Advith's Calendar

View Full Calendar

COLLOQUIUM: Makoto Yamada, "Approximating the 1-Wasserstein Distance: Applications in Self-Supervised Learning and Beyond"

Event Type
Seminar/Symposium
Sponsor
Siebel School of Computing and Data Science
Location
HYBRID: 2405 Siebel Center for Computer Science or online
Virtual
wifi event
Date
Mar 24, 2025   3:30 pm  
Views
88
Originating Calendar
Siebel School Colloquium Series

Zoom: https://illinois.zoom.us/j/87113726192?pwd=xBAJvbFI0aedkNC57UF0OGMlA25Qsu.1

Refreshments Provided.

Abstract: 
In this seminar, we will first introduce an efficient approximation method for the Wasserstein distance [1]. While the Wasserstein distance is a powerful measure for comparing distributions, it is computationally expensive. To address this, we propose the Tree-Wasserstein Distance (TWD), which approximates the 1-Wasserstein distance using a tree structure. By employing L1-regularization to learn edge weights and formulating the distance approximation as a Lasso regression problem, we achieve efficient and globally optimal solutions. Then, we introduce the self-supervised learning based on Wasserstein distance and SimCLR [2]. Finally, we will discuss the intersection of self-supervised learning (SSL) and neuroscience theories [3]. Inspired by predictive coding and the temporal prediction hypothesis, we propose PhiNet, an extension of SimSiam with two predictors mimicking the CA3 and CA1 regions of the hippocampus. PhiNet demonstrates more stable representation learning and better adaptability in online and continual learning scenarios. This work suggests that the temporal prediction hypothesis provides a plausible model for understanding robust and adaptive learning in SSL. 

[1] Makoto Yamada, Yuki Takezawa, Ryoma Sato, Han Bao, Zornitsa Kozareva, Sujith Ravi, Approximating 1-Wasserstein Distance with Trees, TMLR 2022 

[2]Makoto Yamada, Yuki Takezawa, Guillaume Houry, Kira Michaela Dusterwald, Deborah Sulem, Han Zhao, Yao-Hung Hubert Tsai, An Empirical Study of Self-supervised Learning with Wasserstein Distance. Entropy 2024.

[3]Satoki Ishikawa, Makoto Yamada, Han Bao, Yuki Takezawa, PhiNets: Brain-inspired Non-contrastive Learning Based on Temporal Prediction Hypothesis, ICLR 2025

Bio:
Makoto Yamada is currently an Associate Professor in the Machine Learning and Data Science Unit at the Okinawa Institute of Science and Technology (OIST), a leading and innovative university in Japan. He has been working on machine learning problems, particularly in explainable AI (XAI), self-supervised learning, and optimal transport, for over 10 years. His research has been published in top-tier machine learning and data mining venues such as NeurIPS, ICML, ICLR, and AISTATS. He has received several awards, including the Outstanding SPC Award from WSDM, the Best Paper Award at WSDM in 2016, and the Excellence Award from Yahoo Labs. 

Part of the Siebel School Speakers Series. Faculty Host: Han Zhao


Meeting ID: 871 1372 6192 
Passcode: csillinois


If accommodation is required, please email <erink@illinois.edu> or <communications@cs.illinois.edu>. Someone from our staff will contact you to discuss your specific needs


link for robots only