Matrix Factorization For k-Means


listen on castbox.fmlisten on google podcastslisten on player.fmlisten on pocketcastslisten on podcast addictlisten on tuninlisten on Amazon Musiclisten on Stitcher

--:--
--:--


2022-03-21

Matrix Factorization for K-Means

Sibylle Hess is an Assistant Professor in the Data Mining group at TU Eindhoven in the Netherlands. Her research includes work with Matrix Factorization, particularly with clustering objectives, and exploring the relationship between this methodology to  Deep Learning.

Sibylle started off by explaining from a high-level, how matrix factorization is related to deep learning, using autoencoders as an example. She also stated the difference in terms of similarity initialization in both matrix factorization methods and deep learning methods. In a bid to clear the statement that softmax formula and K-means clustering have the same working principle, Sibylle detailed how softmax works. 

She also discussed what an embedded space looks like and how the classes are distributed using cones. While spectral clustering does not have clear centroids as in the case of K-means clustering, it can be seen as a special kind of K-means called spherical K-means clustering

Sibylle further explained the practical implications of spectral clustering mathematical deductions such as how clustering occurs in the embedded space and how to determine the right confidence for each class. But you may ask. Is spectral clustering robust to deceptive inputs during training? Sibylle discussed how the model works in adversarial attacks and learning.

She also discussed a problem she faced during training - having the model not to return probabilities. If you are wondering how this was a problem, Sibylle talked about it. She additionally spelled out the other challenges she faced during implementation and she found her way around them. 

Sibylle further discussed her results and how the result compares with the conventional neural networks. She went on to explain how her methods edge conventional neural networks in terms of classification accuracy, confidence levels, and robustness to adversarial attacks. 

One would think with better results, there would be mass migration from the traditional neural networks. Well, Sibylle talked about these possibilities and the bottlenecks users may face with its results.  

In conclusion, Sibylle had her thoughts on further research in this field and what she is most excited about. You can follow Sibylle on Twitter @llsebyl You can also check her website where she plans to publish tutorials on Matrix Factorization with binary constraints

Sibylle Hess

Sibylle Hess is assistant professor of data mining at TU Eindhoven, the Netherlands. She completed her Ph.D. summa cum laude in 2019 at Katharina Morik’s lab at TU Dortmund University, Germany, as an employee of the collaborative research center SFB876. She has worked on the derivation of (certifiably) robust clustering methods under the framework of matrix factorizations, and is currently interested in the implications of a relationship between of k-means clustering and deep learning.


Thanks to our sponsors for their support

The developer-first MLOps platform. Build better models faster with experiment tracking, dataset versioning, and model management.
https://wandb.com/