Performance and Scalability of Recommendation Algorithms
Recent developments of new algorithms for recommender systems have followed two quite distinct tracks: (1) highly scalable linear algorithms, such as EASE and SLIM, that have been shown to work well on very large and sparse datasets. (2) Deep models, what learn user and item embeddings through a neural network architecture, that is computationally intensive to train. Another approach which is much less well explored in the state-of-the-art is the Bayesian approach, in which a generative model of the recommendation process is posited and an inference algorithm such as Markov Chain Monte Carlo or Variational Inference is applied to learn full distributions of the parameters of the model. Similar to deep models, a major challenge for Bayesian approaches is the computational intensity of the training process. This project will explore novel algorithms for recommendation, focusing on performance and scalability. We will consider whether a Bayesian model with tractable inference is feasible for the recommendation setting and examine the integration of a Bayesian approach with deep models or simple scalable models. As well as tackling performance, other qualities of the recommendation will be considered, such as diversity and novelty.