Research talk: Local factor models for large-scale inductive recommendation

186
Опубликовано 25 января 2022, 1:40
Speaker: Tobias Schnabel, Senior Researcher, Microsoft Research Redmond

In many domains, user preferences are similar locally within like-minded subgroups of users, but typically differ globally between those subgroups. Local recommendation models were shown to substantially improve top-k recommendation performance in such settings. However, existing local models do not scale to large-scale datasets with an increasing number of subgroups and do not support inductive recommendations for users not appearing in the training set. Key reasons for this are that subgroup detection and recommendation get implemented as separate steps in the model or that local models are explicitly instantiated for each subgroup. In this talk, we discuss an End-to-end Local Factor Model (ELFM) which overcomes these limitations by combining both steps and incorporating local structures through an inductive bias. Our model can be optimized end-to-end and supports incremental inference, does not require a full separate model for each subgroup, and has overall small memory and computational costs for incorporating local structures. Empirical results show that our method substantially improves recommendation performance on large-scale datasets with millions of users and items with considerably smaller model size. Our user study also shows that our approach produces coherent item subgroups which could aid in the generation of explainable recommendations.

Learn more about the 2021 Microsoft Research Summit: Aka.ms/researchsummit
автотехномузыкадетское