Please use this identifier to cite or link to this item: https://doi.org/10.21256/zhaw-24616
Publication type: Conference paper
Type of review: Peer review (publication)
Title: Evaluation of algorithms for interaction-sparse recommendations : neural networks don’t always win
Authors: Klingler, Yasamin
Lehmann, Claude
Monteiro, Joao Pedro
Saladin, Carlo
Bernstein, Abraham
Stockinger, Kurt
et. al: No
DOI: 10.21256/zhaw-24616
Proceedings: Proceedings of EDBT 2022
Conference details: 25th International Conference on Extending Database Technology, Edinburgh (online), 29 March - 1 April 2022
Issue Date: Mar-2022
Publisher / Ed. Institution: OpenProceedings
ISBN: 978-3-89318-086-8
Language: English
Subjects: Recommender system; Machine learning; Neural network
Subject (DDC): 006: Special computer methods
Abstract: In recent years, top-K recommender systems with implicit feedback data gained interest in many real-world business scenarios. In particular, neural networks have shown promising results on these tasks. However, while traditional recommender systems are built on datasets with frequent user interactions, insurance recommenders often have access to a very limited amount of user interactions, as people only buy a few insurance products. In this paper, we shed new light on the problem of top-K recommendations for interaction-sparse recommender problems. In particular, we analyze six different recommender algorithms, namely a popularity-based baseline and compare it against two matrix factorization methods (SVD++, ALS), one neural network approach (JCA) and two combinations of neural network and factorization machine approaches (DeepFM, NeuFM). We evaluate these algorithms on six different interaction-sparse datasets and one dataset with a less sparse interaction pattern to elucidate the unique behavior of interaction-sparse datasets. In our experimental evaluation based on real-world insurance data, we demonstrate that DeepFM shows the best performance followed by JCA and SVD++, which indicates that neural network approaches are the dominant technologies. However, for the remaining five datasets we observe a different pattern. Overall, the matrix factorization method SVD++ is the winner. Surprisingly, the simple popularity-based approach comes out second followed by the neural network approach JCA. In summary, our experimental evaluation for interaction-sparse datasets demonstrates that in general matrix factorization methods outperform neural network approaches. As a consequence, traditional well-established methods should be part of the portfolio of algorithms to solve real-world interaction-sparse recommender problems.
URI: https://digitalcollection.zhaw.ch/handle/11475/24616
Fulltext version: Published version
License (according to publishing contract): CC BY-NC-ND 4.0: Attribution - Non commercial - No derivatives 4.0 International
Departement: School of Engineering
Organisational Unit: Institute of Applied Information Technology (InIT)
Published as part of the ZHAW project: NQuest – Natural Language Query Exploration System
Appears in collections:Publikationen School of Engineering

Files in This Item:
File Description SizeFormat 
2022_Klingler-etal_Recommender-System_EDBT.pdf440.43 kBAdobe PDFThumbnail
View/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.