Conference paper Open Access

Intelligent User Interfaces for Music Discovery: The Past 20 Years and What's to Come

Peter Knees; Markus Schedl; Masataka Goto

Providing means to assist the user in finding music is one of the original motivations underlying the research field known as Music Information Retrieval (MIR). Therefore, already the first edition of ISMIR in the year 2000 called for papers addressing the topic of "User interfaces for music IR". Since then, the way humans interact with technology to access and listen to music has substantially changed, not least driven by the advances of MIR and related research fields such as machine learning and recommender systems. In this paper, we reflect on the evolution of MIR-driven user interfaces for music browsing and discovery over the past two decades. We argue that three major developments have transformed and shaped user interfaces during this period, each connected to a phase of new listening practices: first, connected to personal music collections, intelligent audio processing and content description algorithms that facilitate the automatic organization of repositories and finding music according to sound qualities; second, connected to collective web platforms, the exploitation of user-generated metadata pertaining to semantic descriptions; and third, connected to streaming services, the collection of online music interaction traces on a large scale and their exploitation in recommender systems. We review and contextualize work from ISMIR and related venues from all three phases and extrapolate current developments to outline possible scenarios of music recommendation and listening interfaces of the future.
Files (2.1 MB)
Name Size
ismir2019_paper_000003.pdf
md5:5dd9e4d4ac2dcec8edb419f1db62913e
2.1 MB Download
229
160
views
downloads
All versions This version
Views 229229
Downloads 160160
Data volume 331.6 MB331.6 MB
Unique views 213213
Unique downloads 150150

Share

Cite as