How Spotify's Recommendation Algorithm Works
A 7-minute read
Spotify's algorithm does not just analyze songs. It analyzes your behavior and compares it with listeners who have similar patterns.
In 2015, Spotify launched Discover Weekly — 30 songs, delivered every Monday, tailored to each listener. By the end of 2015, users had streamed over 1.7 billion tracks from it, per a Time magazine report from December 2015. By 2025, the playlist had surpassed 100 billion tracks streamed in total, according to Spotify’s own newsroom. No DJ had curated those playlists. No team of music experts had chosen them. The whole thing was automated, driven by a system that had never listened to a single song and had no idea what music actually sounds like.
The short answer
Spotify uses three overlapping systems to recommend music: it finds people with similar taste to yours and borrows their libraries, it reads what music journalists and bloggers write about songs to understand their style and genre, and it directly analyzes the audio to measure tempo, energy, and danceability. Discover Weekly, which lands every Monday with 30 songs, is the most visible output of this machine, and the reason it works is that 450 million users are constantly training it.
The full picture
Collaborative filtering: your taste twin
The oldest and most powerful piece of the system is called collaborative filtering — the same core technique behind most recommendation algorithms on the internet. The core idea: if two people share a lot of listening history, they probably share taste. Songs one of them loves are worth recommending to the other.
Spotify had over 600 million monthly active users (as of early 2024, per Spotify’s quarterly reports). Each one is, in effect, a data point in a vast map of music taste. The algorithm finds people whose listening patterns look like yours and surfaces what they’re playing that you haven’t heard yet.
This is why Discover Weekly can recommend a band you’ve never searched for, from a country you’ve never explored, in a language you don’t speak. The algorithm doesn’t need to know anything about that band. It just knows that every person whose listening behavior resembles yours has been listening to this band recently.
Netflix does the same thing: “People who watched X also watched Y” isn’t a statement about the similarity of X and Y. It’s a statement about the similarity of the people who watched X.
Natural language processing: reading the web
Collaborative filtering has a limitation: it can’t tell you anything about a brand new song, because nobody has listening history with it yet.
Spotify’s second system addresses this by reading. Not listening. Reading. The company’s algorithms crawl the web continuously, pulling in music blog posts, reviews, playlist descriptions, and social media discussions. They use natural language processing (NLP) to extract meaning from text the way a human music nerd would.
If a dozen blogs describe a new band as “moody indie folk with Sufjan Stevens vibes and a hint of Phoebe Bridgers,” the algorithm learns that this band occupies a certain location in the music landscape, even before any Spotify user has played them. It can then recommend the artist to people who listen to Sufjan Stevens and Phoebe Bridgers.
Audio analysis: what the algorithm actually measures
The third system goes directly to the audio files and measures them.
Spotify’s audio analysis model extracts dozens of properties from each track. The public-facing ones include:
- Tempo: beats per minute
- Energy: a perceptual measure of intensity, combining loudness and dynamic range
- Danceability: how suitable the track is for dancing, based on rhythm stability and beat strength
- Valence: the musical “positivity.” High valence sounds happy or euphoric, low valence sounds sad or tense.
- Acousticness: the probability that a track is acoustic rather than electronic
- Instrumentalness: whether a track has vocals
These measurements let Spotify build a mathematical profile of any song and find others that are similar along multiple dimensions simultaneously — a task powered by the same neural network architectures used across modern AI.
How Discover Weekly gets assembled
Discover Weekly arrives every Monday morning with exactly 30 songs. The 30-song count is deliberate. In testing, users found shorter playlists unsatisfying and longer ones overwhelming.
For each user, the algorithm does roughly this: it identifies the songs you’ve played most this week (your “taste profile” for the moment), then finds other users who share those preferences, pulls in what those users have been playing that you haven’t heard, runs the candidates through the audio analysis filter to ensure they fit your current mood and energy preferences, and serves you the top 30 results.
The model re-runs for every user, every week. That’s hundreds of millions of personalized playlists, re-run every week.
The taste profile problem
The algorithm is designed to learn from listening behavior. This creates a few quirks.
If your kids commandeer your phone and play Moana eight times, Spotify notices. Suddenly your Discover Weekly has animated film soundtracks. This is a known frustration, and Spotify has introduced family plans and separate profiles partly for this reason.
The rut problem is more fundamental. Collaborative filtering, by design, recommends what similar people like. If your taste community all share the same blind spots, the algorithm will never surface music outside those blind spots. You can get stuck in an ever-deepening bubble of the same three subgenres.
Spotify’s fix: deliberately inject unfamiliar artists at a low rate. Not too much. Users churn if Discover Weekly feels foreign. But enough to occasionally crack open the bubble.
What Spotify actually knows about you
The public-facing version of Spotify’s algorithm is about music taste. The private reality is considerably more detailed.
Spotify tracks not just what you play, but how you play it. It knows whether you skip a song in the first 30 seconds or listen all the way through. It knows whether you add it to a playlist, share it, look up the lyrics, or replay it immediately. It knows what time of day you listen, whether you’re using headphones or a speaker, and whether you’re listening on mobile (probably moving) or desktop (probably sedentary). These behavioral signals feed into what Spotify calls streaming intelligence: a portrait of your listening not just as preference, but as behavior.
This data is valuable beyond recommendations. Spotify sells Marquee campaigns to record labels: targeted push notifications to listeners who have already shown interest in an artist, precisely when the algorithm predicts they’re ready to engage. A label releasing an album by an artist you’ve listened to three times in the past month might pay Spotify to notify you on the day of release. The recommendation algorithm and the advertising product are, at some level, the same system.
There’s a harder question lurking here: Spotify’s algorithm shapes what becomes popular. An artist who gets recommended to millions of users has a massive advantage over one who doesn’t — regardless of relative quality. This creates feedback loops: popular artists get recommended, get more plays, get more recommendations. Newer or less-famous artists without listening history are essentially invisible to collaborative filtering. Spotify’s editorial playlists and algorithmic injections are partly an attempt to counterbalance this — but they’re also curated by humans with their own preferences and relationships with labels.
Why it matters
The design of Spotify’s algorithm is a blueprint for how recommendation systems work across the internet. Netflix, YouTube, TikTok, Amazon: every platform that tells you what to consume next is running some version of these same three systems. On social platforms, social media algorithms get even more complex, since content from strangers competes with posts from people you follow: collaborative filtering (what do people like you like?), content-based analysis (what is this thing actually like?), and text/metadata analysis (how does the world describe this thing?).
Understanding this matters because these systems shape what you discover, and by extension what becomes popular. A band that gets recommended to 450 million people doesn’t need radio play. An artist that falls outside Spotify’s recommendation clusters, regardless of talent, will struggle to find an audience. The algorithm is not neutral.
Common misconceptions
“Spotify’s algorithm understands music the way humans do.” It doesn’t appreciate melody, emotion, or meaning. It identifies statistical patterns in audio waveforms and listening behavior. The output often seems human, but the mechanism is not.
“Skipping a song tells the algorithm you hate it.” Spotify tracks skip timing, replay behavior, playlist additions, and shares to build a nuanced signal. A song you skip every time it appears is strong negative signal; a song you skip at the 10-second mark once might just mean you weren’t in the mood.
“More listening = more accurate recommendations.” Up to a point, yes. But your recent listening matters far more than your listening history from two years ago. What you play this week shapes next week’s Discover Weekly more than your all-time stats.