Cyanite, AI technology solution for music tagging and similarity search have recently turned its tools inward, using their technology to explore how the sound of electronic music has evolved over the past few years.
By analyzing the annual Resident Advisor charts from the last three years, Cyanite looked beyond surface trends to ask: Is the sound of the scene shifting? What’s driving the changes and what could come next? The goal wasn’t just to interpret the past, but to help the industry better understand where music is heading.
You can read their full report here: The Evolution of Electronic Music (2022-2024) – AI Data Analysis with RA’s Top Tracks
To unpack the findings and the philosophy behind them, we spoke with Jakob Höflich, co-founder and CMO of Cyanite, about creativity, AI, emotional shifts in club music and what it means to make sense of music in an age of abundance.
The report connects an abundance of data into one comprehensive report. What drives you to conduct these reports?
Honestly, we do them because they’re fun! We’ve previously analyzed German club sounds for FazeMag, and The Guardian used Cyanite data to dig into all 1,371 Eurovision finalists since 1956.
Music is deeply emotional, but stepping back and looking at it through data can be incredibly refreshing. It lets us test our hunches, challenge assumptions, and sometimes debunk long-held beliefs. We also see these reports as a way to show how metadata isn’t just for search. It’s becoming a strategic asset. (as I’ve previously explored in Synchblog).
We also see these reports as a way to show how metadata isn’t just for search. It’s becoming a strategic asset.
One German publisher, for example, realized through our analysis that their catalog lacked epic, cinematic music despite frequent sync demand. It was the first time they aligned catalog development with data, not gut feeling. That’s exactly the kind of value we’re passionate about delivering.
Looking at Cyanite’s Electronic Music Report, one standout is that music producers went back to favoring vocals instead of instrumental tracks. Furthermore, we are seeing increasingly darker moods. Why do you think that is?
I see two things happening. First, a counter-movement: after the high-energy euphoria post-Covid, producers and listeners may now be leaning into something more introspective. Second, music has always helped us process what’s going on in the world and lately, the world has felt heavy.
From climate anxiety to political instability, we’ve all been navigating what some call a “polycrisis.” That energy inevitably makes its way into the music. And with younger generations reshaping club culture, adding more emotion, more eclecticism, and fewer rules, it’s not surprising we see both a sonic and emotional evolution in the data.
With technology we are now able to have very specific tags and really dive into the data like we see within the mood analysis. Do you think this helps to find the right track?
Yes, but it’s not the whole story. Accurate tags are essential to narrow down the options, especially with catalogs of millions of tracks. But the “right” track? That still requires a touch of magic.
Accurate tags are essential to narrow down the options, especially with catalogs of millions of tracks.
Even the most experienced music supervisor or DJ knows that data can guide but the final selection comes down to intuition, timing, and emotion. That’s why we always say Cyanite sits at the intersection of art and science, or more precisely, emotion and engineering.
Do you think technology has had an impact on catalog discoverability as the report as we saw growth in more experimental subgenres of ambient, dubstep and IDM?
Technology enables us to better manage musical scale, but it doesn’t drive the creative impulse. Artists and scenes shape genres, not algorithms.
That said, tools like Cyanite help elevate the discoverability of niche or experimental music by making it findable on more than just genre tags like through mood, energy, or instrumentation. And that opens up opportunities for music that might otherwise stay buried.
How has technology changed the game for the music industry?
If disruption were a character, it would have a soft spot for music. From Napster to TikTok to AI, we’re often the first to feel the tech tremors.
And yet, the music industry remains rightfully cautious. Many people get into music to work with artists and culture, not with tech. That’s not a bad thing; it keeps this business rooted in human connection. But I also see so many manual tasks on my desk that could be automated with AI.
While we absolutely need AI to navigate the volume, we also need to stay intentional about how we use it.
What’s really changed is the scale of content. Chartmetric recently noted 28,700 songs are uploaded daily and that number is only going up. It’s overwhelming. At the same time, I hear more people say they’re rediscovering radio or curated playlists because algorithmic fatigue is real. So while we absolutely need AI to navigate the volume, we also need to stay intentional about how we use it. The future, I think, lies in conscious technology.
Obviously Cyanite technology involves AI, but like Synchtank always says AI is great as a tool not a replacement for creatively, do you have any thoughts on this topic?
Yes, though the line is blurry. I often return to something Pharrell once said after hearing Maggie Rogers perform “Alaska”: true creativity requires deep self-honesty. That’s not something you can outsource to a prompt.
AI can assist and accelerate creativity, but it can’t replace the inner work that makes something truly authentic. Emotion is at the heart of music, and emotion is still, thankfully, very human. AI will mimic feelings, but it will never have them. And that’s why, in my opinion, real creativity can’t be replicated by a machine.
Can you explain why metadata and all music data is important for the music industry?
Metadata is what makes music discoverable, usable, and fair. It’s the music industry’s equivalent of Google and without it, you can’t search, license, or even ensure proper payment for artists and rights holders. It also makes the sheer masses of today’s music manageable.
Metadata is what makes music discoverable, usable, and fair. It’s the music industry’s equivalent of Google and without it, you can’t search, license, or even ensure proper payment for artists and rights holders.
At Cyanite, we’re building a system that helps create a universal language for music discovery, one that not only supports search, but also enhances sync, recommendation, and rights management. It’s a huge task, but one we’re deeply passionate about, because when music is well-tagged, it reaches the right ears at the right time.
So far, we’ve tagged over 35 million songs, a scale that reflects just how central Cyanite has become in the tagging space. It’s fair to say we’ve helped shape the industry standard for AI-powered music tagging.
And when we zoom out from discovery to copyright, metadata becomes just as crucial, it’s the key to fair compensation. We’re proud to be a member of DDEX, contributing to metadata standards that reduce friction, prevent fraud, and ultimately protect creativity.
Lastly, through your work have you discovered an artist, track, or sync that you thought was an amazing find?
Yes! Someone on our team dropped a one-shot Jungle video in our Slack, such a vibe. It instantly took me back to seeing them live, opening for Massive Attack in Berlin.
That’s the magic of music: a single moment can bring back a whole memory. And that’s exactly the kind of connection we want to help others make with Cyanite.
If you’re interested in learning how to embrace technology and improve you catalog discoverability with a focus on metadata as your strategic asset then we can defiantly help. You can reach out to us here.