Technology’s ability to comprehend music in the digital era has made our understanding of this creative process incredibly sophisticated. Ben Gilbert explores the emergent tools and profiles two companies specializing in MIR (music information retrieval) to assess how it can take the industry forward.
“I’ve never been wrong. I used to work in the record store,” declared James Murphy on LCD Soundsytem’s landmark 2002 recording ‘Losing My Edge’. For many years – decades in fact – consumers were faced with the unenviable prospect of encountering someone who “had everything before anyone” when trying to ascertain the identify of a piece of music. A few years earlier, Jack Black portrayed a figure on the cinema screen with a similarly commanding presence in ‘High Fidelity’. Barry Judd’s incredulity at being confronted by either a disinterest of ‘Blonde On Blonde’ or interest in Stevie Wonder’s ‘I Just Called To Say I Love You’ felt both terrifying and believable.
But technology’s ability to comprehend music in the digital era has made our understanding of this creative process so sophisticated that the notion of going into a shop to whistle a tune you heard fleetingly on the radio is now a distant memory. While music recognition apps such as Shazam have become embedded within global culture over the last 20 years, Artificial Intelligence (AI) and the continued advancements and impact of high-performance data processing is pushing our ability to document, manipulate and market music into a realm that might cause Judd to shudder, notwithstanding the historic recent renaissance in vinyl sales.
A deal was announced earlier this month that will see SoundCloud purchase Musiio, the Singapore-based AI music curation company co-founded in 2018 by Hazel Savage. Reporting the news, Music Business Worldwide highlighted the tools Musiio deploy to “listen” to music at scale. “You can use this process to replicate anything the human ear might be able to recognize, such as the BPM and genre, right down to how much of a hit something might be. Our new technology is looking for hits, virality, talent and melodic familiarity as well as a bunch of other factors that make up great music,” explained Savage, formerly of Shazam, Pandora and Universal Music.
Eliah Seton, SoundCloud’s President, praised the “brilliant team of innovators at Musiio” and suggested he was expecting great things from the deal. “SoundCloud hosts more music from more creators than any platform on the planet. Acquiring Musiio accelerates our strategy to better understand how that music is moving in a proprietary way, which is critical to our success,” he explained in a press statement.
Vision to create a universal translation intelligence for music
Two other companies that specialize in MIR (music information retrieval) are AIMS and Cyanite, who use AI music tagging – adding descriptive metadata identifiers to songs – and sonic similarity – search functionality that enhances the discoverability of material with matching audio characteristics – to extract value from catalogs. The use case scenarios here are multiple. Music rights holders can deploy the technology to tag vast collections, maximizing their assets and monetizing not just the hits but every song. It also promises to be an incredibly powerful tool for sync and music licensing, particularly during a time when demand is exploding across the entertainment sector and beyond.
Jakob Höflich, Co-Founder and CMO of Cyanite, spoke at Berlin’s BIMM Institute last year, where he outlined four key applications of AI in music: AI Music Creation / Search & Recommendation / Auto-tagging / AI Mastering. To produce a real world demonstration of the third category, the technology was used to analyze Peggy Gou’s ‘Starry Night’. It broke the South Korean DJ’s 2019 track down into 10 core components: BPM / Key / Mood / Main Genre / Sub Genre / Voice / Emotional Profile / Instruments / Energy Level / Musical Era. The results they produced feel detailed, accurate and informative, bringing tangible sonic representation from the language of machine code.
In an interview with Synchtank, Höflich outlined the full scale of the company’s ambitions: “Cyanite’s vision is to create a universal translation intelligence for music – an intelligence that can translate music into anything and anything into music. Why is this important for music rights holders? Because the way someone searches for music depends a lot on their position in the industry. Music supervisors use a different language than film producers. Film producers search for music differently than sound branding agencies or YouTubers, and so on. Solving this challenge is at the core of Cyanite’s philosophy and is reflected in its unique range of products.”
“It’s about much more than tagging or sonic search – it’s about bringing human needs and artificial intelligence together in the best possible way.”
– Jakob Höflich, Cyanite
He sees opportunities across the cultural landscape for products operating in this space, commenting: “On a larger scale, any music rights holder who offers their content via a searchable platform needs to be able to respond to the different search criteria and requests mentioned above with a music search that actually takes into account where their customers are coming from. That’s where Cyanite comes in. It’s about much more than tagging or sonic search – it’s about bringing human needs and artificial intelligence together in the best possible way. It’s about picking up customers where their music search starts – whether it’s for an ad, a movie, a podcast, a game, or an Instagram reel.”
Music tagging technology has evolved “rapidly” over time
According to Höflich, music tagging technology has evolved “rapidly” over time. He expects its role to become ever more profound in the years ahead. “Cyanite started in 2017 with a simple set of eight moods and now offers over 131 as part of a total of 2,285 tags in 25 output classes. However, our latest research goes beyond the idea of a fixed set of tags and aims to understand text dynamically. Just as we search Google by simply typing in what comes to mind, Cyanite’s latest algorithms understand the semantics of the user’s input to recommend music. This means that static tagging, which is important and will retain its relevance, is complemented by a dynamic approach that takes into account the different languages and starting points in music search.”
While Cyanite are based in Germany, AIMS operate out of the Czech Republic. Both are seeking to harness the possibilities of AI in delivering a new understanding of how technology can take sectors of the music industry, including sync, forward. Their team has strong representation from the world of production music – co-founder Martin Nedved ran Studio Fontana, specialists in this field, for more than a decade – and they see a range of positives for potential collaborators.
“Perhaps the most valuable benefit is the amount of time our tools save music, creative and production teams. Music searches can be done in 90% less time when using AIMS, enabling clients to spend more time on business development and nurturing existing clients, knowing music pitches are taken care of,” Nedved told Synchtank. He also explained in more detail how the two core AI-based functions within their product are able to amplify rights owners’ search results for their customers.
“AI Similarity Search allows users to paste a link (from video/music streaming platforms) or upload a music file into the search bar of a music platform instead of typing in keywords. Results are then generated from the music available on their platform based on AI analysis of the track. Our Auto-Tagging AI technology generates genre, instrumentation, mood etc. tags, based on individual track audio files. This product is extremely beneficial to rights owners as it enables them to tag whole catalogs quickly, accurately, and consistently,” he said.
“Sonic similarity democratizes music search across the board”
Like Cyanite, Nedved has a deep-seated belief in the systems and processes the start-up has developed. What makes their product unique? “The simple answer is that it really works! So many different AI products offer similar solutions but you can hear the difference in the results AIMS uncovers. AIMS was conceived by people who own publishing and sync companies, so we have a deep understanding of the industry and the complexities and intricacies of music search as it applies to music users. The idea for AIMS came about because we were desperate to offer our own client’s functionality to make music search simple, accurate and fun – so we sought to build the best product possible to satisfy our own wish lists.”
Specific elements that the company are keen to highlight include features that allow users to identify and highlight individual components within a piece of music. There is enhanced functionality around playlists too, an area that continues to display particular influence and vitality across the digital music sphere. Combined, Nedved believes the full scope of AI in conjunction with these tools has levelled the playing field, while also attracting “some of the most technologically savvy companies across the sync industry”. He cites the positive feedback from a roll call of clients, which include Universal Production Music, Extreme Music, Big Noise, Partisan Records and Songtradr.
“Sonic similarity democratizes music search across the board. AI isn’t subjective – it doesn’t have a favorite composer, doesn’t mind historic music tagging and doesn’t care about the size of a catalog.”
– Martin Nedved, AIMS
“Sonic similarity democratizes music search across the board. AI isn’t subjective – it doesn’t have a favorite composer, doesn’t mind historic music tagging and doesn’t care about the size of a catalog. AI analyses songs based on a multitude of criteria and brings back the best-suited music, making it a great companion to any content creator, music supervisor and editor. We see applications of our technology across the music industry – be it a platform, aggregator, live music or B2C. AIMS is accessible to everyone who makes discovering music their business,” commented Nedved.
Both AIMS and Cyanite are integrated with Synchtank via API, which means customers can run everything from one centralized system rather than using multiple platforms or tools. Coupled with the other functionality provided by the Synchtank platform – rights management, pitching and playlisting tools, automated and subscription-based licensing, presenting catalogs to different audiences and buyers – it’s a dynamic proposition that has proved beneficial for a range of partners. These include rights holder customers, such as labels, publishers and production music libraries. Equally, it has been deployed by rights user clients, including production companies searching third-party catalogs in their Synchtank system to find the right tracks for a project.
How can this technology be incorporated into a client’s business?
Höflich explained how the relationship plays out in practice. “Cyanite and Synchtank work closely together to provide Synchtank’s customers with easy and seamless access to our algorithms. To this end, Synchtank leverages Cyanite’s API to provide their customers with AI capabilities without requiring any development work on the customers’ part. The philosophy is to incorporate AI as easily, intuitively and seamlessly as possible so that Cyanite and Synchtank can create the technological uplift and enable their customers to use AI-powered music indexing and search within the industry-leading label and music publishing software.”
Last year, Synchtank released Drowning in Data, an expansive report addressing the live issue facing the music industry as it seeks to tackle the opportunities and challenges presented by exponential technological change. “Data is now the grease that allows distribution, discovery, marketing, sales and financial processing to work efficiently,” commented Synchtank CEO Rory Bernard in response to an earlier article on Music Business Worldwide which had questioned: “Is artificial intelligence about to transform the sync industry?”
Bernard continued: “The sheer volume of information presented to us in the digital age sees the music industry struggling to keep up and this is where AI and machine learning has the potential to play its biggest supporting role.” The future is happening right now. It seems the tools required to manage this statistical tsunami have arrived right on time.
2 comments
[…] Technology’s impact on the entertainment industry? Combining artificial intelligence (AI) and metadata: link […]
[…] AI-driven algorithms are being used to analyze and categorize music, making it easier for music supervisors to find the perfect track. This reduces the time it takes to secure licenses on a daily basis. […]