Sony is taking a more hands-on approach to the growing problem of AI-generated music borrowing too heavily from real artists.

According to Nikkei Asia, the company has developed new technology that can identify original, copyrighted songs inside AI-generated music and measure how much influence those works had on the final output.

The system is designed to answer a question many musicians and labels are now asking: was my music used to train this AI, and if so, how much of it shows up in what the AI produced?

How Sony tracks real songs inside AI-generated music

Sony’s tool works in two distinct ways. When AI developers are willing to cooperate, the technology can directly analyze an AI model’s training data to pinpoint which copyrighted songs were used.

When developers don’t cooperate, the system compares the AI-generated output against existing music catalogs to estimate likely sources and the degree of influence original works had on the new track.

That allows Sony to flag potential copyright issues even when AI companies keep their training methods opaque.

Turning creative influence into compensation

Sony wants to make sure that artists get paid when their music helps shape an AI-generated song. The company sees this tech as a possible foundation for future revenue-sharing deals, where creators are compensated based on how much their work influenced the final track.

It could also be used during licensing talks or even built directly into AI music tools down the line. However, Sony says it has not decided when or how such a system would be rolled out.

This comes at a time when the relationship between AI companies and the music industry is already strained. Training AI models on copyrighted songs has sparked lawsuits and heated negotiations.

Rather than objecting to AI music, Sony is building tools to track influence and bring accountability to how it’s created, especially as AI-generated tracks are already making their way onto Billboard charts.

Share.
Exit mobile version