Major record labels and music groups are supporting new legislation that would force AI companies to reveal what materials they used to train their AI models.
The music industry is rallying behind a new bill called the Transparency and Responsibility for Artificial Intelligence Networks (TRAIN) Act.
What’s the big deal?
Right now, AI companies keep their training data secret – it’s a “black box.” This makes it nearly impossible for artists and rightsholders to know if their work was used without permission.
The TRAIN Act would change that by:
- Letting rightsholders request information through court clerks
- Requiring AI companies to disclose if specific works were used
- Simplifying the legal process for creators
Who’s supporting it?
The bill has heavyweight backing from:
- All three major labels (Sony, Universal, Warner)
- Industry groups like RIAA and NMPA
- Performance rights organizations (ASCAP, BMI)
- Independent music organizations
How it works:
When creators suspect their work was used to train AI, they can:
- File a request with any US district court
- Get an administrative subpoena issued
- Require AI companies to reveal if their work was used
The bigger picture
This isn’t the only AI regulation in the works:
- The COPIED Act would ban unauthorized use of copyrighted works in AI training
- The NO FAKES Act targets AI voice and likeness rights
- The No AI FRAUD Act addresses similar concerns
Fun fact: Unlike an earlier proposal that required AI companies to disclose ALL training materials (which tech companies said was impossible), the TRAIN Act only requires disclosure when specifically asked about particular works
- Schoenberg Archive Incinerated in Historic Blaze - January 13, 2025
- Spotify’s Alleged Classical Music Shell Game - January 6, 2025
- How The Nutcracker Became Christmas’s Greatest Show - December 23, 2024