ABC News December 3, 2018

How AI is changing the music industry

WATCH: News headlines today: Dec. 3, 2018

This is an Inside Science story.

When a song plays on the radio, there are invisible forces at work that go beyond the creative scope of the writing, performing and producing of the song. One of those ineffable qualities is audio mastering, a process that smooths out the song and optimizes the listening experience on any device. Now, artificial intelligence algorithms are starting to work their way into this undertaking.

"Mastering is a bit of a black art," explained Thomas Birtchnell, a researcher at the University of Wollongong in Australia. "While it's not always clear what mastering does, the music comes back and it sounds better." Birtchnell, a musician himself, was intrigued when he heard about AI-based mastering services like LANDR that offer inexpensive alternatives to human-based mastering. Many younger and newer artists use LANDR to master tracks they are releasing to launch their careers (they offer a monthly service that costs $9 for four tracks). He decided to investigate AI's uses and trends of algorithm-based audio mastering in a new paper released in November.

(MORE: Scientists make see-through fruit flies)

The traditional way of audio mastering generally requires a room with specialized acoustics. A person can hear flaws in the music, such as issues in the spectral range or the stereo balance, and remove glitches, pops and crackles. "It's quality control," explained Birtchnell. They also add loudness, which is the idea of making the sound fuller. It's quixotically different from volume, he pointed out, "containing more presence and energy."

LANDR, which was launched in 2014, recently announced that more than 2 million musicians have used its music creation platform to master 10 million songs.

A few years ago, Carnegie Mellon computer scientist Roger Dannenberg heard that online systems had mastered 1 million songs -- and was shocked. "That's a really big number."

STOCK PHOTO/Getty Images
A musician composes music in this undated stock photo.

Dannenberg said that it makes sense for some artists to turn to algorithms for mastering.

"In the space of music creation, I think that mastering is one of the more cut-and-dried practices that can be formalized relatively easily." Mastering is still creative, and humans can hear things that programs can't. But some aspects of mastering -- like equalizing the loudness levels of different songs on a CD or trying to match the spectral content in bass and high frequencies -- are a lot simpler to automate than composing a piece of music or doing music production.

"Maybe this is some indication of AI in creative practice, and I really think it is, but I think it's a long way from creative work -- even though there can be creative aspects," said Dannenberg.

Ryan Petersen, a Nashville-based producer and songwriter, played around with LANDR a few years ago and ultimately abandoned the service to return to human colleagues. He said that while the algorithm is technologically impressive, it fell short because it lacked a taste algorithm in the part of the software dedicated to creative learning. "They've basically said their engine keeps learning by looking toward songs that get uploaded into it -- but that means it's always looking toward the past," he said. "It's never looking into the future to see how to create the next cool thing."

Birtchnell said that AI-based audio mastering is probably displacing some human jobs, but it's hard to know how many. In most cases, people using the services would not be hiring someone to audio master their tracks anyway. But it is likely shrinking opportunities for newer apprentices. "People in the industry for a long time -- they're not really worried about their jobs, but they aren't taking on newer apprentices because of the shrinking of the sector," said Birtchnell.

One area where computers could soon start making an impact is in composition -- writing a pop tune or creating a chord progression, said Dannenberg. "Those sorts of things are getting pretty good, actually competitive with what humans are doing," he said, adding that some programs he has created can generate catchy tunes.

But the biggest area weakness now for AI and music is production, which is the area when people manipulate music after it's recorded and make decisions about the recording like mixing and arrangements. That's a very creative practice, with fewer boundaries -- and a place where humans are still very much needed on the creative team. "A computer can write a pop tune," said Dannenberg, "but you can't perform it and make an arrangement unless you get human performers and producers."

(MORE: A robot presented evidence about artificial intelligence to a British Parliament committee)

Of course, that can be viewed as an opportunity for more research and more applications of AI and machine learning. "I don't see any absolute barriers," he said. "I'm not a person who believes that creativity is innately human."

Birtchnell believes the creative takeover by AI will eventually occur in the future. "As the algorithms improve, there is scope to start impacting on professional developers," he said. "So we might see in the future a tipping point where AI is on par with people, similar to white collar work where surgeons are replaced by robots, or driverless cars on roads. It always seems to be very soon but we don't know yet when it will happen."

Inside Science is an editorially-independent nonprofit print, electronic and video journalism news service owned and operated by the American Institute of Physics.

Inside Science