We have detected that you are using an adblocking plugin in your browser.

The revenue we earn by the advertisements is used to manage this website. Please whitelist our website in your adblocking plugin.

FEATURE | How Researchers At The University Of Toronto Aim To Help Musicians With Artificial Intelligence

By Anya Wassenberg on January 13, 2019

Technology is breathing vital new life into the world of classical music, just as it needs it most.

Classical music and cutting edge technology may not seem, at first glance, to have much to do with each other. In the public imagination, in fact, classical music may seem like the last stronghold of music making without the use of modern tech. But, to see them as two separate — or even opposing worlds, is a fallacy. They don’t just intertwine; technology is breathing vital new life into the world of classical music just as it needs it most.

One of the ways that technology and classical music intersect, increasingly, is through enhancing the audience experience. Giddy headlines may declare that technology will save classical music, but the importance of broadening the reach, and potential appeal, of classical music through digital technology can’t be underestimated. The TSO pushes its recordings and videos through a number of channels, including Spotify, Soundcloud, and a YouTube channel.

Other orchestras and companies are experimenting with a variety of audience enhancement technologies. Teatro Lirico di Cagliari, an opera company based in Sardinia, used Google Glass, a form of smart glasses, in an experiment in 2014. During a production of Puccini, the singers, musicians, and even stagehands posted pictures and live video feeds directly to the company’s social media accounts. Since 2016, the Boston Symphony Orchestra has incorporated a “Conductor Cam” section of seats in their Casual Friday concert series. In the designated section, patrons get a loaner of an iPad that has enriched content like exclusive interviews. Seated behind a large flat screen, the audience members get the same view of the musicians as conductor Andris Nelsons.

But, the audience experience is just the beginning. Eric Baptiste, the CEO of Society of Composers, Authors and Music Publishers of Canada (SOCAN), has partnered with the U of T’s Department of Computer Science Innovation Lab (DCSIL) on a project designed to benefit Canadian musicians from a different angle — that of being able to track music played around the world in order to collect and distribute the royalties. It’s ironic that, as the ability to proliferate music across the globe has become easier and easier, artist revenues have dwindled. The project will use AI — artificial intelligence — in various ways to hunt down and track music under copyright wherever it is played.

ARTIFICIAL INTELLIGENCE

It’s not the only way that AI has entered the realm of music. Steve Engels is an Associate Professor, Teaching Stream in computer science at the University of Toronto, but he made headlines last year for making music — or rather, forgetting AI to make music.

“I was doing AI related to text,” he recalls. The project initially looked at getting AI to recognize text via an algorithm. It was a student’s interest in music that led to the realization they could look at musical compositions the same way. The algorithm uses what it calls tokens, or a basic unit that it considers as it looks for patterns. “AI looks at it note by note,” he says. Essentially, they feed music to AI, and AI learns. “If you know a composer well enough, you know what they would do next,” he points out. “Give it a large body of music, and it will begin to recognize patterns.”

“It has no concept of music theory,” he says. Without a background in music theory, AI bases its compositions entirely on what it has heard. “It doesn’t do what it hasn’t already seen. It’s all entirely derivative.” Or, almost. There is some room for something like creativity at times. If AI encounters a few notes, and then recognizes multiple options for what happens next in its listening repertoire, it can then improvise a little at the note or structural level of the music. “The overall piece has variations.”

The level of variety depends on what it has heard. “At first, it would focus on a single piece. Then we started to combine these pieces.”

While the media focused on the novelty of non-human musical composition, the exercise led Engels to consider the implications. “Is this how people do it too?” he wonders. In earlier eras, in particular, where style was uniform across geographic areas, and Western music followed the common practice tonality, musical composition would have been influenced almost entirely by what the composer had previously heard in their life. “That’s built into the kind of art we do.”

While AI may already have found its applications in the world of pop music, it probably won’t find its way onto a classical concert stage anytime soon. “This is entirely going to be derivative,” Engels says. As he explains, AI is good at imitation. It can simulate a specific composer it’s been given, but the form would depend on what it had been fed. “If you gave it all concertos, it would come up with a concerto.” If you expose it to many different types of compositions, then the form would become more and more haphazard as it tries to incorporate all the different patterns it has learned. “There’s still that high-level function that we need humans for. We look at it as a compositional assist.”

A NEW WAY FORWARD

As the lines between music and technology blur, the possibilities open up. Willy Wong is an Associate Professor, Biomedical Engineering / Computer Engineering, and director of the Engineering Music Performance Minor program, launched in the Fall 2018 semester. It came about as a recognition of the fact that so many engineering students were musicians too.

Wong points out that the overlap between engineering and classical music is not new. Julian Andreas Kuerti, conductor and son of Anton Kuerti, got an honours degree in engineering and physics from UofT before switching to music. Soprano Isabel Bayrakdarian studied biomedical engineering at the UofT before taking to the stage.

“Music is a very important part of their professional practice as engineers,” he notes. “We’re also pushing out a Music Technology Certificate.” It’s taking the fusion of music and technology to the student level in a new way. “Universities tend to be compartmentalized,” he says. “University tends to be a much more theoretical space.” Combining the two disciplines takes both in new directions.

Wong teaches the physics of sound. “I teach the only course in sound in the engineering program,” he says. “Where does sound come from?” For him, the fields of engineering and music are two angles on the same thing, and his love of music goes back to childhood. “My entire family is music based,” he says, including kids in lessons, and a wife — violinist Etsuko Kimura — who is an Assistant Concertmaster with the TSO.

It took some time to get the cooperative program running, but he’s excited about the possibilities that will emerge as students go through it. “It’s a vision here we want to take forward.”

Jeff Wolpert is an Adjunct Professor: Music Technology & Digital Media, and Director, Music Technology and Digital Media Program, Sound Recording. “I designed this program,” he says. He’s won multiple Juno Awards for Recording Engineer of the Year, and is a working professional along with is teaching duties at UofT, producing records, movie scores, and other projects with some of the big names of the music business.

Enrollment in the program, focusing on music production, and music as media, had some initial surprises. “It seemed to open a side door,” he says. Many students were people who had studied music, and then left the business, or even those who’d had an interest, but never gone into the field, with a good contingent of mature students in their 40s and even older. In other words, it has opened up an alternative pathway to the field of music to the traditional performance major, destined for a career on the stage.

For Wolpert, technology fills in the next step after learning your craft as a musician. “The production part is making it happen,” he says. He developed the program as the realization of a need he saw. “Number 1 — to keep people in music,” he says. “You had people with music degrees and no visible means of support. They were taught about live performance, but my question is – is this how they will make a living?” There is less money to be made in performance these days, and as a composer, he sees a mastery of technology as absolutely crucial.

In his professional practice, he works on music for the movies, what he calls “the last bastion of classical music.” As he notes, there are only a few centres in the world, such as Los Angeles or New York City, where orchestral musicians can get regular work recording music for movies. Everywhere else, composers rely on electronic renditions. “One of the things they really have to do is show the movie director what the music is like.” That means being able to produce a high-quality electronic version of your score. “Sometimes the quality of the mock up makes or breaks the deal. Most composers I know are really expert in that technology.”

Once they get the gig, in his experience, he says composers most often spend virtually their entire commission on music production, making their money through the back end via royalties. “In most cases, you have to deliver the production.” No more paper scores, no more live performance.

Digital technology can be a composer’s best friend in more ways than one. “They’re always worried about repeating themselves,” Wolpert says. With technology, a composer can create a palette of sounds to be used for electronic and electroacoustic music making. He mentions that prominent film composers like Thomas Newman and Hans Zimmer will experiment extensively with the technology, looking for inspiration.

He was part of the team working with composer Jonathan Goldsmith (Take This Waltz, Sharkwater Extinction) on one project. “What we built for him was something called the Infinity Table. Technically, it’s not an instrument at all.” Instruments can be plugged into the Infinity Table to produce a virtually infinite range of different sounds. “Writing with technology has become a possibility.”

Part of the push to compose assisted by tech comes from diminishing timelines for commissions, which he says often involve short lead times of three weeks to three months. “It’s about finding inspiration on that schedule.”

But, along with the benefits of tech, Wolpert also recognizes the limitations. “I greatly miss the community,” he says of the new, isolated way of producing music. When it comes to his students, he emphasizes that technology can only be as brilliant as the mind wielding it. “Just because you can program a string section, doesn’t mean you know anything about strings. Much of the character of an instrument is in the articulation,” he points out.

If classical music needs technology nowadays — technology still needs humans. For now.

Share this article
lv_toronto_banner_high_590x300
comments powered by Disqus

FREE ARTS NEWS STRAIGHT TO YOUR INBOX, EVERY MONDAY BY 6 AM

company logo

Part of

Terms of Service & Privacy Policy
© 2024 | Executive Producer Moses Znaimer