Music theory and machine learning depend on similar intellectual strengths: attention to detail, precision, the intuitive grasp of patterns. But they often manifest differently—in songs and concerts, or through technology. Madeline Wong ’21, who majored in electrical engineering and computer science and in music, came to MIT to put her passions for art and science together.
“Music and computer science complement one another a lot, especially because the music we consume now exists digitally,” she says.
Wong loved music from a young age, beginning with piano at five and moving to flute, mellophone, saxophone, and bassoon. She’s also self-taught in guitar and ukulele. “I don’t remember my life without music. That was always there,” she says.
Now a grad student in the MIT Music Technology Lab in the School of Humanities, Arts, and Social Sciences, she works with professor of the practice Eran Egozy ’93, MNG ’95 on ConcertCue, a mobile web app under development since 2018 that streams text, images, and media live during concerts, precisely timed to key moments in the performance. Think of it as Wikipedia synchronized to music.
“When you go to a concert, usually there’s just a program with a paragraph or two of text that you read during intermission or after. With ConcertCue, a little note will show up and tell you, ‘Hey, listen to the flutes!’ or something that’s relevant to the music at a point in time,” she explains.
For example, during a packed concert at MIT’s Kresge Auditorium of David Bowie’s “Blackstar” by the Ambient Orchestra, ConcertCue users could browse facts about Bowie, read about the score, and see a slideshow of related images. ConcertCue has also been used to enhance performances of the Boston Symphony Orchestra and the New World Symphony. Wong helps improve the app, such as by automating the process used to prepare a piece for ConcertCue through a method called dynamic time warping.
Wong was inspired to attend MIT by a music technology class she sat in on during a campus visit. Already an advanced placement student of computer science, Wong was excited by the possibility of fusing her talents as an engineer and a musician. She says she also hoped to study with Egozy, who cofounded Harmonix Music Systems, maker of the video game Guitar Hero.
She embraced the arts in music performance studies, too: As a freshman, she was selected, by audition, as a classical pianist for the conservatory-level Emerson Scholars track in MIT Music, which offers financial assistance for advanced music lessons. She also joined the MIT Symphony Orchestra, a curricular group, to play the bassoon. Beyond formal studies, she lent her mezzo-soprano to the Chorallaries, MIT’s oldest coed a cappella group. Her tech skills came in handy during the pandemic as she helped to stage virtual concerts for the group.
As a senior, she began studying independently with Egozy as part of the Undergraduate Research Opportunities Program. Wong has made the ConcertCue user interface easier to navigate and improved its synchronicity, a skill she honed in a class on the fundamentals of music processing.
“I do a lot of harmonic analysis, where you look at the notes that are being played at any given point,” she says. “I take the audio, break it down, and transform it so that it can be more informative than just a waveform,” a depiction of the pattern of sound pressure variation over time, also known as amplitude.
Her ongoing work with ConcertCue offers music lovers new depth, she says. “I just find it so cool to know more about a piece while I’m listening to it,” she says. “It’s not just this experience of listening but also understanding more about it—almost like hidden ‘Easter eggs’ in the music.”