How Artificial Intelligence Helped Make an Experimental Pop Album

This content has been archived. It may no longer be accurate or relevant.

From Smithsonian Magazine:

When you listen to experimental pop band YACHT’s discography, the 2019 ​​album Chain Tripping fits right in. Pulsing with glitchy synth sounds, infectious bass riffs and the sweet voice of lead singer Claire L. Evans, Chain Tripping is a successful, Grammy-nominated album that sounds like YACHT, through and through. It’s such a success, you might never know it was generated by artificial intelligence.

But it was: Every riff, melody and lyric on Chain Tripping was developed by A.I. systems—even the album’s title.

That strange, tedious process—at times frustrating and at times awe-inspiring—is now the subject of The Computer Accent, a new documentary from directors Sebastian Pardo and Riel Roch-Decter that is “heartening or horrifying depending on your viewpoint,” according to the film’s own synopsis.

. . . .

To make Chain Tripping, the members of YACHT transformed their entire back catalog into MIDI data. (MIDI, which stands for musical instrument digital interface, allows electronic instruments and computers to communicate.) They then fed that data, piece by piece, to machine learning models—primarily Google’s MusicVAE, which helps artists “create palettes for blending and exploring musical scores.” YACHT followed the same process with songs by their musical inspirations and peers, and fed the lyrics of their songs into a lyric-generating model to come up with words.

Though A.I. generated the building blocks—melodies, riffs, beats and lyrics—the members of YACHT (which, fittingly, is an acronym for Young Americans Challenging High Technology) still had to manipulate them into complete songs.

“It wasn’t something where we fed something into a model, hit print and had songs,” Evans told Ars Technica’s Nathan Mattise in 2019. “We’d have to be involved. There’d have to be a human involved at every step of the process to ultimately make music … The larger structure, lyrics, the relationship between lyrics and structure—all of these other things are beyond the technology’s capacity, which is good.”

. . . .

Evans and her bandmates, Jona Bechtolt and Rob Kieswetter, hand-selected their favorite A.I. generations and then arranged them into the songs that make up Chain Tripping. They set rules for themselves: “We can’t add anything. We can’t improvise anything. We can’t harmonize,” Bechtolt told KCRW’s Madeleine Brand in 2019. “We decided it would be just a subtractive process. So we could remove things, like we could take out a word, but we couldn’t add a word for the lyrics. Same with the drum patterns and the melodies.”

Link to the rest at Smithsonian Magazine

1 thought on “How Artificial Intelligence Helped Make an Experimental Pop Album”

  1. Big deal!
    This is what faux AI is really about:

    https://www.youtube.com/watch?v=sV0cR_Nhac0

    Best description I’ve seen is “industrial scale digital gardening”.

    There’s millions of meatbags that can compose ditties but not a one that can simultaneously and individually weed, debug, and polinate, and fertilize an acre of mixed crops. Not only producing bigger yields, but also better quality yields, using only 5% of the increasingly expensive fertilizers and pesticides.

    This particular company is VERDANT ROBOTICS.
    They are not alone is using “AI” to do mezningful things.
    That is the real 21st century.
    (If Putin doesn’t blow up the world.)

    Oh, and the real flying car was announced last week by a company called Alef in SiliValley. It too relies on “AI”.
    It’ll ship by 2025 but it’s flying today.
    And it’s a true roadworthy car, not a helicopter or airplane.

Comments are closed.