Now and then I get by with a little help from AI
We just got the final Beatles song, but the grand fugue of humans handling AI has only begun.
The last week was bookended by the danger and delight of artificial intelligence.
Part of that protection involves detecting AI-generated content and developing reliable methods of authenticating official content. “Federal agencies will use these tools to make it easy for Americans to know that the communications they receive from their government are authentic—and set an example for the private sector and governments around the world.”
This and the entire laundry list of protections represent an ambitious agenda. In the words of the Massachusetts Institute of Technology, “The trouble is that technologies such as watermarks are still very much works in progress. There currently are no fully reliable ways to label text or investigate whether a piece of content was machine generated. AI detection tools are still easy to fool.”
But for now let’s set aside the dread and turn to the week’s AI delight: On Nov. 2 we were gifted with a new Beatles song, “Now and Then,” followed the next day by a new music video from director Peter Jackson. Technology enabled a cohesive song to be assembled from a primitive 1970s cassette of John Lennon’s voice and piano, 1990s recordings of three-fourths of the Beatles (with George Harrison), and new recordings from Paul McCartney, Ringo Starr, and a string section.
“Now and Then” is marketed as the last Beatles song—a claim I tend to trust about 10 times more than most artists’ advertisements for a “final” concert tour—if only because half of the Fab Four is six feet under, and it sounds like they’ve now exhausted their tape archive.
I’m a lifelong Beatlemaniac even though I wasn’t born until after the band already had broken up. Because I’m a Gen Xer who inherited my Beatles fandom in this way, maybe I feel only more deeply how this band remains one of our most powerful examples of the bygone 20th century monoculture. The Beatles brought us together as a society around the proverbial “All You Need Is Love” campfire before we splintered into our endless phone feeds and partisan news channels.
One of my favorite streaming interludes during the pandemic was the Beatles’ “Get Back” miniseries in late 2021 on Disney+. It let me feel like a lucky fly on the wall for eight hours as the Beatles meandered their way to their last glorious creative gasp as a band. At a time when the daily ponderous routine of pandemic life felt about as easy as walking through Jell-O, escaping to a London studio and rooftop in 1969 was a welcome time warp.
Life at least appeared to be simpler when Apple was the record label of the world’s biggest band, not a tech behemoth and the world’s largest company by market cap.
Jackson relied on proprietary AI to clean and refine the audio that made “Get Back” a sharper and more immersive experience—the same AI that also made “Now and Then” possible.
The three surviving Beatles first worked on the song in the 1990s. But that era’s available technology couldn’t adequately separate Lennon’s vocal from his piano. A generation later, AI swept in to save the day.
Philosophically, this feels completely consistent with the Beatles, a band that endures in part because of how they embraced the cutting edge of 1960s recording technology and studio experimentation. They’ve continued to sound more contemporary than many of their peers because of how eagerly and playfully they chased the future.
Jackson’s music video blends and blurs all Beatle eras. We see Ringo and Paul in 2023 patiently drumming or strumming a bass, only to be invaded by the wacky 1967 Beatles—especially Lennon and his devilish grin.
Jackson struggled to unearth footage for the video—a familiar and old-fashioned problem of the human time and effort it takes to sift through dusty boxes and closets. At first he was given only one hour of scenes from the Beatles 1995 “Anthology” recording sessions.
As Jackson told Esquire: “‘So, if the film crew only went in for an hour to shoot this footage, why is George wearing four different shirts? Could it be four different days?’ They phoned me up and said, ‘Guess what, we’ve found more of the ’95 footage. We have 14 hours.’”
Jackson combined that treasure trove with childhood photos contributed by all four Beatle families plus the first (previously unseen) 40 seconds of Hamburg-era film owned by former drummer Pete Best.
The easier task was relying on AI technology to extract a clean Lennon vocal from his old tape that Yoko Ono had passed along to McCartney in the ‘90s.
“You teach the computer what a guitar sounds like, you teach them what a human voice sounds like, you teach it what a drum sounds like, you teach it what a bass sounds like,” Jackson explained two years ago.
Jackson’s Park Road Post Production collaborators in New Zealand initially developed their AI editing technology while working on “Get Back”—based on police surveillance software.
So there we have it: The danger and delight of AI never stray too far from each other. The same first steps that lead to a new Beatles song also could lead to a police state.
In cultural and conceptual terms, humanity has been wrestling with AI since long before the Beatles broke up, although the human extinction represented by 1984’s “Terminator” always seems to sum up our fears.
We’ve been flooded with AI headlines in the last year since the emergence of ChatGPT. A new Brookings Institution assessment of this week’s executive order compiles contributions from 14 writers, including this nugget from Chinasa T. Okolo:
“Along with a general lack of AI literacy within the American population, low levels of data and privacy literacy are issues that could affect how citizens utilize and interact with AI systems. While the (executive order) emphasizes the protection of Americans’ privacy through external means, such as through the development of privacy-enhancing technologies, these methods can only do so much if citizens are not aware of how to restrict the use of their personal data and how to prevent unintended sharing of confidential information.”
That rings true from my industry experience with how news consumers drifted from trusting local brands to random clicking and consenting based on casual web-surfing. Our digital culture has been built on a vast mountain of freely shared data—the building blocks now training AI.
We’re in the middle of two major wars and on the cusp of an election year where the potential danger of AI and deepfakes looms only larger.
My mindset always will be more like the Beatles: Lean into technology, innovation, and the promise of adventure. But I’m not naive enough to think that all we need is love when it comes to AI.
Lennon’s mischievous grin in the new Beatles video is utterly charming, but the more I’ve thought about the news of the last week, the more somehow it also haunts me.
The Iowa Writers’ Collaborative
Have you explored the variety of statewide voices in the Iowa Writers’ Collaborative? We contribute commentary and feature stories of interest to those who care about Iowa.