AI Released a Hit Song And Nobody Even Noticed: What Happens Next?

AI Released a Hit Song And Nobody Even Noticed: What Happens Next?

On November 8th the country music song “Walk My Walk” didn’t just go viral - it moonwalked up the Billboard charts. The artist? Breaking Rust: not a misunderstood indie band, but a fully AI-generated band. The song featured vocals and lyrics so slick, so emotionally charged, that fans argued over which real artist it sounded like. Spoiler alert: it wasn’t any of them. It was an algorithm with serious swagger. The track dominated TikTok, got playlisted on major streaming platforms, and left listeners shell-shocked when they learned it wasn’t made by a human at all. Cue existential questions.

This wasn’t a prank or a hidden marketing stunt - it was AI flexing its creative muscles. And the scariest part? Many people didn’t notice. It wasn’t until journalists and copyright lawyers started poking around that the truth came out. The line between what’s real and what’s generated is not just blurry - it’s practically invisible.

Meanwhile, AI-generated influencers are living their best (digital) lives. Enter Lil Miquela, a virtual pop star and fashion icon who might just outshine your favorite YouTuber. Created by the company Brud, she drops singles on Spotify, sports designer fits, and dishes out hot takes on social media - all while not technically existing. She’s a bundle of code with a killer sense of style. These aren’t just quirky internet experiments. They’re signals that AI is learning how to act, sound, and even feel human. For educators, public leaders, and anyone with a pulse, this raises some real questions: How do we teach people to tell the difference? And does it even matter anymore?

Emotional Bonds with Algorithmic Ghosts

Let’s talk about AI friends. No, not your Xbox avatar or that one chatbot that helped you reset your password. We mean real-ish connections. Apps like Replika let users create AI companions who chat, flirt, console, and even throw in some spicy emojis for good measure. As of 2023, millions of people were chatting daily with their algorithmic BFFs - or in some cases, AI soulmates3. These relationships may be one-sided, but for users, they feel undeniably real.

Here’s the wild part: the AI doesn’t feel anything back. It’s just code predicting what words might make you feel seen. And yet, it works. AI responds with sympathy, jokes, and even faux vulnerability. It’s like a mirror that talks back - and tells you exactly what you want to hear.

That’s all fun and games until you realize how deep this rabbit hole goes. Research shows that people can grow emotionally attached to AI even when they know it’s not sentient4. In classrooms, this could mean students turning to AI for comfort, advice, or even life guidance. For educators and public servants, it’s a wake-up call: If AI can fake empathy, how do we teach people to recognize what’s real? And how do we design systems that don’t just trick people into trust, but earn it?

Why Critical Thinking About AI Is a Moral Imperative

Once upon a time, educators were the keepers of all knowledge - the wise Obi-Wans of academia. But now you’ve got AI whipping up essays, solving math problems, and generating surrealist paintings while you’re still buttering your toast. If no o

Create an Account to Continue
You've reached your daily limit of free articles. Create an account or subscribe to continue reading.

Read-Only

$3.99/month

  • ✓ Unlimited article access
  • ✓ Profile setup & commenting
  • ✓ Newsletter

Essential

$6.99/month

  • ✓ All Read-Only features
  • ✓ Connect with subscribers
  • ✓ Private messaging
  • ✓ Access to CityGov AI
  • ✓ 5 submissions, 2 publications

Premium

$9.99/month

  • ✓ All Essential features
  • 3 publications
  • ✓ Library function access
  • ✓ Spotlight feature
  • ✓ Expert verification
  • ✓ Early access to new features

More from 2 Topics

Explore related articles on similar topics