Over the past two decades, technology has reinvented much of what we do in our daily lives, but the first major domino to fall was probably the advent of digital music. Next month marks 20 years since Apple’s iTunes Store launched (fun fact: a birthday shared with yours truly), which, while not the first way to get digital music online, was certainly the most far-reaching.
The digital music experience has certainly changed in the intervening years, especially with the rise of streaming in the last decade, but when it comes to Apple’s take on listening to music there are some things that honestly haven’t changed enough. It sometimes feels like Apple believes digital music is a problem solved, with the company sitting back and dusting off its hands, but there are certainly places where the music listening experience could be improved.
Not so gentle downstream
In 2019, Apple split its venerable iTunes app on the Mac into three separate apps: Music, TV, and Podcasts. While some may look back on the iTunes era with nostalgia, I’m not going to sugarcoat it: iTunes had become a hot mess. In theory, it was a good idea to split these into different apps that relate to specific types of media: people don’t want to watch TV shows or listen to podcasts the same way they listen to music.
However, in the implementation, the macOS Music app is basically the former iTunes app with Apple Music’s streaming functionality bolted on. While there are benefits to having both songs from your personal library and Apple Music in one unified interface, especially when it comes to ease of use, it can sometimes feel like Apple is playing a clever trick. For example, one of my biggest frustrations is discovering that a specific song from an album I’ve added to my library isn’t available due to streaming rights. Why only a specific number? It’s almost always obscure, but it does detract from the idea that music in your library actually is in your library.
Foundry
That’s just one example of where this fusion doesn’t always work; there are plenty of others, including matching an explicit version of a song with a clean version (or vice versa), ending up with split albums due to metadata issues, and just getting the wrong version of a song (live instead of studio, For example). Critics of the Music app will no doubt have many more points to add to this, and this could easily turn into a piece based entirely on its shortcomings, but let’s talk about a few other glaring ones issues.
Look Ma, no transfer!
I’m going to shamelessly endorse this one from my friend and colleague Joe Rosensteel, who recently wrote an excellent article about the many issues with the Music app: Why hasn’t Apple implemented Handoff for Music?
If you’ve forgotten what Handoff is, it’s one of Apple’s continuity features (an umbrella term for an ever-expanding set of functionalities), which should make it easy to move a task between your Apple devices. If you’ve ever started writing an email on your iPad and then turned to your Mac and saw a second Mail icon in the Dock, that’s Handoff. (To be honest, sometimes I can’t get it off, especially with apps from my Apple Watch.)
But no comparable functionality exists for music. If I pause a song I’m playing on my Mac and want to pick it up on my iPhone – an analog of that was run in the very first advertisement for the iPod in 2001, I have to launch the Music app on my phone, find the song and skip to where it was on the Mac. Apple’s Podcasts app has this right: sync the playhead position across devices, or at least let users choose that sync. The closest Apple has come is that you can transfer music from a phone to a HomePod by bringing them close together.
This brings us to another big problem.
The AirPlay is not the thing
AirPlay is a mess. It’s not even a hot mess; it’s just a mess. A few years ago, around the time Apple launched the first HomePod, the company sort of changed the way AirPlay worked. Back in the day, all AirPlay speakers were treated in much the same way: basically as an external speaker for playing music from your device, be it Mac, iPhone, iPad, etc. It kind of behaved as if you switched from listening on headphones to listening on a built-in speaker.
However, when the HomePods launched, with their ability to play music on their own without another device, Apple decided to treat them differently in AirPlay. Instead, when you start playing music on your device and then AirPlay to a HomePod, the shifts the currently playing song to the HomePod’s own internal playlist, as if you used Siri to tell the speaker to play a song.

IDG
This has really frustrated me, especially when I start listening to an album on my phone, play it on my HomePod mini via AirPlay, and then go back to my phone to find it’s still on the same track it was when I first AirPlayed it.
Now, instead of AirPlaying, you can control playback directly on a HomePod by using the AirPlay menu on an iOS device and scrolling all the way down to Control Other Speakers and TVs… but then you’re essentially kicked into a version of the Music app That looks exactly the same like the Music app, but you can’t all do the same things. (For example, I can never play another song on the HomePod using this interface.)
Meanwhile, all non-HomePod smart speakers still retain the old method of being treated as external speakers. The way this works is obviously complicated, but whatever the answer, this isn’t it. I hope iOS 17 brings some much-needed revisions to the way AirPlay works, or rather it doesn’t.