Skip to content

Android vs iOS in an AI-first 2025

January 9, 2025 at 03:22 PM

Note: This is not a blog, it's a semi-private digital garden with mostly first drafts that are often co-written with an LLM. Unless I shared this link with you directly, you might be missing important context or reading an outdated perspective.


In 2025, I started carrying two phones. While my iPhone remained my primary device after 15 years of iOS loyalty, I added an Android phone to my daily carry. What started as an experiment has led to some surprising insights about the future of mobile operating systems in an AI-first world.

The Two-Phone Strategy

My iPhone now serves as a highly curated device, with notifications strictly limited to essential apps. The Android phone handles everything else, which I check roughly three times daily, usually around meals. Despite swiping away 80-90% of notifications, this setup proves valuable because social media and other apps still maintain better signal-to-noise ratios and lower ads in their notification channels, even while remaining quite prolific with them. It’s a bit of a necessary evil to keep up with the world.

Hardware Surprises

Unexpectedly, I found myself gravitating more toward the Android device for several reasons. An old Pixel 4a, despite (or perhaps because of) its more modest price point, makes me a lot less nervous about dropping it than the premium glass and metal construction of the iPhone, which has a slippery, cold body that feels precarious to hold.

The AI-First Paradigm Shift

The rise of AI has fundamentally changed what matters in a mobile operating system. When I, like many others, chose iOS in the mid-2010s, we prioritized interface design, smooth animations, and stability. However, phones are increasingly becoming simpler I/O devices - interfaces for AI interactions where you input your request and receive an output, with minimal need for complex UI interactions. The device just needs to essential artifacts (Artifact Centric Computing) for and simple intermediate clarifications - whether through dialogue or visual annotations - before delivering the final artifact and getting out of your way.

Android’s “Flaws” Become Features

What were once considered Android’s weaknesses have evolved into strengths in this new paradigm:

  1. Faster, less “juicy” animations that prioritize efficiency over aesthetic pleasure
  2. Better “triaging” AI in Google assistant/Gemini than Siri.
  3. Significantly improved stability compared to earlier versions

The Ecosystem Lock-in

The only factor preventing my complete switch to Android is Apple’s ecosystem integration - particularly with the Apple Watch, AirPods, and Vision Pro. Transitioning would require investing in Android-compatible alternatives to maintain a seamless multi-device experience.

The AI Advantage

Google’s superior APIs and better data interoperability across services (Photos, Reminders, Notes) make Android more AI-friendly compared to Apple’s more siloed approach. This advantage becomes increasingly important as we move toward AI-driven computing experiences. The integration between Google’s services creates a more seamless experience for AI interactions, while Apple’s Reminders and other services can be quite restrictive in their interoperability.

The AI-First Operating System

The future of mobile operating systems will less about traditional OS features and more about AI capabilities — AI Apps as Operating Systems . As AI apps increasingly become the primary interface through which we interact with our devices, they’re essentially becoming operating systems in their own right. This shift suggests that the traditional boundaries between apps and operating systems may blur, with AI serving as the primary orchestrator of user interactions.

Looking Forward

While Android seems better positioned for an AI-first future, Google faces a significant conflict of interest. Their advertising-based revenue model, which benefits from user distraction and engagement, might prevent them from fully embracing a more efficient, AI-driven user experience. Meanwhile, Apple has the opportunity to catch up.

This transition mirrors a larger shift from app-centric to artifact-centric computing, where the focus is less on the apps themselves and more inputs and the outputs they produce. The platform that best facilitates this shift while balancing user experience and business interests will likely emerge as the leader in this AI-first era.

Raw

I recently did something interesting which is that I got an Android phone and basically started carrying two phones around. But my default iOS device, which I have largely been using as an operating system for the last 15 years or so, is now very selective about what apps I let notifications through on and the Android is kind of for everything else. So I check it like three times a day around meals usually and that's about it. And I realize I like swipe away 80-90% of the notifications on there. The reason it's good to have versus like verifying the notifications control is that if you process your notifications like an email, what a lot of social media apps and other apps have basically done is like they still give you better signal to the noise and lower ads on the notifications channel even though they're still quite prolific with it. So that's how I use the Android phone. One surprising realization I had is that I was often using the Android phone a lot more for a couple of reasons. One was the hardware. It actually had a slightly cheaper and therefore surprisingly more robust feel to use a Pixel 4a versus the iPhone Pro Max because the iPhone has this like premium but slippery feeling and cold body that feels like it would fall and break. The other reason for Android was that back when I switched to iOS like a lot of people decided in the mid-2010s to stick with iOS, interface design mattered a lot more and iOS had slightly smoother animations. iOS was slightly more robust about not crashing, etc. And so those things were relevant. But with the rise of AI, one of the things I've realized is that phones have basically just become in their ideal user experience a dumb input output box where you just like say what you want to it and then like it uses only basic media to do clarifications like a dialogue or like image clarifications or things like that and then just gives you the artifact you finally want and lets you go on with it. So iOS is related of course to the other post I've written about artifact-centric computing taking over from app-centric computing. But in general, all of the things I used to not like about Android have sort of turned into features in a way. So for example, one subtle thing about Android is that their animations are slightly faster and a little less juicy feeling wise. This used to be a problem back when I used to want to spend a lot of time like really admiring the animations of apps and such. But at this point, I just largely want this device to be functionally useful and so like quickly doing what it's being asked for and getting out of the way for like 95% of apps that aren't about leisure and entertainment is actually a really good feature in a way to have. The crashing problem seems to have largely gone away with Android. And so yeah, now I'm at this point where I thought I would use Android as a secondary operating system but now I kind of want to use it as a primary and the only thing really holding me back is just integration with Apple Watch and AirPods and Vision Pro and I would like need to buy the Android equivalent of those devices to really like get the full benefits of the ecosystem. Other than that, I think Google has better APIs and their photos and reminders and notes and so in that way it's even more AI friendly in that their data will interoperate in a way that Apple reminders is quite annoying to. And yeah, this is related to the note I made about AI apps as operating systems. I think if either of the current manufacturers stands a chance, it's more likely to be Android than iOS. However, the big risk with them is of course that Google makes so much money from ads and distracting users that I'm not sure they'd be willing to take a meaningful risk on that to deliver a good user experience. So the other possibility of course is that like Apple catches up on its AI first user experience quickly enough.