I left the headline like the original, but I see this as a massive win for Apple. The device is ridiculously expensive, isn’t even on sale yet and already has 150 apps specifically designed for that.

If Google did this, it wouldn’t even get 150 dedicated apps even years after launch (and the guaranteed demise of it) and even if it was something super cheap like being made of fucking cardboard.

This is something that as an Android user I envy a lot from the Apple ecosystem.

Apple: this is a new feature => devs implement them in their apps the very next day even if it launches officially in 6 months.

Google: this is a new feature => devs ignore it, apps start to support it after 5-6 Android versions

  • How do they expect developers to make apps for it without actually having it available? This is the dev-kit. Yes, they fake it in software so you can do the basics on a MacBook. But that’s not really testing. The device in your hands is testing.

    I recognize that it’s expensive. Being an early adopter isn’t cheap. But it’s sincerely priced insanely aggressively. The resolution is a huge difference from everything else available. It’s the difference between 10 seconds of text making your eyes bleed and actually being able to work on a screen with text. You can’t get just that for meaningfully less than the Vision Pro.

    The passthrough, same deal. Your alternatives are higher latency while also massively compromising the image quality just to get something passed through at all. And that’s before the fact that it has a genuinely powerful SoC in the mix, and high enough quality cameras and processing to be controlled fully with gestures.

    There’s a reason all the tech enthusiast “media”, who have their hands on a lot of these devices regularly, talk about the rest like they’re not anything special, but had their minds blown by the Vision Pro. It’s a huge step. And, because of their great development tools and relationships with big players, there will be a richer ecosystem than any of the others. Solo developers already could, and have, made real apps with ARKit for phones. They’ll make real apps for Vision Pro, too.

    Other platforms are “more open”, but nobody democratizes app development like Apple. I understand the complaints about the arbitrary limitations they place, and don’t like all of them, either, but the bottom line is that they really do make it perfectly reasonable for a single dev or small team to get something high quality published and support themselves on, and all of that vibrant ecosystem is going to add a lot of value to Apple headsets.

    Just not day one. Because people need hardware to develop for.