I left the headline like the original, but I see this as a massive win for Apple. The device is ridiculously expensive, isn’t even on sale yet and already has 150 apps specifically designed for that.

If Google did this, it wouldn’t even get 150 dedicated apps even years after launch (and the guaranteed demise of it) and even if it was something super cheap like being made of fucking cardboard.

This is something that as an Android user I envy a lot from the Apple ecosystem.

Apple: this is a new feature => devs implement them in their apps the very next day even if it launches officially in 6 months.

Google: this is a new feature => devs ignore it, apps start to support it after 5-6 Android versions

  • Especially with the fake “eye” it creates for you on the front of the device. It’s creepy and dystopian af. Like we’re all sitting around wearing AR goggles, with fake eyes displayed on the outside so it still looks like we’re engaging with people around us.

    I mean, I can maybe see a use case for something like this, where you’re prototyping a build, modelling something, etc. Especially if you have more than one person and they can all collaborate on and interact with the same objects. But I’m having a really hard time seeing other use cases. Gaming on macOS isn’t really a thing, as much as the latest Apple silicon releases would like you to believe. AAA devs aren’t porting their games to macOS. So what else? Watching movies? Browsing the web? Why would I spend nearly $4000 for a device to do that?

    I think Apple overall is generally really good about taking existing tech and pushing the envelope with it, and/or making it more usable and appealing for the masses. And even if this thing does represent a big step in xR, what’s the end goal? What’s the killer app? What’s the overall… vision for the product?