Skip to main content

hello

This is my first post while I get things started on this little site about Apple Vision Pro!

I'm looking forward to learning how this new tech will work, as well as building apps for it in SwiftUI.

The classic Apple Hello text in Vision Pro

Why Apple Vision Pro? #

I'm a web designer and developer and have been working on the web for over 20 years. When I began creating web sites, CSS was brand new and web browsers were a lot more primative than they are today, but the platform was new and exciting.

Being able to design, create and publish my ideas for the world to see from the comfort of my sofa was a revolutionary time. The web has since grown to become simultaneously more important but also more mundane. It's possible to create random little sites like this but few people do, and it feels like less of a world wild web and more dominated by heterogenous, large platforms.

The iPhone was launched almost 16 years ago, and has created it's own wonderful ecosystem of creativity. It introduced the first good implementation of a touch-based interface and made the internet available in a pocket-sized device to everyone.

Building apps was also something within reach of a solo developer. The tools to build on iOS have continued to to get better and better, while the technology has become more refined. Which leads to what I hope will be another exciting stage in technology, spatial computing.

Spatial computing vs VR #

Most headsets aim for a primarily virtual reality (VR) approach, in which the user is placed into an entirely simulated environment. As technology has improved this has created some amazing, immersive games and simulations.

What is more tricky is supporting augmented reality (AR). This when the real world is visible and a computer overlays items on top. To do this requires cameras, lots of smart code to adjust and shape the camera input to look right, as well as sensors to detect where the headset is and where the user is looking.

With all these pieces in place, the headset can then render visual elements within the video stream and make it look like your computer and the real world are intermingled. The result, if done well, allows us to explore computing beyond the constraint of screens.

As much as laptops are amazing devices, having a seemingly infinite canvas within which 3D computing elements can be placed is a big step forward.

Apple's difference #

Compared to other platforms such as the Meta Quest or PlayStation VR, Apple has taken a different approach.

Where the previously mentioned headsets are designed and marketed primarily as gaming devices, Apple has promoted their headset as an entertainment and creative productivity device.

I'm sure there'll be games, and maybe I'll try making some. However with no controllers, and a much more expensive device, the value I'm hoping Vision Pro will bring will be much more than games.

It's like when laptops were dull, beige and focused on office productivity, and Apple promoted their laptops as creative tools. The Vision Pro is being positioned as a device that will bring spatial computing and be a creative tool rather than just got gaming.

Cutting-edge technology #

Apple's Vision Pro combines the highest fidelity of passthrough video, with state of the art eye tracking, and a (hopefully) expansive array of applications.

So what's this blog about? #

I'm interested in exploring what this new platform offers, from a developer's perspective. I'd like to ask questions such as how we build spatial experiences, what makes a good Vision Pro app, and how we find inspiration as this platform emerges.

I have a lot to learn and I find blogging about it can be a great way to reinforce my learning. It's fun too.

Time will tell if the device is worth the asking price. Developers will explore the capabilities and build apps to see what it can do. I'm excited to be in a position to learn how to build apps for this platform, and I'll continue documenting this journey here.