
"I'm going to rewind back like 8 years to 2016. One of my favorite experiences, actually wasn't because it was in the Microsoft HoloLens, not the greatest field of view, but it was a data visualization that I saw in Microsoft HoloLens by IBM. This is my friend, Rostin, on my left, you could see his hands doing a lot of the hand tracking and you could interact with the data visualization."
"Fast forward to 2024, this is me at a hackathon with Apple Vision Pro. The advances in hardware that we see both from Meta and Apple is really just pass-through. That's what we call it, mixed reality or AR. You're actually able to code in editor. How many people have an Apple Vision Pro? You can code with Runestone, with Xcode, with any IDE of choice. Ergonomically not the best because it's still a little bit heavy, but it is possible now."
Early AR/VR work in 2016 included Microsoft HoloLens IBM data visualizations that combined hand-tracking interaction with a split-screen Jupyter Notebook. Direct in-headset editing was limited, requiring external manipulation and prompting engineering of ETL-style pipelines to move data and machine-learning visualizations into headsets. By 2024, Apple Vision Pro and Meta hardware advances emphasize pass-through mixed reality and enable development within standard editors and IDEs such as Runestone and Xcode. Ergonomic constraints persist due to headset weight, but current hardware supports creation of interactive, AI-enhanced mixed-reality applications.
Read at InfoQ
Unable to calculate read time
Collection
[
|
...
]