Panel: Next Generation Inclusive UIs
Briefly

Panel: Next Generation Inclusive UIs
"We're going to do a Q&A panel. We're going to dig in a little bit more on how augmented, virtual, and extended, and mixed reality unlock the ability to integrate the power of computers more seamlessly into our physical three-dimensional world. Designing that user experience of these next generation UIs to be as inclusive as possible comes with a lot of challenges."
"If you really think about even in like AI development, you're not going to start with the most gigantic foundation model and the most amount of data. You're going to start off with something really small and simple and test it on 10 people. What I always encourage people to do since 2016 is to try as many different VR or AR, XR, demos as much as possible."
"When you're actually developing for a single experience, you're going to start with maybe just a few interactions at a single level and see if it works with actual people. If you get more than 60%, maybe you're in a good range. A lot of the time you'll actually try it in a bunch and you'll realize, this actually doesn't work. That way you're saving a lot of time when you're gameplay testing. That's the first thing I would say. It's pretty easy."
Gameplay testing for XR should begin with small, simple prototypes that focus on a few interactions to evaluate real user performance. Test early with roughly ten people to gather feedback and identify failures before scaling development. Explore many VR, AR, and XR demos to discover what works across different users and contexts. Aim for greater than a 60% success rate on core interactions to indicate acceptable usability. Early, iterative testing reduces wasted development time and surfaces accessibility issues, which is especially valuable for teams without prior game-development experience.
Read at InfoQ
Unable to calculate read time
[
|
]