featured image
three.js

Hello WebXR

     - 

We are happy to share a brand new WebXR experience we have been working on called Hello WebXR!

Here is a preview video of how it looks:

We wanted to create a demo to celebrate the release of the WebXR v1.0 API!.

The demo is designed as a playground where you can try different experiences and interactions in VR, and introduce newcomers to the VR world and its special language in a smooth, easy and nice way.

How to run it

You just need to open the Hello WebXR page on a WebXR (Or WebVR thanks to the WebXR polyfill) capable browser like Firefox Reality or Oculus Browser on standalone devices such as the Oculus Quest, or with Chrome on Desktop >79. For an updated list of supported browsers please visit the ImmersiveWeb.dev support table.

Features

  • The demo starts in the main hall where you can find:
  • Floating spheres containing 360º mono and stereo panoramas
  • A pair of sticks that you can grab to play the xylophone
  • A painting exhibition where paintings can be zoomed and inspected at will
  • A wall where you can use a graffiti spray can to paint whatever you want
  • A twitter feed panel where you can read tweets with hashtag #hellowebxr
  • Three doors that will teleport you to other locations:
    • A dark room to experience positional audio (can you find where the sounds come from?)
    • A room displaying a classical sculpture captured using photogrammetry
    • The top of a building in a skyscrapers area (are you scared of heights?)

Goals

Our main goal for this demo was to build a nice looking and nice performing experience where you could try different interactions and explore multiple use cases for WebXR. We used Quest as our target device to demonstrate WebXR is a perfectly viable platform not only for powerful desktops and headsets but also for more humble devices like the Quest or Go, where resources are scarce.

Also, by building real-world examples we learn how web technologies, tools, and processes can be optimized and improved, helping us to focus on implementing actual, useful solutions that can bring more developers and content to WebXR.

Tech

The demo was built using web technologies, using the three.js engine and our ECSY framework in some parts. We also used the latest standards such as glTF with Draco compression for models and Basis for textures. The models were created using Blender, and baked lighting is used throughout all the demo.

We also used third party content like the photogrammetry sculpture (from this fantastic scan by Geoffrey Marchal in Sketchfab), public domain sounds from freesound.org and classic paintings are taken from the public online galleries of the museums where they are exhibited.

Conclusions

There are many things we are happy with:

  • The overall aesthetic and “gameplay” fits perfectly with the initial concepts.
  • The way we handle the different interactions in the same room, based on proximity or states made everything easier to scale.
  • The demo was created initially using only Three.js, but we successfully integrated some functionality using ECSY.

And other things that we could improve:

  • We released fewer experiences than we initially planned.
  • Overall the tooling is still a bit rough and we need to keep on improving it:
    • When something goes wrong it is hard to debug remotely on the device. This is even worse if the problem comes from WebGL. ECSY tools will help here in the future.
    • State of the art technologies like Basis or glTF still lack good tools.
  • Many components could be designed to be more reusable.

What’s next?

  • One of our main goals for this project is also to have a sandbox that we could use to prototype new experiences and interactions, so you can expect this demo to grow over time.
  • At the same time, we would like to release a template project with an empty room and a set of default VR components, so you can build your own experiments using it as a boilerplate.
  • Improve the input support by using the great WebXR gamepads module and the WebXR Input profiles.
  • We plan to write a more technical postmortem article explaining the implementation details and content creation.
  • ECSY was released after the project started so we only used it on some parts of the demo. We would like to port other parts in order to make them reusable in other projects easily.
  • Above all, we will keep investing in new tools to improve the workflow for content creators and developers.

Of course, the source code is available for everyone. Please give Hello World! a try and share your feedback or issues with us on the github repository.