Want to try this now? Download this three.js Rollercoaster Demo (Android
We are happy to announce that Google Daydream VR headset and Gamepad support are landing in Servo. The current implementation is WebVR 1.1 spec-compliant and supports asynchronous reprojection to achieve low-latency rendering.
If you are eager to explore, you can download an experimental three.js Rollercoaster Demo (Android
APK) compatible with Daydream-ready Android phones. Put on the headset, switch on your controller, and run the app from Daydream Home or from a direct launch.
We have contributed to many parts in the Servo browser codebase in order to allow polished WebVR experiences on Android. It’s nice that our WebVR support goals has allowed to push forward some improvements that are also useful for other areas of the Android version of Servo.
VR Application life cycle
Daydream VR applications have to gracefully handle several VR Entry flows such as transitions between the foreground and background, showing and hiding the Daydream pairing screen, and adding the GvrLayout Android View on top of the view hierarchy. To manage the different scenarios we worked on proper implementations of native EGL context lost and restore, animation loop pause/resume, immersive full-screen mode, and support for surface-size and orientation changes.
Servo uses a NativeActivity, in combination with android-rs-glue and glutin, as an entry point for the application. We realized that NativeActivity ignores the Android view hierarchy because it’s designed to take over the surface from the window to directly draw to it. The Daydream SDK requires a
GvrLayout view in the Activity’s view hierarchy in order to show the VR Scene, so things didn’t work out.
A research about this issue shows that most people decide to get rid of
NativeActivity or bypass this limitation using hacky PopupWindow modal views. The
PopupWindow hack may work for simple views like an Google AdMob banner but causes complications with a complex VR view. We found a more elegant solution by releasing the seized window and injecting a custom SurfaceView with its render callbacks redirected to the abstract implementation in
This approach works great, and we can reuse the existing code for native rendering. We do, however, intend to remove NativeActivity in the future. We’d like to create a WebView API-based Servo component that will allow developers to embed their content from Android standalone apps or using WebView-based engine ecosystems such as Cordova. This will involve modifications to various Servo layers coupled with
Thanks to the amazing job of both the Rustlang and Servo teams, the browser can be compiled with very few steps, even on Windows now. This is true for Android too, but the packaging step was still using ant combined with Python scripts. We replaced it with a new Gradle build system for the packaging step, which offers some nice benefits:
- A scalable dependency system that allows to include Gradle/aar-based dependencies such as the GoogleVR SDK.
- Relative paths for all project libraries and assets instead of multiple copies of the same files.
- Product flavors for different versions of Servo (e.g. Default, VR Browser, WebView)
- Android Studio and GPU debugger support.
The new Gradle integration paves the way for packaging Servo APKs with the Android AArch64 architecture. This is important to get optimal performance on VR-ready phone CPUs. Most of the Rust package crates that Servo uses can be compiled for AArch64 using the aarch64-linux-android Rust compilation target. We still, however, need to fix some compilation issues with some C/C++ dependencies that use cmake, autotools or pure Makefiles.
Other necessary improvements to support WebVR
There’s a plethora of rough edges we have to polish as we make progress with the WebVR implementation. This is a very useful exercise that improves Servo Android support as a compelling platform for delivering not only WebVR content, but graphics-intensive experiences. To reach this milestone, these are some of the areas we had to improve:
- Discover and fix a buffer overflow bug in unsafe code in Angle’s Rust binding.
- Fix eglGetProcAddress for loading core OpenGL entry points.
- Add support for packed depth-stencil attachments.
- Address Android clipping issues on some WebRender GPU shaders.
- Improve EGLContextLost detection on Android.
- Fix renderbuffer/texture storage formats used on Android.
- Research long startup times caused by the
Daydream support on Rust WebVR
These notable Android improvements, combined with the existing cross-platform WebVR architecture, provide a solid base for Daydream integration into Servo. We started by integrating Daydream support in the browser dependency-free rust-webvr library.
The Google VR NDK for Android provides a C/C++ API for both Daydream and Cardboard headsets. As our codebase is written in Rust, we used rust-bindgen to generate the required bindings. We also published the gvr-sys crate, so from now on anyone can easily use the GVR SDK in Rust for other use cases.
The GoogleVRService class offers the entry point to access GVR SDK and handles life-cycle operations such as initialization, shutdown, and VR Device discovery. The integration with the headset is implemented in GoogleVRDisplay. Daydream lacks positional tracking, but by using the neck model provided in the SDK, we expose a basic position vector simulating how the human head naturally rotates relative to the base of the neck.
A Java GvrLayout view is required in order get a handle to the gvr_context, apply lens distortion, and enable asynchronous-reprojection-based rendering. This adds some complexity to the implementation because it involves adding both the Java Native Interface (JNI) and Java code to the modular rust-webvr library. We created a Gradle module to handle the
GvrLayout-related tasks and a helper JNIUtils class to communicate between Rust and Java.
One of the complexities about this interoperation is that JNI FindClass function fails to find our custom Java classes. This happens because when attaching native Rust threads to a JavaVM, the JNI AttachCurrentThread call is unaware of the current Java application context and it uses the system Classloader instead of the one associated with the application. We fixed the issue by retrieving the Classloader from the NativeActivity’s
jobject instance and performing loadClass calls directly to it. I’m waiting for variadic templates to land in Rustlang to extend and move these JNI Utils into it’s own crate providing a similar API like the one I implemented for the C++11 SafeJNI library.
In order to present the WebGL canvas into the headset we tried to use a shared texture_id as we did in the OpenVR implementation. Unfortunately, the GVR SDK allows attaching only external textures that originate from the Android MediaCodec or Camera streams. We opted for a BlitFramebuffer-based solution, instead of rendering a quad, to avoid implementing the required OpenGL state-change safeguards or context switching:
Once the Daydream integration was tested using the pure Rust room-scale demo, we integrated it pretty quickly into Servo. It fit perfectly into the existing WebVR architecture. WebVR tests ran well except that
VRDisplay.requestPresent() failed in some random launches. This was caused because of a deadlock possibility during the very specific frame when the
requestAnimationFrame is moved from
VRDisplay. Fortunately, this was fixed with this PR.
EGLContext is avoided. The optimized VR render path draws into only the texture framebuffer attached to the WebGL Canvas. This texture is sent to the GVRLayout presentation view when
VRDisplay.submitFrame() is called and lens distortion is then applied.
Gamepad support is a necessity for complete WebVR experiences. Similarly to the
VRDisplay implementation, integration with the vendor-specific SDK for gamepads are implemented in rust-webvr, based on the following traits and structs:
These traits are used in both the WebVR Thread and DOM Objects in the Gamepad API implementation in Servo.
Vendor-specific SDKs don’t allow using the VR gamepads independently, so
navigator.vr.getDisplays() must be called in order to spin up VR runtimes and make VR gamepads discoverable later in subsequent
The recommended way to get valid gamepad state on all browsers is calling
navigator.getGamepads() within every frame in your
requestAnimationFrame callback. We created a custom GamepadList container class with two main purposes:
Implement an indexed getter method which will be used to hide gamepads according to privacy rules. The Gamepad spec permits the browser to return inactive gamepads (e.g.,
[null, <object Gamepad>]) when gamepads are available but in a different, hidden tab.
The latest gamepads state is polled immediately in response to the
navigator.getGamepads() API call. This is a different approach than the one implemented in Firefox, where the gamepads are vsync-aligned and have the data already polled when
requestAnimationFrame is fired. Both options are equally valid, though the being able to immediately query for gamepads enables a bit more flexibility:
- Gamepad state can be sampled multiple times per frame, which can be very useful for motion-capture or drawing WebVR applications.
- Vsync-aligned polling can be simulated by just calling
navigator.getGamepadsat the start of the frame. Remember from the Servo WebVR architecture that
We are very excited to see how far we’ve evolved the WebVR implementation on Servo. Now that Servo has a solid architecture on both desktop and mobile, our next steps will be to grow and tune up the WebGL implementation in order to create a first-class WebVR browser runtime. The Gear VR backend is coming too ;) Stay tuned!
Subscribe to Mozilla Mixed Reality Blog
Get the latest posts delivered right to your inbox