All posts by VRguy

July 2016 developer updates


Key additions to OSVR capabilities since the last developer update:

  • Enhanced support for OSVR devices inside SteamVR including direct mode and distortion correction
  • Support for the new OSVR HDK 2 as well as new support for HTC-Vive
  • Native support for OSVR inside Unreal 4.12
  • OSVR support inside WebVR

See below for details




The SteamVR-OSVR driver allows gamers to play SteamVR games on any supported OSVR hardware. A number of features and improvements have been added to the driver recently:

  • The driver now works with the latest released version of SteamVR (v1.0.2). You should no longer need to use the old beta release.
  • Distortion correction is now provided.
  • Direct mode is supported in Windows if you have an AMD or NVIDIA (but not NVIDIA Optimus or NVIDIA on a laptop) graphics card with up-to-date drivers. First, run DisableDirectMode.exe put the HDK in extended mode. This allows the SteamVR driver to see the HDK and figure out what its position is in relation to your other monitors. Start up SteamVR and from the menu, select Devices → Direct mode. SteamVR will restart with the HDK in direct mode.
  • Along with direct mode, we now detect your monitors and automatically set the HDK’s position properly. This means you should no longer need to edit the osvr_server.json configuration file to set the position.
  • Room setup (standing mode) should now complete successfully. Completing the room setup should also fix the head-stuck-on-the-floor problem.
  • The tracking camera is now recognized by SteamVR.



The OSVR team has completed first step in integrating OSVR into WebVR. The patch has landed in Mozilla Firefox and is officially scheduled to be released in 49.01a milestone.
The goals of WebVR in remaining device agnostic align perfectly with OSVR philosophy and with OSVR integrated, it brings support to dozens of devices that can be used expanding the use of WebVR.

With the current integration, OSVR-WebVR allows users to experience WebVR with any device supported by OSVR. This includes HMDs such as OSVR HDK, Sensics, Vuzix, Oculus and HTC, numerous position and orientation trackers, controllers such as the Razer Hydra, HTC Vive Controller, the Nod Backspin and others. One of the most important goals of WebVR is interoperability, meaning that users can experience content across a wide spectrum of hardware.

The following capabilities are supported:

  • Device detection
  • Positional and orientation tracking
  • Display configuration which allows users to demo WebVR in extended mode.

Upcoming improvements: Now that the initial OSVR integration with Firefox is complete, WebVR could potentially use all the available OSVR interfaces such as eye tracking, plugins such as augmented reality target recognition and more. Hundreds of VR devices that were not supported by WebVR are now available by virtue of the OSVR integration. Moreover, as new devices get added to OSVR, they automatically become available to WebVR. The remaining part is for WebVR to connect its API to these additional OSVR interfaces to bring out all the benefits it has to offer.

The OSVR team is working to enhance this integration by incorporating the OSVR Render Manager to enable direct mode, asynchronous time warp, distortion correction and more across a range of HMDs and graphics cards, thereby improving latency and fidelity of the experience.

Useful links:

Instructions for trying OSVR in WebVR
Nightly build downloads
The WebVR 1.0 API Spec
Bug tracker for WebVR 1.0 implementation


  • OSVR is now natively integrated in Unreal 4.12. Just enable it in the Virtual Reality section of the project plugins window.
  • Updated tutorial video on YouTube for Unreal 4.12, using the built-in plugin:
  • Fixed projection matrix calculation to support off-center projections.
  • Fixed flickering black mesh issue due to an incorrect render target texture format.
  • Updates for Unreal 4.12 compatibility.
  • VR headset should now go to black after exiting VR Preview mode in the editor when running in direct mode.


  • Added an OSVR editor window accessible through the Unity menu bar. It includes:
    • Links for launching utilities (TrackerView, Reset Yaw, DirectMode Enable/Disable, etc.) located in the OSVR Runtime/SDK directory.
    • Ability to launch any osvr_server.exe and config files from the Unity editor. Useful for faster testing of various server/render configurations.
    • Links to installers and documentation.

  • Added OSVR Server Autostart feature for Windows Standalone platform. This can be enabled or disabled on the ClientKit prefab for now, until it becomes a server config option.
  • Bug fix for crash when switching scenes in direct mode.





OSVR-Central, a new application serves as a main hub for controlling OSVR. It includes the following features :

  • Device Detection (hotplugging)
  • Ability to detect and launch OSVR server
  • OSVR Tools
  • Developer Tools
  • Access to RenderManager utilities and examples.

We are planning to soon add auto updating features and on-demand plugin downloads. OSVR-Central also lets you access OSVR-Config easily and is included in the latest runtime and SDK.

Device Detection: OSVR continues to add support for new devices (See section below about new LaputaVR HMD) and we added hotplugging feature to detect when a user plugs in or removes an OSVR compatible device. This is one of the steps we continue to improve the ease of use with OSVR as we are working towards plug-n-play experience.

Developer Tools : There are a variety of useful tools for developers to use such as Render Manager tools and demos, and OSVR related tools. OSVR-Central organizes them for an easy access. This feature is available with the SDK.


  • OSVR-Config is now included in the OSVR runtime installer.
  • Some improvements to the display and sample configuration layouts.




With the release of HDK 2, users should note the new display descriptor and sample config files distributed with the OSVR SDK. We recommend this configuration as a good starting point for orientation-only tracking with direct mode support, distortion correction, and client-side predictive tracking.

One issue Windows 10 users may notice is that when a new display is detected, the default magnification is set to 200%, resulting in an image that looks “zoomed in”. When using the HDK 2 in extended mode, the “Change the size of text, apps, and other items” option in Windows Display Settings should be set to 100%.


The OSVR-Vive plugin adds support for the HTC Vive (and Vive Pre) HMD, tracking system and hand controllers to OSVR. It has been released and updated to work with the latest (at this writing) build of SteamVR (1467410709). Tracking of HMD and controllers within the calibrated “room setup” space, buttons and touchpad, proximity sensor, and custom per-unit factory calibrated display/distortion correction in direct mode as well as extended mode are all functional when using the Vive with OSVR.

Pre-compiled binaries are available for Windows. Thanks to community contributions, the plugin also builds and runs on Linux and macOS.

This is not to be confused with SteamVR-OSVR module. The SteamVR-OSVR module is an add-in for SteamVR that allows using any OSVR-supported HMD with SteamVR applications. The OSVR-Vive plugin is a driver for OSVR Server that exposes the Vive hardware to OSVR applications.

Access OSVR-specific usage instructions for VIVE here
Access source code here


  • Head-tracking supported with all versions of Oculus SDK from 0.4.4 to 1.5.0.
  • Rendering is supported only with Oculus SDK 0.7 or older.


OSVR adds support for a new HMD – LaputaVR Hero. Together with the OSVR plugin, this HMD provides:

  • Orientation tracking
  • 110 degree FOV
  • 2560 x 1440 resolution
  • Distortion correction
  • No additional drivers/software required to get it running
  • Native VRPN driver


Under the hood



  • Completed implementation of C API for the OpenGL rendering library, including new example program.  Also includes utility functions to generate color and depth render-texture targets.  This paves the way to complete the port of RenderManager to non-Windows (and thus non-DirectX) platforms.
  • Direct3D C API updated to compile cleanly from both C and C++.  Direct3D C API example program added.
  • Reworked OpenGL header file structure to support a single set of headers on all platforms, including the ability for the application to decide which OpenGL header files to use.
  • For clients using CMake to build against OSVR-RenderManager, the osvrRenderManagerConfig.cmake file has been updated:
    • The osvrRenderManager target has been moved to the osvrRenderManager namespace. This means if you’ve written target_link_libraries(mytarget PUBLIC osvrRenderManager) in the past, you should update it to use osvrRenderManager::osvrRenderManager instead.
    • A new osvrrm_install_dependencies(<install_directory>) function has been added which will install the RenderManager library and all its dependency libraries to the specified path. This is useful in Windows for creating more portable installations.
  • Added alpha support for RenderManager in the OSVR-Android-Build project.  The current RenderManager master branch can now be compiled on this platform.
  • Merged contribution from Steffen Keiss, who provided an approach to enabling RenderManager to be used with other windowing libraries than SDL, and who also provided a Qt-based example of how to do this.
  • Sharing OpenGL contexts between the application and RenderManager, including new example program showing how to do it.
  • Beta version of distortion correction for the OSVR HDK 2 compiled into the library. See sample display descriptor using this distortion mesh in “displays/OSVR_HDK_2_0.json”.
  • Exposes distortion-correction machinery for use outside of RenderManager.
  • Merged Cailei’s VRGate HMD vendor ID.
  • Added projection matrix calculation for Unreal.  Generalized D3D projection matrix calculations to support off-axis projections on both axes.  Repaired OpenGL off-axis projection matrix calculation.
  • Added renderer-independent call to blank the screen, as requested by game engines to handle transitions between scenes.  The first renderer-agnostic example program (SolidColor) provided to show how to use it.
  • Fixed crash when constructing a new RenderManager after having destroyed a previous one.
  • Performance optimizations in the render-mesh construction and in the DirectMode rendering pipeline.  Reduction of image-tearing artifacts in DirectMode.
  • Added support for additional color buffer formats.
  • Saving and restoring D3D state during the final render pass to avoid interfering with application settings.
  • Merged contribution from Matthew Dutton, who reduced the number of vertices calculated in the distortion mesh by a factor of six, greatly improving startup time.
  • Fixed crash issues in some cases for Asynchronous Time Warp.  Also waiting for rendering completion in ATW to reduce tearing.
  • No longer has the side effect of pulling a display out of DirectMode if the config file says that it should be opened in extended mode.
  • Avoids creating callback-based render targets when they are not being used, reducing GPU memory usage.
  • Synchronized the passing of multiple eyes to an ATW buffer, ensuring display of consistent views for both.
  • Enabled applications to operate via buffer-copy when using ATW, rather than requiring them to construct two sets of buffers and alternate presentations.  This enables apps to use ATW without modification, though it does incur a slight rendering time increase.





  • High-performance logging framework now built-in, logging to console as well as to file, with client and plugin APIs to log to those same log sinks – see below.
  • Build compatibility with Boost 1.61.0 verified and build updated accordingly.
  • Adjust Windows timer frequency when a client is connected, for lower, consistent latency.
  • Added configuration files for OSVR HDK 2


Logging interface

OSVR-using apps will now log not just to the console/terminal (if they had been before), but also to file, when possible, in more detail. Log files are given a name preferably with the application name (“osvr” as the fallback) and the date and time, and, if needed (if you run OSVR Server for longer than a day, for instance) they are rotated daily.  Here is an example of the logging interface running on a Mac:

  • The location for log files is platform-specific:
    • On Windows, they’re in %LOCALAPPDATA%\OSVR\Logs (for example, C:\Users\Ryan\AppData\Local\OSVR\Logs )
    • On Linux and other *nixes, the XDG directory standard is followed so they are placed preferably in $XDG_CACHE_HOME/osvr/logs, or if that environment variable is empty or not defined, $HOME/.cache/osvr/logs
    • On macOS, they are placed in your home or user directory, under Library/Logs/OSVR
    • On Android, logging output is directed to the standard Android logging interface (“logcat”) rather than file or console.
  • Where console output is visible, you’ll see information on where the log file is placed and its base filename on-screen at the top of the output.
  • Log entries are marked with a timestamp, a “severity”, and a log source in [square brackets].
  • For developers:
    • Both PluginKit and ClientKit provide logging functions that let you specify a severity and a message string.
    • Messages you log will be associated with your plugin or app via the log source tag automatically by providing a valid context as the first argument to the function (in the C interface) – while failing to do so will not cause crashes or cause the loss of the message, it will cause the message to show up with an ugly note about a missing context.
    • Existing log messages from within OSVR functionality, but triggered by your code, will also be associated with your application or plugin by inclusion of your ID in the log source tag.
    • Now would be a great time to double-check your application IDs and plugin IDs (reverse domain-name format!)
    • When you can, port any existing direct output to the new logging functionality, to provide a unified, improved experience for both users and developers.


  • getBinaryLocation() (used to find plugins, etc) fixed on macOS, should improve behavior when running osvr server by typing ./osvr_server
  • Build fixes for using libc++ on non-Mac platforms such as Android.
  • Improved display descriptor and distortion correction for the DK2.
  • Path normalization for server autolaunching/server locate.
  • Fixed variable naming to improve compatibility with some configurations.
  • Fix Android build issues related to gtest.
  • Fixed angular velocity transformations.
  • Various compiler warning fixes.




Release 1.2.7:

  • Updated device metadata packages to clarify that a device may be an HDK 1.3 or 1.4, since they are indistinguishable from the data that the device metadata system has access to.
  • Minor .inf updates:
    • USB serial port driver now installs on Windows 10, wrapping the bundled Windows 10 USB serial port driver and providing a name for it.
    • IR camera driver modified to fix a corner case where the displayed name in the Device Manager would not always be overridden by the .inf file on Windows 10.

Tips and Tricks


Super-sampling and Sub-sampling

Recently, attention has been drawn to “hidden settings” in Oculus software and SteamVR to enable supersampling. Supersampling renders a larger, more detailed initial buffer than strictly needed to map 1-1 per pixel in the final output image. Supersampling results in every pixel in the output being sampled from more rendered data. It requires more graphics performance, but may lead to a perceived visual improvement – it’s a form of spatial anti-aliasing and essentially a (high-overhead) variant of MSAA.

OSVR RenderManager has had this capability for quite some time, as well as its inverse, subsampling, which renders fewer pixels and upscales to compensate for lower performance graphics processing. The setting is easily accessible in the RenderManager configuration, usually a separate file referenced from your OSVR server config file. The value you’d want to change is called renderOversampleFactor in the renderManagerConfig object. The default value is 1.0 (one pixel in the render buffer per pixel in the output), and the values work just like those discussed for SteamVR. Larger values render more pixels than necessary and blend/sample for the output, smaller values render fewer pixels than 1-to-1 and upscale, potentially improving performance at the expense of rendering sharpness.

Changing this setting and restarting your OSVR server will affect all native OSVR applications using RenderManager. Notably, it will not affect SteamVR as SteamVR does not use the OSVR RenderManager for distortion correction, client-side prediction, time-warp, and direct display output; however, the “Vive” supersampling configuration change should actually work with devices used through the SteamVR-OSVR plugin as well.

Client-side vs Server-side Predictive Tracking

In the context of AR and VR systems, predictive tracking refers to the process of predicting the future orientation and/or position of an object or body part. Prediction can run on the server or client, but only one of those options should be enabled for best performance. By default, server-side prediction is enabled because it works on every machine running OSVR. However we recommend using client-side prediction for machines that support it. This document gives an overview of predictive tracking and describes the steps to enable or disable client and server-side prediction.

Additional Contributions from the OSVR Community

  • Steve Le Roy Harris has created a number of new OSVR plugins:
    • OSVR-OpenHMD is a wrapper around OpenHMD which provides native driver support for the Oculus DK1 and other HMDs.
    • OSVR-KinectV2 provides joint position and orientation data from the Kinect for Xbox One.
    • OSVR-Wiimote uses the wiiuse library to support up to four Wii Remotes + Nunchucks.
    • OSVR-Fusion combines position data from one plugin with the orientation data from another plugin to present a fully-fledged tracker to OSVR clients.

The culmination of Steve’s work (video) is the ability to use Wii remotes, an Oculus DK1, and a Kinect to play games that require a tracked HMD and tracked controllers.

Additional Resources

As a reminder, we’ve set up developer chat rooms at Gitter, and they are tied to the GitHub projects.
The list of all rooms is here:
Some existing rooms:

The list of supported devices, operating systems and engines is here
The OSVR documentation repository is here

VR tutorials written by the experts at Sensics on key topics such as eye tracking, optical design, time warping, foveated rendering more are here. To only see the tutorials, click on ‘key concepts’ on the right side of that page.

Want to Participate?

Interested in contributing? Start here:

Current areas of high-priority “help wanted” are improving render manager on non-Windows platforms, tools to simplify the user experience, support of additional devices and video “compositor”

Any questions or issues? email

April 2016 Developer Updates

I wanted to provide a brief update on new repositories and features and major enhancements for the OSVR software platform.


  • New augmented reality plug-in OSVR-ARTookit
  • New initial release of interactive configuration utility OSVR-Config
  • One-click installers for end-users and developers
  • Updates to HDK firmware and OSVR Control software to support persistence modes and tracking modes.
  • Many improvements and bug fixes in the Github repositories

Details below.


In collaborator with DAQRI (see annoucement here), a new plugin has been created that connects the popular ARTookit open-source augmented reality library with OSVR. The OSVR-ARToolkit repository is here.

ARToolKit is a tracking library for creation of augmented reality applications. It uses video tracking capabilities that calculate the real camera position and orientation relative to square physical markers or natural feature markers in real time. ARToolKit was acquired by DAQRI which then open-sourced what used to be called the professional version of ARToolkit. Tho OSVR plugin receives imaging reports from video plugins, such as OpenCV, and performs marker detection. This initial implementation tracks one pattern, patt.hiro (located in the Data folder), and sends a tracker pose showing the position and orientation of the marker on the path /me/hands/left, which can then be used in all the OSVR-supported game engines.

Community contributions on connecting additional features of ARTookit with OSVR are very welcome.


The OSVR-Config utility (binaries can be downloaded from the “Get OSVR” page, source code respository is here) is an interactive utility that builds the JSON configuration file for the OSVR server as well as other implements features such as:

  • Enable/disable manually loaded plugins.
  • Look through and select one of a number of sample configurations.
  • Select a display configuration from a list.
  • Configure common RenderManager settings
  • Launch the currently selected OSVR Server.
  • Enable/Disable direct mode.
  • Launch the Tracker Viewer utility to test your configuration.

This is an early release and we would very much welcome feedback.


One-Click Installers

A key area of improvement we identified in the recent developer survey (see here) was to simplify the installation and configuration procedures. In addition to the OSVR-Config utility, we are introducing new one-click installers for 32 and 64 bit for both developers and end-users.

The installers can be found here for the runtime version and here for the developer version. Both include the RenderManager files, so a separate RenderManager install is no longer necessary.

Firmware and OSVR Control Improvements

  • Added the ability to set persistence levels. Use the OSVR Control software which you can download from here. The persistence level (see image below) is automatically stored in the HDK until changed again. This feature requires upgrading the HDK firmware which can also be done with the OSVR control software
  • Added the ability to choose Rotation Vector or Game Rotation Vector for tracking mode. Rotation Vector senses the magnetic north and does not drift. Game rotation vector is similar to the Rotation Vector except it does not use the geomagnetic field. Therefore it does not point north and drifts a bit over time. The advantage of using the game rotation vector is that relative rotations are more accurate, and not impacted by magnetic field changes. If using the Rotation Vector, it is recommended to calibrate the HMD any time it is moved to a different environment. To calibrate, move the HMD in a figure-eight motion for a few seconds and then put on a flat surface (e.g. desk) for 10 seconds. This will cause the calibration to be stored in the HDK.


Improvements and key bug fixes


  • Improved error handling/messages in case config files can’t be loaded or parsed.
  • Added new server auto-start API. This black-box style API will start the currently selected OSVR server, if it is not already running.


  • Fixed anti-aliasing bug in Unity where AA quality setting was not being applied by default.
  • Fixed bug where Unity was crashing when switching scenes in the editor.
  • RenderManager preview window off by default, for substantially improved performance.
  • Added RenderManager and Android dependencies to OSVRUnity.unitypackage.


  • OSVR initialization is now more robust with longer time-based timeouts when context initialization fails.
  • Preliminary support for Android added.

See also the announcement regarding native support for OSVR in Unreal 4.12


  • Added client-side predictive tracking in RenderManager, with different interval for each eye. Predictive tracking can be performed on the server side or on the client side. Client side is better because the client knows best at what time the prediction needs to be run.
  • Added black borders during time warp rather than texture clamping. This was seen when the overfill ratio was fairly small – smaller than the movement required by time warping.
  • Changed default configurations to ones that reduce tearing and judder
  • Fixed structure returned from OpenDisplay to not send unused/unset fields
  • Fixed DirectMode to work with nVidia driver version 364.72 on at least some GPUs (other GPUs may still need a 362.xx series driver)
  • Fixed aliasing in second render pass when using Direct3D. This works together with a similar fix in the Unity plugin.
  • Added ability to use built-in distortion mesh for OSVR HDK 1.3. Generic distortion meshes are planned for the near future.
  • Much faster loading of unstructured grid distortion correction
  • Mac OpenGL Core profile support is now working
  • Builds and links (not yet tested) on Android
  • On Windows, snapshots now include the redistributable version of d3dcompiler_47.dll, easing Windows 7 compatibility.
  • Improved DirectModeDebugging tool for NVIDIA-based systems: now checks to see if hybrid graphics (“Optimus”) systems have a directly-connected external video port that can support direct mode, and also performs driver version compatibility checking.
  • Fixed multiple heterogeneous GPU support on Windows.
  • Added GLES 2.0 sample program
  • Removed all legacy OpenGL function calls from RenderManager
  • Fixed #define collision on Linux/X11
  • Installer stored shortcuts in sub-folders by type


  • No longer relies on a VRPN driver.
  • Builds with multiple pre-1.0 runtimes.
  • Auto-detects Oculus Rifts on Windows, Linux, and OS X.


  • Updated to work with the latest OpenVR 0.9.19.

Chat Rooms, Videos and other Resources

As a reminder, we’ve set up developer chat rooms at Gitter, and they are tied to the GitHub projects.

The list of all rooms is here:

Some existing rooms:

Ever-growing list of supported devices, operating systems and engines is here

The OSVR documentation repository is here

VR tutorials on key topics such as time warping, foveated rendering, binocular overlap and more are here. To only see the tutorials, click on ‘key concepts’ on the right side of that page.

Want to Participate?

Interested in contributing? Start here:

Current areas of high-priority “help wanted” are here for OSVR-Core and here for RenderManager

Any questions or issues? email

Thank you for being part of OSVR!

RenderManager and other developer updates

Here is a brief update on new repositories and features and major enhancements for the OSVR software platform.


  • OSVR RenderManager is now open-sourced
  • New documentation repository as well as global search from
  • Dozens of new devices
  • Support for latest Unity and Unreal versions including new samples

Details below


A new documentation repository saves as the gateway for developer documentation, tips and tricks, configuration instructions and a whole lot more. Please visit

Because this is an open-source repository, we encourage you to contribute to it and share your experience.


RenderManager was previously distributed as closed-source but now it has been split into an open-source repository that performs most of the functions and two closed-source repositories (one for NVIDIA, one for AMD) that require individual NDAs with NVIDIA or AMD

What RenderManager Provides?

RenderManager provides a number of functions beyond the OSVR-Core library in support of VR rendering. It wraps the Core functions in an easy-to-use interface that implements many VR-specific needs.

  • DirectMode: On platforms that support it, RenderManager implements direct rendering to the display, bypassing operating-system delays and enabling front- buffer rendering. On Windows, this is implemented using nVidia’s VR Direct Mode rendering and AMD’s Direct-to-Display rendering. These share a common interface in RenderManager and plans are underway to extend these to new operating systems as they become available. DirectMode supports both D3D11 and OpenGL (core and legacy) on Windows.

The following capabilities are provided on all supported platforms:

  • Distortion correction: This enables undistortion of HMD lenses. Configuration files can be used to specify the type of distortion correction used, from several choices: rgb polynomial away from a center, monochromatic unstructured mesh, and rgb unstructured mesh. RenderManager includes distortion-mesh-construction for all of its rendering libraries based on all of the above input types. See RenderManager.h for more information.
  • Time Warp: Synchronous time warp is provided on all platforms. This is done at the same time as the distortion-correction render pass by reading the latest tracking information and adjusting the viewing transformation using the texture matrix to fix up changes due to motion between the start of rendering and its completion. This warping is geometrically correct for strict rotations around the center of projection and is approximated by a 2-meter distance for translations.
  • Rendering state: RenderManager produces graphics-language-specific conversion functions to describe the number and size of required textures, the viewports, projection and ModelView matrices needed to configure rendering for scenes. Configuration files specify the number of eyes, whether they are in a single screen or multiple screens, and their relative orientations. RenderManager takes all viewports and textures in their canonical (up is up) orientation and internally maps to the correct orientation, enabling the use of bitmap fonts and other rendering effects that require canonical orientation. An optional, callback-based rendering path provides these transformations for arbitrary spaces within the OSVR configuration space (head space, hand space, room space, etc.).
  • Window creation: RenderManager uses SDL on Windows, Linux, and Mac systems to construct windows of the appropriate size for a given HMD or on-screen display. Configuration file entries describe window size, placement, and orientation. For non-DirectMode operation, these show up within the operating virtual screen and can be either full-screen or windowed. For DirectMode operation, they provide full- screen operation on one or more displays.
  • OverFill & Oversampling: To enable time warp to work, the rendered view must be larger than the image to be presented on a given frame. This provides a border around the image that can be pulled in as the user’s viewport rotates. Also, the distortion caused by lenses in VR systems can cause a magnification of the screen that requires the application to render pixels at a higher density than the physical display. RenderManager handles both of these capabilities internally, hiding them from the application. Configuration file entries can adjust these; trading rendering speed for performance at run time without changes to the code.

Coming Soon

  • Asynchronous Time Warp is under development as of 2/15/2016. There is a single D3D11 example program that runs on DirectMode displays under Windows. This capability is not yet fully operational (the example program does not work when run without ATW enabled, and there are several open Github issues). When complete, this mode will be enabled by a configuration-file setting. It produces a separate rendering thread that re-warps and re-renders images at full rate even when the application renders too slowly to present a new image each frame.
  • Android support is under development. As of 2/15/2016, the OpenGL internal code is all compatible with OpenGL ES 2.0. Work is underway to port RenderManager to Android on top of the existing OSVR-Core port.
  • DirectMode/Linux is planned as graphics-card vendors finish drivers to enable it on this platform. It is being designed to use the same RenderManager interface and configuration files as the current Windows implementations.

Two RenderManager Interfaces

RenderManager provides two different interfaces, a Get/Present interface and a Callback interface. Example applications are provided that use each. The Callback interface provides the ability to easily render objects in multiple spaces (head space, hand space, etc.). The Get/Present interface lets the application have complete control over render-buffer construction.

Example programs

There are a number of example programs that highlight the different RenderManager interfaces and features.

Key features added to RenderManager

  • Added DirectMode (“Direct to Display” or D2D) support on AMD GPUs for both Direct3D11 and OpenGL on Windows through AMD LiquidVR for headset manufacturers SDK. (NDA component distributed as binary only)
  • Asynchronous Time Warp (currently only in a single example program)
  • Code compiles and links on OS X (still needs fixing for OpenGL/Core)
  • DirectMode works on nVidia cards co-installed with an Intel card
  • DirectModeDebug tool added for nVidia DirectMode
  • Code compiles and runs on Linux with all non-DirectMode features


Primarily User-Facing Changes

  • Windows builds now have the osvr_server_config.HDK13DirectMode.sample.json file copied to be their default osvr_server_config.json config file, enabling video-based tracking with IMU fusion and direct mode by default.
  • Server mainloop will now reduce CPU usage when no clients are connected.

Primarily Developer and Non-Windows Changes

  • Added cross-platform shared export header for libraries.
  • New analysis plugin: generic orientation predictive tracking, can be configured with any orientation device that reports angular velocity as well as input.
  • Fixed behavior of state interfaces when timestamps are non-monotonic: for instance, when replaying a loop of motion-capture data.
  • Dozens of new devices added. See for the full list
  • ​Build-system compatibility with OpenCV 3.1. (Windows binaries are still shipped with OpenCV 2.4.10 by default.)
  • Updates to support additional versions of Boost (1.60) for compilation.
  • Fix automated CI build and pull-request build testing on Travis-CI (building Linux and Mac).
  • New utility shipped: osvr_list_usbserial lists VID, PID, and platform-specific path for each connected USB-Serial device, primarily useful in development. (Uses the osvrUSBSerial library, intended for use by OSVR plugins: so you can embed this capability in your plugin.)
  • Experimental single-filter IMU and video-based tracker plugin merged. Not ready for general usage, but if you’re interested in tracking technology or algorithms, contributions appreciated.


Release 1.2.6  – Recommended upgrade for all HDK users

  • A new metadata package has been added for the HDK 1.3 beltbox audio.


  • This fixes issue #7 for Windows 7 users.
  • Since we can distinguish the beltbox shipped with the 1.3 from that shipped with the 1.2 (and earlier), the names for the devices have been updated to include an HDK version number, which may be convenient for all users. Below is a screen capture from “Devices and Printers” on a machine with both a HDK 1.2 and HDK 1.3 plugged in.
  • Similarly, a new device metadata package has been added to warn users if they have an outdated firmware version on their infrared camera, since this can seriously impact tracking performance. Right-clicking on such a device will reveal a menu entry to bring you to a page where you can download a firmware upgrade tool. If you want to manually upgrade the firmware on your IR camera, use the firmware update tool from


  • The major version numbers on some of the “cosmetic” INF file drivers (currently that’s display, HID, and camera) have been bumped to 10 to pre-empt the basic in-box Windows 10 drivers. This isn’t critical, since they are cosmetic only (they just rename devices in the Device Manager), but it is nice to have.



  • The new OSVRInput module now implements the standard Unreal controller and motion controller interfaces. This module implements the traditional controller button interface as well as the split motion controller interface for both buttons and hand tracking.
  • VR Preview play mode in the Unreal editor now renders the scene into your HMD in direct mode.
  • OSVR DLLs are loaded dynamically at runtime from the plugin binary directory. The plugin will also search in multiple locations for the dlls, including the engine’s third party binaries directory. Previously, these DLLs were copied to the executable directory.
  • Updates for Unreal 4.11. The plugin still works with 4.10.
  • Replaced absolute paths in the build files with engine-relative paths.
  • Breaking changes:
    • All of the original OSVR blueprint API has been deprecated. Due to design issues with the original API, we’ve decided to take a second look and create a more general purpose API. In the meantime, if you are relying on the existing blueprint API, there are instructions in for re-enabling the API. We do not recommend using the API for new projects.
    • Note: if you were using the original blueprint API for hand-tracking or controller buttons, please take a look at the new native controller and motion controller support. You can now access this functionality from your blueprints with the built-in blueprint events and functions from Unreal.


  • Updated for SteamVR-OpenVR 0.9.15
  • Special thanks to community contributor @d235j on github for this update.



A simple QT utility to help users set up their configuration parameters

Chat Rooms, Videos and other Resources

We’ve set up developer chat rooms at Gitter, and they are tied to the GitHub projects.

The list of all rooms is here:

Some existing rooms:

Ever-growing list of supported devices, operating systems and engines is here

New presentation from Unity Vision Summit 2016 on using OSVR to connect (practically) any device to AR/VR is here

Want to Participate?

Interested in contributing? Start here:

New OSVR developer resources and platform updates

A brief update on new features and major enhancements for the OSVR software platform.

Chat Rooms, Videos and other Resources

We’ve set up developer chat rooms at Gitter, and they are tied to the GitHub projects.


The list of all rooms is here:

Some existing rooms:


New video showing how to add OSVR to Unreal projects is here


New step-by-step guide for using OSVR in Unity is here


Explanation of RenderManager options for Unity and beyond is here


Ever-growing list of supported devices, operating systems and engines is here


Short blog post explaining the importance of OSVR’s role as middleware is here


New Repositories

Unity-VR-Samples (repository is here) includes the OSVR version of the samples shipping alongside Unity 5.3. This is a useful way to get started in Unity/OSVR.


  • Extensive stability, latency, performance, and reliability enhancements to the video-based (“positional”) tracking code.

  • New beacon pre-calibration application for use if you want the best performance from an  HDK from the start, every time, along with explanation about the various kinds of calibration involved in the video-based tracking – see

  • New option, enabled by default, to have the Video-IMU tracker fusion automatically reset orientation so that the camera is assumed to be forward.

  • New config files pre-set for the HDK 1.2 and 1.3 for use with RenderManager-enabled applications in direct and extended mode with optional positional tracking, and both front and rear panel trackers enabled by default. These configuration files can also be used with non-RenderManager applications.

  • New/enhanced support for the RenderManager config data to be supplied through the OSVR Server, with several sample config files bundled.

Bug fixes

  • Update display descriptors to correct projection matrices and improve distortion correction.

  • Fix video-based tracker build with some versions of the Windows SDKs.

  • Fixed compiling client applications using Visual Studio 2010.

  • Improved Video-IMU tracker fusion now requires the HMD to be held still in the camera’s view momentarily to provide a more robust determination of the camera’s pose.

  • Corrected transformation of video-based tracker pose into room space in the Video-IMU fusion, fixing mixed movement (e.g. moving sideways producing a sideways-and-down movement).

  • Fixed build system to enable building using system packages of jsoncpp on more non-Windows platforms (tested on Debian Jessie)

OSVR Render Manager

  • Improved distortion correction for the HDK 1.3.

  • Additional unstructured-grid mesh-based distortion correction strategy supported.

  • Additional mono point-sample distortion mesh strategy supported.

  • Support added for systems containing multiple, different vendor GPUs with NVIDIA direct mode, as long as the HMD is attached to a display output directly connected to the NVIDIA GPU. (This supports some, but not all, laptop integrated+discrete configurations, as well as desktop configurations with multiple GPUs or that use an integrated GPU in addition to a discrete GPU. Tested on a laptop with 6th gen Intel Core processor integrated graphics plus NVIDIA 9xxM series discrete, as well as a desktop with a 4th gen Core i7 integrated GPU plus NVIDIA 635 GT OEM discrete.)

  • New direct mode debugging/test tool will check the NVIDIA whitelist, then try to take displays in then out of direct mode, with verbose messaging. Just run “DirectModeDebugging”, from a command prompt window if you want to see the messages produced.

  • Supports new “oversample” configuration parameter to have the application produce a render texture sized to be over- or under-sampled when being distortion mapped and time warped.

  • Updated multi-vendor whitelist.

  • Now works with unmodified configuration files from OSVR-Core in addition to configuration files bundled with the separate Render Manager installer.

  • Update to NVIDIA GameWorks VR SDK for Headsets v1.1 – NVIDIA direct mode reported to require driver 361.43.

Bug Fixes

  • Performance improvements in building distortion meshes.

  • Improved error messages in case of failure in NVIDIA GameWorks VR direct-mode codepath.

  • No longer causes screen flashes due to re-entering direct mode if the display is already in direct mode on application startup.

Known Issues

  • Particularly with a (recommended) “Clean Install” of NVIDIA drivers, application of the “Combined Whitelist” file and rebooting the computer seems required for most systems using the NVIDIA driver version 361.43 to successfully use direct mode. (Driver whitelist issue)

OSVR Unity

  • New “Getting Started” guide for Unity developers:

  • Moved the main “rendering loop” from DisplayController to VRViewer. The biggest change is that a VRViewer with a camera (tagged “MainCamera”) now exists in the scene before runtime (but doesn’t have to). This brings us closer to Unity’s VR setup, as Unity VR projects will always have a MainCamera in the scene, likely referenced by other objects in the scene. A Unity VR scene can be converted to an OSVR-Unity VR scene simply by adding a VRViewer component to the MainCamera (so long as there are also DisplayController and ClientKit components in the scene), without breaking existing references.

  • “DirectMode Preview” feature, but with an expensive implementation by way of leaving the VRViewer camera on during rendering. If performance suffers in DirectMode, the first optimization to make is turning this option off. It is configurable via a boolean in the editor on the DisplayController component. This is an upcoming RenderManager feature, so this will go away soon.

  • New Unity VR Samples project:
    Good for seeing how to convert an existing Unity VR project to OSVR.

  • Updated to work with RenderManager v0.6.35, including functions for setting near and far clipping plane distances and IPD in RenderManager. The near and far clipping planes are copied from the MainCamera in the scene. The IPD is not being set from Unity code, but the functionality is there to pass to RenderManager… will remove this if the IPD value should come from a config file.

  • Also includes “SetRoomRotationUsingHead” and “ClearRoomToWorldTransform” functionality on the RenderManager path, and a “Recenter” prefab that demonstrates using a key to “recenter” the room based on the head’s current orientation.

Bug fixes

  • Fixed head-translation bug with positional tracking + RenderManager.

  • Rendering optimizations. Fixed a bug on the non-RenderManager path that caused an extra render.

  • Fixed bug where VREye was not inheriting initial rotation of VRViewer, causing incorrect rotations when parented to an object with non-zero rotation.

  • Fixed a bug where point lights were causing the display to turn white in some scenes on the RenderManager path.


Known Issues

  • Setting up the VRSurface cameras with Image Effects are a common request. There is a potential solution in the open pull request:
    We’d like some input from the community, especially someone with actual image effects in their game, to see if this actually works in-practice and meets the needs of developers.

  • White line or white rectangle outline on one half of the display in RenderManager, only in the Unity Editor. This does not affect builds.

  • Colored rectangle on one half of the display in RenderManager, only in the Unity Editor. This does not affect builds.

  • Slow “DirectMode Preview” implementation, until the feature is moved to RenderManager.


OSVR Unreal

  • Added RenderManager support! When the game is run from Visual Studio or from the standalone package, the plugin will render the scene in direct mode on your HMD.

  • Updated documentation, including a link to a tutorial video for getting started.

  • Support for Unreal 4.10.

  • Improved error handling and logging when there are errors.

  • The plugin build file now properly copies runtime DLL dependencies (e.g. OSVR-Core DLLs) to the right output binary folder during build/packaging. Previously this required manual copying.

  • Updated FOV, IPD, and projection code to use corresponding APIs from OSVR-Core. Previously these values were calculated or hard-coded in the plugin itself.

  • Removed hard-coded engine module include paths in the plugin build file. This should allow the plugin to work regardless of your Unreal engine install path (or build, if you built the engine from source).


Bug fixes

  • The ImportFromSdk.cmd script now works when SDK paths have whitespace in them.

  • Plugin now reports that the “OSVR HMD” is not connected when the OSVR server is not running or there are errors during startup.

  • Fixed a viewport issue when r.screenpercentage was set to anything other than 100. The plugin now sets r.screenpercentage on startup.

  • Fixed an error when the plugin was shutdown.


  • Updated to the latest OpenVR SDK, which is compatible with the latest release of SteamVR.

  • The plugin has now been verified to work with several SteamVR based games on Steam, including Half Life 2 (beta) and Elite Dangerous.


Bug fixes

  • The plugin now reports a suggested window position based on the corresponding render manager configuration. You can now adjust where the VR compositor window shows up based on your individual display configuration.

  • Fixed an issue where the plugin failed to report the correct IPD.

  • Fixed an issue where the SteamVR server failed to initialize if the VR Compositor window’s initial position was anywhere other than (0, 0).

Want to Participate?

Interested in contributing? Start here:

Any questions or issues? email

Thank you for being part of OSVR!

Open-source RenderManager, enhanced positional tracking and more

Here is a brief update on new features and major enhancements of of the OSVR software

Major additions

Open-source RenderManager plugin for Unity and Unreal

The RenderManager layer (available for download including example programs and source code for these programs from here: ) provides a set of high-performance low-latency rendering utilities for game engines, OpenGL and Direct3D applications. These services include:
  • Support for NVIDIA GameWorks VR
  • Timewarp, updating the rendered image with the freshest predicted tracking data just before providing this image to the graphics card
  • Enable/disable direct render to switch between extended desktop and direct mode
  • Front-buffer rendering
  • Distortion correction using multiple methods including distortion mesh, parametric distortion and more.

RenderManager does this in a generalized way to support multiple HMD configurations including single- and dual-port models, HMDs that prefer portrait or landscape mode, HMDs with required distortion correction and more.


The RenderManager plugin for Unity ( provides an open-source Unity 5.2 and Unity 5.3 plugin that uses RenderManager. This not only provides the RenderManager functionality for Unity applications but also provides a template for integrating RenderManager into other game engines.

Support for Unreal 4.9 is also available at

Auto-calibrating positional tracking with sensor fusion

A major update to the OSVR positional tracking now includes the following:
  • Sensor fusion to combine information from IR camera and IMU into more accurate, smoother tracking information
  • Take into account distortion in IR camera
  • Auto-calibrating position. Overcomes inaccuracies in the expected position of the IR LEDs (such as those due to manufacturing tolerances, opening and closing of OSVR HDK front panel). This is implemented in the OSVR-Core VideoBasedTracker plugin. It uses the following logic:



Improvements to OSVR-Core

  • Improved parsing of the HDK angular velocity data, resulting in much improved predictive tracking.
  • Adjusted HDK predictive tracking interval for a better experience based on improved data.
  • Improved install directories on Linux and Mac


  • Added ClientKit API methods for “re-center” functionality. This allows maintaining a “room-to-world” transform and setting its yaw (rotation about y axis) based on the current head orientation, primarily as an aid for porting from the Oculus SDK.
  • The SDK now ships with a general-purpose, high performance header-only Kalman filter/extended Kalman filter framework, using the Eigen matrix math library
  • Fixes to support the “Ninja” build tool (via CMake) on all platforms for very fast parallel compilation.
  • Optional PluginKit and ClientKit APIs added to the “Tracker” device interface class, exposing linear and angular velocity and acceleration.
  • Removed hard OpenCV dependency from Core
    • Note that if you have an imaging plugin that isn’t using the C++ wrappers, there may be some small changes you’ll want to make in order to ensure stability.
  • The device descriptor is now consulted to determine which callbacks (position, orientation, etc.) for a tracker to trigger.
  • Improved header providing math type interoperation with the Eigen linear algebra library.
  • Fixed reference counting for imaging clients.
  • Improved configure performance by refactoring the header-dependency tests.
  • Switched some internal implementation details from Boost.Fusion to the internal “TypePack” library: should improve compile time and reduce fragility.
  • Updated version of UIforETW Event Tracing code.

Improvements to OSVR-Unity

  • Added support for Unity 5.2 and Unity 5.3
  • Added FPS controller
  • Improved support for HMDs with two video inputs such as Sensics dSight
  • Log an error if no OSVR server was detected.
  • Merged support for Locomotion interface, to support omnidirectional treadmills such as Virtuix Omni
  • Added RenderManager support, but DLLs not yet added to unitypackage

Improvements to OSVR-Unreal

  • Added support for Unreal 4.9
    Added support for RenderManager
  • Use of the DisplayConfig API for improved display configuration reliability and flexibility.
  • Fixed stereo viewport issue.

Improvements to OSVR-HDK-Windows-Drivers

  • Fixed install in non-US locales.

Improvements to OSVR-Tracker-Viewer

  • Reduced size of binaries by not redistributing unused DLLs.


  • Adds predictive tracking to reduce perceived latency
  • Added CMakeLists.txt and instructions for compiling.
  • Added support for dSight dual-display mode in DirectMode and non
  • Adding mutex to make RenderManager more thread-safe
  • Alpha support for dual DirectMode displays (1-buffer rendering only)
  • Applications can now register the same buffer for multiple eyes
  • Client application sets the type of rendering to use, not config file
  • Added Visual Studio merge modules for CRT and MFC to installer
  • Added ability to update distortion mesh while running
  • Added configuration file to run in a non-directmode mono window
  • Added initial distortion correction for the OSVR HDK 1.3
  • Adds configuration file and white list for alpha support of Vuzix display
  • Adds support for two-window, two-display HMDs
  • Applications can now specify near and far clipping planes
  • Enabling the application to double-buffer to avoid copies in Time Warp
  • Making a separate client context for RenderManager (thread safety)
  • Per-eye timing information
  • Powers off DirectMode displays when we close them
  • Removed depth buffer clearing (and existence) from internal rendering pass
  • Removed frame delay when specifying both sync and app-block sync
  • Turns on vsync by default for the direct-mode examples to avoid tearing.
  • Using release 1.0 version of nVidia DirectMode API
Bug fixes:
  • Busy-waits rather than sleeps in demos to avoid O/S delays
  • Checks for zero-sized buffers to enable error report rather than crash
  • Failure to obtain high-priority rendering now returns PARTIAL rather than FAILURE
  • Handles window quit events from SDL
  • Includes the Visual Studio redistributable runtime and Direct3D compiler DLL.
  • Render routines check to see if the display has been opened to avoid crashing.
  • Render() callbacks not called for spaces that do not exist
  • Check for extra vsync to reduce incidence of screen tearing
  • Made the distortion mesh much finer to reduce artifacts
  • The code now reads and responds to the swap_eyes entry in display config
  • The code now responds to time warp enable in config (was not being used)


  • Updates for new releases of SteamVR OpenVR API

New and Updated Interfaces

New Repositories





Blender-OSVR (experimental)


Python-binding (experimental):








  • Utility to calculate LED bright/dark patterns for the OSVR HDK and other devices that use the OSVR IR tracking. Shows how much instantaneous overlap there is among bright LEDs in the different patterns. Allows selecting the number of LEDs, the number of bits to encode them in, and the stride to pick for time offsets among the LEDs. Shows maximum instantaneous power draw for different strides and bit depths.

Check the diagram on for frequent updates on the projects. Current diagram is:


New utilities

Documentation improvements

Special thanks

Thank to all the outside contributors, but a special shout-out to @Armada651 and @d235j


OSVR support for Mac OS X, Linux, Android, SteamVR

We wanted to provide a brief update on new and enhanced support for operating systems and game engines within OSVR.

Operating system support

Mac OS X

In the words of @d235j:

Mac support for OSVR has been steadily improving. The recent HDK firmware update has resolved several USB connectivity issues. The stack cleanly compiles, and the HDK is usable on Mac OS. SteamVR-OSVR also works on Mac OS and Linux, though there are bugs in SteamVR which need to be worked around, and the SteamVR compositor is not available.
A Homebrew tap is now available at, and at present this is the preferred way to install it on Mac OS. Near-term improvements will include improvements to the OSVR Server configuration scheme as well as better ways to locate finding plugins.
Even with these minor issues, OSVR is usable on Mac OS X, as is SteamVR. The remaining issues will be resolved in the near future, with an easy-to-use binary package coming at some point.


We have made several improvements to Android support, including:

  • Multiple deployment models: OSVR client and OSVR server, running on two different processes; client and server running on the same process and thread, as well as client and server running in the same process on different threads.
  • App to launch the OSVR server if automatic launch upon startup is not desired.
  • Improved Unity “Palace” demo
  • Additional Android samples, including using the imager plugin



OSVR is being compiled on Linux everyday as part of our Continuous Integration server. However, we improved the build instructions to make it easier for others. We have also see demos of Unity/OSVR/Oculus on Linux, SteamVR/OSVR on Linux and others.


A new OSVR driver installer has been added for Windows. It sets up drivers including:

  • CDC driver for the OSVR HDK
  • Atmel bootloader driver in case a firmware upgrade is desired to the OSVR HDK
  • Device identification in the Windows control panel and device listing

What’s next?

Aside from improvements to currently-supported platforms, we would like to continue the work towards supporting:

  • iOS
  • Embedded configurations (Linux, Android or others on single-board computers like those from Gumstix). This could enable interesting applications
  • Game consoles: XBOX, PlayStation, etc.

If you have another configuration you’d like to run OSVR on, we’d love to help out.

Game engine support


Quoting @d235j again:

We finally have a working SteamVR driver! This means you can use the OSVR HDK to play SteamVR games, or to develop software that uses the SteamVR API. So far most of the testing has been with jMonkeyEngine (via jMonkeyVR) and Team Fortress 2. TF2 is already quite playable with the HDK. If anyone would like to try using SteamVR-OSVR, binary builds and installation instructions are available at

Just to clarify, the SteamVR-OSVR project allows OSVR-supported hardware to be used with SteamVR. We do not yet have support for HTC Vive as an OSVR device, though I hope that this will be available soon.

What’s next

I am aware of teams working on a Blender plugin as well as CryEngine. If you have another game engine you’d like to integrate with OSVR, we’d love to help out.

Want to participate?

Interested in contributing? Start here:

Any questions or issues? email

Thank you for being part of OSVR!

OSVR 0.6 Released

This release of the OSVR projects brings dozens of updates for OSVR, including many improvements submitted by the community of OSVR developers on GitHub. Thanks to all of the contributors to OSVR!

Below is a description of the major additions to OSVR, followed by a “release notes” document detailing smaller changes and bug fixes.

OSVR community members – please pay attention to the “contributions wanted” section in each feature and see if you are able to help us accelerate the pace of development for OSVR.

Major Features in OSVR-Core

Optical video-based (“positional”) tracking

Centralized Display Interface

Render Manager

Predictive Tracking

Profiling tools


New Android Capabilities

Engine Integrations

Language Bindings

New Unity capabilities

New Plugins and Interfaces

Gesture interface

Locomotion interface

EyeTracker interface

SMI Eye Tracker plugin

Simulation plugins

Release Notes

OSVR-Core ClientKit API, Documentation, Examples

OSVR-Core PluginKit API, Documentation, Examples

OSVR-Core Internals, Bundled Plugins, Tools, and General Documentation

OSVR Tracker Viewer

Managed-OSVR .NET bindings






In Closing…

Major Features in OSVR-Core

Optical video-based (“positional”) tracking

The positional tracking feature uses IR LEDs that are embedded on the OSVR HDK along with the 100 Hz IR camera included with the OSVR HDK to provide real-time XYZ positional and orientation tracking of the HMD.

 The LEDs flash in a known pattern, which the camera detects. By comparing the location of the detected LEDs with their known physical locations, a position and orientation (pose) determination is made.

 The software current looks for two targets (LED patterns): one on the front of the HDK and one on the back. Additional targets can be added, and thus additional devices that have known IR LED patterns can also be tracked in the same space.

 It is also possible to assign different flashing patterns to multiple HDK units, thus allowing multiple HDK units to be tracked with the same camera. This is useful for multi-user play. Changing the IR codes on the HDK requires re-programming the IR LED board.

Sensics is working with select equipment developers to adapt the IR LED board and pattern to the specific shape of an object (e.g. glove, gaming weapon) so that that object can also be tracked with the OSVR camera.

The tracking code is included with the OSVR Core source code and is installed as an optional component while we are optimizing the performance.  It will be set as the standard tracker once camera distortion, LED position optimization, and sensor fusion with IMU data have been implemented.

 The image below shows a built-in debugging window that indicates the original image overlaid with beacon locations (in red, a tag of -1 means that the beacon has not been visible long enough to be identified) and reprojected 3D LED poses (in green, even for beacons not seen in the image).  RANSAC-based pose estimation from OpenCV provides the tracking.

Contributions wanted:

  • Make the system more configurable by moving configuration parameters into an external JSON configuration files. These parameters could include the camera to use, optimization parameters and LED positions into the configuration file.
  • Add Kalman optimal estimation filter to combine pose information from the video-based tracker and inertial measurements from the HDK’s inertial measurement unit into a combined pose + velocity + acceleration tracker that will provide smoother tracking and that can be used for predictive positional tracking.
  • Combine the output from multiple cameras for wide-area tracking.
  • Account for the optical distortion of the camera in the analysis.
  • Create a calibration tool that improves performance by accounting for the slight manufacturing variation in the LED position.
  • Create a tool that simplifies the process of adding new objects and IR LED patterns.

Centralized Display Interface

The OSVR-Core API now includes methods to retrieve the output of a computational model of the display. Previously, applications or game engine integrations were responsible for parsing display description JSON data and computing transformations themselves. This centralized system allows for improvements in the display model without requiring changes in applications, and also reduces the amount of code required in each application or game engine integration.

The conceptual model is “viewer-eye-surface” (also known as “viewer-screen”), rather than a “conventional camera”, as this suits virtual reality better.[1] However, it has been implemented to be usable in engines (such as Unity) that are camera-based, as the distinction is primarily conceptual.

As a demonstration of this API, a fairly minimal OpenGL sample (using SDL2 to open the window) is now included with OSVR-Core.

Contributions wanted:

  • OpenGL-SDL sample that uses the distortion parameter API to apply the distortion shader.

Render Manager

The Sensics/OSVR Render Manager provides optimal low-latency rendering on any OSVR-supported device.  Render Manager currently provides an enhanced experience with NVIDIA’s Gameworks VR technology on Windows. Support for additional vendors (e.g. AMD, Intel) is being worked on. We are also exploring the options to work with graphics vendors for mobile environments.

 Unlike most of the OSVR platform, the Render Manager is not open-sourced at this point. The main reason is that the NVIDIA API was provided to Sensics under NDA and thus we cannot expose it at this time.

Key features enabled by the Render Manager:

  • DirectMode: Enable an application to treat VR Headsets as head mounted displays that are accessible only to VR applications, bypassing rendering delays typical for Windows displays. DirectMode supports both Direct3D and OpenGL applications.
  • Front-Buffer Rendering: Renders directly to the front buffer to reduce latency.
  • Asynchronous Time Warp: Reduces latency by making just-in-time adjustments to the rendered image based on the latest head orientation after scene rendering but before sending the pixels to the display.  This is implemented in the OpenGL rendering pathway (including DirectMode) and hooks are in place to implement it in Direct3D.  It includes texture overfill on all borders for both eyes and supports all translations and rotations, given an approximate depth to apply to objects in the image.

Coming very soon:

  • Distortion Correction: Handling the per-color distortion found in some HMDs requires post-rendering distortion.  The same buffer-overfill rendering used in Asynchronous Time Warp will provide additional image regions for rendering.
  • High-Priority Rendering: Increasing the priority of the rendering thread associated with the final pixel scan-out ensures that every frame is displayed on time.
  • Time Tracking: Telling the application what time the future frame will be displayed lets it render the appropriate scene.  This also enables the Render Manager to do predictive tracking when producing the rendering transformations and asynchronous time warp.  The system also reports the time taken by previous rendering cycles, letting the application know when to simplify the scene to maintain an optimal update rate.
  • Unity Low-level Native Plugin Interface: A Rendering Plugin will soon enable Render Manager’s features in Unity, and enable it to work with Unity’s multithreaded rendering.

 Render Manager is currently available only for OSVR running on Windows.

Several example programs and configuration files for OpenGL (fixed-pipeline and shader code versions, callback-based and client-defined buffers based) and Direct3D11 (callback-based and client-defined buffers based, library-defined device and client-defined device) are provided and open-sourced. Also included is a program with adjustable rendering latency that can be used to test the effectiveness of asynchronous time warp and predictive tracking as application performance changes.

The RenderManager library features are controlled through a JSON configuration file:


  “meta”: {

    “schemaVersion”: 1


  “render_manager_parameters”: {

        “direct_mode”: false,

        “direct_display_index”: 0,

        “asynchronous_time_warp”: false,

        “render_overfill_factor”: 1.0,

        “num_buffers”: 2,

        “vertical_sync”: true,

        “render_library”: “OpenGL”,

        “window_title”: “OSVR”,

        “window_full_screen”: true,

        “window_x_position”: 1280,

        “window_y_position”: 0,

        “display_rotation”: 0,

        “bits_per_color”: 8



 Contributions wanted:

  • We are seeking to work with additional graphics chip vendors to create a universal, multi-platform library for high-performance rendering.

Predictive Tracking

Predictive tracking reduces the perceived latency between motion and rendering by estimating head position at a future point in time. At present, the OSVR predictive tracking uses the angular velocity of the head to estimate orientation 16 mSec (1 frame at 60 FPS) into the future.

 Angular velocity is available as part of the orientation report from the OSVR HDK. For other HMDs that do not provide angular velocity, it can be estimated using finite differencing of successive angular position reports.

 Contributions wanted:

  • Improve the algorithm to extract velocity from non-HDK trackers.
  • Extract angular acceleration and use that to improve the quality of predictive tracking.
  • Fuse angular (yaw/pitch/roll) and linear (X/Y/Z) data to improve quality of positional tracking.
  • Configure the look-ahead time either through API or through an external configuration file.

Profiling tools

Utilizing ETW – Event Tracing for Windows – the OSVR performance profiler allows optimizing application performance by identifying performance bottlenecks throughout the entire software stack.


 Event Tracing for Windows (ETW) is an efficient kernel-level tracing facility that lets you log kernel or application-defined events to a log file and then interactively inspect and visualize them with a graphical tool. As the name suggests, ETW is available only for the Windows platform. However, OSVR-Core’s tracing instrumentation and custom events use an internal, cross-platform OSVR tracing API for portability.

 Currently the default libraries have tracing turned off to minimize any possible performance impacts. However, the “tracing” directory contains tracing-enabled binaries along with instructions on how to swap them in to use the tracing features. See this slide deck for a brief introduction on this powerful tool:

 Contributions wanted:

  • Identify and add additional useful custom tracing events.
  • Implement the internal OSVR tracing API using other platforms’ equivalents of ETW for access to the instrumented events.
  • Measure the performance impact of running a tracing-enabled build, to potentially enable it by default.


The default OSVR configuration has the client and server run as two separate processes. Amongst other advantages, this keeps the device servicing code out of the “hot path” of the render loop and allows multiple clients to communicate with the same server.

 In some cases, it may be useful to run the server and client in a single process, with their main loops sharing a single thread. Examples where this might be useful include automated tests, special-purpose apps, or apps on platforms that do not support interprocess communication or multiple threads (in which case no async plugins can be used either). The new JointClientKit library was added to allow those special use cases: it provides methods to manually configure a server that will run synchronously with the client context.

 Note that the recommended usage of OSVR has not changed: you should still use ClientKit and a separate server process in nearly all cases. Other ways of simplifying the user experience, including hidden/launch-on-demand server processes, are under investigation.

Contributions wanted:

  • Automated tests for ClientKit APIs, using JointClientKit to start a temporary server with only dummy plugins loaded.

New Android Capabilities

A new device plugin has been written to support Android orientation sensors. This plugin exposes an orientation interface to the OSVR server running on an Android device. This is available here:

A new Android OpenGL ES 2.0 sample demonstrates basic OSVR usage on Android in C++. You can find this sample here:

 An early version of an Android app has been written that launches the OSVR server on a device to run in the background. This eliminates the need to root the phone, which existed in previous OSVR/Android version. You can find this code here:

 The Unity Palace demo ( for Android can now work with the internal orientation sensors, as well as with external sensors

Contributions wanted:

  • Predictive tracking/filtering in the Android sensor plugin.
  • Sensor fusion to improve fidelity of tracking.
  • Additional sample apps.
  • Connect to Android camera to provide Imager interface for Android.

Engine Integrations

OSVR continues to expand the range of available engines to which it integrates., this includes:

  • Unity
  • Unreal
  • Monogame
  • Valve OpenVR (in beta):

 Here are new integrations as well as improvements to existing integrations:

Language Bindings

The .NET language bindings for OSVR have been updated to support new interface types for eye tracking. This includes 2D and 3D eye tracking, direction, location (2D), and blink interface types.

New Unity capabilities

Unity adapters for the eye tracking interface types have been added, as well as prefabs and utility behaviors that make it easier to incorporate eye tracking functionality into Unity scenes.

The optional distortion shader has been completely reworked, to be more efficient as well as to provide a better experience with Unity 4.6 free edition.

The OSVR-Unity plugin now retrieves the output of the computational display model from the OSVR-Core API. This eliminates the need to parse JSON display descriptor data in Unity, which allows for improvements in the display model without having to rebuild a game. The “VRDisplayTracked” prefab has been improved to create a stereo display at runtime based on the configured number of viewers and eyes.

 Coming soon: Distortion Mesh support. Mesh-based distortion uses a pre-computed mesh rather than a shader to do the heavy lifting for distortion. There is a working example in a branch of OSVR-Unity.

 Contributions wanted:

  • UI to display hardware specs, display parameters, and performance statistics.

New Plugins and Interfaces

Gesture interface

The gesture interface brings new functionality that allows OSVR to support devices that detect body gestures including movements of hand, head and other parts of the body. This provides ways to integrate devices such as Leap Motion®[2], Nod Labs Ring, Microsoft® Kinect® [3], Thalmic Labs MYO[4] armband and many others. Developers can combine gesture interface with others to provide meaningful information such as orientation, position, acceleration and/or velocity about user’s body part(s) pose.

New API has been added on the plugin and client sides to report/retrieve gestures for device. The gesture API provides a list of pre-set gestures while staying flexible to allow custom gestures to be used.

 We added a simulation plugin – com_osvr_example_Gesture (see description of Simulation Plugins below) that uses gesture interface to feed a list of gestures, and also created a sample client application to visually output gestures

received from the plugins. These useful tools would help when developing new plugins or client apps.

Using the new interface, we are working on releasing a plugin for Nod Ring that will expose a gesture interface as well as existing interfaces.

Contributions wanted:

  • Development of new plugins for devices that have gesture recognition such as :
  • ThalmicLabs Myo Armband
  • Logbar Ring[5]
  • New devices are welcome

Locomotion interface

The locomotion interface adds an API to support a class of devices also known as Omni-Directional Treadmills (ODT) allow walking and running on a motion platform and then converts this movement into navigation input in a virtual environment. Some examples of devices that would be able to use locomotion interface are: Virtuix Omni,  Cyberith  Virtualizer, Infinadeck, and others. These devices are very useful for First Person Shooters (FPS) games and by combining locomotion interface with tracker additional features such as body orientation, jump/crouch sensing could be added.

The API allows the ODTs to report the following data (on a 2D plane):

  • User’s navigational velocity
  • User’s navigational position

Contributions wanted:

  • Development of plugins for ODTs using Locomotion interface (walking / running) combined with additional OSVR interfaces (jumping, crouching, looking around)

EyeTracker interface

The EyeTracker interface provides an API to report detailed information about the movement of one or both eyes.

 This includes support for reporting:

  • 3D gaze direction – a unit directional vector in 3D
  • 2D gaze direction – location within 2D region
  • Detection of blink events

EyeTracker devices are effective tools to interact inside VR environment providing intuitive way to make a selection, move objects, etc. The data reported from the devices can be analyzed to understand human behavior, marketing research and other research topics as well as gaming applications. They can also be used to perform the most accurate virtual reality rendering, customized for the location of your pupil every frame.

A .NET binding for EyeTracker (described above) allows easy integration of the eye tracking data into Unity.

 Contributions wanted:

  • Create new plugins for eye tracking devices
  • Expand EyeTracker interface to report additional eye attributes such as pupil size, pupil aspect ratio, saccades

SMI Eye Tracker plugin

In collaboration with SensoMotoric Instruments GmbH (SMI) we are releasing a new

plugin for SMI trackers. For instance, the plugin supports the SMI Upgrade Package for Oculus RiftDK2. It uses the SMI SDK to provide real-time streaming of eye and gaze data and report it via EyeTracker interface.

The SMI plugin also provides an OSVR Imaging interface to stream the eye tracker images.

 The plugin is available at –

 Contributions wanted:

  • Create similar plugins for other eye-tracking vendors.

Simulation plugins

Along with the newly added interfaces (eyetracker, gesture, locomotion), we provide simulation plugins that serve as an example on how to use a certain interface. Their purpose is emulate a certain type of device (joystick, eyetracker, head tracker, etc.), connected to OSVR server, and feed simulation data to the client. These plugins were added as a tool for developing applications so that developers can easily run tests without the need to attach multiple devices to the computer. We would be expanding the available simulation plugins to have one for every type of interface. Simulation plugins are available in OSVR-Core and can be modified to a specific purpose.

Contributions wanted:

  • Create new simulation plugins for interfaces that do not have one already.
  • Add new or improve the existing data generating algorithms used in simulation plugins
  • Create new simulation plugins that use a combination of various interfaces such as joystick (button + analog)

Release Notes

Items listed here are generally in addition to the major items highlighted above, and do not include the hundreds of commits involved in the development and tuning of these or the above features – see the Git logs if you’d like to see all the details!

OSVR-Core ClientKit API, Documentation, Examples

  • All API updates have been reflected in the Managed-OSVR .NET bindings.
  • Added osvrClientCheckStatus()/ClientContext::checkStatus() method to expose whether the library was able to successfully connect to an OSVR server.
  • Display API and OpenGL examples.

OSVR-Core PluginKit API, Documentation, Examples

  • Added C++ wrapper to the configured device constructor API.
  • Added an example “configured” plugin
  • Improved self-contained example plugin.
  • Link from the documentation to the OpenCV video capture plugin – a bundled plugin that also serves as an example.
  • Improve completeness and usability of the Doxygen-generated API documentation.

OSVR-Core Internals, Bundled Plugins, Tools, and General Documentation

  • A large number of improvements, including a review of content as well as custom styling, were applied to the generated API documentation (“Doxygen” output).
  • Timestamps are now verified for in-order arrival before updating client interface object state.
  • Improved compatibility of vendored COM smart pointer code.
  • Preliminary support for building the video-based tracker with OpenCV 3 (assertions quickly fail on an MSYS2/MinGW64 test build).
  • Decreased code duplication through use of an internal “UniqueContainer” type built on existing internal “ContainerWrapper” templates.
  • Client contexts now store their own deleters to allow safe cross-library deallocation, needed for JointClientKit.
  • Vendored “Eigen” linear algebra library updated from 3.2.4 to a post-3.2.5 snapshot of the 3.2 branch.
  • Reduced noisy configure messages from CMake.
  • Moved non-automated tests to an internal examples folder so they still get built (and thus verified as buildable) by CI.
  • Compile the header-dependency tests for headers marked as C-safe as both C and C++, since some headers have extra functionality when built as C++.

OSVR Tracker Viewer

  • Adjusted the default settings to show full pose, not just orientation, for the /me/head path.
  • Print a message to the command prompt about the command line arguments available when none are passed.
  • Start up tracker viewer zoomed in (distance of 1 meter to origin) to provide a more usable experience from the start.
  • Hide coordinate axes until the associated tracker has reported at least once.
  • Compile the “coordinate axes” model into the executable itself to remove a point of failure.

Managed-OSVR .NET bindings

  • Updates to include bindings for new APIs added to ClientKit.
  • Fix a lifetime-management bug that may have resulted in crashes (double-free or use-after-free) if a ClientContext was disposed of before child objects.
  • Added accessor for the raw ClientContext within the safe handle object, for use with code that interoperates with additional native code.


  • Operation order tweaks that result in latency improvements.
  • ShaderLab/Cg/HLSL distortion shader simplified and optimized for higher performance.


  • Updated to newer version of OSVR-Unity.
  • Disabled mouselook.


  • Included OpenGL distortion shaders simplified and optimized for higher performance.


  • Display descriptor schema v1 elaborated with “as-used” details.


  • Fixed infinite loop in hand-coded matrix initialization.
  • Improved/simplified math code using Eigen and its “map” feature.

In Closing…

As always, the OSVR team with the support of the community is continuously  adding smaller features and addressing known issues. You can see all of these on Github such as in this link for OSVR Core

Interested in contributing? Start here:

Any questions or issues? email

Thank you for being part of OSVR!

Yuval (“VRguy“) and the OSVR team

[1] Kreylos, 2012, “Standard camera model considered harmful.” <>

[2] Leap Motion is a trademark of Leap Motion, Inc.

[3] Microsoft and Kinect are either registered trademarks or trademarks of Microsoft Corporation in the United States and/or other countries.

[4] ThalmicLabs™ and Myo™ are trademarks owned by Thalmic Labs Inc.

[5] Logbar Inc. Ring is a trademark of Logbar Inc.


New interfaces, SteamVR and more

It’s been a busy month of OSVR development, and we wanted to update you about some of the new and improved capabilities

  • ndk-build compatible Android builds
  • SteamVR plugin
  • “Reset yaw” and other OSVR core improvements
  • New eye tracker and other interfaces
  • Windows installer
  • Additional improvements

Improved Android builds

ndk-build compatible Android builds of OSVR now being built – see

SteamVR plugin (alpha)

This is a SteamVR driver for allowing applications written against that API to work with hardware and software running with the OSVR software framework. See

Note that despite similar goals, the internal models of the two systems are not the same (in most cases, OSVR has a more flexible, descriptive model), so if you’re writing an application from scratch, be sure to evaluate that. This driver exists primarily for compatibility reasons, so existing software using the SteamVR system can run on OSVR.

There are still some issues with this plugin – see and we would appreciate the community’s help in getting this to the finish line

OSVR Core improvements

Key improvements include:

  • “Reset yaw” tool added, allowing to set the yaw direction of the head tracker to a desired ‘0’ position
  • Initial support for updated OSVR HDK tracker firmware, providing higher reporting rate and angular velocity reports

New Interfaces

Interfaces – pipes of data – provide an abstraction layer for VR devices and peripherals so that a game developer does not need to hardcode support for particular hardware. Instead, just like a Windows application prints to the Windows print services, the game developer connects to the OSVR abstraction layer. If a new VR goggle was introduced in 2016, an OSVR-based game published in 2015 could support this new goggle as soon as the goggle had an OSVR driver. 

New interfaces that have been completed or are imminent are:

  • Eye tracking
  • Locomotion – supporting omnidirectional treadmills
  • Gesture
  • Skeleton – for body and finger tracking

See the interface definition section

Windows Installer

The Windows installer, provides a simple way to install the OSVR server. See

Additional improvements

  • OSVR-Devel mailing/discussion list now fully up and running, now including searchable web archives and a web interface courtesy of Gmane – see
  • Managed-OSVR (used by OSVR-Unity and OSVR-Monogame):
    • Improved method of pre-loading native libraries
    • Wrappers for the State API (in addition to the existing Callback API)
    • Wrappers for Eye tracker, Direction, and Position2D device interface classes added
  • OSVR-Unity: see also
    • Switched to using externally-built Managed-OSVR, for improved reliability, quicker development, and multiplatform (including 64-bit) support
    • Tagged version 0.2 following this migration.
    • Made the ClientKit prefab a persistent singleton, preventing duplication and instability in multi-scene projects.
    • Updated way of making custom code access OSVR callbacks and state (RequiresAnalogInterface, etc)
    • Updated examples, code to use new Managed-OSVR InterfaceAdapter functionality.

As always, the OSVR team with the support of the community is continuously  adding smaller features and addressing known issues. You can see all of these on Github such as in this link for OSVR Core

Interested in contributing? Start here:

Any questions or issues? email

Thank you for being part of OSVR!

OSVR platform updates, Android distro and more

We wanted to update you about several new capabilities of OSVR:

  • OSVR Android build and example program
  • New developer discussion list
  • Support for additional HMDs
  • New imager interface
  • New MonoGame engine integration

Android Distribution for OSVR

OSVR remains committed to providing multi-platform, multi-engine, multi-device support. We have now released an initial Android OSVR distribution (fully open-sourced, of course) and also a simple Unity demo (“OSVR Palace” which also works on Windows) to go with it. Visit, your gateway to the OSVR source code or the OSVR-Android Github project directly at

New developer discussion list and archive

This new list allows you to directly communicate with other members of the OSVR community as well as to review archives of previous discussions. You can manage your subscription, view message archives as well as view our etiquette guidelines  at

If you are receiving this OSVR update, it does not mean that you are automatically subscribed to the OSVR mailing list, so please register to the developer discussion list if you are interested.

Support for additional HMDs

We added JSON device descriptors for several HMDs including those from Vuzix, Oculus, VRVana and Sensics. These provide support for various resolutions, various number of video inputs (where supported) and distortion correction coefficients. By selecting the device descriptor, an OSVR application can automatically (without re-compiling) adapt to new HMDs. This continues our drive to support an ever-growing list of peripherals and displays. Adding a display descriptor is easy and we welcome submissions for additional models.

New Imager Interface

The imager interface allows receiving video streams from connected cameras, both locally and over the network. Multiple OSVR applications can subscribe to the same video stream. Images are sent using an OpenCV format, allowing immediate access to the the de-facto open-source standard for computer vision and image processing. Imagers are useful for a wide range of applications: augmented reality, SLAM, face recognition and more. Devices such as Leap Motion also provide an imaging interface. The imager interface also includes sample programs.

MonoGame plugin

MonoGame is a free software used by game developers to make their Windows and Windows Phone games run on other systems. It currently supports Mac OS, Linux, iOS, Android, PlayStation Mobile, and the OUYA console. It implements the Microsoft XNA 4 Application programming interface.

The OSVR plugin for MonoGame (thanks @JeroMiya !) connects OSVR to MonoGame and allows MonoGame apps to run on OSVR-supported HMDs and hardware. See an introductory video here

Additional game engine integrations are coming!

As always, the OSVR team with the support of the community is continously adding smaller features and addressing known issues. You can see all of these on Github such as in this linkfor OSVR Core

New OSVR Resources and Platform Updates

We wanted to update you regarding three news items regarding OSVR:

  • Availability of new repository for interface definitions
  • New instructional resources for OSVR
  • Config files for additional devices

OSVR interface specs and device integration proposals

A new Github repository is made public covering interface specs and device integration proposals. An interface – a pipe of data – is an important concept in OSVR. The repository includes specific types of interfaces such as:

  • Eye tracking
  • Locomotion (such as an “omnidirectional treadmill”)
  • Skeleton
  • Enhanced tracker
  • Imaging
  • Analog
  • Buttons

It includes discussions on how to factor vendor-specific devices into OSVR interfaces as well as graphML diagrams of the proposals.

We decided it’s best to have these discussions in the open to allow all the interested developers to follow and contribute to the formation of the OSVR interfaces. Contributions and comments are most welcome!

New instructional resources for OSVR:

  • New SlideShare presentation and speaker notes regarding path trees. OSVR maintains a “path tree” – similar to a URL or file system path – in which all the sensing and rendering data is made available. Aliases are configured in the server to essentially redirect from a semantic path (a path with a meaningful name) all the way back to the system-specific hardware details. Thus, while direct device access by name is possible, we recommend accessing semantic paths. Just like a game allows mapping of various buttons to various game actions, OSVR allows defining the connection between interfaces, analysis plugins and actions. The connection between interfaces can be pre-loaded or can be changed dynamically. This presentation covers this important OSVR concept.
  • New SlideShare presentation and speaker notes providing introduction to OSVR hardware and software.
  • New 5-minute video showing how to integrate OSVR with Unity.

We welcome any requests for additional instructional material.

Configuration files for additional devices

Now that we have made it easy to import VRPN devices into OSVR, we’ve created OSVR config files for several of them, including:

  • Xbox controller S
  • Xbox Controller 360
  • Sidewinder
  • Sidewinder 2
  • Afterglow AX1
  • Saitek ST290
  • Logitech Extrem 3D Pro mouse
  • Futaba InterLink Elite
  • Griffin PowerMate
  • ART Dtrack flystick

These files can be found under bin/external-devices in the binary snapshots.