Category Archives: Uncategorized

April 2016 Developer Updates

I wanted to provide a brief update on new repositories and features and major enhancements for the OSVR software platform.

Highlights

  • New augmented reality plug-in OSVR-ARTookit
  • New initial release of interactive configuration utility OSVR-Config
  • One-click installers for end-users and developers
  • Updates to HDK firmware and OSVR Control software to support persistence modes and tracking modes.
  • Many improvements and bug fixes in the Github repositories

Details below.

OSVR-ARToolkit

In collaborator with DAQRI (see annoucement here), a new plugin has been created that connects the popular ARTookit open-source augmented reality library with OSVR. The OSVR-ARToolkit repository is here.

ARToolKit is a tracking library for creation of augmented reality applications. It uses video tracking capabilities that calculate the real camera position and orientation relative to square physical markers or natural feature markers in real time. ARToolKit was acquired by DAQRI which then open-sourced what used to be called the professional version of ARToolkit. Tho OSVR plugin receives imaging reports from video plugins, such as OpenCV, and performs marker detection. This initial implementation tracks one pattern, patt.hiro (located in the Data folder), and sends a tracker pose showing the position and orientation of the marker on the path /me/hands/left, which can then be used in all the OSVR-supported game engines.

Community contributions on connecting additional features of ARTookit with OSVR are very welcome.

OSVR-Config

The OSVR-Config utility (binaries can be downloaded from the “Get OSVR” page, source code respository is here) is an interactive utility that builds the JSON configuration file for the OSVR server as well as other implements features such as:

  • Enable/disable manually loaded plugins.
  • Look through and select one of a number of sample configurations.
  • Select a display configuration from a list.
  • Configure common RenderManager settings
  • Launch the currently selected OSVR Server.
  • Enable/Disable direct mode.
  • Launch the Tracker Viewer utility to test your configuration.

This is an early release and we would very much welcome feedback.

osvrConfig

One-Click Installers

A key area of improvement we identified in the recent developer survey (see here) was to simplify the installation and configuration procedures. In addition to the OSVR-Config utility, we are introducing new one-click installers for 32 and 64 bit for both developers and end-users.

The installers can be found here for the runtime version and here for the developer version. Both include the RenderManager files, so a separate RenderManager install is no longer necessary.

Firmware and OSVR Control Improvements

  • Added the ability to set persistence levels. Use the OSVR Control software which you can download from here. The persistence level (see image below) is automatically stored in the HDK until changed again. This feature requires upgrading the HDK firmware which can also be done with the OSVR control software
  • Added the ability to choose Rotation Vector or Game Rotation Vector for tracking mode. Rotation Vector senses the magnetic north and does not drift. Game rotation vector is similar to the Rotation Vector except it does not use the geomagnetic field. Therefore it does not point north and drifts a bit over time. The advantage of using the game rotation vector is that relative rotations are more accurate, and not impacted by magnetic field changes. If using the Rotation Vector, it is recommended to calibrate the HMD any time it is moved to a different environment. To calibrate, move the HMD in a figure-eight motion for a few seconds and then put on a flat surface (e.g. desk) for 10 seconds. This will cause the calibration to be stored in the HDK.

OSVRControl

Improvements and key bug fixes

OSVR-Core

  • Improved error handling/messages in case config files can’t be loaded or parsed.
  • Added new server auto-start API. This black-box style API will start the currently selected OSVR server, if it is not already running.

OSVR-Unity

  • Fixed anti-aliasing bug in Unity where AA quality setting was not being applied by default.
  • Fixed bug where Unity was crashing when switching scenes in the editor.
  • RenderManager preview window off by default, for substantially improved performance.
  • Added RenderManager and Android dependencies to OSVRUnity.unitypackage.

OSVR-Unreal

  • OSVR initialization is now more robust with longer time-based timeouts when context initialization fails.
  • Preliminary support for Android added.

See also the announcement regarding native support for OSVR in Unreal 4.12

OSVR-RenderManager

  • Added client-side predictive tracking in RenderManager, with different interval for each eye. Predictive tracking can be performed on the server side or on the client side. Client side is better because the client knows best at what time the prediction needs to be run.
  • Added black borders during time warp rather than texture clamping. This was seen when the overfill ratio was fairly small – smaller than the movement required by time warping.
  • Changed default configurations to ones that reduce tearing and judder
  • Fixed structure returned from OpenDisplay to not send unused/unset fields
  • Fixed DirectMode to work with nVidia driver version 364.72 on at least some GPUs (other GPUs may still need a 362.xx series driver)
  • Fixed aliasing in second render pass when using Direct3D. This works together with a similar fix in the Unity plugin.
  • Added ability to use built-in distortion mesh for OSVR HDK 1.3. Generic distortion meshes are planned for the near future.
  • Much faster loading of unstructured grid distortion correction
  • Mac OpenGL Core profile support is now working
  • Builds and links (not yet tested) on Android
  • On Windows, snapshots now include the redistributable version of d3dcompiler_47.dll, easing Windows 7 compatibility.
  • Improved DirectModeDebugging tool for NVIDIA-based systems: now checks to see if hybrid graphics (“Optimus”) systems have a directly-connected external video port that can support direct mode, and also performs driver version compatibility checking.
  • Fixed multiple heterogeneous GPU support on Windows.
  • Added GLES 2.0 sample program
  • Removed all legacy OpenGL function calls from RenderManager
  • Fixed #define collision on Linux/X11
  • Installer stored shortcuts in sub-folders by type

OSVR-Oculus-Rift

  • No longer relies on a VRPN driver.
  • Builds with multiple pre-1.0 runtimes.
  • Auto-detects Oculus Rifts on Windows, Linux, and OS X.

SteamVR-OSVR

  • Updated to work with the latest OpenVR 0.9.19.

Chat Rooms, Videos and other Resources

As a reminder, we’ve set up developer chat rooms at Gitter, and they are tied to the GitHub projects.

The list of all rooms is here: https://gitter.im/orgs/OSVR/rooms

Some existing rooms:

Ever-growing list of supported devices, operating systems and engines is here

The OSVR documentation repository is here

VR tutorials on key topics such as time warping, foveated rendering, binocular overlap and more are here. To only see the tutorials, click on ‘key concepts’ on the right side of that page.

Want to Participate?

Interested in contributing? Start here: http://osvr.github.io/contributing/

Current areas of high-priority “help wanted” are here for OSVR-Core and here for RenderManager

Any questions or issues? email support@osvr.com

Thank you for being part of OSVR!

OSVR 0.6 Released

This release of the OSVR projects brings dozens of updates for OSVR, including many improvements submitted by the community of OSVR developers on GitHub. Thanks to all of the contributors to OSVR!

Below is a description of the major additions to OSVR, followed by a “release notes” document detailing smaller changes and bug fixes.

OSVR community members – please pay attention to the “contributions wanted” section in each feature and see if you are able to help us accelerate the pace of development for OSVR.

Major Features in OSVR-Core

Optical video-based (“positional”) tracking

Centralized Display Interface

Render Manager

Predictive Tracking

Profiling tools

JointClientKit

New Android Capabilities

Engine Integrations

Language Bindings

New Unity capabilities

New Plugins and Interfaces

Gesture interface

Locomotion interface

EyeTracker interface

SMI Eye Tracker plugin

Simulation plugins

Release Notes

OSVR-Core ClientKit API, Documentation, Examples

OSVR-Core PluginKit API, Documentation, Examples

OSVR-Core Internals, Bundled Plugins, Tools, and General Documentation

OSVR Tracker Viewer

Managed-OSVR .NET bindings

OSVR-Unity

OSVR-Unity-Palace-Demo

Distortionizer

OSVR-JSON-Schemas

SteamVR-OSVR

In Closing…

Major Features in OSVR-Core

Optical video-based (“positional”) tracking

The positional tracking feature uses IR LEDs that are embedded on the OSVR HDK along with the 100 Hz IR camera included with the OSVR HDK to provide real-time XYZ positional and orientation tracking of the HMD.

 The LEDs flash in a known pattern, which the camera detects. By comparing the location of the detected LEDs with their known physical locations, a position and orientation (pose) determination is made.

 The software current looks for two targets (LED patterns): one on the front of the HDK and one on the back. Additional targets can be added, and thus additional devices that have known IR LED patterns can also be tracked in the same space.

 It is also possible to assign different flashing patterns to multiple HDK units, thus allowing multiple HDK units to be tracked with the same camera. This is useful for multi-user play. Changing the IR codes on the HDK requires re-programming the IR LED board.

Sensics is working with select equipment developers to adapt the IR LED board and pattern to the specific shape of an object (e.g. glove, gaming weapon) so that that object can also be tracked with the OSVR camera.

The tracking code is included with the OSVR Core source code and is installed as an optional component while we are optimizing the performance.  It will be set as the standard tracker once camera distortion, LED position optimization, and sensor fusion with IMU data have been implemented.

 The image below shows a built-in debugging window that indicates the original image overlaid with beacon locations (in red, a tag of -1 means that the beacon has not been visible long enough to be identified) and reprojected 3D LED poses (in green, even for beacons not seen in the image).  RANSAC-based pose estimation from OpenCV provides the tracking.

Contributions wanted:

  • Make the system more configurable by moving configuration parameters into an external JSON configuration files. These parameters could include the camera to use, optimization parameters and LED positions into the configuration file.
  • Add Kalman optimal estimation filter to combine pose information from the video-based tracker and inertial measurements from the HDK’s inertial measurement unit into a combined pose + velocity + acceleration tracker that will provide smoother tracking and that can be used for predictive positional tracking.
  • Combine the output from multiple cameras for wide-area tracking.
  • Account for the optical distortion of the camera in the analysis.
  • Create a calibration tool that improves performance by accounting for the slight manufacturing variation in the LED position.
  • Create a tool that simplifies the process of adding new objects and IR LED patterns.

Centralized Display Interface

The OSVR-Core API now includes methods to retrieve the output of a computational model of the display. Previously, applications or game engine integrations were responsible for parsing display description JSON data and computing transformations themselves. This centralized system allows for improvements in the display model without requiring changes in applications, and also reduces the amount of code required in each application or game engine integration.

The conceptual model is “viewer-eye-surface” (also known as “viewer-screen”), rather than a “conventional camera”, as this suits virtual reality better.[1] However, it has been implemented to be usable in engines (such as Unity) that are camera-based, as the distinction is primarily conceptual.

As a demonstration of this API, a fairly minimal OpenGL sample (using SDL2 to open the window) is now included with OSVR-Core.

Contributions wanted:

  • OpenGL-SDL sample that uses the distortion parameter API to apply the distortion shader.

Render Manager

The Sensics/OSVR Render Manager provides optimal low-latency rendering on any OSVR-supported device.  Render Manager currently provides an enhanced experience with NVIDIA’s Gameworks VR technology on Windows. Support for additional vendors (e.g. AMD, Intel) is being worked on. We are also exploring the options to work with graphics vendors for mobile environments.

 Unlike most of the OSVR platform, the Render Manager is not open-sourced at this point. The main reason is that the NVIDIA API was provided to Sensics under NDA and thus we cannot expose it at this time.

Key features enabled by the Render Manager:

  • DirectMode: Enable an application to treat VR Headsets as head mounted displays that are accessible only to VR applications, bypassing rendering delays typical for Windows displays. DirectMode supports both Direct3D and OpenGL applications.
  • Front-Buffer Rendering: Renders directly to the front buffer to reduce latency.
  • Asynchronous Time Warp: Reduces latency by making just-in-time adjustments to the rendered image based on the latest head orientation after scene rendering but before sending the pixels to the display.  This is implemented in the OpenGL rendering pathway (including DirectMode) and hooks are in place to implement it in Direct3D.  It includes texture overfill on all borders for both eyes and supports all translations and rotations, given an approximate depth to apply to objects in the image.

Coming very soon:

  • Distortion Correction: Handling the per-color distortion found in some HMDs requires post-rendering distortion.  The same buffer-overfill rendering used in Asynchronous Time Warp will provide additional image regions for rendering.
  • High-Priority Rendering: Increasing the priority of the rendering thread associated with the final pixel scan-out ensures that every frame is displayed on time.
  • Time Tracking: Telling the application what time the future frame will be displayed lets it render the appropriate scene.  This also enables the Render Manager to do predictive tracking when producing the rendering transformations and asynchronous time warp.  The system also reports the time taken by previous rendering cycles, letting the application know when to simplify the scene to maintain an optimal update rate.
  • Unity Low-level Native Plugin Interface: A Rendering Plugin will soon enable Render Manager’s features in Unity, and enable it to work with Unity’s multithreaded rendering.

 Render Manager is currently available only for OSVR running on Windows.

Several example programs and configuration files for OpenGL (fixed-pipeline and shader code versions, callback-based and client-defined buffers based) and Direct3D11 (callback-based and client-defined buffers based, library-defined device and client-defined device) are provided and open-sourced. Also included is a program with adjustable rendering latency that can be used to test the effectiveness of asynchronous time warp and predictive tracking as application performance changes.

The RenderManager library features are controlled through a JSON configuration file:

{

  “meta”: {

    “schemaVersion”: 1

  },

  “render_manager_parameters”: {

        “direct_mode”: false,

        “direct_display_index”: 0,

        “asynchronous_time_warp”: false,

        “render_overfill_factor”: 1.0,

        “num_buffers”: 2,

        “vertical_sync”: true,

        “render_library”: “OpenGL”,

        “window_title”: “OSVR”,

        “window_full_screen”: true,

        “window_x_position”: 1280,

        “window_y_position”: 0,

        “display_rotation”: 0,

        “bits_per_color”: 8

  }

}

 Contributions wanted:

  • We are seeking to work with additional graphics chip vendors to create a universal, multi-platform library for high-performance rendering.

Predictive Tracking

Predictive tracking reduces the perceived latency between motion and rendering by estimating head position at a future point in time. At present, the OSVR predictive tracking uses the angular velocity of the head to estimate orientation 16 mSec (1 frame at 60 FPS) into the future.

 Angular velocity is available as part of the orientation report from the OSVR HDK. For other HMDs that do not provide angular velocity, it can be estimated using finite differencing of successive angular position reports.

 Contributions wanted:

  • Improve the algorithm to extract velocity from non-HDK trackers.
  • Extract angular acceleration and use that to improve the quality of predictive tracking.
  • Fuse angular (yaw/pitch/roll) and linear (X/Y/Z) data to improve quality of positional tracking.
  • Configure the look-ahead time either through API or through an external configuration file.

Profiling tools

Utilizing ETW – Event Tracing for Windows – the OSVR performance profiler allows optimizing application performance by identifying performance bottlenecks throughout the entire software stack.

 

 Event Tracing for Windows (ETW) is an efficient kernel-level tracing facility that lets you log kernel or application-defined events to a log file and then interactively inspect and visualize them with a graphical tool. As the name suggests, ETW is available only for the Windows platform. However, OSVR-Core’s tracing instrumentation and custom events use an internal, cross-platform OSVR tracing API for portability.

 Currently the default libraries have tracing turned off to minimize any possible performance impacts. However, the “tracing” directory contains tracing-enabled binaries along with instructions on how to swap them in to use the tracing features. See this slide deck for a brief introduction on this powerful tool: http://osvr.github.io/presentations/20150901-Intro-ETW-OSVR/

 Contributions wanted:

  • Identify and add additional useful custom tracing events.
  • Implement the internal OSVR tracing API using other platforms’ equivalents of ETW for access to the instrumented events.
  • Measure the performance impact of running a tracing-enabled build, to potentially enable it by default.

JointClientKit

The default OSVR configuration has the client and server run as two separate processes. Amongst other advantages, this keeps the device servicing code out of the “hot path” of the render loop and allows multiple clients to communicate with the same server.

 In some cases, it may be useful to run the server and client in a single process, with their main loops sharing a single thread. Examples where this might be useful include automated tests, special-purpose apps, or apps on platforms that do not support interprocess communication or multiple threads (in which case no async plugins can be used either). The new JointClientKit library was added to allow those special use cases: it provides methods to manually configure a server that will run synchronously with the client context.

 Note that the recommended usage of OSVR has not changed: you should still use ClientKit and a separate server process in nearly all cases. Other ways of simplifying the user experience, including hidden/launch-on-demand server processes, are under investigation.

Contributions wanted:

  • Automated tests for ClientKit APIs, using JointClientKit to start a temporary server with only dummy plugins loaded.

New Android Capabilities

A new device plugin has been written to support Android orientation sensors. This plugin exposes an orientation interface to the OSVR server running on an Android device. This is available here: https://github.com/OSVR/OSVR-Android-Plugins

A new Android OpenGL ES 2.0 sample demonstrates basic OSVR usage on Android in C++. You can find this sample here: https://github.com/OSVR/OSVR-Android-Samples

 An early version of an Android app has been written that launches the OSVR server on a device to run in the background. This eliminates the need to root the phone, which existed in previous OSVR/Android version. You can find this code here:  https://github.com/OSVR/OSVR-AndroidServerLauncher

 The Unity Palace demo (https://github.com/OSVR/OSVR-Unity-Palace-Demo/releases/download/v0.1.1-android/OSVR-Palace-Android-0.1.1.zip) for Android can now work with the internal orientation sensors, as well as with external sensors

Contributions wanted:

  • Predictive tracking/filtering in the Android sensor plugin.
  • Sensor fusion to improve fidelity of tracking.
  • Additional sample apps.
  • Connect to Android camera to provide Imager interface for Android.

Engine Integrations

OSVR continues to expand the range of available engines to which it integrates., this includes:

  • Unity
  • Unreal
  • Monogame
  • Valve OpenVR (in beta): https://github.com/OSVR/SteamVR-OSVR

 Here are new integrations as well as improvements to existing integrations:

Language Bindings

The .NET language bindings for OSVR have been updated to support new interface types for eye tracking. This includes 2D and 3D eye tracking, direction, location (2D), and blink interface types.

New Unity capabilities

Unity adapters for the eye tracking interface types have been added, as well as prefabs and utility behaviors that make it easier to incorporate eye tracking functionality into Unity scenes.

The optional distortion shader has been completely reworked, to be more efficient as well as to provide a better experience with Unity 4.6 free edition.

The OSVR-Unity plugin now retrieves the output of the computational display model from the OSVR-Core API. This eliminates the need to parse JSON display descriptor data in Unity, which allows for improvements in the display model without having to rebuild a game. The “VRDisplayTracked” prefab has been improved to create a stereo display at runtime based on the configured number of viewers and eyes.

 Coming soon: Distortion Mesh support. Mesh-based distortion uses a pre-computed mesh rather than a shader to do the heavy lifting for distortion. There is a working example in a branch of OSVR-Unity.

 Contributions wanted:

  • UI to display hardware specs, display parameters, and performance statistics.

New Plugins and Interfaces

Gesture interface

The gesture interface brings new functionality that allows OSVR to support devices that detect body gestures including movements of hand, head and other parts of the body. This provides ways to integrate devices such as Leap Motion®[2], Nod Labs Ring, Microsoft® Kinect® [3], Thalmic Labs MYO[4] armband and many others. Developers can combine gesture interface with others to provide meaningful information such as orientation, position, acceleration and/or velocity about user’s body part(s) pose.

New API has been added on the plugin and client sides to report/retrieve gestures for device. The gesture API provides a list of pre-set gestures while staying flexible to allow custom gestures to be used.

 We added a simulation plugin – com_osvr_example_Gesture (see description of Simulation Plugins below) that uses gesture interface to feed a list of gestures, and also created a sample client application to visually output gestures

received from the plugins. These useful tools would help when developing new plugins or client apps.

Using the new interface, we are working on releasing a plugin for Nod Ring that will expose a gesture interface as well as existing interfaces.

Contributions wanted:

  • Development of new plugins for devices that have gesture recognition such as :
  • ThalmicLabs Myo Armband
  • Logbar Ring[5]
  • New devices are welcome

Locomotion interface

The locomotion interface adds an API to support a class of devices also known as Omni-Directional Treadmills (ODT) allow walking and running on a motion platform and then converts this movement into navigation input in a virtual environment. Some examples of devices that would be able to use locomotion interface are: Virtuix Omni,  Cyberith  Virtualizer, Infinadeck, and others. These devices are very useful for First Person Shooters (FPS) games and by combining locomotion interface with tracker additional features such as body orientation, jump/crouch sensing could be added.

The API allows the ODTs to report the following data (on a 2D plane):

  • User’s navigational velocity
  • User’s navigational position

Contributions wanted:

  • Development of plugins for ODTs using Locomotion interface (walking / running) combined with additional OSVR interfaces (jumping, crouching, looking around)

EyeTracker interface

The EyeTracker interface provides an API to report detailed information about the movement of one or both eyes.

 This includes support for reporting:

  • 3D gaze direction – a unit directional vector in 3D
  • 2D gaze direction – location within 2D region
  • Detection of blink events

EyeTracker devices are effective tools to interact inside VR environment providing intuitive way to make a selection, move objects, etc. The data reported from the devices can be analyzed to understand human behavior, marketing research and other research topics as well as gaming applications. They can also be used to perform the most accurate virtual reality rendering, customized for the location of your pupil every frame.

A .NET binding for EyeTracker (described above) allows easy integration of the eye tracking data into Unity.

 Contributions wanted:

  • Create new plugins for eye tracking devices
  • Expand EyeTracker interface to report additional eye attributes such as pupil size, pupil aspect ratio, saccades

SMI Eye Tracker plugin

In collaboration with SensoMotoric Instruments GmbH (SMI) we are releasing a new

plugin for SMI trackers. For instance, the plugin supports the SMI Upgrade Package for Oculus RiftDK2. It uses the SMI SDK to provide real-time streaming of eye and gaze data and report it via EyeTracker interface.

The SMI plugin also provides an OSVR Imaging interface to stream the eye tracker images.

 The plugin is available at – https://github.com/OSVR/OSVR-SMI

 Contributions wanted:

  • Create similar plugins for other eye-tracking vendors.

Simulation plugins

Along with the newly added interfaces (eyetracker, gesture, locomotion), we provide simulation plugins that serve as an example on how to use a certain interface. Their purpose is emulate a certain type of device (joystick, eyetracker, head tracker, etc.), connected to OSVR server, and feed simulation data to the client. These plugins were added as a tool for developing applications so that developers can easily run tests without the need to attach multiple devices to the computer. We would be expanding the available simulation plugins to have one for every type of interface. Simulation plugins are available in OSVR-Core and can be modified to a specific purpose.

Contributions wanted:

  • Create new simulation plugins for interfaces that do not have one already.
  • Add new or improve the existing data generating algorithms used in simulation plugins
  • Create new simulation plugins that use a combination of various interfaces such as joystick (button + analog)

Release Notes

Items listed here are generally in addition to the major items highlighted above, and do not include the hundreds of commits involved in the development and tuning of these or the above features – see the Git logs if you’d like to see all the details!

OSVR-Core ClientKit API, Documentation, Examples

  • All API updates have been reflected in the Managed-OSVR .NET bindings.
  • Added osvrClientCheckStatus()/ClientContext::checkStatus() method to expose whether the library was able to successfully connect to an OSVR server.
  • Display API and OpenGL examples.

OSVR-Core PluginKit API, Documentation, Examples

  • Added C++ wrapper to the configured device constructor API.
  • Added an example “configured” plugin
  • Improved self-contained example plugin.
  • Link from the documentation to the OpenCV video capture plugin – a bundled plugin that also serves as an example.
  • Improve completeness and usability of the Doxygen-generated API documentation.

OSVR-Core Internals, Bundled Plugins, Tools, and General Documentation

  • A large number of improvements, including a review of content as well as custom styling, were applied to the generated API documentation (“Doxygen” output).
  • Timestamps are now verified for in-order arrival before updating client interface object state.
  • Improved compatibility of vendored COM smart pointer code.
  • Preliminary support for building the video-based tracker with OpenCV 3 (assertions quickly fail on an MSYS2/MinGW64 test build).
  • Decreased code duplication through use of an internal “UniqueContainer” type built on existing internal “ContainerWrapper” templates.
  • Client contexts now store their own deleters to allow safe cross-library deallocation, needed for JointClientKit.
  • Vendored “Eigen” linear algebra library updated from 3.2.4 to a post-3.2.5 snapshot of the 3.2 branch.
  • Reduced noisy configure messages from CMake.
  • Moved non-automated tests to an internal examples folder so they still get built (and thus verified as buildable) by CI.
  • Compile the header-dependency tests for headers marked as C-safe as both C and C++, since some headers have extra functionality when built as C++.

OSVR Tracker Viewer

  • Adjusted the default settings to show full pose, not just orientation, for the /me/head path.
  • Print a message to the command prompt about the command line arguments available when none are passed.
  • Start up tracker viewer zoomed in (distance of 1 meter to origin) to provide a more usable experience from the start.
  • Hide coordinate axes until the associated tracker has reported at least once.
  • Compile the “coordinate axes” model into the executable itself to remove a point of failure.

Managed-OSVR .NET bindings

  • Updates to include bindings for new APIs added to ClientKit.
  • Fix a lifetime-management bug that may have resulted in crashes (double-free or use-after-free) if a ClientContext was disposed of before child objects.
  • Added accessor for the raw ClientContext within the safe handle object, for use with code that interoperates with additional native code.

OSVR-Unity

  • Operation order tweaks that result in latency improvements.
  • ShaderLab/Cg/HLSL distortion shader simplified and optimized for higher performance.

OSVR-Unity-Palace-Demo

  • Updated to newer version of OSVR-Unity.
  • Disabled mouselook.

Distortionizer

  • Included OpenGL distortion shaders simplified and optimized for higher performance.

OSVR-JSON-Schemas

  • Display descriptor schema v1 elaborated with “as-used” details.

SteamVR-OSVR

  • Fixed infinite loop in hand-coded matrix initialization.
  • Improved/simplified math code using Eigen and its “map” feature.

In Closing…

As always, the OSVR team with the support of the community is continuously  adding smaller features and addressing known issues. You can see all of these on Github such as in this link for OSVR Core

Interested in contributing? Start here: http://osvr.github.io/contributing/

Any questions or issues? email support@osvr.com

Thank you for being part of OSVR!

Yuval (“VRguy“) and the OSVR team


[1] Kreylos, 2012, “Standard camera model considered harmful.” <http://doc-ok.org/?p=27>

[2] Leap Motion is a trademark of Leap Motion, Inc.

[3] Microsoft and Kinect are either registered trademarks or trademarks of Microsoft Corporation in the United States and/or other countries.

[4] ThalmicLabs™ and Myo™ are trademarks owned by Thalmic Labs Inc.

[5] Logbar Inc. Ring is a trademark of Logbar Inc.

 

New interfaces, SteamVR and more

It’s been a busy month of OSVR development, and we wanted to update you about some of the new and improved capabilities

  • ndk-build compatible Android builds
  • SteamVR plugin
  • “Reset yaw” and other OSVR core improvements
  • New eye tracker and other interfaces
  • Windows installer
  • Additional improvements

Improved Android builds

ndk-build compatible Android builds of OSVR now being built – see https://github.com/OSVR/OSVR-Android-SDK

SteamVR plugin (alpha)

This is a SteamVR driver for allowing applications written against that API to work with hardware and software running with the OSVR software framework. See https://github.com/OSVR/SteamVR-OSVR

Note that despite similar goals, the internal models of the two systems are not the same (in most cases, OSVR has a more flexible, descriptive model), so if you’re writing an application from scratch, be sure to evaluate that. This driver exists primarily for compatibility reasons, so existing software using the SteamVR system can run on OSVR.

There are still some issues with this plugin – see https://github.com/OSVR/SteamVR-OSVR/issues/5 and we would appreciate the community’s help in getting this to the finish line

OSVR Core improvements

Key improvements include:

  • “Reset yaw” tool added, allowing to set the yaw direction of the head tracker to a desired ‘0’ position
  • Initial support for updated OSVR HDK tracker firmware, providing higher reporting rate and angular velocity reports

New Interfaces

Interfaces – pipes of data – provide an abstraction layer for VR devices and peripherals so that a game developer does not need to hardcode support for particular hardware. Instead, just like a Windows application prints to the Windows print services, the game developer connects to the OSVR abstraction layer. If a new VR goggle was introduced in 2016, an OSVR-based game published in 2015 could support this new goggle as soon as the goggle had an OSVR driver. 

New interfaces that have been completed or are imminent are:

  • Eye tracking
  • Locomotion – supporting omnidirectional treadmills
  • Gesture
  • Skeleton – for body and finger tracking

See the interface definition section https://github.com/OSVR/OSVR-Specs-and-Proposals

Windows Installer

The Windows installer, provides a simple way to install the OSVR server. See https://github.com/OSVR/OSVR-win-installer#readme

Additional improvements

  • OSVR-Devel mailing/discussion list now fully up and running, now including searchable web archives and a web interface courtesy of Gmane – see http://osvr.github.io/mailing-lists/
  • Managed-OSVR (used by OSVR-Unity and OSVR-Monogame):
    • Improved method of pre-loading native libraries
    • Wrappers for the State API (in addition to the existing Callback API)
    • Wrappers for Eye tracker, Direction, and Position2D device interface classes added
  • OSVR-Unity: see also https://github.com/OSVR/OSVR-Unity/blob/master/OSVR-Unity/CHANGES.md
    • Switched to using externally-built Managed-OSVR, for improved reliability, quicker development, and multiplatform (including 64-bit) support
    • Tagged version 0.2 following this migration.
    • Made the ClientKit prefab a persistent singleton, preventing duplication and instability in multi-scene projects.
    • Updated way of making custom code access OSVR callbacks and state (RequiresAnalogInterface, etc)
    • Updated examples, code to use new Managed-OSVR InterfaceAdapter functionality.

As always, the OSVR team with the support of the community is continuously  adding smaller features and addressing known issues. You can see all of these on Github such as in this link for OSVR Core

Interested in contributing? Start here: http://osvr.github.io/contributing/

Any questions or issues? email support@osvr.com

Thank you for being part of OSVR!

OSVR platform updates, Android distro and more

We wanted to update you about several new capabilities of OSVR:

  • OSVR Android build and example program
  • New developer discussion list
  • Support for additional HMDs
  • New imager interface
  • New MonoGame engine integration

Android Distribution for OSVR

OSVR remains committed to providing multi-platform, multi-engine, multi-device support. We have now released an initial Android OSVR distribution (fully open-sourced, of course) and also a simple Unity demo (“OSVR Palace” which also works on Windows) to go with it. Visit osvr.github.io, your gateway to the OSVR source code or the OSVR-Android Github project directly at https://github.com/OSVR/OSVR-Android-Build

New developer discussion list and archive

This new list allows you to directly communicate with other members of the OSVR community as well as to review archives of previous discussions. You can manage your subscription, view message archives as well as view our etiquette guidelines  at http://lists.getosvr.org/listinfo.cgi/osvr-devel-getosvr.org

If you are receiving this OSVR update, it does not mean that you are automatically subscribed to the OSVR mailing list, so please register to the developer discussion list if you are interested.

Support for additional HMDs

We added JSON device descriptors for several HMDs including those from Vuzix, Oculus, VRVana and Sensics. These provide support for various resolutions, various number of video inputs (where supported) and distortion correction coefficients. By selecting the device descriptor, an OSVR application can automatically (without re-compiling) adapt to new HMDs. This continues our drive to support an ever-growing list of peripherals and displays. Adding a display descriptor is easy and we welcome submissions for additional models.

New Imager Interface

The imager interface allows receiving video streams from connected cameras, both locally and over the network. Multiple OSVR applications can subscribe to the same video stream. Images are sent using an OpenCV format, allowing immediate access to the the de-facto open-source standard for computer vision and image processing. Imagers are useful for a wide range of applications: augmented reality, SLAM, face recognition and more. Devices such as Leap Motion also provide an imaging interface. The imager interface also includes sample programs.

MonoGame plugin

MonoGame is a free software used by game developers to make their Windows and Windows Phone games run on other systems. It currently supports Mac OS, Linux, iOS, Android, PlayStation Mobile, and the OUYA console. It implements the Microsoft XNA 4 Application programming interface.

The OSVR plugin for MonoGame (thanks @JeroMiya !) connects OSVR to MonoGame and allows MonoGame apps to run on OSVR-supported HMDs and hardware. See an introductory video here

Additional game engine integrations are coming!

As always, the OSVR team with the support of the community is continously adding smaller features and addressing known issues. You can see all of these on Github such as in this linkfor OSVR Core

New OSVR Resources and Platform Updates

We wanted to update you regarding three news items regarding OSVR:

  • Availability of new repository for interface definitions
  • New instructional resources for OSVR
  • Config files for additional devices

OSVR interface specs and device integration proposals

A new Github repository is made public covering interface specs and device integration proposals. An interface – a pipe of data – is an important concept in OSVR. The repository includes specific types of interfaces such as:

  • Eye tracking
  • Locomotion (such as an “omnidirectional treadmill”)
  • Skeleton
  • Enhanced tracker
  • Imaging
  • Analog
  • Buttons

It includes discussions on how to factor vendor-specific devices into OSVR interfaces as well as graphML diagrams of the proposals.

We decided it’s best to have these discussions in the open to allow all the interested developers to follow and contribute to the formation of the OSVR interfaces. Contributions and comments are most welcome!


New instructional resources for OSVR:

  • New SlideShare presentation and speaker notes regarding path trees. OSVR maintains a “path tree” – similar to a URL or file system path – in which all the sensing and rendering data is made available. Aliases are configured in the server to essentially redirect from a semantic path (a path with a meaningful name) all the way back to the system-specific hardware details. Thus, while direct device access by name is possible, we recommend accessing semantic paths. Just like a game allows mapping of various buttons to various game actions, OSVR allows defining the connection between interfaces, analysis plugins and actions. The connection between interfaces can be pre-loaded or can be changed dynamically. This presentation covers this important OSVR concept.
  • New SlideShare presentation and speaker notes providing introduction to OSVR hardware and software.
  • New 5-minute video showing how to integrate OSVR with Unity.

We welcome any requests for additional instructional material.


Configuration files for additional devices

Now that we have made it easy to import VRPN devices into OSVR, we’ve created OSVR config files for several of them, including:

  • Xbox controller S
  • Xbox Controller 360
  • Sidewinder
  • Sidewinder 2
  • Afterglow AX1
  • Saitek ST290
  • Logitech Extrem 3D Pro mouse
  • Futaba InterLink Elite
  • Griffin PowerMate
  • ART Dtrack flystick

These files can be found under bin/external-devices in the binary snapshots.
 

New OSVR Core features: path tree updates

Here is an update on some important OSVR improvements:

OSVR Core:

This relates to version v0.2-1-g0f15a69

This release is an extensive update: fully-configurable path tree, use of device descriptors for auto-routing, and more.

Path tree refresher

Path trees are an important concept in OSVR and perhaps worth a quick reminder. OSVR maintains a “path tree” – similar to a URL or file system path – in which all the sensing and rendering data is made available. Aliases are configured in the server to essentially redirect from a semantic path (a path with a meaningful name) all the way back to the system-specific hardware details. Thus, while direct device access by name is possible, it is not recommended: instead, we recommend accessing semantic paths. This accommodates use cases where the hardware is not immediately available at startup or is changed during operation without any impact to the application developer. Some examples:

  • A path capable of position, orientation, or pose (position and orientation) callbacks or state access, associated with the left hand: /me/hands/left
  • Position of the “0” controller of the Hydra, directly accessed in the namespace of the com.osvr.Multiserver plugin: /com_osvr_Multiserver/RazerHydra0/tracker/0
  • Output of the first smoothing filter in a hypothetical org.example.smoothing plugin: /org_example_smoothing/smooth_filter/0

Just like a game allows mapping of various buttons to various game actions, OSVR allows defining the connection between interfaces, analysis plugins and actions. For instance:

  • /joystick/button/1 → /actions/fire maps the first joystick button into a fire action. While the game could choose to access /joystick/button/1 directly, it is recommended to access /actions/fire in this example because this allows changing the flow of information from the hardware through the OSVR layers without changing the game itself.
  • /com_osvr_Multiserver/RazerHydra0/tracker/0 → /org_example_smoothing/smooth_filter/0 → /me/hands/left specifies that the position of the first Hydra controller goes through a smoothing filter and then is mapped to the left hand.

The connection between interfaces can be pre-loaded or can be changed dynamically.

Key changes:

  • Updated server and client internal data structures, for:
    • Fully-configurable path tree, with support for single or multi-level aliases for semantic name use with no additional performance hit. For instance, this allows a game-friendly path such as /me/hands/left to be dynamically mapped to a path in the plugin /com_osvr_Multiserver/RazerHydra0/semantic/left which in turn is mapped to the physical path /com_osvr_Multiserver/RazerHydra0/tracker/0. All these mappings are resolved up-front so they do not create a performance hit at runtime.
    • More efficient data transfer.
  • Partial to complete auto-configuration of aliases based on JSON device descriptor data embedded in plugins.
  • Full access to external (local or remote) VRPN servers/devices with Tracker, Button, and Analog interfaces. (Previously, only experimental support for Tracker devices) See the updated osvr_server_config.externalvrpn.sample.json
  • Attributes of display now controlled and configurable by the server: see osvr_server_config.dSight.json for config info. This allows switching between various HMDs (or other displays) without recompiling the application.
  • Display descriptor files added for a wide range of HMDs and display modes. See the OSVR HDK descriptor as an example

Breaking changes:

  • To support the data structure updates, some internal messages were changed, so v0.2+ client libraries can only communicate with v0.2+ servers: v0.1 and v0.2 do not interoperate. (However, client ABI or overall client API did not change, so you should be able to just replace the DLLs without recompiling your client application.)
  • If your device driver was not submitting a JSON device descriptor, it will be mostly inaccessible as that is used to populate the path tree. See the example on how these work.
  • The method of configuring access to external VRPN servers has changed, so if you’re using that functionality, see the updated sample config files.
  • Plugins have been renamed, and many configuration files were able to be simplified greatly. See the updated sample config files.

Additional changes:

  • New tools: There are new tools: osvr_print_tree displays the contents of the path tree (try -h for help with command line arguments), and PathTreeExport exports path tree structure to DOT format – see the documentation for usage
  • Support for auto-detecting a USB serial port name by VID/PID.
  • More efficient data transfer
  • The contents of the osvrTransform and osvrRouting shared libraries (.dll files) have been folded into osvrCommon and thus those files are now no longer produced by the build, nor need to be distributed. This is internal API only, so no changes to apps expected.

Definition of OSVR interfaces:

We are working to define and enhance additional OSVR interfaces such as eye tracking, locomotion (for devices like those from Cyberith and Virtuix) as well as support additional devices. If you are interested in participating in this process, please let us know.

Additional updates:

1. The OSVR waffle.io board contains an overview of issues currently in GitHub issue trackers for all OSVR framework projects. It summarizes the issues in a number of lists:

  • Those under “Contributions Wanted” are likely to be the easiest entry points into contributing, though in some cases they may simply be tasks that require skills that existing contributors do not have.
  • Those under “Ready” are not blocked by any other issues and are approved for development for integration.
  • Those under “In Progress” should typically show a user who is working on them – if you want to take on a task, move it here and assign yourself.
  • The “Done” list should be self-explanatory – it contains closed issues from the past 7 days (automatically removed).
  • All issues not otherwise assigned to a list end up in “Backlog”. You’re welcome to work on these (whether through verification, triage, or development), though please use the issue thread to communicate your plans and progress.

Of course, the issue lists are not all-encompassing: if you’ve got a contribution you’d like to make, we’d love to see it! Filing an issue on the right project would be a great first step.

2. Follow OSVR on Twitter @OpenSource_VR

These latest updates also include contribution from the OSVR community, for which we are very thankful.

Interested in contributing? Start here: http://osvr.github.io/contributing/

Any questions or issues? email support@osvr.com

Thank you for being part of OSVR!

Unity plugin improvements, .Net bindings and new projects

Dear OSVR developers,

We wanted to briefly update you on several OSVR improvements:

Unity plugin:

Refers to version: 0.1.71-ga184206

Key changes (see CHANGES.md file for complete list):

  • Updated plugin to be compatible with Unity 5.
  • Increased frequency of OSVR client updates for reduced latency as well as additional performance improvements
  • Dynamically sets display parameters (field of view, resolution, side by side mode) based on JSON display descriptor. This allows the same Unity app to run well with multiple different OSVR-supported HMDs

We recommend that existing users of OSVR Unity plugin re-build their project with this new plugin. The performance and usability improvements are worth it.

Additional updates:

  • Improved locating of DLLs in both the editor and built players, and improved locating of the (temporary, game-local) json file in built players. Previously, some Unity projects ran fine inside the editor but had to have files copied over to run standalone. With this fix, it is no longer necessary to move the built executable to the project root.
  • Modified DisplayInterface to look for json in _Data folder at runtime.     If it’s not there, load json from Unity scene as usual.
  • Moved DLLSearchPathFixer.fix() to Awake().     Removed constructor.
  • Created static DeviceDescriptor.Parse method and moved code from DisplayInterface.GetDeviceDescription over to to this static method.
  • Untabified changes to DeviceDescriptor and DisplayInterface.

This version does not yet include support for Unity 5 64-bit – that is in progress and being tested, expected soon. https://github.com/OSVR/OSVR-Unity/issues/19

Vuzix plugin:

Version v0.1-18-g7758a1c

  • Resolved issues with orientation and zero-point.

 

.NET bindings:

The base .NET bindings for OSVR have been extracted from the OSVR-Unity plugin into a separate repo – https://github.com/OSVR/Managed-OSVR – for optimal re-usability, and are being actively improved by us and the community. This separate version currently includes support for 32 and 64 bit native libraries, have a fully MSBuild-based build system, and builds assemblies for both .NET 2.0 and 4.5 framework versions. A NuGet package is coming soon https://github.com/OSVR/Managed-OSVR/issues/2

New projects on osvr.github.io:

  • OSVR HDK: includes production file for the OSVR Hacker Developer Kit
  • sensics/Latency-test: includes code and instructions for Arduino-based latency tester

 

Definition of OSVR interfaces:

We are working to define and enhance additional OSVR interfaces such as eye tracking, locomotion (for devices like those from Cyberith and Virtuix) as well as support additional devices. If you are interested in participating in this process, please let us know.

Additional updates:

1. We have updated a couple of white papers on the OSVR site:

2. We are providing additional suggestions for OSVR roadmap on our Wiki page
3. The OSVR waffle.io board contains an overview of issues currently in GitHub issue trackers for all OSVR framework projects. It summarizes the issues in a number of lists:

  • Those under “Contributions Wanted” are likely to be the easiest entry points into contributing, though in some cases they may simply be tasks that require skills that existing contributors do not have.
  • Those under “Ready” are not blocked by any other issues and are approved for development for integration.
  • Those under “In Progress” should typically show a user who is working on them – if you want to take on a task, move it here and assign yourself.
  • The “Done” list should be self-explanatory – it contains closed issues from the past 7 days (automatically removed).
  • All issues not otherwise assigned to a list end up in “Backlog”. You’re welcome to work on these (whether through verification, triage, or development), though please use the issue thread to communicate your plans and progress.

Of course, the issue lists are not all-encompassing: if you’ve got a contribution you’d like to make, we’d love to see it! Filing an issue on the right project would be a great first step.
These latest updates also include contribution from the OSVR community, for which we are very thankful.

Interested in contributing? Start here: http://osvr.github.io/contributing/

Any questions or issues? email support@osvr.com

Thank you for being part of OSVR!

 

Improved Unity plugin (v92), new Unreal and Vuzix plugins

We wanted to briefly update you on several OSVR improvements: an improved Unity plugin, a new plugin for Unreal Engine and a new plugin for Vuzix HMDs.

Improved Unity plugin

Version 92 of the Unity plugin adds several new improvements:

  • Compatibility with Unity 5
  • Added distortion correction shader that uses parameters from the display JSON descriptor (discovered through the distortionizer tool)
  • Support for wide-field HMDs with partial binocular overlap such as the Sensics dSight and VRUnion Claire
  • Performance improvements
  • Added NEWS.md file to summarize recent changes

We also updated the whitepaper describing how to convert Oculus Unity projects to OSVR Unity projects. It can be found at http://osvr.github.io/

New Unreal Engine plugin

The Unreal Engine plugin is now public on Github. Please visit http://osvr.github.io/build-with/ or https://github.com/OSVR/OSVR-Unreal

New plugin for Vuzix HMDs

This new plugin supports Wrap1200 and iWear 720 HMDs. Find the source code here https://github.com/OSVR/OSVR-Vuzix or download the latest build here: http://osvr.github.io/using/

This plugin enables OSVR applications to run on Vuzix products. We are continuing to add support for additional vendors.

These latest updates also include contribution from the OSVR community. Interested in contributing? Start here: http://osvr.github.io/contributing/

Any questions or issues? email support@osvr.com

A brief overview of the OSVR software repositories

The OSVR team opened up most of the source code repositories to the public this weekend, and a few additional repositories will be opened in the coming days.  Many have asked us for a brief overview of the project.

The best place to start is osvr.github.io. If you haven’t read the ‘introduction to OSVR whitepaper‘ you might want to do so.

There are several github projects under the /osvr organization. They are as follows:

Key projects:

  • OSVR-Core : this is the heart of the project. The OSVR_server executable connects the game to the OSVR hardware and software components.

Utilities:

  • OSVR-Tracker-Viewer is a utility that graphically shows the position and orientation of the head and hand controllers. It is also an OSVR client, and thus an example on how to connect to and extract data from the server
  • Distortionizer: a utility to estimate distortion correction parameters for various HMDs and a shader to implement the parameters estimated by the Distortionizer. OSVR has JSON descriptor files for HMDs (and many other objects) and the distortion parameters are part of that JSON file

Game engine plugins:

  • OSVR-Unity includes a prefab component that can be imported into Unity.
  • OSVR-Unreal (to be released later this week) is an Unreal Engine plugin

Development tools:

  • OSVR-Boxstarter is a Boxstarter install that helps quickly set up a development environment on a Windows machine
  • OSVR-JSON-Editor is the source code for a tool (deployed version here) that helps create and edit the JSON descriptor files
  • OSVR-JSON-Schemas is a repository for such JSON files

Plugins:

  • OSVR-Oculus-Rift provides a plugin that allows using the position and orientation data of an Oculus device inside OSVR.
  • OSVR-Vuzix (to be released later this week) does the same for Vuzix headsets
Additional projects are coming. There is also a wiki page. Issues are currently tracked as part of the Github pages of the projects and we are looking to add an open-source project management tool.
OSVR (licensed under Apache 2.0 license) aims to create a multi-platform framework to detect, configure and operate a wide range of VR devices as well as to allow smart plugins that turn data into useful information. This is a big undertaking and there is a lot more work to be done to fulfill that vision, so we are looking for all the help we can get. We are super encouraged by the support and feedback we are getting from developers all over the world that believe in the open-source concept, and want to make their contributions towards moving VR forward.
Let us know what you think. What’s missing, what your priorities are and how we can get you involved. Welcome to OSVR.

OSVR now open to the pubilc on Github

The OSVR project is now open to the public. All key software projects are now available and a few minor ones will be added in the next couple of days.

The best place to start is http://osvr.github.io/

Please also make sure to visit the Wiki, support portal, development blog and other OSVR sites available from the top right menu in http://osvr.github.io/

We look forward to working with the OSVR community to make OSVR better with every passing day.

Welcome!