OSVR 0.6 Released

This release of the OSVR projects brings dozens of updates for OSVR, including many improvements submitted by the community of OSVR developers on GitHub. Thanks to all of the contributors to OSVR!

Below is a description of the major additions to OSVR, followed by a “release notes” document detailing smaller changes and bug fixes.

OSVR community members – please pay attention to the “contributions wanted” section in each feature and see if you are able to help us accelerate the pace of development for OSVR.

Major Features in OSVR-Core

Optical video-based (“positional”) tracking

Centralized Display Interface

Render Manager

Predictive Tracking

Profiling tools

JointClientKit

New Android Capabilities

Engine Integrations

Language Bindings

New Unity capabilities

New Plugins and Interfaces

Gesture interface

Locomotion interface

EyeTracker interface

SMI Eye Tracker plugin

Simulation plugins

Release Notes

OSVR-Core ClientKit API, Documentation, Examples

OSVR-Core PluginKit API, Documentation, Examples

OSVR-Core Internals, Bundled Plugins, Tools, and General Documentation

OSVR Tracker Viewer

Managed-OSVR .NET bindings

OSVR-Unity

OSVR-Unity-Palace-Demo

Distortionizer

OSVR-JSON-Schemas

SteamVR-OSVR

In Closing…

Major Features in OSVR-Core

Optical video-based (“positional”) tracking

The positional tracking feature uses IR LEDs that are embedded on the OSVR HDK along with the 100 Hz IR camera included with the OSVR HDK to provide real-time XYZ positional and orientation tracking of the HMD.

 The LEDs flash in a known pattern, which the camera detects. By comparing the location of the detected LEDs with their known physical locations, a position and orientation (pose) determination is made.

 The software current looks for two targets (LED patterns): one on the front of the HDK and one on the back. Additional targets can be added, and thus additional devices that have known IR LED patterns can also be tracked in the same space.

 It is also possible to assign different flashing patterns to multiple HDK units, thus allowing multiple HDK units to be tracked with the same camera. This is useful for multi-user play. Changing the IR codes on the HDK requires re-programming the IR LED board.

Sensics is working with select equipment developers to adapt the IR LED board and pattern to the specific shape of an object (e.g. glove, gaming weapon) so that that object can also be tracked with the OSVR camera.

The tracking code is included with the OSVR Core source code and is installed as an optional component while we are optimizing the performance.  It will be set as the standard tracker once camera distortion, LED position optimization, and sensor fusion with IMU data have been implemented.

 The image below shows a built-in debugging window that indicates the original image overlaid with beacon locations (in red, a tag of -1 means that the beacon has not been visible long enough to be identified) and reprojected 3D LED poses (in green, even for beacons not seen in the image).  RANSAC-based pose estimation from OpenCV provides the tracking.

Contributions wanted:

  • Make the system more configurable by moving configuration parameters into an external JSON configuration files. These parameters could include the camera to use, optimization parameters and LED positions into the configuration file.
  • Add Kalman optimal estimation filter to combine pose information from the video-based tracker and inertial measurements from the HDK’s inertial measurement unit into a combined pose + velocity + acceleration tracker that will provide smoother tracking and that can be used for predictive positional tracking.
  • Combine the output from multiple cameras for wide-area tracking.
  • Account for the optical distortion of the camera in the analysis.
  • Create a calibration tool that improves performance by accounting for the slight manufacturing variation in the LED position.
  • Create a tool that simplifies the process of adding new objects and IR LED patterns.

Centralized Display Interface

The OSVR-Core API now includes methods to retrieve the output of a computational model of the display. Previously, applications or game engine integrations were responsible for parsing display description JSON data and computing transformations themselves. This centralized system allows for improvements in the display model without requiring changes in applications, and also reduces the amount of code required in each application or game engine integration.

The conceptual model is “viewer-eye-surface” (also known as “viewer-screen”), rather than a “conventional camera”, as this suits virtual reality better.[1] However, it has been implemented to be usable in engines (such as Unity) that are camera-based, as the distinction is primarily conceptual.

As a demonstration of this API, a fairly minimal OpenGL sample (using SDL2 to open the window) is now included with OSVR-Core.

Contributions wanted:

  • OpenGL-SDL sample that uses the distortion parameter API to apply the distortion shader.

Render Manager

The Sensics/OSVR Render Manager provides optimal low-latency rendering on any OSVR-supported device.  Render Manager currently provides an enhanced experience with NVIDIA’s Gameworks VR technology on Windows. Support for additional vendors (e.g. AMD, Intel) is being worked on. We are also exploring the options to work with graphics vendors for mobile environments.

 Unlike most of the OSVR platform, the Render Manager is not open-sourced at this point. The main reason is that the NVIDIA API was provided to Sensics under NDA and thus we cannot expose it at this time.

Key features enabled by the Render Manager:

  • DirectMode: Enable an application to treat VR Headsets as head mounted displays that are accessible only to VR applications, bypassing rendering delays typical for Windows displays. DirectMode supports both Direct3D and OpenGL applications.
  • Front-Buffer Rendering: Renders directly to the front buffer to reduce latency.
  • Asynchronous Time Warp: Reduces latency by making just-in-time adjustments to the rendered image based on the latest head orientation after scene rendering but before sending the pixels to the display.  This is implemented in the OpenGL rendering pathway (including DirectMode) and hooks are in place to implement it in Direct3D.  It includes texture overfill on all borders for both eyes and supports all translations and rotations, given an approximate depth to apply to objects in the image.

Coming very soon:

  • Distortion Correction: Handling the per-color distortion found in some HMDs requires post-rendering distortion.  The same buffer-overfill rendering used in Asynchronous Time Warp will provide additional image regions for rendering.
  • High-Priority Rendering: Increasing the priority of the rendering thread associated with the final pixel scan-out ensures that every frame is displayed on time.
  • Time Tracking: Telling the application what time the future frame will be displayed lets it render the appropriate scene.  This also enables the Render Manager to do predictive tracking when producing the rendering transformations and asynchronous time warp.  The system also reports the time taken by previous rendering cycles, letting the application know when to simplify the scene to maintain an optimal update rate.
  • Unity Low-level Native Plugin Interface: A Rendering Plugin will soon enable Render Manager’s features in Unity, and enable it to work with Unity’s multithreaded rendering.

 Render Manager is currently available only for OSVR running on Windows.

Several example programs and configuration files for OpenGL (fixed-pipeline and shader code versions, callback-based and client-defined buffers based) and Direct3D11 (callback-based and client-defined buffers based, library-defined device and client-defined device) are provided and open-sourced. Also included is a program with adjustable rendering latency that can be used to test the effectiveness of asynchronous time warp and predictive tracking as application performance changes.

The RenderManager library features are controlled through a JSON configuration file:

{

  “meta”: {

    “schemaVersion”: 1

  },

  “render_manager_parameters”: {

        “direct_mode”: false,

        “direct_display_index”: 0,

        “asynchronous_time_warp”: false,

        “render_overfill_factor”: 1.0,

        “num_buffers”: 2,

        “vertical_sync”: true,

        “render_library”: “OpenGL”,

        “window_title”: “OSVR”,

        “window_full_screen”: true,

        “window_x_position”: 1280,

        “window_y_position”: 0,

        “display_rotation”: 0,

        “bits_per_color”: 8

  }

}

 Contributions wanted:

  • We are seeking to work with additional graphics chip vendors to create a universal, multi-platform library for high-performance rendering.

Predictive Tracking

Predictive tracking reduces the perceived latency between motion and rendering by estimating head position at a future point in time. At present, the OSVR predictive tracking uses the angular velocity of the head to estimate orientation 16 mSec (1 frame at 60 FPS) into the future.

 Angular velocity is available as part of the orientation report from the OSVR HDK. For other HMDs that do not provide angular velocity, it can be estimated using finite differencing of successive angular position reports.

 Contributions wanted:

  • Improve the algorithm to extract velocity from non-HDK trackers.
  • Extract angular acceleration and use that to improve the quality of predictive tracking.
  • Fuse angular (yaw/pitch/roll) and linear (X/Y/Z) data to improve quality of positional tracking.
  • Configure the look-ahead time either through API or through an external configuration file.

Profiling tools

Utilizing ETW – Event Tracing for Windows – the OSVR performance profiler allows optimizing application performance by identifying performance bottlenecks throughout the entire software stack.

 

 Event Tracing for Windows (ETW) is an efficient kernel-level tracing facility that lets you log kernel or application-defined events to a log file and then interactively inspect and visualize them with a graphical tool. As the name suggests, ETW is available only for the Windows platform. However, OSVR-Core’s tracing instrumentation and custom events use an internal, cross-platform OSVR tracing API for portability.

 Currently the default libraries have tracing turned off to minimize any possible performance impacts. However, the “tracing” directory contains tracing-enabled binaries along with instructions on how to swap them in to use the tracing features. See this slide deck for a brief introduction on this powerful tool: http://osvr.github.io/presentations/20150901-Intro-ETW-OSVR/

 Contributions wanted:

  • Identify and add additional useful custom tracing events.
  • Implement the internal OSVR tracing API using other platforms’ equivalents of ETW for access to the instrumented events.
  • Measure the performance impact of running a tracing-enabled build, to potentially enable it by default.

JointClientKit

The default OSVR configuration has the client and server run as two separate processes. Amongst other advantages, this keeps the device servicing code out of the “hot path” of the render loop and allows multiple clients to communicate with the same server.

 In some cases, it may be useful to run the server and client in a single process, with their main loops sharing a single thread. Examples where this might be useful include automated tests, special-purpose apps, or apps on platforms that do not support interprocess communication or multiple threads (in which case no async plugins can be used either). The new JointClientKit library was added to allow those special use cases: it provides methods to manually configure a server that will run synchronously with the client context.

 Note that the recommended usage of OSVR has not changed: you should still use ClientKit and a separate server process in nearly all cases. Other ways of simplifying the user experience, including hidden/launch-on-demand server processes, are under investigation.

Contributions wanted:

  • Automated tests for ClientKit APIs, using JointClientKit to start a temporary server with only dummy plugins loaded.

New Android Capabilities

A new device plugin has been written to support Android orientation sensors. This plugin exposes an orientation interface to the OSVR server running on an Android device. This is available here: https://github.com/OSVR/OSVR-Android-Plugins

A new Android OpenGL ES 2.0 sample demonstrates basic OSVR usage on Android in C++. You can find this sample here: https://github.com/OSVR/OSVR-Android-Samples

 An early version of an Android app has been written that launches the OSVR server on a device to run in the background. This eliminates the need to root the phone, which existed in previous OSVR/Android version. You can find this code here:  https://github.com/OSVR/OSVR-AndroidServerLauncher

 The Unity Palace demo (https://github.com/OSVR/OSVR-Unity-Palace-Demo/releases/download/v0.1.1-android/OSVR-Palace-Android-0.1.1.zip) for Android can now work with the internal orientation sensors, as well as with external sensors

Contributions wanted:

  • Predictive tracking/filtering in the Android sensor plugin.
  • Sensor fusion to improve fidelity of tracking.
  • Additional sample apps.
  • Connect to Android camera to provide Imager interface for Android.

Engine Integrations

OSVR continues to expand the range of available engines to which it integrates., this includes:

  • Unity
  • Unreal
  • Monogame
  • Valve OpenVR (in beta): https://github.com/OSVR/SteamVR-OSVR

 Here are new integrations as well as improvements to existing integrations:

Language Bindings

The .NET language bindings for OSVR have been updated to support new interface types for eye tracking. This includes 2D and 3D eye tracking, direction, location (2D), and blink interface types.

New Unity capabilities

Unity adapters for the eye tracking interface types have been added, as well as prefabs and utility behaviors that make it easier to incorporate eye tracking functionality into Unity scenes.

The optional distortion shader has been completely reworked, to be more efficient as well as to provide a better experience with Unity 4.6 free edition.

The OSVR-Unity plugin now retrieves the output of the computational display model from the OSVR-Core API. This eliminates the need to parse JSON display descriptor data in Unity, which allows for improvements in the display model without having to rebuild a game. The “VRDisplayTracked” prefab has been improved to create a stereo display at runtime based on the configured number of viewers and eyes.

 Coming soon: Distortion Mesh support. Mesh-based distortion uses a pre-computed mesh rather than a shader to do the heavy lifting for distortion. There is a working example in a branch of OSVR-Unity.

 Contributions wanted:

  • UI to display hardware specs, display parameters, and performance statistics.

New Plugins and Interfaces

Gesture interface

The gesture interface brings new functionality that allows OSVR to support devices that detect body gestures including movements of hand, head and other parts of the body. This provides ways to integrate devices such as Leap Motion®[2], Nod Labs Ring, Microsoft® Kinect® [3], Thalmic Labs MYO[4] armband and many others. Developers can combine gesture interface with others to provide meaningful information such as orientation, position, acceleration and/or velocity about user’s body part(s) pose.

New API has been added on the plugin and client sides to report/retrieve gestures for device. The gesture API provides a list of pre-set gestures while staying flexible to allow custom gestures to be used.

 We added a simulation plugin – com_osvr_example_Gesture (see description of Simulation Plugins below) that uses gesture interface to feed a list of gestures, and also created a sample client application to visually output gestures

received from the plugins. These useful tools would help when developing new plugins or client apps.

Using the new interface, we are working on releasing a plugin for Nod Ring that will expose a gesture interface as well as existing interfaces.

Contributions wanted:

  • Development of new plugins for devices that have gesture recognition such as :
  • ThalmicLabs Myo Armband
  • Logbar Ring[5]
  • New devices are welcome

Locomotion interface

The locomotion interface adds an API to support a class of devices also known as Omni-Directional Treadmills (ODT) allow walking and running on a motion platform and then converts this movement into navigation input in a virtual environment. Some examples of devices that would be able to use locomotion interface are: Virtuix Omni,  Cyberith  Virtualizer, Infinadeck, and others. These devices are very useful for First Person Shooters (FPS) games and by combining locomotion interface with tracker additional features such as body orientation, jump/crouch sensing could be added.

The API allows the ODTs to report the following data (on a 2D plane):

  • User’s navigational velocity
  • User’s navigational position

Contributions wanted:

  • Development of plugins for ODTs using Locomotion interface (walking / running) combined with additional OSVR interfaces (jumping, crouching, looking around)

EyeTracker interface

The EyeTracker interface provides an API to report detailed information about the movement of one or both eyes.

 This includes support for reporting:

  • 3D gaze direction – a unit directional vector in 3D
  • 2D gaze direction – location within 2D region
  • Detection of blink events

EyeTracker devices are effective tools to interact inside VR environment providing intuitive way to make a selection, move objects, etc. The data reported from the devices can be analyzed to understand human behavior, marketing research and other research topics as well as gaming applications. They can also be used to perform the most accurate virtual reality rendering, customized for the location of your pupil every frame.

A .NET binding for EyeTracker (described above) allows easy integration of the eye tracking data into Unity.

 Contributions wanted:

  • Create new plugins for eye tracking devices
  • Expand EyeTracker interface to report additional eye attributes such as pupil size, pupil aspect ratio, saccades

SMI Eye Tracker plugin

In collaboration with SensoMotoric Instruments GmbH (SMI) we are releasing a new

plugin for SMI trackers. For instance, the plugin supports the SMI Upgrade Package for Oculus RiftDK2. It uses the SMI SDK to provide real-time streaming of eye and gaze data and report it via EyeTracker interface.

The SMI plugin also provides an OSVR Imaging interface to stream the eye tracker images.

 The plugin is available at – https://github.com/OSVR/OSVR-SMI

 Contributions wanted:

  • Create similar plugins for other eye-tracking vendors.

Simulation plugins

Along with the newly added interfaces (eyetracker, gesture, locomotion), we provide simulation plugins that serve as an example on how to use a certain interface. Their purpose is emulate a certain type of device (joystick, eyetracker, head tracker, etc.), connected to OSVR server, and feed simulation data to the client. These plugins were added as a tool for developing applications so that developers can easily run tests without the need to attach multiple devices to the computer. We would be expanding the available simulation plugins to have one for every type of interface. Simulation plugins are available in OSVR-Core and can be modified to a specific purpose.

Contributions wanted:

  • Create new simulation plugins for interfaces that do not have one already.
  • Add new or improve the existing data generating algorithms used in simulation plugins
  • Create new simulation plugins that use a combination of various interfaces such as joystick (button + analog)

Release Notes

Items listed here are generally in addition to the major items highlighted above, and do not include the hundreds of commits involved in the development and tuning of these or the above features – see the Git logs if you’d like to see all the details!

OSVR-Core ClientKit API, Documentation, Examples

  • All API updates have been reflected in the Managed-OSVR .NET bindings.
  • Added osvrClientCheckStatus()/ClientContext::checkStatus() method to expose whether the library was able to successfully connect to an OSVR server.
  • Display API and OpenGL examples.

OSVR-Core PluginKit API, Documentation, Examples

  • Added C++ wrapper to the configured device constructor API.
  • Added an example “configured” plugin
  • Improved self-contained example plugin.
  • Link from the documentation to the OpenCV video capture plugin – a bundled plugin that also serves as an example.
  • Improve completeness and usability of the Doxygen-generated API documentation.

OSVR-Core Internals, Bundled Plugins, Tools, and General Documentation

  • A large number of improvements, including a review of content as well as custom styling, were applied to the generated API documentation (“Doxygen” output).
  • Timestamps are now verified for in-order arrival before updating client interface object state.
  • Improved compatibility of vendored COM smart pointer code.
  • Preliminary support for building the video-based tracker with OpenCV 3 (assertions quickly fail on an MSYS2/MinGW64 test build).
  • Decreased code duplication through use of an internal “UniqueContainer” type built on existing internal “ContainerWrapper” templates.
  • Client contexts now store their own deleters to allow safe cross-library deallocation, needed for JointClientKit.
  • Vendored “Eigen” linear algebra library updated from 3.2.4 to a post-3.2.5 snapshot of the 3.2 branch.
  • Reduced noisy configure messages from CMake.
  • Moved non-automated tests to an internal examples folder so they still get built (and thus verified as buildable) by CI.
  • Compile the header-dependency tests for headers marked as C-safe as both C and C++, since some headers have extra functionality when built as C++.

OSVR Tracker Viewer

  • Adjusted the default settings to show full pose, not just orientation, for the /me/head path.
  • Print a message to the command prompt about the command line arguments available when none are passed.
  • Start up tracker viewer zoomed in (distance of 1 meter to origin) to provide a more usable experience from the start.
  • Hide coordinate axes until the associated tracker has reported at least once.
  • Compile the “coordinate axes” model into the executable itself to remove a point of failure.

Managed-OSVR .NET bindings

  • Updates to include bindings for new APIs added to ClientKit.
  • Fix a lifetime-management bug that may have resulted in crashes (double-free or use-after-free) if a ClientContext was disposed of before child objects.
  • Added accessor for the raw ClientContext within the safe handle object, for use with code that interoperates with additional native code.

OSVR-Unity

  • Operation order tweaks that result in latency improvements.
  • ShaderLab/Cg/HLSL distortion shader simplified and optimized for higher performance.

OSVR-Unity-Palace-Demo

  • Updated to newer version of OSVR-Unity.
  • Disabled mouselook.

Distortionizer

  • Included OpenGL distortion shaders simplified and optimized for higher performance.

OSVR-JSON-Schemas

  • Display descriptor schema v1 elaborated with “as-used” details.

SteamVR-OSVR

  • Fixed infinite loop in hand-coded matrix initialization.
  • Improved/simplified math code using Eigen and its “map” feature.

In Closing…

As always, the OSVR team with the support of the community is continuously  adding smaller features and addressing known issues. You can see all of these on Github such as in this link for OSVR Core

Interested in contributing? Start here: http://osvr.github.io/contributing/

Any questions or issues? email support@osvr.com

Thank you for being part of OSVR!

Yuval (“VRguy“) and the OSVR team


[1] Kreylos, 2012, “Standard camera model considered harmful.” <http://doc-ok.org/?p=27>

[2] Leap Motion is a trademark of Leap Motion, Inc.

[3] Microsoft and Kinect are either registered trademarks or trademarks of Microsoft Corporation in the United States and/or other countries.

[4] ThalmicLabs™ and Myo™ are trademarks owned by Thalmic Labs Inc.

[5] Logbar Inc. Ring is a trademark of Logbar Inc.

 

OSVR Confirms Pre-order Date for Hacker Development Kit; Launches Content Discovery Platform

OSVR Confirms Pre-order Date for Hacker Development Kit; Launches Content Discovery Platform

Software platform reveals 65 new supporting companies including Legendary VR and NVIDIA’s Gameworks VR

 

BERLIN (IFA Europe 2015) – Organizers of Open Source Virtual Reality (OSVR), an open software and hardware platform for virtual reality, today announced development of the Hacker Development Kit (HDK) v1.3 and related preorder date for the public on October 1, 2015.

It was also announced that the OSVR content discovery platform has launched in anticipation of the new HDK release. It is available for VR enthusiasts, developers and interested consumers now through OSVR.org. The platform features a fast-growing list of curated content ranging from 360-degree media to VR games from across the OSVR supporter network. It additionally provides a list of VR stores that contain OSVR-supported content, including the following:

  • Boondogl
  • Itch.Io
  • Qihoo360
  • Razer Cortex
  • Steam
  • V
  • Vrideo
  • WearVR

Curated content will rotate frequently to give all partner content equivalent exposure.

Interested parties may visit the platform here: http://www.osvr.org/featured.

Hacker Development Kit v1.3 Pre-Orders:
The HDK v1.3 will be made available to all for pre-order on the 1st of October with shipping later that month.

V1.3 improves upon v1.2 with a new optics module that maintains the HDK’s highly acclaimed picture quality while expanding on the eyebox to allow for sharper images without the need for Independent Adjustment lenses (IPD). It will also have individual eye focus for personalized use without glasses.

Major Software Updates

Driven by continued investments from its core team alongside a productive community of contributors, the OSVR software platform continues to grow and offer a complete set of capabilities for VR application developers. Some of the recent enhancements include:

OSVR Render Manager, allowing optimal low-latency rendering on any OSVR-supported device.

Currently supporting Gameworks VR, NVIDIA’s technology for VR headsets and games, the OSVR Render Manager provides:

  • Direct Mode—the NVIDIA driver treats VR headsets as head mounted displays accessible only to VR applications, rather than a typical Windows monitor that your PC shows up on, providing better plug and play support and compatibility for VR headsets.
  • Front Render Buffering—enables the GPU to render directly to the front buffer to reduce latency.
  • Context Priority—provides headset developers with control over GPU scheduling to support advanced virtual reality features such as asynchronous time warp, which cuts latency and quickly adjusts images as gamers move their heads, without the need to re-render a new frame.

“Our work with NVIDIA is a key step towards achieving our goal of providing developers with a comprehensive infrastructure that allows creating high-performance VR and AR experiences regardless of the particular operating system, HMD platform, game engine or input peripherals”, says Yuval Boger, CEO of Sensics. “OSVR allows the VR/AR community to create exciting experiences without being locked in to any single vendor”.

Support for other graphics architectures are currently in development

OSVR Performance Profiler
Utilizing ETW – Event Tracing for Windows – the OSVR performance profiler allows optimizing application performance by identifying performance bottlenecks throughout the entire software stack.

Event Tracing for Windows (ETW) is an efficient kernel-level tracing facility that lets you log kernel or application-defined events to a log file and then interactively inspect and visualize them with a graphical tool

ABOUT OSVR:
OSVR™ is a software platform designed to set an open standard for virtual reality input devices, games and output to provide the best possible VR game experience. Supported by industry leaders, the OSVR framework unites developers and gamers alike under a single platform. Plug in. Play Everything.

For the full list of OSVR supporters go to www.osvr.org

Like OSVR on FB: https://www.facebook.com/OpenSourceVR.

Follow OSVR on Twitter: https://twitter.com/OpenSource_VR .

PRESS CONTACTS:
press@osvr.org

# # #

APPENDIX: NEW OSVR SUPPORTERS:
The OSVR partner network has grown in size since the last update with 65 new companies bringing the total number of supporters to 230

A4VR is a one- stop creative agency specialized in key ready virtual reality solutions. As a specialist in all areas of VR application and 360 ° content A4VR has its own motion capture studio, a 360 ° film production, 3D VR graphics department, recording studio, event – specialists and a unique interdisciplinary Innovation Unit. In their nearly 200 m ² sized VR LAB they research and develop new solutions and products around Virtual Reality, in cooperation with several universities and OSVR.

Aboard the Looking Glass is a science fiction game about dimensional time and a rescue mission gone awry, developed by Henry Hoffman

Aerys develops the Minko Engine which allows developers to build desktop, web and mobile 3D applications. It’s free and open source. Minko targets HTML5, iOS, Android, Windows, OSX and Linux. They provide everything you need to develop cross-platform applications. With a single C/C++11 code base, your 3D now targets more screens than ever! You can also integrate the Minko Engine with existing native/web apps.

The Bar L’Atelier is a living place, a rallying point between digital and human, covering several services aimed at stimulating socialization, individual skills improvement, social and professional life:

Basemark is worldwide leader in benchmark development. Basemark has started development of industry standard VR benchmarks and are committed to support OSVR.

BCAA has been developing interactive tailor made solutions for 10 years, their team is now building a VR framework to support this and will be using OSVR as a solution to integrate VR hardware and software.

BlueFrog Robotics developed Buddy, the first ever companion robot accessible to everyone.
It is supported by a new open platform based on Android and made with UNITY3D IDE.

Bnome is a Belgian IT SME specialized in the study and implementation of customize IT solutions. VectionVR, their product is a cost effective motion simulator engineered to work with a VR headset to power immersion and the physical feeling of acceleration.

Box provides a secure cloud based platform for content storage and collaboration. Developer APIs empower creative teams to load content from Box into their own applications, including secure storage for user data and web/mobile viewers for more than 100 file types.

Boost VC is an emerging technology accelerator located in Silicon Valley. Boost VC invests in early stage VR companies.

CaptoGlove is a new VR peripheral leveraging the power of your hand; a virtual glove usable in past, present and future PC games.

Climax Studios are believers in VR and released their first title, Bandit Six, in March. The sequel Bandit Six Salvo was announced at E3 2015 and will be available later in the year. Climax is also working on unannounced projects that will be available on the OSVR software platform.

DailyVR World it is a project which was born of a need that exists in the field of functional rehabilitation for neurological injury. Their team uses new technologies such as virtual reality to aid in this.

DeepStreamVR is pioneering VR games to help improve the quality of life based on a decade of research that indicates VR can help relieve pain, reduce stress and build resilience.

Ergoneers supports businesses with D-Lab3, an extensive software platform for behavioral research and Dikablis, a high end head mounted eye tracking-system. D-Lab3 supports integrated data acquisition, for example via audio, video, physiological data and of course eye-tracking. Ergoneers is located in Germany and has a US subsidiary in Portland, Oregon.

Fallen Planet Studios is the creator of the successful VR Horror Experience, Affected. They look forward to bringing future titles to OSVR and to support the open source nature of this product.

FIERY THINGS is a small London/Munich based indie game studio that, among other things, develops VR games/experiences. One of them is a survival horror VR experience called “Gooze”, which is based on a real location.

Fire Panda is a Virtual Reality development studio creating exciting and innovative VR Content, Experiences and Games. They are proud to work with some of the most talented people exploring everything VR can offer. They are experienced in VR Storytelling and building experiences that immerse viewers in new, exciting worlds.

F/LAT – Free/Libre Art and Technology is a Research Lab and Community hub for artists and others to share and develop tools in mixed media and transdisciplinary arts. We focus on Open Source and Libre Software and Hardware to ensure the greatest possible freedom for the future, to use and modify the tools we use and build.

Glue Engine is the first game engine for everyone. No programming skills needed. Our Mission is to make game development very easy for professional developers, casual developers and regular users that never created a game before.

Hendesehane is a technology development company located in Ankara, Turkey. Hendesehane Nefes represents Hendesehane’s VR and Robotics product line. The product line is created for providing low latency wireless operation to the modern HMDs and USB peripherals created for Virtual Reality, Augmented Reality and Mixed Reality.

Immersive Entertainment is a software development company focusing on the creation of next-generation Virtual Reality Entertainment Content that bridges the gap between videogames and movies, providing a superior level of immersion, emotional engagement and realism.

Infinity Augmented Reality (Infinity AR) provides a revolutionary software platform that makes the digital eyewear experience a reality. The Company is the first augmented reality software platform to connect universally with digital eyewear, smartphones and tablets, integrating multiple devices into one platform.

inMotionVR combines gamification together with psychosomatic mechanics to make sure the user gets maximum engagement through VR with the movement he or she is experiencing. We believe that using a combination of gamification and VR will enable optimized therapy for a wide variety of physical and mental issues.

itch.io is an open indie game marketplace. It’s a platform for independent developers of all sizes to upload, distribute, and sell their content with absolutely no barrier to entry. At over 16,000 games it’s also a great place to find something new to play.

Jimbomania is an independent simulation and visualization developer, committed to open standards and software for all.

Kraken Realtime is a company specialized in interactive 3D visualizations and simulations. They create immersive training tools using haptic systems to completely immerse the trainee in his future work environment.

Legacy Games is a publisher and developer of PC and mobile games, including AR, VR, and other emerging platforms. We work with major partners and licensors such as HP, Intel, Lenovo, CBS, A&E, the BBC, and Crayola.

Legendary VR is focused on creating premium VR experiences and will bring content to the OSVR platform in the future. Legendary VR supports the open nature of this platform and looks forward to the growth and development this platform will bring to the VR industry and audience.

Mindemia is technology and consultancy firm that has been doing VR development since 2013.

Minecrift is an open source total VR conversion for Minecraft that is aiming for all of the main VR platforms. OSVR support is a necessity going forward.

Motion Reality, Inc. (MRI) manufactures virtual tactical training simulators for use by the military and law enforcement. Trainees enter the system and navigate through an endless variety of scenarios meant to challenge their tactics and clearing techniques while addressing their use of force continuum. MRI’s live motion capture, modern game engine, full physics and haptic feedback blend to provide the most realistic training possible.

Narayana games is currently working on a standing to room-scale VR experience called Holodance – a spatial, rhythmic, audio-visual experience of pure ecstatic joy!

Neomancer LLC continues its tradition of bringing and applying cutting edge technology to its clients by joining the OSVR partnership as part of the team to commercially introduce HMD and wearable computing to the NYSE and NYMEX back in the late 90s; followed by one of the first demonstrations of interactive surface computing at a Museum of Modern Art installation, With a rich past in both VR hardware and software development, Neomancer looks forward to the new renaissance of VR through the leadership of OSVR.

NeuroDigital Technologies provides stunning Software and Hardware solutions for Fun & Serious games using Virtual Reality. They have developed Gloveone, a wireless haptic feedback device that lets Touch & Feel virtual reality!

Oasis VR is reimagining the arcade as a platform and marketplace. The platform will consist of a premium holistic system that that maximizes the user’s 1:1 interactions with the virtual world. The marketplace will be comprised of physical store fronts that host developers’ premium content and provide easy access to that content for consumers.

Omnifnity, Omnideck 6 enables you to perform natural instinctive movement, in any direction, within the virtual world – walking, running, jumping, and even crawling, are all possible. New dimensions of immersion for simulation training are now opened up; physical and psychological factors can be induced to affect the user’s behavior.

Overpower Studios is an independent hardcore games studio based in Belo Horizonte, Brasil. They created Scorching Skies for iOS and now are focusing on Aces High, an immersive innovative MOBA-Air Combat hybrid.

Q42 creates software for the web, mobile and the internet of things. They are 61 nerds programming from offices in The Hague, Amsterdam, and Mountain View together with companies like Google, Philips, the Rijksmuseum and the Dutch State Lottery

realities.io is an independent company that recreates interesting places for VR Sightseeing and aims for photorealistic scenes and broad HMD support.

One Unify a subsidiary of Unify Holdings Pte Ltd has a mission towards building a smart city powered by the intelligence-of-things to enable predictive automation.

RJ360VR is a startup that provides virtual reality solutions, the company has grown exponentially and has launched many VR projects.

CEOs Pet Software is a small indie studio from Berlin working on their debut VR game. S.P.Y. Robot is a unique mix of stealth, hacking, puzzle and action elements.

Schell Games is aggressively pursuing VR and AR for both entertainment and educational gaming, and are quite intrigued with the possibilities for OSVR.

Selvz’s SmartVR Platform is white label solution that makes it easy for enterprise customers to create, distribute, and monetize their VR content guide and experience for the desktop and mobile VR.

Serious Simulations was founded to provide professional trainers of dangerous and complex tasks a complete suite of mixed reality training products to improve performance. The company designs and manufactures individual and small group immersive simulators where human motion is the primary interface for the experience. Display technology is key to the experience, and the company currently has the widest FOV, high resolution, VR display in a wireless package.

SpheresVR craft amazing immersive experiences that take you to another world, time and place. Their clients are in tourism, retail, heritage and immersive cinema.

Surprise Attack Games is an indie games label that publishes remarkable games from independent developers.

Surreal is the first fully immersive virtual world designed for state of the art VR and beyond

Pixelboom created an immersive Star Wars experience called Tatooine VR.

Terranovita Software created, BouncerVR a virtual reality arcade game for PC, Mac. You control a shield with your vision. Collect colored resources to charge your abilities and to attack, but beware! The enemy can bounce your attacks back at you

Tiny Bull Studios is an indie developer based in Turin, Italy. The team is working on Blind, a VR adventure in which players need to rely on sound and echolocation to escape a mysterious mansion.

Toxic Games, established in 2010, is a UK-based games developer founded by former classmates. The team began development of Q.U.B.E. as a student project in 2009 and sought out investment from independent game dev veterans, Indie Fund. All three core team members are game designers and have been able to bring Q.U.B.E. to fruition without any programming expertise. Thanks to the encouragement of academic mentors and industry execs, and with the help of Indie Fund, Toxic Games went from student hobbyists to independent developers upon graduation from University.

TREEHOUSE specializes in virtual reality content creation for education, primarily focusing on a K12 audience. TREEHOUSE partners with education aggregators to develop a supplemental curriculum using VR tools. With arms in live-action content creation, Quest Based Learning (QBL), and passive animation TREEHOUSE is the first VR content licensing company operating exclusively in the K12 educational space.

Ultrabox is a small company based in the UK, with a focus on developing and supporting entertaining content that’s beyond the ordinary.

Unibrain technologies is a small group of passionate programmers trying to help develop OSVR as a platform, and create new solutions to the problems posed by upcoming and existing VR tech.

Vicator is a content creation company using state of the art Virtual- and Augmented Reality technologies. We develop immersive and interactive applications for smartphones and head-mounted devices, both tethered and untethered.

VR Nerds is working on different topics concerning VR on www.vrnerds.de which is at the moment Germanys biggest Virtual Reality Showcase, they blog about the latest VR news, game reviews, soft- and hardware.

VR Philippines is a community of VR developers and enthusiasts in the Philippines dedicated to promoting the advancement and widespread adoption of virtual reality technology in the Philippines.

VRARlab is deals with research and development of virtual and augmented reality projects. They created the project “Hauhet”.

vrAse is the biggest revolution the Smartphone has ever seen; the first high quality VR&AR case, which enables the user to experience the smartphone like never imagined before.

Vrideo is the world’s most fully built-out immersive video platform. Our website, mobile apps, and VR apps allow content creators to distribute and consumers to stream immersive video across the universe of VR and non-VR devices.

Woojer developed a patented, wearable haptic technology that enables users to literally FEEL the sound. The addition of haptic sensation to the audio of AR/VR, instantaneously catapults the immersive experience by recapturing the emotional dimension lost when using headphones.

XAP Electronics designs and makes innovative products and electronic systems for motor racing, their products include steering wheels for the racing world. They are currently working on a VR simulator.

YoutopiaVR is a social world that is bringing together all of the things that make VR great. Current features include a social panoramic YouTube player, Street View, and a distribution system that enables downloading and launching other experiences from within the social world.

YouVisit is a VR company that empowers businesses and institutions to create and distribute immersive virtual experiences that both engage and convert their target audiences.