Virtual Reality

Our software has to be as realistic as possible, so that the learning curve for its users is not too steep. It has to be as close as possible to the real flats. That's why we looked towards virtual reality, to offer the most immersive experience.



The solution we put forward must work under a variety of environments, from software to hardware. We use abstraction layers to ease the removal and the addition of input / output peripherals, without having to recompile the software.



While the 3D scene is rendered by Unity, most of our logic code is written in C#. Through the Component model, they interact with the different objects that make up the scene, to attach specific behaviors to them depending on their function.

What is Virtual Reality?

It's an activity that solicitates senses and makes the user dive into a virtual world, in an experience as immersive as possible. It relies on different peripherals :

  • User input peripherals : controllers, keyboards, haptic feedback arms... These peripherals allow the user to interact with the environment.
  • User output peripherals : screens, virtual reality headsets, 3D glasses... These peripherals allow the user to have a representation of the environment. It can be a combination of visual, auditive feedback...


The launchpads provided by Kerpape are full with domotics : every possible element is robotized! The doors, the light switches, the shutters, and even the living room table, the kitchen sinks or the bed - everything can be remote controlled.

These flats have two aims : allow disabled people to discover the different domotic equipements that are currently available on the market to select the ones they find useful, and give the craftsmen a chance to test the installation of these rather rare equipements.


Features and constraints

The project specifications include some major constraints, that must be taken care of during the development. They include:

  • Several learning modes: a symbolic one, very simplified, and schematized. It must also allow the user to be in total control inside the scene. Finally, the last mode is a compromise: autonomy in the movements and iteractions, but with indications.
  • Learning scenarios: 3 are defined by the engineers at Kerpape. For example during a medic's visit, the user has to get the phone then open the door.