Solving ‘Simulator Sickness’ With Smart Software, SoC Design

Intelligent software, the right processor and constant calibration can make it all go away.


The growth in virtual and augmented reality headsets is expected to explode in the coming years. The U.K.-based research firm KZero estimates headset unit volume will jump from nearly 4 million this year to 38.4 million in 2018.

But that growth rate might be stunted if users continue to struggle with “simulator sickness,” that queasy feeling that something is just not right as they navigate a virtual reality game chasing down the bad guys.

Yet for Roy Illingworth, director of systems engineering, and his colleagues at Hillcrest Labs, this is soon to be a problem of the past.

The solution, he said in an interview, lies in intelligent software, the right processor design and a focus on constant calibration.

Hillcrest engineers have devised a software solution, that runs on both a 32-bit Cortex-M0+ and M4, designed to reduce latency in playback while tamping down power.

That dizzy feeling
Before we get to that, let’s recap the problem: VR systems trick the brain into believing the virtual world is real. This can lead to “cue conflict,” which is a leading theory about the cause of simulator sickness. This theory argues that simulation sickness is caused by a mismatch between what we expect to see and feel and what we are actually seeing and feeling.

Ben Lewis Evans, a U.K.-based human factors psychologist and games researcher writing in Gamasutra, believes any latency above 46ms could start to create problems for users.

Illingworth offers some additional context:
“As you move your head around slowly, a lot of things come into play. Latency is one. You don’t want the game’s video display to lag your head’s movement. That’s poor system design, which leads to a bad user experience. A second thing is the stability of the whole system. If I stop moving, I don’t want my screen to keep moving and I don’t want the screen to jump for no obvious reason. Both of these issues can be a result of using MEMS sensors. Gyroscopes can exhibit drift and magnetometers can suffer from interference. The drift will cause the screen to continue moving when the head has stopped. Interference can cause the user’s frame of reference (head position) to be updated randomly. You have to cancel the effects of imperfect sensor performance in software, and that’s what the dynamic calibration algorithms do.”

This is a non-trivial challenge. Because MEMS are small machines, they’re affected by temperature, age and other environmental factors. “You have to constantly calibrate for different variations,” he said.

Interference issue
A key issue the Hillcrest Labs team observed and overcame during the design was the effect of magnetic interference.

“We found a lot of issues related to the magnetometer,” Illingworth said. “You’re playing a game near other devices with magnets in them. There is a lot of magnetic interference. Smart phones, speakers, even headsets have magnets. The magnetometer gets affected by those.”

“A large part of our software is designed to calibrate away these affects to allow for excellent performance and apply updates to the user’s frame in an intelligent manner” he added.

The design of a head-mounted display requires an understanding of the context of the sensor fusion and calibration algorithms employed. There are different requirements for a smart phone, wrist-worn wearable or motion remote control.

The other design challenge lies in power: most virtual or augmented reality headsets operate on batteries, so power consumption is a key design constraint. Finding a solution that delivers enough performance to overcome latency and stability issues and keep power low is key, he said.

Designed in collaboration between Hillcrest Labs and Bosch Sensortec, the SoC for the headset is the BNO070, a system-in-package, measuring 3x5x1mm, which includes Hillcrest Labs SH-1 sensor hub software, a 3-axis gyroscope, 3-axis accelerometer, a 3-axis magnetometer, and an Atmel MCU with Cortex-M0+ in a single package.

Hillcrest Labs’ SH-1 SensorHub software runs on the system.
“Head-mounted displays for VR and AR are just one application of sensor processing software in wearable devices today,” Illingworth said. “The proliferation of sensors in smart watches, activity trackers, motion capture devices, and more have highlighted an even broader need for high performance sensor processing software and hardware.”