Modding an Oculus Rift for Augmented Reality with Linux on the Intel Edison

513

testing AirOS

At the NASA Space App Challenge hackathon in April, Team AirOS won second place at the San Francisco event with an augmented reality (AR) headgear system that included a Linux-driven Intel Edison module hooked to an Oculus Rift. Last week we covered the first place winner at the event — Team ScanSat— which developed a Linux- and Edison-based CubeSat nanosatellite camera design. Both teams advanced to the global competition, but did not make the final cut.

While the members of Team ScanSat knew each other beforehand, Team AirOS first met at the event, according to David Badley (the Pikachu in the Superman shirt), who spoke on behalf of his team in an interview with Linux.com. Team AirOS, which also included Patrick Chamelo, Karl Aleksander Kongas, Scott Mobley, Maria Roosiaas, and Marc Seitz, decided to enter a challenge called Space Wearables: Designing for Today’s Launch & Research Stars.

AirOs teamThe team decided on Badley’s suggestion for a sensor-enabled AR headset for ground technicians and astronauts. Badley, who has been programming at various startups for over a decade and is currently working for Visa, had already been considering launching his own AR-related startup.

NASA is collaborating on a similar AR concept with the Osterhout Design Group. ODG developed the Android-based R-7 Glasses, as well as the Mini Augmented Vision heads up display (HUD) glasses for drivers of BMW’s Mini cars.

Team AirOS decided to work with Badley’s own Oculus Rift VR headset, which he had to run home to fetch, as well as technology donated for the event. This included Leap Motion’s gesture controller, a camera, and an Intel Edison board with an Arduino breakout board and Grove sensors.

The group also had access to a limited “Instant Answers” version of IBM’s Watson supercomputer, to provide an interactive knowledge system for the user. However, “at the end of the day the functionality just wasn’t there,” says Badley. “We cut it from our demo.”

For the purposes of the hackathon demo, a laptop stood in for the embedded computer that would run a finished AirOS device. All the devices were connected to the laptop, which ran the Unity 3D game engine that controls the Oculus Rift. There were so many inputs, in fact, that Badley had to run out to buy a USB hub.

The Leap Motion controller, whose APIs also run on the laptop, tracks gestures via a pair of infrared cameras and various LEDs. The device was already optimized for the Oculus Rift, and can be mounted on the outside of the headgear, as shown in the photo. The camera, meanwhile, pipes video into the Oculus Rift display.

The Linux application on the Intel Edison orchestrates sensor input from the Arduino breakout and Seed Studio’s Grove temperature and flame detectors. The Edison app sends a GUI overlay to the AR display with sensor updates.

AirOS applications

Badley lays out a scenario in which a NASA ground technician heads out to repair a generator. The AirOS device would display navigation directions to the generator and then provide repair directions, updating the display as each step is completed. 

“If more advanced help is needed, people back at HQ would be able to monitor progress and give instructions,” says Badley. “The person at HQ could even wear a VR device with Leap Motion, and see exactly what the ground technician is seeing.”

The advantage of an immersive experience goes beyond freeing up your hands, says Badley. “The Oculus Rift can tell where you are looking and move in response to your gaze,” he says. “With AR, you can be immersed while still being in the real world. This means HUDs can be drawn over moving objects, and you can zoom in to focus on details. The Leap Motion APIs pair with the Oculus really well.”

Badley and the team had some experience with the Oculus Rift and Leap Motion, but the Edison was new to them “The Edison is great, and the Grove sensors were fantastic to work with,” says Badley. “The flame sensor can detect light given off at the 760-1100 nanometer range. Looking to the future, AR will let you visually display light ranges at nanometer scales that aren’t visible to the human eye. It’s sort of like adding colors and senses to your vision.”

The integration between the Edison, the Leap Motion APIs, and Unity was “was a bit hacky,” but was fairly straightforward, and ended up working fine, says Badley. “First, we wrote the results from the Edison/Arduino/Grove setup into a flat file,” he says. “We could have used a REST interface, but a flat file was easier. Then Unity read from that file to display data on an interval. As you add more and more data, the flat file would become less ideal, so ideally we would have something like SQLite.”

With Unity, Team AirOS was able to build a UI over a couple of “spheres” created by the webcam, one for each eye, explains Badley. In this way, the Oculus Rift user can “see” the outside world, as well as draw 3D objects in virtual space. The main problem with the Oculus Rift is that it takes 10 to 15 minutes to set up, says Badley. By comparison, with Samsung’s Oculus-based Gear VR headset, “you can pop a phone into a headset and you’re ready to go,” he says. “It’s a huge difference in accessibility.”

Badley is excited about the current burst of activity in the AR/VR space, noting that Leap Motion recently added methods of rendering images of one’s real hands in the Oculus Rift. He predicts there will be a “bloom of creativity” in the AR/VR arena over the next few years.

“As AR improves, it will replace VR glasses because AR will be able to handle VR,” he says. “You will be able to experience someone else’s perspective in different ways, which will change communications forever. It will change how we search for information, and even how we think.”

Badley wishes there was more of a focus on getting Linux to support AR/VR out of the box in an intuitive way. “It would have to happen fast considering how far Microsoft and Magic Leap and others are moving into the space,” he says. “Linux could power all this, and moving towards open sourcing everything would be awesome.”