IRE

Project Duration:
June 2014 – September 2014

IRE is one of the first moved-reality systems. It enables you to explore the world through the comfort of your seat, using IRE as your eyes and ears. In just three months I developed IRE for an open house day at the University of Applied Sciences FHNW. A stereo camera rig on a 3-axis gimbal is the centerpiece of IRE. The camera video is streamed to a virtual reality headset, while the gimbal is synchronized with the wearer’s head movements. Additionally, the wireless camera setup is installed on a remote-controlled robotic platform. The headset wearer can naturally look around and explore the world as IRE. The project could gain a lot of attention in national and international media.

Quick tour of IRE

More details about IRE can be found in a series of blog posts

  • May I introduce IRE?

    May I introduce IRE?

    What would it be like to explore the world as a small kid? The quest of answering this question sounds like a perfect job for the Oculus Rift. For all those who don’t know the Oculus Rift, it is a display which you can wear… continue reading

  • IRE – Overview (Part 1 of 7)

    IRE – Overview (Part 1 of 7)

    The post “May I introduce IRE?” has already explained the basics of what IRE is. This post will be the start of whole series of blog posts about my journey of doing this project. Index Diagram The following diagram will give a short overview of… continue reading

  • IRE – Stereo camera (Part 2 of 7)

    IRE – Stereo camera (Part 2 of 7)

    In order to match the Rift specifications as closely as possible, I had some very challenging requirements for cameras: Most standard webcams, such as those made by Logitech, don’t have 60 fps. The famous PlayStation 3 Eye camera has that high frame rate, but only… continue reading

  • IRE – Video & audio transmission (Part 3 of 7)

    IRE – Video & audio transmission (Part 3 of 7)

    How can I wirelessly transmit 1080p@60fps with short latency? That was one of my biggest questions. Most FPV systems are analog, which results in a bad image. That’s why I wanted to have a digital solution. First I thought of using Wifi, but soon I… continue reading

  • IRE – Media processing (Part 4 of 7)

    IRE – Media processing (Part 4 of 7)

    The special lenses of the Oculus Rift require the image to be distorted. This can be done with the official SDK. Its documentation is quite good, but luckily I’ve found the upcoming Oculus Rift in Action book with its nice source code examples. For my… continue reading

  • IRE – Camera Gimbal (Part 5 of 7)

    IRE – Camera Gimbal (Part 5 of 7)

    Of course I wanted to make use of Rift’s amazing head tracking and syncing the camera orientation with it. Using some static fisheye cameras to compute the right image for matching spot would have been neat. Such a solution would eliminate problems with quick head… continue reading

  • IRE – Control system (Part 6 of 7)

    IRE – Control system (Part 6 of 7)

    The last post covered how the video signal is transmitted. However, a communication channel back to the robot is also necessary. Over this channel the signals for the gimbal and the robot are sent. The smartest solution would have been to transmit the USB of… continue reading

  • IRE – Conclusion (Part 7 of 7)

    IRE – Conclusion (Part 7 of 7)

    Overall I was really impressed with how well the system worked, and was also rather proud of myself. As I built it for our open day, the system had to work all day long with many different people. In this regard it worked perfectly –… continue reading