Public projects 1

Head Tracking for Wireless 3D First Person Vision

Project showcase by twhi2525

  • 9,338 views
  • 7 comments
  • 58 respects

Toolbox 2


Respected projects 0

 twhi2525 hasn't respected any projects yet.

Comments 2

  • Head Tracking for Wireless 3D First Person Vision over 2 years ago

    Hey Igor,
    Thanks, looks like you built your one from scratch. I just attached my devices to an RC car and used it's controls..
    Yeah the latency, or delay, was the main focus on most decisions.
    I chose to use analogue video capture and transmission rather than a more stable digital stream because it was significantly faster, the same way FPV drones are operated. It means the only bottle necks for latency were in the analogue-to-digital USB capture cards, and the distortion / display software. This couldn't be avoided for my task, unless I had access to some very expensive analogue equipment. At least the computation was done rapidly on the laptop PC, rather than on a small on-board processor.

    In the end measured video latency from camera to glasses was 83 to 133 ms, plus an additional 26ms for the pan-tilt unit to mechanically respond to head movements. This was definitely noticeable but not limiting / frustrating. In-fact the noisy / glitchy image quality from using analogue transmission was the worst aspect. Especially bad when it is right in front of your eyes...

  • Head Tracking for Wireless 3D First Person Vision over 2 years ago

    Hi Doug,

    I'm gonna give a full-on answer in case any one wants to know, because it was tough!

    The Oculus headset has a pair of lenses that are used to wrap the image around the eye and give the high field of view. Normally a 2D screen image / video / game would be passed through two digital filters to distort them accordingly.

    With my project I had written C++ code (with OpenCV libraries) to capture, digitise and distort each frame coming from the USB capture cards, however I couldn't get it to work for 2 video streams! In the end Just running two Microsoft DirectShow streaming windows next to each other gave the best results..

    As for the distortion; I used wide angle lenses on the cameras (2.8mm i think?) so the FOV was 130 degrees to suit the Oculus. It worked pretty well actually, once you adjusted the video windows for the user's eye separation.

Add projectSign up / Login