Tech Convergence Will Spur Demand for New ADAS Technology

Augmented-reality machine works in real time





Augmented-reality machine works in real time



Computer-generated scenery can be realistically added to live video footage, using a machine vision system developed at Oxford University, UK.



Researchers Andrew Davison and Ian Reid say the augmented-reality system could also in the longer term enable robots to navigate more effectively. Or it could be used to virtually decorate a real house or plan engineering work. It allows a computer to build an accurate three dimensional model of the world using only a video camera feed. It can also keep track of the camera's movement within its environment - all in real time.



Previously, it has been necessary to calibrate a computer using several markers added to a scene. The Oxford team's machine only requires an object of known size to be placed in its line of sight to perform a complete calibration.



The system then automatically picks out its own visual markers from a scene. By measuring the way these markers move the computer can judge how far away each marker is. It can also rapidly determine how the camera is moving.



Special effects



A video (80MB avi) posted to the team's home page shows the system in action. Its ability to rapidly and accurately model its environment is demonstrated using virtual furniture, including a table and shelves, which are transposed over the live video footage.



Previously, it has only been possible to add special effects to a scene in the studio afterwards. The Oxford team's system can automatically rework a scene at up to 30 frames per second.



Davison says the key is to efficiently search for visual markers. "The system is very selective about when it looks for landmarks in its environment," he told New Scientist.

Comments