MVIS 2024 Q3 AI Chat

Location-based Mixed Reality for Mobile Information Services



Location-based Mixed Reality for Mobile Information Services

By Dr. Fotis Liarokapis

With advances in tracking technologies, new and challenging navigational applications have emerged. The availability of mobile devices with global positioning system receivers has stimulated the growth of location-based services, or LBS, such as location-sensitive directories, m-commerce and mapping applications.

Today, standard GPS devices are becoming more and more affordable, and this promises to be the lead technology in use in LBS. Commercial applications such as in-car navigation systems are already established. However, for other potential applications like pedestrian urban navigation standard GPS devices are still deficient in providing high accuracy (ranges between 10 to 50 meters); coverage in urban areas (in-between high buildings or inside tunnels); and coverage in indoor environments.

An assisted GPS application has been proposed that features orientation assistance provided by computer-vision techniques — detecting features included in the navigation route. These could be either user-predefined fiducials or a careful selection of real-world features (i.e. parts of buildings or whole buildings).

With the combination of position and orientation it is possible to design “augmented reality” interfaces, which offer a richer cognitive experience, and which deliver orientation information infinitely and without the limitations of maps. However, to complement the environment in an AR setup, the continuous calculation of position and orientation information in real time is necessary. The key problem for vision-based AR is the difficulty obtaining sufficiently accurate position and orientation information in real time, which is crucial for stable registration between the real and virtual objects.

The Development of Location Context tools for UMTS Mobile Information Services research project, or LOCUS, aims to significantly enhance the current map-based user-interface paradigm on a mobile device through the use of virtual reality and augmented reality techniques. Based on the principles of both AR and VR, a prototype mixed reality interface has been designed to be superimposed on location aware 3D models, 3D sound, images and textual information in both indoor and outdoor environments. As a case study, the campus of City University (London) has been modeled and preliminary tests of the system were performed using an outdoor navigation within the campus.

Research Issues

A number of principal research issues, related to good calibration and registration techniques are being addressed. They include:

Registration of the geographical information with real objects in real-time
Use of mixed reality for spatial visualization at decision points
Integration of visualized geographic information with other location-based services
System Architecture and Functionality

To explore the potential of augmented reality in practice, we have designed a tangible mixed reality interface. An overview of the system architecture at the mobile device side includes a tracking sub-system, camera sub-system, graphical representation sub-system and user-Interface sub-system.

The AR models acquire both the location and orientation information through a client API on the mobile device, which is sent to the server. The server will build and render the scene graph associated with the location selected and return it to the client for portrayal.

Mixed Reality Pre-navigational Experiences

The stereotypical representation for self-localization and navigation is the map, which leads to the assumption that a digital map is also an appropriate environment representation on mobile devices. However, maps are designed for a detached overview (allocentric) rather than a self-referential personal view (egocentric), which poses new challenges. To test these issues we have developed an indoor pre-navigation visual interface to simulate the urban environment in which the user will ultimately navigate.

In particular, a novel mixed reality interface, referred to as MRGIS, was implemented that uses computer-vision techniques to estimate the camera position and orientation, in six degrees-of-freedom, and to register geospatial multimedia information, in real time. Two procedures are being pursued for calibration: a technique involving a set of pre-calibrated marker cards, and an algorithm using image processing to identify fiducial points in the scene.

In the context of LOCUS, geospatial information that we believe would make a significant improvement in the navigation process includes 3D maps, 2D maps (with other graphical overlays), 2D and 3D textual information (guidance, description and historical information) and spatial sound (narration).

Another advantage of the system is that it provides the user with a tool to control the form of visualization and the level of interaction with geographical information, in both VR and AR environments. MRGIS can seamlessly operate using any number of input devices.

PDA-based Navigation

Orientation will be provided by a digital compass. A hybrid approach can be then deployed utilizing a balance between both hardware (GPS and digital compass) and computer vision techniques to achieve the best registration results. In addition, we plan to develop new tools to characterize the spatio-temporal context defining the user's geographic information needs, and to build navigation and routing applications sensitive to this context.

Conclusions

This research, funded by the EPSRC, through Pinpoint Faraday Partnership, aims to enhance the UK research base in the emerging mobile information science. LBS are a crucial element in the strategy to develop new revenue streams alongside voice, and this research could significantly improve the usability and functionality of these services. The design issues must now be addressed through holistic task-based studies of the applications needed, and their usability at cognitive, ergonomic, informational and geographic levels.

Comments