At present, there are several governmental aids to advance in the ﬁeld of Augmented Reality in Germany.
The BMBF (Federal Ministry of Education and Research) is investing around 39 Million Euro until the year 2011 to signiﬁcantly advance the technology and improve Germany’s competitiveness in this sector. It is stating that Augmented Reality is becoming more and more important for example for surgeons who need to practice operations on realistic models of patients. Therefore it is supporting 3 pro jects, namely AVILUS, AVILUSplus and VIERforES. The center point for those pro ject is the Fraunhofer Institute for Factory Operation and Automation IFF in Magdeburg. Besides the ﬁnancial support from BMBF there is also support from industrial partners amounting to 170 Million Euro.
In the pro ject AVILUS, a consortium of 28 companies and research facilities is developing and testing technologies. Their goal is to develop a user-friendly technology for easily create Virtual Reality systems. The coordination of AVILUS is done by the Volkswagen AG. There are no real achievements to date, so they only present some examples the project partners came up with so far:
As stated in the 2008 annual report of the institution, in the AVILUSplus pro ject the Fraunhofer IFF has its focus on tracking of real ob jects. Current solutions need special markers on the object that are meant to be tracked as well as special light conditions and complex calibration procedures. Therefore, markerless tracking should be optimized and made more robust and calibration made more simpliﬁed. In VIERforES, there pro ject to build training systems with tactile feedback for surgeons to train operations on virtual patients.
Underwater AR is an application to turn a ordinary swimming pool into a coral reef. A camera in the diving goggles recognizes marker inside the pool and then beeing overlaid with virtual scenery by the display. The necessary computing power is provided by a ultra mobile computer the swimmer mounts on his back. Because the marker can be hidden due to swimming movements, a magnet ﬁeld-based sensor is used to keep track of the orientation of the swimmer.
EXPLOAR is used to realize augmented science centers and museums. Therefore, several exhibits were equipped with virtual content. Like in the Underwater AR, the visitor is equipped with goggles and a portable computer.
CONNECT is like EXPLOAR, but used in classrooms to make lesson more illustrative.
CoSpaces is a ”Innovative Collaborative Work Environments for Individuals and Teams in Design and Engineering”. The supervisor could for example easily get data and information about the planned work and check progress and performance.
ARTHUR is the name of the Augmented Reality round table. The usage of it lies in the ﬁeld of architecture and urban planning. Teams are able to discuss models without the need of physically creating models. Fingers or pen can be used to interact with the buldings and gesture detection is also used for interaction.
MARA is a personal digital assistant. The user wears a head-up display with GPS sensors to determine his location. The prototype comes with PIM functionality and location based services.
MQUBE is a system that comes with a 1:4 replica of a stage. It is thought to be of use for directors, stage designers etc. The involved people can visualize their ideas more quickly and more cheap than on the real stage.
TECHNICAL UNIVERSITY MUNICH
At the TU Munich there is research done in the ﬁelds Medical Augmented Reality as well as Industrial Augmented Reality.
NARVIS was an experiment using a 3D AR workspace for spine surgery. 3D CT data, ﬂuoroscopy data and endoscopic images can be overlaid on the stereoscopic video. The system is also able to track hand-held surgical instruments in 2 diﬀerent techniques. The tools can be graphically enhanced to make it easier for the surgeon to perceive the orientation of the tools. With an AR user interface, the surgeon is able to control the visualization of the data.
Camera Augmented Mobile C-arm is a pro ject where X-ray images made during surgery are being overlaid onto the live images of the operation.
Freehand SPECT for Sentinel Lymph Node Localization. In this project AR is used for nuclear imaging. This makes it easier to locate lymph nodes during surgery in comparison the pre-operative tomographic imaging which can only serve as a vague guidance for surgery.
Navigated Beta Probes for Optimal Tumor Resection. The goal in tumor resection is to remove the cancerous cells completely but not remove too much of healthy cells. With AR the patient can be scanned while doing surgery, thus the data is most accurate at that time. In addition with overlaying the images on the live video, the surgeon can do a much more exact performance.
3D user interfaces for medical interventions is a point of study where it is researched how the use of head-mounted displays and alternative interaction technologies can be in advantage to the conventional monitor/mouse combination.
ARAV Augmented Reality Aided Vertebroplasty is a project to overcome the problem when the view on the operation site is blocked by instruments.
Improving Depth Perception and Perception of Layout for In-Situ Visualization in Medical Augmented Reality is working on the problem, that it is hard to get a grasp of how far away a virtual object is from the viewer. For example if the surgeon gets an image of the virtual spinal cord, the surgeon has to know how far into the patient this is.
Virtual Mirror: Interaction Paradigm for Augmented Reality Applications. Since the virtual objects are aligned to the viewpoint of the viewer (e.g. in surgery), traditional ways to interact with the object cannot apply. Therefore the concept of a virtual tangible mirror was born to overcome this restriction.
Augmented Reality Supported Patient Education and Consultation is a pro ject that aims on tools to support communication between doctors and patients. As primary clinical application, breast reconstruction was determined.
Forlog focuses on humans who have to work in supra-adaptive logistic environments. For example, if an employee has to get goods from a speciﬁc place, instead of looking for cryptic store place numbers he just has to follow visual guidance aids provided by the system through the head-up display.
Vision Targeted CAD Models is the keyword for the pro ject that is combining 3D CAD models with live imagery. The advantage in comparison to using just 2D markers lies in the known geometry of the observed ob ject. Even if great portions of the ob ject are covered by something it is still able to track them.
SHEEP stands for ”Shared Environment Entertainment Pasture”. In this demo, we a multiplayer shepherding game is used to explore the possibilities of mutimodal, multiuser interaction with wearable computing in an intelligent environment. The game is centered around a table with a beamerpro jected pastoral landscape. Players can use diﬀerent intuitive interaction technologies (beamer, screen, HMD, touchscreen, speech, gestures) oﬀered by the mobile and stationary computers. Building on our DWARF framework, the system uses peer-to-peer, dynamically cooperating services to integrate diﬀerent mobile devices (including spectators laptops) into the game.
ProjectComputerVisionDiscrepancyCheck is researched to tackle a problem of many construction companies. During planning phase CAD software is used but the ﬁnally built things doesn’t always match those CAD models. The software uses Anchor Plates as a way of orientation, since those are a integral part of every building structure. With the help of those, the 3D models from the planning phase are overlaid on the images of the actual built structure.
At Audi, there has been tests with Augmented Reality to use it for overlaying diﬀerent rim design on the wheels of a car.
BMW is researching the possibility of using Augmented Reality for BMW service to support maintenance work on vehicles. BMW technicians receive additional instructions for the tasks and additional information about the parts while doing maintenance.
EADS Military Air Systems developed a prototype for cable harness assembly. Via goggles the technician gets additional information about the cables. Due to a camera within the goggles the computer can detect where the technician is looking at and thus know on what cable he is working on.
Unifeye is a product of Metaio GmbH in Munich, Germany. Their website shows several use cases as followed.
- The system found use to advertise the new Mini Cabrio, print advertisements in several automotive magazines showed 3D models of the car on the user’s screen. The camera is adjusted just by moving around the advertisement in the hands of the user.
- The danish toy manufacturer LEGO uses the system to make it possible for costumers to look at the assembled contents without the need to open the packaging.
- The iLiving application for the iPhone enables the user to place virtual furniture inside a room to make furnishing a room easier
- The last example is a book, an atlas enriched with three-dimensional content when ﬁlmed with a standard webcam and a PC.
Volkswagen uses Augmented Reality to avoid having to build 1:1 prototypes of their cars. For example for the question if a new, bigger car is able to ﬁt through the drying tunnel of the paintshop, tracking pads were applied on the walls of the tunnel and photographs were taken. With this, the 3D model of the car could be overlayed on the photographs to see if the car ﬁts through the tunnel. Another application for Augmented Reality is the ﬁtting of motor parts into the engine bay. AR makes this much easier because the ﬁtting accuracy can be directly tested on a prototype.