Augmented Reality

Our past research projects

  • SensorVis

    Modern cars are equipped with an increasing number of sensors perceiving the environment – especially towards the area in front of a vehicle. Fusing such sensor data and further analysis to detect other traffic participants is expected to help driver assistance systems increase driver safety.

    For the development of such multi-sensor systems and driver assistance systems, it is necessary to visualize representations of all levels of such data, starting with raw data from each single sensor up to fused data and interpreted contextual data. Such visualization is necessary for debugging purposes during the development process of perception systems. They will also become invaluable as cars with increasing sensoric functionality are introduced into market and (re)calibration becomes part of the daily production and maintenance routine since the correct operation of the sensors has to be evaluated or maintained on a regular basis. Visualization of sensor data also can bridge the gap between researchers in sensorics and in HMI presentation concepts, thus leading to new, preferably visual interaction schemes in safety assistance systems...

     

  • Trackframe

    Use of AR technology in wide-range industry plants requires a tracking framework which supports a fusion of various tracking technologies based on careful evaluations of tradeoffs between technical requirements and costs. It is the goal of the trackframe project to systematically develop the formal basis of a tracking framework. The concept is expected to consider algorithmic and data-related standards via which tracking systems can be dynamically merged. Furthermore, trackframe aims at providing concepts towards the analysis of quality criteria (such as error statistics for individual sensors) and towards supporting AR-engineers in planning improvements to their tracker configuration. Three demonstrators will exhibit trackframe concepts in industrial and academic setups.

     

  • PARENT

    New interaction and information concepts for car drivers are in common evaluated in driving simulators. Systems like these often require a lot of work to realize predefined scenarios.

    What about taking real traffic scenarios and putting them into simulated environments?

    With a suitable system, ergonomic engineers can request car drivers to create a specific traffic scenario, which for instance would cause a driver to react in a certain way. Analysis of those szenarios opens up a new opportunity for the design of rules reflecting human behaviour.

    This project provides a platform for these issues:

    Sensor data is used as well as AR and tracking based data to generate immersive szenarios

    Driver behaviour is capturable for objective analysis

    Interactive table-top development of scenarios

    Multi-channel configurable immersive presentation from different viewpoints

     

  • PRESENCCIA

    The PRESENCCIA project undertakes a research programme that has as its major goal the delivery of presence in wide area distributed mixted reality environments. The environment will include a physical installation that people can visit both physically and virtually. The installation will be the embodiment of an artificial intelligent entity that understands and learns from its interaction with people. People who inhabit the installation will at any one time be physically there, virtually there but remote, or entirely virtual beings with their own goals and capabilities for interacting with one another and with embodiments of real people. The Augmented Reality Group (FAR) is participating in work package 7, providing a reality model for the inhabited mixed environment. This is a continuation of our work on Ubiquitous Tracking systems.

     

  • Augmented Presentation

    This project aggregates various independent subprojects to generate base technology for seamless integration of well know desktop metaphors into virtual environments as well as intuitive interaction metaphors.

    The window-into-a-virtual-world is a project to extend an existing 3D visualization tool to support Powerwall-, CAVE-environments and ART Tracking devices.

    The XinCave integrates the data represented from a X-Server on display into CAVE (Cave Automatic Virtual Environment). The data is to be represented as a texture mapped onto object(e.g. plane) in the virtual space. So the user can see a virtual desktop, placed somewhere in the virtual world. In a future SEP this desktop have to be made able to response to user interaction directly in the CAVE.

     

  • Immersive Configuration

    In user interfaces in ubiquitous augmented reality, information can be distributed over a variety of different, but complementary, displays. For example, these can include stationary, opaque displays and see-through, head-worn displays. Users can also interact through a wide range of interaction devices. In the unplanned, everyday interactions that we would like to support, we would not know in advance the exact displays and devices to be used, or even the users who would be involved. All of these might even change during the course of interaction. Therefore, a flexible infrastructure for user interfaces in ubiquitous augmented reality should automatically accommodate a changing set of input devices and the interaction techniques with which they are used. This project embodies the first steps toward building a mixed-reality system that allows users to configure a user interface in ubiquitous augmented reality.

    This project has been conducted at Columbia University by a a visiting scientist from our research group.

     

  • TUMMIC - Thoroughly consistent User-centered Man Machine Interaction in Cars

    The focus of this project relies on the HCI part of automobile technology. Even if sensory information is necessary, the project assumes THE sensor system and therefore focusses on the evaluation of HCIs.

     

  • AiRcraft

    In light of the continuously increasing air traffic during the last decades, technical aids needed to support the pilot in navigation and flying has be- come more critical than ever to assure a flawless and safe flight. Primarily, this support is given to the pilot by a graphical presentation of sensor data to depict the actual flight condition (situational awareness). In this concern, visual displays have a crucial role to play nowadays. But the trend leads away from monitors on the centerpannel and head up displays to integra- ted head-mounted devices, which increase the pilot’s freedom of movement. But most of the devices used today are proprietary solutions, with many of them being based on electromagnetical tracking. In the application field of augmented reality, we encounter many of the same questions, but most of them in a more general context. Camera-based optical tracking is frequently applied in this field, due to some ma jor advantages over other tracking me- thods. The obvious drawbacks of optical tracking, such as highly complex and time-consuming image processing, may be compensated soon with the ongoing development of faster CPUs2 and GPUs3 . Moreover, by reducing a „one-fits-all“ solution to a problem-customized approach, several known problems of optical tracking can be avoided or at least reduced.

    The goal of this thesis is to design a tracking system for the LFM’s4 flight research simulator by using existing techniques from the research field of augmented reality. The approach described here adapts solutions from existing systems to the special requirements of the cockpit environment, which leads in se- veral aspects to constraints that need to be intentionally taken advantage of whenever possible. Otherwise, the cockpit environment introduces several limitations which inhibit the application of existing solutions without modi- fication. Hence, a camera-based optical tracking approach was chosen, based on acitvely emitting fiducials in the infrared spectrum. The requirements of a platform-independent implementation as well as the future option to migrate the system to other cockpit types were complied with as much as possible throughout. Even though the requirements for an approval by federal flight authorities could not be taken into account, the system was nevertheless in- tentionally designed beyond the exclusive use in the flight simulator, which is reflected particularly in the fiducial design, by assuming a wide range of lighting conditions in the cockpit. The results presented in this thesis may serve the reader as support in choosing a tracking system tailored to a spe- cific problem, and showing some difficulties and possible approaches to their solution. For the tracking, we use a HMD-Mounted FireWire camera by PointGrey.

     

  • CAR

    Problem Statement

    The goal of CAR is to create a collaboration platform for computer scientists (UI programmers) and non-technicans (human factors, psychologists etc.). The platform allows collaborative design of visualizations and interaction metaphors to be used in the next-generation cars with Head-Up Displays. We focus on two scenarios: parking assistance and a tourist guide.

    On the technical level we try to incoperate techniques like: layout of information on multiple displays, active user interfaces based on user modelling with eyetracking and an improved User Interface Controller with a rapid prototyping gui. Additionally a dynamically configurable set of filters (each having an appropriate gui for tuning parameters) is provided, which can be easily instantiated and deployed with DIVE.

     

  • Ski - Entwicklung einer Messeinheit für die Bodenreaktionskräfte im Alpinen Skisport

    Zur Unterstützung der Fahrtechnikanalyse der Ski Alpin Nationalmannschaft werden die Bodenreaktionskräfte bei der Abfahrt gemessen und mit dem Fahrstil verglichen. Dazu wird bisher eine Messeinrichtung zwischen Bindung und Ski geschraubt wird. Da diese allerdings zu alt ist und nicht mehr den Anforderungen entspricht (zu schwer und ungenau), soll sie überarbeitet werden. Die gewonnen Daten werden mittels Augmented Reality in das Kamerabild eingeblendet. Dazu kommen Trackinggeräte wie GPS und Gyroskop zum Einsatz.

    Ziel des Projekts ist es, ein Konzept für eine neue, leichte, genaue und flexibel einsetzbare Messeinrichtung und eine Augmented Reality-basierte Datenvisualisierung zu entwickeln.

    Dazu arbeitet eine Gruppe von mehreren Maschinenbaustudenten mit mehreren Informatik(SEP)studenten intensiv mit Forschern der Sportwissenschaften. Der Zeitrahmen des Projekts ist von Dezember 2003 bis ca. Mai/Juni 2004. Die Maschinenbaustudenten kümmern sich dabei primär um die Entwicklung der Sensorik, die Informatiker um eine AR-basierte Visualisierung der gemessenen Daten. Dabei sollen zunächst zusammen mit Sportwissenschaftlern die Anforderungen an ein solches System analysiert werden. Mittels relativ einfacher Prototypen soll dann ein erster Ausblick auf ein mögliches Endsystem gegeben werden.

     

  • Break Out

    The aim of this project was to implement a simple game which can be used to demonstrate the possibilities of augmented reality in general and particularly of the DWARF framework. The game developed during the course of this project is based on the popular game Break Out, which was originally developed by Atari in 1976 as an arcade video game.

    In this game the player controls a racket at the bottom of the playing area. The player has to use the racket to prevent the ball moving in the area from reaching the bottom side. At the top of the area there is a number of blocks, which have to be destroyed by hitting them repeatedly to win the game.

    The player interacts with the game by moving a tracked object. The movement of this object is used to calculate the movement of the racket. For tracking functionality Break Out is dependant on the ARTTracker service. Additionally, the service Viewer is used for visualization purposes.

     

  • Navi

    Navigation Aid for Visually Impaired

    NAVI is a navigation system for blind and visually impaired users. The Main Aspects of the project are:

    User Interface

    The system will be used by people who can't see or read the ouput given on a screen. Therefore we must provide feedback and ouput in acustic and tactile mode. The input must be designed in a way that the user knows also without seing something where in the menus he is and how he can get where he wants.

    Data filtering

    During the navigation process a lot of events occur where the system has some informational output for the user. Through data filtering we want to guess which information the user wants to know and which information isn't of interest.

     

  • heARt - Heart surgery Enhanced by Augmented Reality Techniques

    Minimally invasive or totally endoscopic cardiac surgery is an operation technique in which physicians operate through small incision points at the operating region, either using endoscopic tools or master-slave robot systems. In either case, the field of view of the physician is quite limited to a small area around the operating region. To get more general overview, preoprative imaging and planning data can be used through Augmented Reality techniques.

    PORT - pre-operative planning and intra-operative guidance and navigation for robotically assisted minimally invasive cardiovascular surgery:

    - Design of a Planning Tool for Port Placement in Robotically Assisted Minimally Invasive Cardiovascular Surgery (MarcoFeuerstein)

    - Design of an Intra-Operative Augmented Reality Navigation Tool for Robotically Assisted Minimally Invasive Cardiovascular Surgery (JoergTraub)

    STENT - pre-operative planning and navigation for STENT implantations:

    - Development of a Planning and Navigation Tool for Endoscopic Treatment of abdominal Aortic Aneurysms - Computer Supported Implantation of a Stent Graft (MartinGroher)

    - Mathematical Methods of Image Processing for Automated Navigation in Endoscopic Treatment of Abdominal Aortic Aneurisms - Computer Aided Implantation of a Stent Graft (PeterKeitler)

    The project started in March 2003 and finished in November 2003 as a bundle of diploma theses supervised by Prof. Gudrun Klinker (Fachgebiet Augmented Reality) and Prof. R. Bauernschmitt (German Heart Center). Many of the participants are now working at the Chair for Computer Aided Medical Procedures by Prof. Nassir Navab.

     

  • Tangible User Interfaces

    Interaction techniques for Augmented Reality user interfaces (UIs) differ considerably from well explored 2D UIs, because these include new input and output devices and new interaction metaphors such as tangible interaction. For experimenting with new devices and metaphors we propose a flexible and lightweight UI framework that supports rapid prototyping of multimodal and collaborative UIs. We use the DWARF framework as foundation. It allows us to build highly dynamic systems enabling the exchange of components at runtime.

    Our framework is a UI architecture described by a graph of multiple input and output and control components (User Interface Controller UIC).

     

  • ARCHIE - Augmented Reality Collaborative Home Improvement Environment

    For producing buildings in a collaborative way with all involved persons, ARCHIE (Augmented Reality Collaborative Home Improvement Environment) shall be responsible for giving the system user the most familiar way to h andle the information during the development process.

     

  • FixIt

    Augmented Reality (AR) allows users to view computer information that is graphically embedded within the real three-dimensional world. Using a semi-transparent head-mounted display (HMD) attached to a wearable computer, a user can inspect and manipulate objects while viewing information about these objects in the HMD. This information is typically displayed as virtual objects in the real world, thus augmenting the perception of the user. The wearable computer enables users to pursue their work as they normally do, without imposing constraints on their mobility. AR applications span from medical minimally invasive surgery to manufacturing, from machine inspection and repair to games and tourist guides.

    At the Chair for Applied Software Engineering, we have set up a basic system to perform simple Augmented Reality tasks. This Praktikum enhances the basic system focussing on scenarios to support the inspection or repair of machines (Fischer Technik models).

    * Fixit Project Homepage: wwwbruegge.in.tum.de/projects/fixit/

     

  • TRAMP

     

  • PAARTI - Practical Application of Augmented Reality in Technical Integration

    More Information:

    A presentation of the prototype system (german) was given by BMW at the 3rd ARVIKA forum in July 2003. The system is currently in the process of being modified and installed for productional use.

     

  • Project Fata Morgana

    Project description

    Setup

    Lab project for 25 students and instructors during the spring semester 2001 at TU Munich

    Indepth project (system development, SEP) for a single student working directly with designers and technicians and managers at BMW during the fall semester 2001-2002

    Case studies

    We placed a camera on a designer's head to record his head motions while he evaluated car shapes in different scenarios. This helped us get an appreciation of the tracking requiremens (range and speed of head rotations) an AR-system would have to satisfy in different scenarios.

     

  • Pathfinder

     

  • Praktikum 01_ STARS

    Praktikum Augmented Reality

    "Erweiterte Realität" (Augmented Reality, AR) ist eine neue Technologie, mit der Benutzern Computerinformationen in einer halb-transparenten Datenbrille (HMD) drei-dimensional in ihr Sichtfeld eingeblendet werden, so daß der Eindruck entsteht, daß diese virtuellen Objekt innerhalb der realen Umwelt existieren. Wenn ein Benutzer sich in seiner Umwelt bewegt, bleiben die virtuellen Objekte an "ihrem Platz"; man kann sie sich also von allen Seiten ansehen und sie wie die realen Objekte manipulieren.

    Eine typische Problemstellung bei der Wartung großer Industrieanlagen besteht darin, daß man sich mit möglichst großer Geschwindigkeit und Zuverlässigkeit in neuen Umgebungen zurechtfinden muß und unter vielen sehr ähnlich aussehenden Maschinenteilen (z.B.: viele parallel verlegte Rohrinstallationen in einem Kraftwerk) ein bestimmtes zur Reparatur oder Inspektion aussuchen muß (das rte Rohr, das fte Fenster, die lte Leitersprosse, der bte Baustein, die ste Schraube). Die entsprechende Komponente kann lokal nicht eindeutig bestimmt werden sondern läßt sich nur im erweiterten Kontext der Umgebung, beispielsweise durch Abzählen aller ähnlichen Komponenten von einer bekannten Randposition aus, identifizieren.

    Augmented Reality Konzepte können bei der Erstellung dieses Kontextwissens helfen. Ein Ansatz wäre es, ein AR-System zu entwickeln, das mit einer mitgeführten mobilen TV-Kamera die lokale Situation analysiert, erkennt, daß keine eindeutige Wahl getroffen werden kann, und dann anhand von Globalwissen über die Umgebung den Benutzer durch Pfeile anleitet, den Blick in eine bestimmte Richtung zu wenden, sodaß ein eindeutiges "Startkriterium" gefunden werden kann, von dem aus die gesuchte Komponente durch sequentielles Abzählen per Computer (bei entsprechend dirigierter Kopfbewegung des Benutzers) identifiziert werden kann. "Komfortablere" Lösungsansätze können eventuell zusätzlich vorhandenes Globalwissen aus der Umgebung ausnutzen, wie etwa Trackinginformationen von stationären Überwachungskameras, die den augenblicklichen Benutzerstandpunkt genauer festlegen, und somit die Arbeitsweise des Benutzers weniger stark belasten.

    In diesem Praktikum werden Studenten verschiedene Lösungsansätze entwickeln und miteinander vergleichen.

     

  • DWARF - Distributed Wearable Augmented Reality Framework

    DWARF is a CORBA based framework that allows the rapid prototyping of distributed Augmented Reality applications. Read more...

     

  • Arvika

    ARVIKA explores concepts of Augmented Reality (AR) in Development, Production, and Service in industrial applications. It is a German consortium funded by the German Federal Ministry of Education and Research (BMBF) that researches Augmented Reality technologies for the support of work processes in development, production, and service for complex technical products and plants. The project is very application driven and consists of five sub-projects: the horicontal projects "AR in Development", "AR in Production", and AR in Service as vertical projects are complemented by "User driven System Design" and "Base Technologies for AR".

     

  • Forlog (Supra-adaptive Logistic Systems)

    The focused view on human resources within supra-adaptive logistics systems leads to the fundamentals of / establishes the fundamentals for the mobility of knowledge and the flexibility of employees.

     

  • 3D Visualization and Exploration of Relationships and Constraints at the Example of Sudoku Games

    In recent years, many systems for visualizing and exploring massive amounts of data have emerged. A topic that has not yet been investigated much, concerns the analysis of constraints that different objects impose on one another, regarding co-existence. We present concepts toward visualizing mutual constraints between objects in a three-dimensional virtual and ARbased setting. We use 3x3 Sudoku games as suitable examples for investigating the underlying, more general concepts of helping users visualize and explore the constraints between different settings in rows, columns and blocks of matrix-like arrangements. Constraints are shown as repulsive and magnetic forces making objects keep their distance or seek proximity.

     

  • Augmented Chemistry

    The aim of this project is to help chemists to create, visualize molecules and chemical reactions. It will be possible to see if molecules have enough space to react with eachother. The molecules are rotated and placed with markers in the real world and displayed on a monitor or on vr-glasses.

     

  • Asyntra

    A.R.T. ist ein führender Hersteller von IR-optischen Trackingsystemen für den professionellen Einsatz (www.ar-tracking.de). Ein wichtiges Einsatzgebiet von A.R.T. Trackingsystemen ist hierbei die Verfolgung und virtuelle Abbildung von Bewegungen von Personen, welche u.a. dazu verwendet wird, um ergonomische Untersuchungen durchzuführen. Diese Messaufgabe wird häufig durch Verdeckungen erschwert („line of sight Problem“). Das bisher übliche („globale“) Trackingsystem wird durch eine größere Anzahl sog. Kleiner Kameras ergänzt. Unter Kleinen Kameras sollen hier verschiedene preisgünstige Kameras verstanden werden, wie z.B. USB-Kameras oder Mobiltelefone mit Kamera-Funktion, die Messdaten auch drahtlos übertragen können. Diese Kleinen Kameras werden so angeordnet, dass sie die vom globalen Tracking schlecht einsehbaren Bereiche des Messvolumens betrachten und so auch dort ein zuverlässiges, genaues Tracking ermöglichen. Durch entsprechende automatisierte Routinen werden die Positionen dieser Kameras sozusagen im Betrieb („on the fly“) eingemessen. Damit diese Kleinen Kameras in das globale Trackingsystem eingebunden werden können, müssen die Tracking-Algorithmen so ergänzt werden, dass asynchrone Beobachtungen korrekt verarbeitet werden können. Damit können schnell ad-hoc-Setups realisiert werden, die auch in schlecht einsehbaren Bereichen zuverlässig und genau tracken können.

     

  • Speed Up

    In verschiedenen Bereichen unseren täglichen Lebens kommt es immer wieder zu unvorhersagbaren Ereignisse, die zu einer Gefährung aller Beteiligten führen. Mündet ein solches Ereignis in einer hohen Zahl an verletzten Personen oder in einer komplexen Situation mit potentiell katastrophaler Folgegefährdung spricht man auch von einer Großlage. Die gemeinsame Vision von SpeedUp ist die integrierte Krisenreaktion der Rettungs- nund Einsatzkräfte durch Einsatz einer von allen Beteiligten akzeptierten organisatiorischen und technischen Gesamtlösung, bei der die Konzeption geeigneter User-Interfaces eine zentrale Rolle spielt. Unsere Vision ist hierbei, mobile und stationäre UIs so in den Einsatzablauf einzubinden, dass eine schnellere Abarbeitung des Katastrophenereignisses und eine beschleunigte Rettung aller verletzten Personen möglich werden. Offizielle Projektseite

     

  • Avilus

    Erweiterung von Trackingumgebungen in Makro- und Mikroumgebungen (Avilus):

    Einerseits wird durch geschickte Sensorfusion der Trackingbereich auszugeweitet, wobei der Schwerpunkt in der Nahtlosigkeit im Übergang zwischen verschiedenen Trackingsystemen liegt (z.B. bei weitläufigen Logistikanwendungen). Andererseits werden mittels einer Simulationsumgebung wesentliche Einflussfaktoren bezüglich der zu erzielenden Trackinggenauigkeit für hochgenaues Tracking identifiziert. Somit können gezielt Verbesserungen eingeführt werden.

     

  • Virt.Arabia (KAUST K2)

    The goal of this project is to develop a virtual environment for the interactive visual exploration of Saudi Arabia. In contrast to virtual globe viewers like Google Earth, this environment will allow the user to look both above and underneath the earth surface in an integrated way. It will, thus, provide interactive means for the visual exploration of 3D geological structures and dynamic seismic processes as well as atmospheric processes and effects or built or planned infrastructure. The specific techniques required to support such functionality will be integrated into a generic infrastructure for visual computing, allowing essentially all KAUST Research Institutes to use parts of this functionality in other applications. In particular, we expect close impact on and links to the KAUST 3D Modelling and Visualisation Centre and the KAUST Computational Earth Sciences Centre.

     

  • TISCH

    Recently, interaction devices based on natural gestures have become more and more widespread, e.g. with Jeff Han's work (watch on YouTube), Microsoft Surface or the Apple iPhone. These devices particularly support multi-touch interaction, thereby enabling a single user to work with both hands or even several users in parallel.

    In this project, we explore the applications of multi-touch surfaces, both large and small. We are building and evaluating new hardware concepts as well as developing software to take advantage of the new interaction modalities.

     

  • DySenNetz

    This project provides the foundations and formal basis towards building dynamic systems to fuse multiple sensors online. To this end, it lays the groundwork for building ad-hoc tracker networks for Augmented Reality, and formulates and analyzes accumulation of measurement errors and sensor noise.

     

  • SHEEP - The Shared Environment Entertainment Pasture

    In this demo, we use a multiplayer shepherding game to explore the possibilities of mutimodal, multiuser interaction with wearable computing in an intelligent environment. The game is centered around a table with a beamerprojected pastoral landscape. Players can use different intuitive interaction technologies (beamer, screen, HMD, touchscreen, speech, gestures) offered by the mobile and stationary computers. Building on our DWARF framework, the system uses peer-to-peer, dynamically cooperating services to integrate different mobile devices (including spectators laptops) into the game.

     

  • FISCH (Fully Immersive Sophisticated Conformal HUD)

    AR for car drivers has the potential to reduce traffic blind times. This project develops a large scale HUD with a large distance focal plane and software based imge undistortion. A sample application shows navigation arrows superimposed directly on the street.

     

  • FISCH Application

     

  • TUMult

    Accidents in mass transport systems, nuclear events and terroristic attacks differ from day-to-day emergencies. The high number of injured in so called mass casualty incidents (MCIs) leads to a disproportion between the number of injured needing care and the resources available to provide care. It is the goal of the TUMult project to develop user-interfaces for mobile devices that support paramedics in performing the triage process in MCIs and evaluate them in different disaster control exercises. Fire fighters require mobile, computer-based triage systems which do not delay the existing operational procedures. Therefore these mobile systems must take the extraordinary environment into account and must react flexibly to changes in the instable situation.

     

  • Ubiquitous Tracking (Ubitrack)

    Augmented Reality (AR) applications are highly dependent from accurate and precise tracking data. Since current tracking technologies do not always provide such information everywhere in real-time application developer must combine certain trackers together to minimize the disadvantage of one tracker by another. These sensor networks can then be used to deliver positional relation information of objects to the application which then can be evaluated. Currently most AR application bring along their own customized solution of this problem. However these solutions are hardly reusable in other systems. This inhibits the development of large-scale sensor networks because there are no standard interfaces between these technologies. By introducing the Ubitrack framework it is possible to form ubiquitous tracking environments which may consist of several sensor networks.