Automotive User Interfaces
With the change from manual driving to automated driving come new challenges for the automotive domain. Our research looks into novel interfaces to not only support drivers in manual driving but also drivers/passengers of automated vehicles.
We evaluate novel interfaces, like a vibro-tactile belt, a shape-changing steering wheel or ambient light patterns inside the vehicle. Scenarios include navigation, highway overtaking, safe distance keeping, taking over the control of an automated vehicle, or intuitively understand an automated vehicle’s intention.
For most of our experiments, we use a 150° fixed-base driving simulator and, depending on the research question, measure driving performance, gaze behavior, and/or driving experience.
This research is supported by the projects COMPANION and CSE.
Contact: Wilko Heuten
Augmented and Virtual Reality
We are investigating novel interaction methods and visualization techniques for Augmented and Virtual Reality. In that regard, we also fabricate new devices in our own Fablab. We evaluate our results in empirical lab studies.
Main Research Topics
- Visualization of out-of-view objects in Mixed Reality
- Attention Guidance in Cyber-Physical Systems
- Reducing alarms on Intensive Care Units
- Novel techniques for medical care
- Industry 4.0
Contact: Sebastian Weiß
An important key to aging healthily is physical activity. The aim of the project AEQUIPA is the development of interventions which promote physical activity in old age. Therefore, factors within a community which influence the mobility are being investigated and interventions developed. A particular focus lies on measures which are applicable to all socio- economic groups of people within the community.
Within the scope of AEQUIPA, OFFIS examines the application of technology-based interventions on the basis of sensor-detected vital parameters for the preservation of mobility of older people. Furthermore, OFFIS is working on detecting preventive measures for functional decline of muscle groups and is creating a system for monitoring and displaying physical activity.
Project websites: OFFIS AEQUIPA project page and aequipa.de
Contact: Kai von Holdt
Understandable Privacy Policies
Emerging technologies are deeply ingrained into our day-to-day lives. To use a smart watch that reminds us to stay active or social media that connects us with friends and family, we accept and allow the access of our private data. Often, the privacy policies we agree to, comprise of long texts, written in a way that is difficult and time-consuming to comprehend.
The aim of the PANDIA project is to make the topic of data protection more interactive as well as understandable and to help users make informed decisions about the use of their data. As part of the project, we create prototypes for a PANDIA app and browser plugin, suitable for the everyday use. In addition, we investigate creative interaction methods and visualizations, offered by Augmented- or Virtual Reality and gamification techniques.
Rich Interactive Materials for Everyday Objects in the Home (RIME)
With the world gradually moving towards affordable smart home setups, new design and technical challenges are emerging. Each vendor has their bespoke interaction concepts and techniques which require learning and remembering.
These varying interaction concepts lead to users being frustrated, making mistakes, and have negative user experiences, ultimately resulting in the discarding of promising solutions. However, there is an opportunity to utilise artefacts and technologies naturally embedded into daily practices as a basis for new holistic control interfaces and mediums.
Therefore, the Rich Interactive Materials for Everyday Objects in the Home (RIME) project seeks to unlock the interactive potential for rich interaction with the materials in our smart environments.
The RIME project seeks to achieve this goal by designing, prototyping and evaluating scalable sensor and actuator technology together with touch interaction paradigms for seamless integration into everyday materials and objects to enable natural and scalable hands-on interactions with our future smart homes. As a result, the physical artefacts in our homes, such as chairs, tables, walls, and other surfaces, can be equipped with an interactive digital “skin”, or contain interactive sensor and actuator materials; and swiping along a table, to unfold it for additional guests may become a possible scenario.
Contact: Michael Chamurnowa
Gestural interaction paradigms for smart spaces (GrIPSs)
With the advent of intelligent environments (e.g. smart homes) and wearable computing technologies, 2D gestures slowly disappear and are replaced by more natural interaction modalities, such as voice or spatial 3D gestures, and take advantage of the whole body for interacting with pervasive computing environments. While 3D gestural interaction has been explored for many years now, there is no general vocabulary of gestures that is generalizable across different spaces and situations. There is also no metric to allow for a comprehensive assessment of the quality and usability of gestures in different contexts. We aim to understand and support gestures related to interactions, particularly in smart environments. Here, we will look at single gestures and gesture sequences carried out not only with one hand, but also bimanually and with the support of the whole body.
Intelligent Self-Service Systems (ISSS) in public sector
Self-Service systems are well integrated in our everyday life, e.g., self-service checkouts in retail. However, they seem to appear more in private sectors compared to public ones (e.g., civil service).
In the ISSS.KOM project, we investigate the effect of interacting with an Intelligent Self-Service System (ISSS) instead of a with human (bureaucrat) in public administrative processes, such as passport renewals or car registrations. In addition, we aim to examine the level of citizen satisfaction and acceptance of ISSS in such settings.
The project goals are to be achieved by designing an ISSS model that is visually and vocally responsive to humanns, their languages, expressions and emotions. We then will use the empirical approach to examine the efficiency of our created prototype in five different major cities in lower Saxony.
Transparent Usage of Personal Health-Data
Monitoring, tracking and controlling individual health is becoming increasingly popular through fitness trackers and apps. But what happens to this data?
Nowadays, our individual health data is mostly in the hands of health insurance companies, doctors, and health care facilities, and is not managed by each person individually.
Since the collected data is sensitive and considered to be worthy of special protection, it is currently very difficult to allow other institutions to use this data, even though the users could benefit from sharing it, for example, in their regular health care.
The Health-X project aims to bring citizens into the focus of providing, using and controlling their own health data. With the dataLOFT platform, users will be able to work with their data in an ecosystem of health applications that uses GAIA-X standards to ensure data security and privacy.
As part of the project, future users will be surveyed, and prototypes of health applications will be created and tested based on the needs captured.
A user-centered design process aims to empower users as individuals to understand the use of their data, trust the process behind it, and establish responsible usage of that data.
Main research topics:
- User-centered design
- Health data
- Smart Wearables
Gaze Behavior in Reading
Main research topics:
- Inferring reading progress from gaze behavior
- Detecting reading problems in real time
- Assistive visualizations for word and sentence level decoding
Contact: Tobias Lunte