This project was completed in collaboration with Corinna Hirt and Kalle Reiter at the HfG Schwäbisch Gmünd. The aim of the course (Invention Design, Prof. Jörg Beck) is to research future-focused technologies and design for interactions that may arise when these technologies mature. An emphasis was placed on anticipating the societal impact of these technologies and accounting for potential pitfalls in our designs. For this project, Kalle, Corinna, and I decided to research artifical intelligence and machine learning technologies. Our project converged around the concept of adaptive user interfaces.
An Adaptive User Interface (AUI) is a user interface that dynamically adjusts its layout, elements, functionality, and/or content to a given user's needs, capabilities, and context of use.

Human-to-human interaction is characterized by bidirectional perception and reaction.
Human-computer-interaction is constrained by the computer's inability to perceive a user.
With advances in sensor technologies and machine learning algorithms, computers are becoming more and more capable of interpreting human emotions and behaviors. With this in mind, we asked ourselves, The project has two compontents:
  1. a website outlining our research and design guidelines
  2. an interactive smart-mirror installation

Which kinds of information can an adaptive UI capture?
How can these parameters be constructed from sensor data?
Disclaimer: This table is not based on peer-reviewed research nor does it reflect the accuracy and reliability of each algorithm. It is intended for exploratory and demonstrational purposes.

The website outlines our findings and serves as a digital documentation for our project. We outline the design potential and the dangers of adaptive UIs, as well as a technical description of how such a system could work. During the exhibition, it was displayed alongside our mirror for visitors who wanted to learn more about AUIs.

How might an AUI adapt to differences in distance?
How might an AUI adapt to differences in size?

Smart Mirror
source code demo video
The smart mirror was displayed at the HfG Schwäbisch Gmünd semester exhibition. Our mirror was designed to playfully and intuitively communicate the present capabilities of machine perception to the general public and provoke critical discussion thereof. Equipped with facial recognition, keypoint tracking, and emotion recognition, the mirror could recognize past visitors, estimate a user’s age and gender, and display facial features and emotional cues in real-time.
Mockup mirror design (left), Screenshot of real prototype (right)
Displaying our smart mirror at the HfG's exhibition.
© Matthew Jörke, 2019