In this course we researched future-focused technologies and design for interactions that may arise when these technologies mature. An emphasis was placed on anticipating the societal impact of these technologies and accounting for potential pitfalls in our designs.
project duration: one semester
course: Innovation Design 1
semester: 3rd in 2019
Adaptive User Interface are an evolution of standard static UIs, which present the same UI regardless of the user or context of use. Most interfaces we encounter today are static. AUIs are a stepping stone towards ubiquitous computing (ambient intelligence), a futuristic vision in which every object is equipped with computers, sensors, and networking power. In this vision, the border between the environment and computing interface is blurred – the entire environment becomes an interface.
An interface is always based on the communication between the user and the user interface, thus creating interaction. The first step for us was to analyze the interaction between different participants and systems. First questions we answered is how can an artificial intelligence get to know the user? And by whom is the the adaption made?
We differentiate between the Human-to-Human-Interaction where the humans communicate with each other. We are able to interpret each other draw conclusions about our environment and develop intelligent reactions is the core of human existence.
With a Human-to-Computer-Interaction the information of the communication is lost. As it stands, classical computers are simply incapable of interpreting humans as like we can interpret each other. In this project, we asked ourselves, what if human-computer interaction were as rich as human-to-human interaction? How could we design computers to intelligently interpret and react to human behaviour?
In a public setting, it is impossible to design a system perfectly tailored to each individual's dimensions. An interface suited for a moderately tall user will prove cumbersome for both tall and short users. What if we didn't have to design a single interface for everyone? Adaptive UI systems could conceivable detect and adapt themselves to a user's height, vertically aligning and scaling the interface appropriately.
habits and abilities
A user's navigation habits gives insight into their goals, priorities, and individual preferences. Their navigation ability may reflect slight difference in micro-interactions or gross differences in motor ability. How can adaptive interfaces adjust themselves to accelerate common tasks and accommodate individual differences in ability?
context and surroundings
We are always on the move–surrounded by different people, navigataing different social contexts, traveling within our towns, across countries, or around the world. Despite our ever-changing environments, our digital devices always remain the same. How might we incorporate situational and contextual awareness into our devices?
framework - magic mirror
We have developed a framework for adaptive user interface systems and an installation to demonstrate the current state of artificial perception. Our mirror was designed to playfully and intuitively communicate the present capabilities of machine perception to the general public and provoke critical discussion thereof. Equipped with facial recognition, keypoint tracking, and emotion recognition, the mirror could recognize past visitors, estimate a user’s age and gender, and display facial features and emotional cues in real-time.
A person positions themself in front of this installation. The camera detects different characteristics of the person and compares them with its own database to find out how old and what gender the person is.