The ADHAPT Project is a research initiative developed by the Creative Science and Arts Institute (CSAI), led by Dr. Angelo Dalli and Selina Scerri, in collaboration with the ADHD Foundation and researchers from the University of Malta.

The project has recently been awarded the Research Innovation  funding by Xjenza Malta, Grant Number-REP 2025-081 supporting the development of a new approach to emotional regulation using artificial intelligence and sensory design.

ADHAPT focuses initially on individuals with ADHD, a condition that often includes challenges with emotional regulation, sensory overload, and maintaining focus during moments of stress.

Many people with ADHD experience sudden spikes in emotional intensity or sensory overwhelm. During these moments it can be difficult to regain calm or refocus attention. ADHAPT explores whether AI-generated visual and sensory environments can help support that process.

How ADHAPT works

When a user begins to feel overwhelmed or dysregulated, they interact with the system by speaking to the application.

The system analyzes vocal signals such as tone, rhythm, and emotional markers. Based on this information, the AI generates a personalised visual environment designed to reduce overstimulation and support emotional balance.

Instead of presenting generic relaxation exercises or meditation videos, the system dynamically responds to the user’s emotional state.

Each experience is therefore unique and adaptive, evolving based on the emotional signals detected in the user’s voice.

Science-informed sensory design

The ADHAPT system combines principles from psychology, visual perception, and sensory design.

Three core elements shape the environments generated by the AI:

Color and light balance

Carefully calibrated color palettes and light transitions aim to support physiological calming and reduce sensory stress.

Culturally familiar visual structures

Visual patterns inspired by local environments are used to enhance grounding and emotional recognition.

Minimal and semi-abstract imagery

The environments avoid complex visuals that may increase cognitive load. Instead, simple shapes and flowing forms create spaces that are visually engaging without becoming overwhelming.

The project is also exploring how sound may play an important role in emotional regulation.

Both rhythmic audio elements and ambient sound environments are being studied to understand how they interact with visual stimuli to support calming responses.

Research and future impact

ADHAPT is being developed as a research-driven system, with structured testing involving participants with ADHD.

The project will investigate how different combinations of:

color

motion

spatial composition

and sound

affect the ability to recover from stress and emotional overload.

While the first phase of the research focuses on ADHD, the long-term goal is broader.

The team aims to develop adaptive digital environments that support emotional wellbeing for a wide range of users, including people experiencing stress, anxiety, or cognitive fatigue.

Toward personalised digital wellbeing

Digital wellbeing tools are often based on generalised solutions that assume the same calming strategies work for everyone.

ADHAPT explores a different direction.

By combining artificial intelligence, visual science, cultural intelligence, and sound research, the project aims to move toward personalised emotional support systems that respond to individuals rather than offering one-size-fits-all experiences.

As the research progresses, the team hopes ADHAPT will contribute to a deeper understanding of how AI-generated sensory environments can support emotional regulation in real time.