It empowers users with live brainwave feedback and visual stimuli, helping them self-regulate their emotions and actions. Based in real-time, users can be aware of their brainwave patterns to choose tasks that are suitable for their current brainwave patterns

NeuroFocus XR

The project was part of my Masters Thesis at the Royal College of Art London.

Recently won Design Intelligence Award 2024 Honorable Mention

NeuroFocus XR is a neurofeedback therapy for ADHD adults in Mixed Reality.

ADHD is a prevalent neurological disorder that affects 5-10% of the world’s children and often persists into adulthood. These traits lead students to perform poorly in academics, and has also been observed that when carried forth to adulthood, it can lead to involvement in crimes.

ADHD symptoms & Neurofeedback therapy

ADHD Brainwaves

Cognitive-behaviour therapy

+

Medication and Cognitive-behavioral therapy (CBT) are the primary treatment options for ADHD.

Psychopharmacological treatment such as the prescription of Methylphenidate is not always effective and may have serious side effects.

Unfortunately, these primary treatment options have potential limitations, such as medication side effects, lack of behavioral improvement, high costs, and major time commitments.

Moreover, people are hesitant in accepting medication due to adverse side effects, which include the loss of appetite, anxiety, insomnia, headaches and irritability.

Somatic work, Meditation, Mindfulness movement are some of the initial harmless treatment options for ADHD people .

After Neurofeedback therapy, patients see themselves as improved beings in terms of focus, self-regulation of emotions, understanding of the situations, better social engagement, control in anger issues, and more.

Underlying questions & Expert insights

To outpack the contributing factors, we need look deeper and answer the underlying questions:

  • How do they operate?

  • How do they perceive their environment, things around them?

  • What tasks they do well? And for how long?

  • What’s their productivity like? Are they any different?

  • What if we liberate ADHD people of their weighed down medication plans?

  • What if we let their creativity run wild and free?

  • How do you make these people realize their full potential?

Work style

Mindfulness

+

Expert’s insights into tackling such a critical condition

However, existing Neurofeedback therapies are expensive requiring long, consistent sessions.

The games are not engaging enough, graphics are poor quality, are only appointment-based, require Neurofeedback therapist at all times throughout the therapy, and have mixed feelings among people.

New emerging ways have the potential to change the mindsets of people towards the methodologies and the technology in which they can be used.

Introducing NeuroFocus XR

NeuroFocus XR is a neurofeedback therapy for ADHD adults in Mixed Reality. It empowers users with live brainwave feedback and visual stimuli, helping them self-regulate their emotions and actions. Based in real-time, users can be aware of their brainwave patterns to choose tasks that are suitable for their current brainwave patterns.

User experience, brainwaves & visual stimuli

The visual stimuli is placed in the semi-focused area of the human vision angle so that the user can focus on their task but simultaneously be aware of their brainwaves patterns.

This way users can get live feedback of when they get distracted or zoned out with the help of changing color gradients on the side of their vision, indicating them to switch to a different task suitable for their current brainwave state or take a break

+

Brainwave patterns

Semi-focused vision placement

The work dashboard gives an overview of the people working in teams and assigns task as per their brainwave pattern so that they do not have make efforts in concentrating at the task but comes naturally due to their optimised brainwaves

Working prototype & visual interaction

Final working prototype that interacts with biometric sensor to display colors according to the brainwave states of the user making it optimized individually for them in Mixed Reality.

Due to the lack of market availability of brainwave sensors and expensive prices, I prototyped the design using a heart rate sensor, configuring it with a micro-processor and transmitting the data wirelessly to the visual stimuli software.

With ESP-32 and Max 30102 (a heart rate sensor), I was able to send the heart rate data to the Touch Designer software and then interact in Mixed Reality via the Unity Engine software.

Further, to make it hassle-free, I designed a small a housing for the heart-rate sensor to map the data behind the right ear which holds the auricular nerve pathway.

To understand the classifications of brainwaves and related colors, I also designed a brainwave color scale for users to distinguish their brainwave pattern with identifying frequent colors appearing on their semi-focused vision angle.

Prototyping Mixed Reality in Unity

While the initial setup went relatively smoothly, I encountered a few unexpected failures along the way which I overcame later.

In iteration, I implemented a feature where color gradient moved in sync with user’s head movements, functioning as a heads-up display. You can see it in the bluish-green gradient in right image above.

This allowed the user to receive real-time feedback on their brainwave patterns, integrating the visual experience more directly with their physical actions.

The placement of the color gradient is well-executed, as it stays out of the user’s focused vision while still allowing them to notice the displayed color.

However, when it comes to testing this setup in a work environment, the current headset is too bulky and heavy for people to wear comfortably while working.

Although the pass-through quality has improved, it still falls short of allowing users to see clearly enough through the headset to read from their laptop screens and perform tasks effectively.

User testing & the future

I believe that as Mixed Reality technology advances in the coming years, more compact and lightweight glasses will become available.

At that point, we can conduct more rigorous user testing in real-world conditions and refine the design.