top of page
Frame 3632.png

Wearable device for Asperger’s Syndrome

 

     Personal Project             Timeline: October 2021 to December 2021

01
What is Asperger’s Syndrome?

Asperger's syndrome is classified within the autism spectrum disorders (ASD), specifically on the milder end. Key indicators of Asperger's comprise:

Have difficulty with social interaction

Standing firm on opinions and beliefs

Focusing on rules and routines

Engaging in repetitive behaviour

02
Challenges

  • Hyperfocus Many people develop an extreme focus on a narrow topic of interest. For children, that could be an all-consuming interest in things like train schedules or dinosaurs, for example. This interest can fuel one-sided conversations with peers and adults.

  • Trouble recognizing social cues - People with Asperger’s might remain unaware of attempts to change the topic of conversation, which can be one reason why they have difficulties with social interactions. They may also have difficulty knowing when to lower their voices in certain locations.

  • Difficulty reading facial expressions or body language - Many autistic people have a hard time recognizing and understanding other people’s feelings. They might find body language difficult to interpret, avoid making eye contact, speak in a monotone, and display few facial expressions.

You should’ve put yourself in someone else’s shoes before doing that!

Why would I do that?? *surprised and disgusted*

Their shows might be filthy.

They often have a more literal thinking style. This can make it challenging for them to grasp idiomatic expressions, metaphors, and non-literal language. As a result, they might take things too literally or struggle to understand the intended meaning behind figures of speech.

03
Empathy map

In order to create a shared understanding of user needs, advocate on behalf of them and aid decision-making, I tried to put myself in their shoes. By considering their emotions, thoughts, and experiences, I gained insights into what they truly need and desire.

Frame 3640.png

04
User persona

Given that we were addressing the needs of two distinct user groups – individuals with Asperger's (Aspergian) and their caregivers or support network (neurotypical) – I developed user personas to gain a comprehensive understanding of their unique goals and challenges.

image 9.png

ASPERGIAN
Shawn | 19 | Student

“ I cannot relate to people or their emotions and I lose a lot of friends as I seem to be indifferet. I often avoid eye contact to stay away from a potential conversation”

About Shawn

Shawn, a college freshman, is brimming with enthusiasm as he embarks on his academic journey. His excitement is fueled by a strong desire to forge new friendships, engage in collaborative projects, and actively participate in the vibrant social and cultural scene on campus.

Goals
 

  • Make friends

  • Have a social life

  • Understand sarcasm

  • Be organized

  • Travel

Frustrations
 

  • Awkward with people

  • Apathetic

  • Can’t understand sarcasm

  • Obsessive

  • Hard to maintain relationships

image 10.png

PARENT
MARILYN | 44 | Baker

“ I’m apprehensive about Shawn, I’m unaware of how he’s coping with his new environment. he’s inexpressive and this worries me”

About Marilyn

Marilyn, a dedicated baker and mother with a demanding workday, finds herself fully occupied at her bakery. Despite her busy schedule, she maintains a genuine concern for Shawn's well-being, with a constant desire to ensure his comfort and peace of mind.

Goals
 

  • Know how he’s feeling

  • Guide him

  • Make shawn self sufficient in the future

Frustrations
 

  • Mentally taxing

  • Decoding his feelings

  • Anxious about him in a new environment

  • No time for personal life

How
Might
We ?

How can we support individuals with Asperger's syndrome in improving their decision-making skills, fostering meaningful relationships, and nurturing a stronger sense of empathy towards others?

05
Understanding the science behind emotions

A. How do emotions originate?

Neuroscience research in past decades has shown that emotions do not have ‘fingerprints’ in the brain. Different networks in the brain can create the same emotion and they are created by our brains. It is the way our brain gives meaning to bodily sensations based on past experiences. Different core networks all contribute at different levels to feelings such as happiness, surprise, sadness and anger.


B. How does our brain recognise emotions?

The almond-shaped amygdala attaches emotional significance to events and memories. Your brain uses your past experiences to combine information from your body, such as a pounding heart, with information from the world, like the fact that you’re waiting in a doctor’s office for test results, to construct an emotion, such as anxiety.


C. How does this help us?

While emotions are intangible and hard to describe, they serve important purposes, helping us learn, survive and initiate actions. To figure out how to respond, it is important to first decide if your emotions match the current situation. Emotional reactions can be helpful when they happen in the right situations with the right people.

06
Working

Frame 3643.png

A. Enhancement of Input Speech Data in SER

The input data collected for emotion recognition is often corrupted by noise during the capturing phase. Due to these impairments, feature extraction and classification become less accurate. This means that the enhancement of the input data is a critical step in emotion detection and recognition systems. In this preprocessing stage, the emotional discrimination is kept, while the speaker and recording variation is eliminated




B. Feature extraction and selection in SER

The speech signal after enhancement is characterized into meaningful units called segments. Relevant features are extracted and classified into various categories based on the information extracted. One type of classification is a short-term classification based on short-period characteristics such as energy, formants and pitch. The other is known as long-term classification; mean and standard deviation are two of the often-used long-term features. Among prosodic features, the intensity, pitch, rate of spoken words and variance are usually important to identify various types of emotions from the input speech signal. A few of the characteristics based on acoustics emotions of speech are presented in Table 2.
 

Frame 3644.png

C. Measures for Acoustics in SER

Often, a straightforward view of emotion is considered, wherein emotions are assumed to exist as discrete categories. These discrete emotions sometimes have relatively clear relationships with acoustic parameters, for example, as indicated in Table 2 for a subset of emotions. Often, the intensity and pitch are correlated to activation, so the value of intensity increases along with high pitch and gets low with a low pitch. Factors that affect the mapping from acoustic variables to emotion include whether the speaker is acting, there are high speaker variations and the mood or personality of the individual.

According to the study, the fundamental emotions can be described by areas within the space defined by the axes of arousal and valence are provided in Figure 2. Arousal represents the intensity of calmness or excitement, whereas valence represents the effect of positivity and negativity in emotions.

Frame 3645.png

D. Classification of Features in SER

Normally, pattern recognition classifiers used for SER can broadly be categorized into two main types, namely linear classifiers and non-linear classifiers. Linear classifiers ( Baye’s/ K nearest neighbours classifier ) usually perform classification based on object features with a linear arrangement of various objects. These objects are mostly evaluated in the form of an array termed a feature vector. In contrast, non-linear classifiers ( GMM/ HMM classifiers) are utilized for object characterization in developing the non-linear weighted combination of such objects.

06
Ideation

Quick_Retrospective__1_-removebg.png

07 
Proposed solution

A wristwatch along with a mobile app that detects and guides the wearer to identify and respond to different emotional situations, be aware of his surroundings and develop better social relationships.

Group 3894 (1).png
animockup (13).gif
Group 3895.png
animockup (9).gif
Group 3896.png
7654.gif
Group 3897 (2).png
animockup (14).gif

07 
Future roadmap

The project’s aim was to serve as an assistive device, to enhance social interaction and build lasting relationships with peers. Identifying the right user group, defining clear test objectives, conducting usability studies would  accurately assess the effectiveness and feasibility of the device, this would help us make the product better suited for its target users. Even though the logic behind the working of the device was explored, the physical design needs to be thought out further in depth.

Let's
connect! 

If you liked what you saw and want to get in touch, view my resume or just talk about design and art

R E A C H   M E   A T

  • LinkedIn
  • Instagram

@ 2023 | made with        by Anjana Menon

bottom of page