Empathy 2.0 series: How biometrics can help you understand your customers

In the first part of our series exploring what’s next in empathy and design-thinking methods, we look at how the latest cognitive research technologies offer new insights into users’ minds.

How people make decisions – from what they’ll eat for breakfast to the phone they’ll buy –  is a complex blend of emotion and reason, a psychological code to crack that until now, no single technology has managed. New biometric technologies are able to link people’s emotions to physical responses in their bodies, providing data and insights for identifying exactly what drives people to feel the way they feel.

From a design thinking perspective, biometrics and other cognitive technologies solve parts of a bigger puzzle by opening a door to further innovation. We see two patterns emerging when it comes to the empathy stage: tech that enables workflow automation and tech that augments intelligence (or insights). Here we’ll dive into the latest advances in the technologies that augments intelligence and explore how they might just change the way you think about customers.

How tech can supercharge Design Thinking

Design Thinking Tech.002

Augmented intelligence solutions

As state of the art biometrics and cognitive research technologies become more user friendly and less expensive, we’re realizing the indispensable role they can play in user and customer research.

When looked at as an augmented intelligence solution to a lack of data or deficit of data analysis power, new biometrics technologies stand out in several arenas, able to gather new data and provide further insights and analyse existing data through advanced software.

Where biometrics can change the game

Academic research: Provide insights into people’s decision-making, social behavior, risk-taking and gambling; reading and comprehension levels; memory and learning processes.

Neuromarketing: Offer measurements of people’s emotions and valence (positive vs. negative sentiment), attention and arousal.

Human-technology interaction (including human-computer, human-robot and general UX): Identify the thinking and emotions driving user satisfaction, flaws in workflow or navigation, as well as interface testing.

Healthcare: Achieve a wider scope of solutions-based research into autism, obsessive-compulsive disorder, depression, anxiety, Alzheimer’s and Parkinson’s, dementia and other conditions; assessment of treatment effects; biofeedback for performance improvement.

Virtual reality: Harness the ability to measure users’ depth of immersion and identify nausea triggers.

The new state of biometric technologies

A leap in sensor technology accuracy and availability paired with a decrease in its cost has opened the door to tech startups with innovative, business-minded ideas. Companies such as iMotions, Vempathy, isobar, audEERING and HireVue have harnessed sensor technology and new data mining methods to measure, evaluate and analyze bodily functions and parameters; matching gathered data to individual’s emotional states.

This technology is able to precisely identify emotions by measuring everything from eye movement, facial expressions and heart rate to muscle tone, sweating, vocalizations and even brainwaves. Today, new biometrics technology has transformed into a powerful, consumer-level tool for understanding customers in order to create experiences that authentically, emotionally resonate with them.

4 Biometric tools for business growth

As tech startups harness biometric technologies at a consumer level, biometric data and analysis is set to become a significant cross-sector business tool, with implications in research and development, user experience and beyond. Biometrics’ reliable, accurate information about what motivates customers and users, from their base impulses to their calculated decisions, is relevant to a range of business cases and applications.

1. Enhanced customer conversations

As more companies interact with their customers entirely over digital and voice channels, the realm of voice analytics has expanded to meet companies’ needs. An increase in voice controlled smart-home devices and other applications has also added valuable training data for these analytics so they can more accurately assess the content of spoken words and the voice’s non-verbal characteristics.

Beyond searching for keywords that trigger certain routines in customer service and classify customers, biometrics analysis searches out non-verbal characteristics, such as tone and rhythm of the voice, allowing for deeper insights into the emotional and motivational aspects of customer experience. Companies can also learn about customers’ motivations and impulses through biometric tools equipped with sophisticated machine learning algorithms and a multitude of parameters—all simply by using standard smartphone and laptop technology.  

The startup audEERING developed sensAI technology to detect emotions from recorded audio signals, distinguishing differences between over 50 emotions and accurately predicting gender, age, fitness and other factors. In corporate use, this technology has identified phrases connected to a significant decrease in conversion in phone sales calls and has predicted customers with the highest risk of closing their accounts. These insights then guided companies’ focus on customer retention activities.

Even artists are dreaming up a new set of opportunities using voice technology. In her artist-in-residency at Nokia Bell Labs, fashiontech powerhouse Jasna Rokegem is using voice to detect emotions. She even goes one step further, building prototypes that translate emotions into haptics. (see video above)

Figure 1: Interview lab setup (source: https://imotions.com/neuromarketing)

2. Tracking customer behaviour

Used alone or in combination, biometrics technologies add value to data gathering and analysis of customer feedback. As a customer research tool, these sensors are able to provide deeper and more accurate insight into customers’ motivations.

iMotions has shown how data from different sensors can provide a variety of useful information for marketing researchers, customer experience strategists and others invested in linking positive user outcomes to company growth. As seen in the image below, this technology combines three different technologies—a measurement of galvanic skin response, eye tracking and facial expression recognition—to determine if a user is feeling positively or negatively (emotional valence), how intensely they’re feeling that sentiment (arousal), where they’re looking (attention) and the emotions they’re expressing at a given moment.

When used alongside other data sources this “neuro data” can help strategists track user responses to products and services and create more personalized customer experiences.

Figure 2: Determining valence, arousal and attention by eye tracking, facial expression recognition and measurement of galvanic skin response (source: https://imotions.com/neuromarketing)

3. Unbiased prototype validation

Validation of early prototypes with customers is key in creating true solution validation. Additionally, usability of mobile apps and websites is proven to be one of the most important factors in user acquisition and retention. A systematic approach to gathering valuable user feedback should include a variety of techniques, including biometric technologies, that test for users’ attention, emotional responses (in particular when things go wrong, such as error messages or loading delays) and for their ease of use with regard to hand movements.

iMotions uses eye tracking and facial expression recognition for mobile device testing, and assesses emotions, arousal and cognitive load of website users through eye tracking, galvanic skin response and EEG. iMotions also recently integrated Emotiv’s EEG brainwave tracking solutions into its suite of biometric analyses. Meanwhile, Vempathy has recognized the importance of prototype and UX testing by specializing in it, analyzing audio and video recordings as well as textual input such as user surveys.

Figure 3: Mobile lab setup to evaluate real life or simulated environments (source: https://imotions.com/research-lab/)

4. Validating offline experiences

Biometrics innovation extends to offline experiences, such as customers’ experiences in immersive physical prototypes . Increasingly sophisticated and discreet sensor and camera technology can now capture consumers’ emotions and impulses.

iMotions, for example, has conducted studies in a retail environment using eye tracking hardware from a variety of technology suppliers. Eye tracking offers insights into both the emotional valence of experience and decision-making processes, enabling store owners to more strategically present their products within shoppers’ sight lines. In most cases, eye tracking represents one piece of a bigger customer puzzle. A customer could be looking at a product for a long time because they are attracted to it and considering buying it or because they are confused or even disgusted by it. To complete the data set and come to an accurate conclusion, eye tracking can be combined with EEG, GSR and/or facial expression recognition, together offering insight into attention, arousal and emotional valence of the customer at a particular point in time.

Move forward and grow using biometrics data

The way today’s business decisions are made reflects the general way humans make decisions: with a combination of lived experience, gut feeling and, at the best of times, methodical data. Sophisticated biometric technologies give human-centered designers a powerful combination of the three, combining the best of the human mind with technology’s accuracy and insights. Incorporating technologies based in empirical science into your process lets you reduce the guesswork in understanding people’s motivations and decision-making processes so you can move forward, ideate and grow with more certainty than ever.

Empathy 2.0: Are we there yet?

Well… We tried some of these tools, to see whether they would live up to (or surpass) their claims.

Test 1: Vempathy emotion analysis

Vempathy, which provides a UX testing platform that records the UW tst, but also provides emotion detection via webcam and AI analysis. Bold claims: what if we can distill emotion like we can categorize a biomarker. That would supercharge empathy, decrease the learning curve to gain empathy (because the non-verbal language is observed for you), and just save lots of time in post-processing and interpretation of interviewing and user testing.

We took Vempathy for a test run: we signed up for the free trial of the tool, which you can get access to immediately. We set up a test by inputting a the website we wanted to test (in this case Innovationstack.io), sent it to a few users and this was the result.

A confused series of detected emotions: a baseline of ‘neutral face’ takes us on a rollercoaster of emotions from screaming, to winking, to plain dissapointment.
As you’ll notice in the video, the interpretation of Vempathy quite differs from our human interpretation. I certainly didn’t see him wink or scream. I just saw a man giving his neutral, slightly unenthused and genuine feedback.

Test 2: gathering high quality video & audio for emotion analysis

We thought it might be due to the webcam (which Vempathy was designed to work with), so we started looking into different softwares that use higher quality video and audio to distill emotions from (such as audEERING). To test these, we first needed to accumulate some video and audio data, so with our client’s permission we recorded some interviews using a tripod camera. After taping about 5 of them, we noticed our fear was real: the setup is intimidating. The tripod camera continuously screams “You are being recorded”. As expected, it influenced the quality of the interview as people were more aware of what they said and what their body language conveyed on tape. We carried on the next interviews without the tripod, and didn’t bother analysing the previously recorded interviews as the bias induced by the tripod camera would affect the quality. We might set up a new trial with less intrusive methods: audio recording of live interviews and screen recording when doing video chat interviews.

Test 3: gathering EEG data

The EEG headsets made by emotive and muse have proven to measure brainwave activity in order to better capture neurodata. We have tested some of these at events like SXSW, and while the neurodata seems quite accurate, again our ‘bias-allergy’ chimed in: we feared the ‘guinea pig’ feeling would take over when wearing the crazy-looking headsets. And so it did.

We could’ve set up and interview with someone wearing an EEG headset, just like discussed previously in this article, but we’re still looking for a less invasive way to collect the same data. Perhaps daily power users would have less of a guinea pig feeling? If a company like emotiv would give access to testing with their power users, that might be a more valid testing opportunity.

References
[1] https://imotions.com/
[2] http://vempathy.tech/
[3] https://isobarmarketingintelligence.com/mindsight/
[4] https://www.audeering.com/
[5] https://www.hirevue.com/
[6] https://www.emotiv.com
[7] https://www.audeering.com/what-we-do/
[8] https://imotions.com/blog/visual-attention-consumer/

Stay up to date with InnoTech tools​

We only touch on a handful of examples in this article. Find 50+ tools on innovationstack.io

What is your experience with using tech to supercharge Design Thinking? Let us know!

Thanks!

I’m Pablo Juarez, Entrepreneur-in-residence Tech Lab (Board of Innovation). Spreading innovation culture is in our DNA – if you liked the read, contribute to our mission by sharing this article.

Share this article:

Share on linkedin
Share on facebook
Share on whatsapp
Share on email
Join 30K+ innovators and receive our latest posts

Stay updated

Join 30K+ innovators and receive our latest posts

Let's get in touch!