July 15, 2023No Comments

Unlocking the Future of UX with Biometric Research

The trajectory of innovation is never static; it curves and flexes, adapting to the nuances of our growing technological capability. As the era of digitization powers on, one tool is proving to be increasingly crucial in shaping the next generation of products, services, and processes: biometrics. In the intricate tapestry of user experience (UX) design, a myriad of sensing technologies can reveal the underlying threads that tie users to their spatial experiences. And it's micro-research that holds the magnifying glass to these threads.

Within the selection of sensing technologies, the two major types help illuminate how users interact with their environments: biometric sensors and spatial sensors. Biometric sensors including electrodermal activity (EDA), electroencephalography (EEG), and eye tracking,  reveal users' automatic physiological responses. Sensor technologies measuring spatial experience include both environmental sensors and spatial sensors.

Measuring spatial experience

Spatial sensors are pivotal in deciphering the intricacies of spatial experiences by supplying data that augments our comprehension of human interactions in various settings. For instance, visual sensors record and assess individuals' motions and engagements, detecting behavioral patterns and responses to environmental elements. Audio sensors record ambient sounds, aiding researchers in discerning the auditory cues that mold users' sentiments and perceptions. Sensors measuring temperature and humidity offer a glimpse into a space's thermal comfort, which influences the occupants' overall well-being. These are only some of the many types of sensors that can be used to measure our spatial experience. By integrating data from these multifaceted sensors, designers can comprehensively examine the multi-faceted nature of spatial experiences, enabling them to make informed decisions in the design and optimize the experience for various purposes.

Measuring users’ physiological responses

Biometric research is a specialized form of UX research that drills down into the intricate details of user interaction with technology. While macro and middle-range research provide necessary strategic frameworks and understandings of product development respectively, micro-research delves into the specifics of technical usability and minute interaction points. It navigates the granular sea of users' emotional responses, physiological signals, and behaviors, bringing to the surface insights that are often invisible to traditional research methodologies. Some examples of biometric sensors include:

EDA

One potent tool in the micro-research arsenal is EDA. This biometric method tracks the changes in skin conductance linked to emotional arousal. By quantifying these responses, businesses can gain an in-depth understanding of how users react emotionally to their products, services, or processes. This can uncover invaluable data, allowing organizations to fine-tune their offerings and craft experiences that resonate more deeply with their target audience.

EEG

Similarly, EEG, which measures brain wave activity, provides a unique window into a user's cognitive processes. It provides real-time feedback on a user's engagement, concentration, and stress levels when interacting with a product or service. By leveraging this biometric data, businesses can understand their customers on a much more intimate level, identifying potential pain points and opportunities for improvement that would otherwise go unnoticed.

Eye Tracking

Eye-tracking technologies, another facet of micro-research, not only unveil how users visually interact with products or interfaces, but also can be used to pick out user arousal through pupil dilation triggered by the autonomic nervous system. They reveal when users are stressed or excited, as well as which elements draw the user's attention and which are overlooked, informing design decisions that can drastically improve usability and engagement. In a world saturated with visual information, understanding and optimizing for users' visual attention can be a game-changer.

Technologies that measure physiological signs of excitement or stress.

Examples of biometric data:

User-Centered Biometric Data 
Type
Relevance to research topics
Electrodermal Activity (EDA)Used as a strong indicator of heightened physiological activity correlated with heightened emotional states. 
Electro-encephalogram (EEG)Used as a strong indicator of cognitive load, focus and attention, and confidence level in decision making.
Eye-trackingPupil dilation in response to emotional stimuli, a strong predictor of stress and excitement.

Examples of User-Centered Activity Data:

User-Centered Data TypeRelevance to research topics
Accelerometer (ACC)Shows the amount and speed of bodily movement
Global Positioning System (GPS)Shows exact geographical location which can be overlaid on maps and plans.
Audiovisual (AV)Shows a first-person perspective view of what was visible to the research participant, ability to pick up t activity, human interactions, 
Eye-trackingShows how users visually interact with products or interfaces, can reveal elements that draw significant user's attention and which are overlooked

Examples of Environmental and Spatial Data

Environmental/Spatial  DataRelevant Characteristics & Validation Studies
Temperature, Pressure, humidity-  Proven indicator of environmental comfort, productivity and efficiency, functional comfort, physical comfort and psychological comfort.
Analog gas sensor and ADS1015 analog-to-digital converter (ADC)
Light and Proximity-  LUX levels can provide information of the transition from indoor to outdoor spaces, or proximity to windows, and availability of natural light- Shows also the proximity of other objects or humans to the user.
Microphone (Audio)- Audio levels in decibels, audio sources, frequency, pitch.
Particulate Matter (PM) Sensor-  Indicator of pollution, visibility, and building ventilation.  
Visual Data (Camera)- Shows first-person perspective view of what was visible to the research participant- Spatial Depth- Object/Human movement
- visible color, visible spatial depth, spatial information, and audio information.
Spatial depth (LiDAR)- Shows mapped structures, spatial characteristics, configurations, movement of objects, etc.

The above quantitative data types collected by the methodology and their relevance expanded from previous research by He. et al.

Harnessing the Power of Biometric-Research 

These biometric tools, wielded within the context of UX research, empower businesses to go beyond traditional metrics and benchmarks. They provide a robust, nuanced understanding of user experiences that transcends what users say or think. It taps into what they feel, their automatic physiological responses to interactions and impressions, and what they do, the patterns throughout the user journey. Fine-grained micro-level research essentially bridges the gap between the human and the digital, enriching business strategies with human-centric data. By integrating biometric methods into traditional UX research approaches, businesses can gain insights that drive innovation, improve customer satisfaction, and ultimately, bolster the bottom line. The power of biometrics, specifically within the context of micro-level research, isn't just the future of UX research - it's the future of business.

As we navigate the vast landscape of technological innovation, we can use more biometric data to guide businesses to align their strategy, product development, and intricate usability with the needs and desires of their users. It's through these scales that we can truly unlock the power of user experience, propelling our businesses towards a future designed with the user at its heart.

The devil is in the details. By tapping into the potential of fine-grained, in-depth, micro-level research, businesses can ensure they are not just designing for users, but are designing with them.

August 22, 2022No Comments

How will we live in the future?

Design of everything affects how we live now and how we will live in the future - it revolves around our society, city, lifestyles and personal preferences. In a world where technology is constantly evolving, it's hard to imagine what life will be like in the future. But the unknown shouldn’t stop us from wondering how we will live. We should go beyond thinking about what we want in the future, but try to understand what future generations would want in their world. We cannot rely solely on the systems that have sustained us for generations. Instead we should think of speculative futures, where we live in a world with technology and gadgets that don’t exist yet. (If you are interested in the future and speculative design, you should read Speculative Everything by  Anthony Dunne and Fiona Raby )

As our society, city and lifestyles become more complex, the design of everything around us has to take these changes into account. Our personal preferences are also increasingly important in the way things are designed, whether it’s the clothes we wear, the cars we drive or the homes we live in. The future technology that serves our everyday needs cannot only be about comfort and efficiency. Designers will need to go above and beyond to think how technology can be responsive and create delightful experiences that make our life more enjoyable.

So whatever the future holds, one thing is certain: design will continue to play a vital role in our lives.

Questions about the future

Telemedicine scene from Hanna-Barbera’s 1962 animated sitcom, The Jetsons.
A telemedicine scene from Hanna-Barbera’s 1962 animated sitcom, The Jetsons.

More than 50 years ago, the Jetsons featured a telemedicine scene that we are so familiar with now. Back then, it was hard to imagine how their hospital visits could change so drastically. Similarly, future generations will live in a world far more different than we know of today. Questions we should ask ourselves include:

  1. How will our lives change, and what are the implications for our interactions with technology and the built environment?
  2. What types of technologies now will impact how we live in the future? And how will these affect the development of future technologies?
  3. Why do we need technology in the future? Will these reasons be different from why we need technology today?

As we think about what we want for our future, we should also try to understand what future generations would want in their world. How will they live? Will they have the same occupations and work schedules? What do they do for fun?

Thinking about speculative futures can help us break free from our dependency on possibly unsustainable systems. It can help us imagine different ways of living and being in the world. It can also help us create new systems that are more equitable, inclusive and just (If you are interested in Inclusive Design, check out our project with Microsoft).

Dive in:

  1. Future Work
  2. Future Living
  3. Future Mobility

November 14, 2007No Comments

#tbt Interactive Touch Screen with Z-Depth

Let's start off the conversation about interactive touch screen with z-depth. These experiments were from the INVIVIA blog from 2007. What was it like to design touch screens back then? And what exactly is Z-Depth?

Read the rest:

  1. Interactive Touch Screen with Z-Depth
  2. Curved Touch Screen
  3. Touch Screen + LED
  4. Touch Screen Instruments
  5. Touch Screen V5

July 16, 2007, Z-Depth Touch Screen // Ron

Here is a very crude, very quickly made partial demo of stereo on the touch screen. It will enable the hand to be tracked near the screen.( I didn’t have time to get the math right for this camera orientation so am not generating the right z-value which we can map to size, or whatever…)What you see here is that putting stereo cameras off axis behind the screen can generate hand position…

-Ron

Nov 14, 2007, Z-Depth Touch Table Proposal

Recent INVIVIA brainstorms and discussions have pointed to the desirability of adding Z-Depth to the Touch Table idea. The least obtrusive way to do this would be to illuminate the hand from below the screen. It also uses stereo cameras situated there. I’ve made numerous tests of this approach using different screen materials and illuminator positions.

What I’ve learned is that the ideal screen material would be a great diffuser of the white projector light and yet be transparent to the IR light illuminating the hand and provide a clear view for the IR camera. I have not found such a material. What I find instead is that a terrific screen material is almost opaque to the IR source and IR camera, and a material that functions well for the IR camera is a terrible projector screen.

To illustrate the problem I used frosted mylar. The material that is the closest I could find to fulfilling both criteria: allowing the cameras to see through somewhat and still provide an adequate projector screen.

So, here’s what the camera saw, with the hand touching, looking at the underside of the frosted mylar illuminated by a couple of rows of IR LEDs (MS Surface works something like this, I assume)

Z-depth hand illumination

When the hand is slowly pulled away from the mylar (here at 1inch increments) the diffusion of the image (very good for the projection) blurs the hand so that at 4inches away it is no longer recognizable, not by human eyes and certainly not by a video camera.

Z-depth camera view

If all we needed was 2 inches of hand travel we could probably make a frosted mylar screen work.

Looking back at the Z-Depth stereo demo from last year is instructive. It shows that you need at least 8-10 inches of depth of movement in order not to severely constrain the user. (the cameras have the blue lights and are located above the screen)

Z-Depth demo

Here’s what the stereo cameras see:

Python hand follower stereo view

The Proposal:

Until we find the magical screen material which diffuses in white light and is transparent in IR, I propose that we adapt the approach I used successfully in the Hand Follower demo above to the Touch Table environment.

My idea is that we build (or at least think hard about building) a Z-Depth module that is simply a small wide angle USB video camera, a stacked and angled fan of IR LEDs, and a housing to protect them (and in this case hold the connector to get the signals into the table.)

zdepthtopflatfantogether.jpg

Till next time!

I

October 16, 2007No Comments

#tbt Interactive Touch Screen Design

In 2007, when the first iPhone debuted, interactive touch screens were seen as a new and exciting way to interact with technology. All screen, no button- they were designed to be user-friendly and easy to use. Nowadays, touch screens is a staple for interactive technologies. But how did they get here? How were interactive touch screen designed and prototyped in 2007? We found some old INVIVIA blogs that documented the prototype process for Touch Screen design.

We found these blog posts interesting, and hope that you do too! Here's a little peak into various touch screen experiments we did in 2007:

  1. Z-Depth Touch Screen
  2. Curved Touch Screen
  3. Touch Screen + LED
  4. Touch Screen Instruments
  5. Touch Screen V5

Touch Screen LED

I thought you’d all like to see the first integrated screen and LED Guide working.

interactive touch screen

So, What's different from previous versions, you might ask?
– LED Guide, touch surface, video screen and backing sheet are now in one strong, reasonably nice looking frame
– LED orientation gives even light distribution across screen using only side arrays
– Heat dissipation into frame is reduced and now manageable.

What needs to be done next?
– Narrow beam LED circuits need to be made and tested. As it stands, I am 2 circuits short of filling the screen with light and found some narrow beam, very bright LEDs that should be tried… they will reduce spillover and may improve performance.

– Decide on housing shape (table, kiosk…convertible?) and build it (or whatever variations seem to make sense) so you guys can have a new toy to play with and we can begin to get some applications developed and running on it…

Here are some development process pix:

The AutoCAD cross section
interactive touch screen design prototype- mill work
Milling the slot to hold the touch surface…
prototype- mill work
just finished cutting the shelf that makes connection to next piece
prototype- mill work
the four pieces before cutting off clamp ends
prototype- LED work
partially assembled showing LED circuits.

cheers, ronmac (Oct 16, 2007)

If you liked this, see House Zero for some more interactive LED.

Till next time!
I

Instagram    |    LinkedIn    |    Facebook

Cambridge, MA 02138
+1 617 497 9900
info@INVIVIA.COM

© 2022 INVIVIA          Legal          Privacy