October 3, 2023No Comments

The Magic: Illusion of Completeness (Generative AI Part 1)

If you are trained as an architect or designer,  you understand the feeling of never-ending decision fatigue when it comes to design elements. Or even the decisions that go into crafting an email, where nuances in wording can convey subtleties in meaning. When Generative AI tools, such as ChatGPT or Dall-E, came along, leaving it up to “fate” was easy after typing in the input prompt. One is either wildly surprised and in awe of what it produced or extremely disappointed that it didn’t understand your intentions.

As the digital era progresses, businesses across various industries are continually exploring ways to harness the power of artificial intelligence (AI) to enhance their processes and outputs, whether that’s a business plan, a short movie trailer, or a sales email. By design, GenAI can create new content from scratch - be it written text, images, music, or any form of digital content - mirroring the style, context, and nuances of its training data, which can seem magical.

However, within this evolving landscape, it's essential to understand the intricacies of Generative AI and its role in shaping our perception of artificial intelligence outputs. At the heart of this is a phenomenon known as the 'Illusion of Completeness.'

The 'Illusion of Completeness' refers to the perceived accuracy or wholeness of AI-generated content, which might seem perfect to the untrained eye but may not exactly align with the specific requirements, subtleties, or context intended by the user. This illusion is influenced by various factors, including our brain's neurology, cognitive biases, and even our intrinsic appreciation for beauty. To understand this phenomenon, let's delve into some of the factors contributing to the Illusion of Completeness in Generative AI:

Value to Effort Ratio: 

In this context, the value-to-effort ratio refers to the disproportionate relationship between the minimal amount of user input and the substantial amount of AI output. Generative AI can produce large volumes of content or highly complex outputs with just a short prompt or a handful of parameters. Years ago a speed drawing challenge asked artists to sketch the same drawing in 10 minutes, 1 minute, and 10 seconds. With Dall-E, any drawing or image can be produced in 20 seconds, the amount of time it takes to generate a batch of 10 photos.

A drawing of an eye  in 10 minutes, 1 minute, and 10 seconds by rachelthellama

Example of an eye drawing in 10 minutes, 1 minute, and 10 seconds (by rachelthellama).

Left: DALL-E2 Prompt: 3d realistic render, maya, ambient studio light, splash page image,  sci-fi, futurism, greenery, aerial view, A city of bikes, scooters, pedestrians friendly city,
Right: Dall-E2 Prompt: Future of mobility workshop and symposium poster without text (bad prompting).

The image on the left could be categorized as low effort high value because it is perceived to be more coherent and labor intensive. The image on the right seems to be low effort low value, as we are more able to recognize errors in texts, just like we can recognize the famous example of a 6-fingered hand generated by AI is not a hand we recognize. From a psychological standpoint, this abundance of output against a minimal input can heighten our perception of completeness. This is seemingly the opposite of the effort justification effect in cognitive dissonance, where the individual experiences wonder despite the seemingly insignificant input.

*The “Effort justification” is a phenomenon whereby people come to evaluate a particular task or activity more favorably when it involves something that is difficult or unpleasant (APA).

Perceived Coherence: 

While generative AI models like ChatGPT and Dall-E can produce impressive outputs, they lack true comprehension and contextual understanding. However, despite their limitations, they can lead to outputs that might seem internally coherent visually complete, and contextually appropriate at times but can also fall short in other instances.

Many times, the perception of coherence lies in the ambiguity and conciseness of both the user's inputs and the AI-generated outputs. The input prompts provided to generative AI models can be concise and open-ended, leaving room for interpretation. Users may assume that the AI understands their intentions fully and will generate outputs aligned with their expectations.

Here’s an example: 

In this example, the response attempts to convey coherence and relevance to the topic of AI's societal impacts. However, the content lacks depth, true understanding, and specific examples to substantiate the claims.

Example of ChatGPT output, achieving the illusion of completeness.

Example of how bolding and similar sentence lengths achieve the illusion of completeness.

Fill in the _______

From a neuroscientific viewpoint, our brains are naturally wired to fill in missing information. This survival-oriented mechanism aids us in interpreting the world around us. A phenomenon called "filling in" happens when the brain "fills in" missing information in a person's blind spot. Reality is a construction of our brain, and the brain has evolved for survival, not accuracy. Consequently, when we examine an AI output, our brain instinctively completes any apparent gaps, making the result seem 'whole' even if it lacks certain aspects. Many times, it's easier to believe that AI can do more than it can.

Emotional Attachment: 

When users witness generative AI producing something remarkable and aligned with their desires, they may develop an emotional attachment to the output. Specifically, the effort the user exerted to produce the prompt that led to the generated output can create a certain feeling of “ownership” towards the generated output, whether that’s a text, animation, or image. This emotional response further reinforces the belief that the AI has grasped its intent comprehensively.

Confirmation Bias

Confirmation Bias is a cognitive bias that affects our interaction with AI outputs. This bias causes us to process new information in a way that affirms our existing beliefs or expectations. So, if the AI's output is somewhat aligned with what we expect, our brain is inclined to view the output as more precise than it might be. For example, the certainty at which ChatGPT generates false information can trick many. On the other hand, if the AI output is not what we expected, we might dismiss it and give it less weight in our minds or continue to edit the prompt until the desired output matches our input and coincides with the user’s preexisting beliefs and thoughts.

Summary: Illusion of Completeness in Generative AI

These cognitive biases are more complex than described in this article. These are merely a glimpse to showcase why AI feels magical. And is it too good to be true? How can individuals and teams mitigate the Illusion of Completeness in Generative AI and think critically about our usage of Generative AI?

  1. Promote Awareness: Users and consumers of generative AI should be educated about the technology's limitations. Schools, governments, local communities, and companies should teach how generative AI works. Understanding that AI models lack genuine comprehension and can produce unpredictable and incorrect results will foster more realistic expectations.
  2. Iterative Hybrid Initiatives: Encouraging collaboration between AI and human creators, where the AI assists and the human guides, can lead to more reliable and contextually accurate results. The human plays a critical role in providing feedback to fine-tune the machine. This will allow more diligent tracking of AI outputs and user control over generated content.
  3. Clear Communication of Intent: Users should be educated about prompt design and encouraged to provide more specific and explicit prompts to avoid misinterpretations by the AI model. A bad prompt is misleading, unclear, and ambiguous, leaving much to be “misinterpreted” by the machine, leading to irrelevant, inappropriate, ineffective misleading, inadequate, or biased outputs.

The Illusion of Completeness in Generative AI stems from a combination of factors related to human cognition, AI limitations, and expectations. By being aware of these factors and adopting suitable strategies, we can harness the true potential of generative AI while maintaining realistic expectations about its capabilities. As AI continues to evolve, understanding the nuances of human-AI interactions becomes increasingly critical for creators and consumers of AI-generated content.

July 15, 2023No Comments

Unlocking the Future of UX with Biometric Research

The trajectory of innovation is never static; it curves and flexes, adapting to the nuances of our growing technological capability. As the era of digitization powers on, one tool is proving to be increasingly crucial in shaping the next generation of products, services, and processes: biometrics. In the intricate tapestry of user experience (UX) design, a myriad of sensing technologies can reveal the underlying threads that tie users to their spatial experiences. And it's micro-research that holds the magnifying glass to these threads.

Within the selection of sensing technologies, the two major types help illuminate how users interact with their environments: biometric sensors and spatial sensors. Biometric sensors including electrodermal activity (EDA), electroencephalography (EEG), and eye tracking,  reveal users' automatic physiological responses. Sensor technologies measuring spatial experience include both environmental sensors and spatial sensors.

Measuring spatial experience

Spatial sensors are pivotal in deciphering the intricacies of spatial experiences by supplying data that augments our comprehension of human interactions in various settings. For instance, visual sensors record and assess individuals' motions and engagements, detecting behavioral patterns and responses to environmental elements. Audio sensors record ambient sounds, aiding researchers in discerning the auditory cues that mold users' sentiments and perceptions. Sensors measuring temperature and humidity offer a glimpse into a space's thermal comfort, which influences the occupants' overall well-being. These are only some of the many types of sensors that can be used to measure our spatial experience. By integrating data from these multifaceted sensors, designers can comprehensively examine the multi-faceted nature of spatial experiences, enabling them to make informed decisions in the design and optimize the experience for various purposes.

Measuring users’ physiological responses

Biometric research is a specialized form of UX research that drills down into the intricate details of user interaction with technology. While macro and middle-range research provide necessary strategic frameworks and understandings of product development respectively, micro-research delves into the specifics of technical usability and minute interaction points. It navigates the granular sea of users' emotional responses, physiological signals, and behaviors, bringing to the surface insights that are often invisible to traditional research methodologies. Some examples of biometric sensors include:

EDA

One potent tool in the micro-research arsenal is EDA. This biometric method tracks the changes in skin conductance linked to emotional arousal. By quantifying these responses, businesses can gain an in-depth understanding of how users react emotionally to their products, services, or processes. This can uncover invaluable data, allowing organizations to fine-tune their offerings and craft experiences that resonate more deeply with their target audience.

EEG

Similarly, EEG, which measures brain wave activity, provides a unique window into a user's cognitive processes. It provides real-time feedback on a user's engagement, concentration, and stress levels when interacting with a product or service. By leveraging this biometric data, businesses can understand their customers on a much more intimate level, identifying potential pain points and opportunities for improvement that would otherwise go unnoticed.

Eye Tracking

Eye-tracking technologies, another facet of micro-research, not only unveil how users visually interact with products or interfaces, but also can be used to pick out user arousal through pupil dilation triggered by the autonomic nervous system. They reveal when users are stressed or excited, as well as which elements draw the user's attention and which are overlooked, informing design decisions that can drastically improve usability and engagement. In a world saturated with visual information, understanding and optimizing for users' visual attention can be a game-changer.

Technologies that measure physiological signs of excitement or stress.

Examples of biometric data:

User-Centered Biometric Data 
Type
Relevance to research topics
Electrodermal Activity (EDA)Used as a strong indicator of heightened physiological activity correlated with heightened emotional states. 
Electro-encephalogram (EEG)Used as a strong indicator of cognitive load, focus and attention, and confidence level in decision making.
Eye-trackingPupil dilation in response to emotional stimuli, a strong predictor of stress and excitement.

Examples of User-Centered Activity Data:

User-Centered Data TypeRelevance to research topics
Accelerometer (ACC)Shows the amount and speed of bodily movement
Global Positioning System (GPS)Shows exact geographical location which can be overlaid on maps and plans.
Audiovisual (AV)Shows a first-person perspective view of what was visible to the research participant, ability to pick up t activity, human interactions, 
Eye-trackingShows how users visually interact with products or interfaces, can reveal elements that draw significant user's attention and which are overlooked

Examples of Environmental and Spatial Data

Environmental/Spatial  DataRelevant Characteristics & Validation Studies
Temperature, Pressure, humidity-  Proven indicator of environmental comfort, productivity and efficiency, functional comfort, physical comfort and psychological comfort.
Analog gas sensor and ADS1015 analog-to-digital converter (ADC)
Light and Proximity-  LUX levels can provide information of the transition from indoor to outdoor spaces, or proximity to windows, and availability of natural light- Shows also the proximity of other objects or humans to the user.
Microphone (Audio)- Audio levels in decibels, audio sources, frequency, pitch.
Particulate Matter (PM) Sensor-  Indicator of pollution, visibility, and building ventilation.  
Visual Data (Camera)- Shows first-person perspective view of what was visible to the research participant- Spatial Depth- Object/Human movement
- visible color, visible spatial depth, spatial information, and audio information.
Spatial depth (LiDAR)- Shows mapped structures, spatial characteristics, configurations, movement of objects, etc.

The above quantitative data types collected by the methodology and their relevance expanded from previous research by He. et al.

Harnessing the Power of Biometric-Research 

These biometric tools, wielded within the context of UX research, empower businesses to go beyond traditional metrics and benchmarks. They provide a robust, nuanced understanding of user experiences that transcends what users say or think. It taps into what they feel, their automatic physiological responses to interactions and impressions, and what they do, the patterns throughout the user journey. Fine-grained micro-level research essentially bridges the gap between the human and the digital, enriching business strategies with human-centric data. By integrating biometric methods into traditional UX research approaches, businesses can gain insights that drive innovation, improve customer satisfaction, and ultimately, bolster the bottom line. The power of biometrics, specifically within the context of micro-level research, isn't just the future of UX research - it's the future of business.

As we navigate the vast landscape of technological innovation, we can use more biometric data to guide businesses to align their strategy, product development, and intricate usability with the needs and desires of their users. It's through these scales that we can truly unlock the power of user experience, propelling our businesses towards a future designed with the user at its heart.

The devil is in the details. By tapping into the potential of fine-grained, in-depth, micro-level research, businesses can ensure they are not just designing for users, but are designing with them.

June 14, 2023No Comments

Future of UX Research

How necessary is UX Research? The field of UX Research has been growing steadily in the past 2 decades. More recently, it has come into attack because much of it is associated with “soft” science - subjective feelings and experiences. Frequently designers and engineers think that they are able to perform user research through a couple of interviews that can generate a journey map, but also because designs can be executed regardless of a comprehensive understanding of a problem or the user. They might not be good designs or good products, but they can be executed and pushed out to market. Why? Because with some packaging, they can seem “complete."

From an architect's perspective, the parallels between architecture and UX research are intriguing. Architects employ scalar thinking, examining designs at various scales, from the expansive urban landscapes down to the chair one might sit on, or the literal nuts and bolts of construction detail. This multi-scalar perspective is deeply ingrained in our architectural mindset, as it encompasses both the temporal and spatial scales. We contemplate the long-term implications of our designs, scrutinizing how a building might shape the development of a city, and the day-to-day interactions inhabitants have with a single room. And though architects could design a beautiful state-of-the-art building, it is not good design, unless it understands the context that surrounds it and the people who will inhabit it.

Scales of UX Research

In many ways, architecture and UX research share a common purpose: to create spaces or products that not only meet functional needs but also elicit positive emotional responses. Both disciplines require a deep understanding of the intended users, balancing aesthetics and functionality, while considering the minutiae of interaction details and the broader experience.

In our approach to UX Research, we borrow from our architecture design experience, and see UX Research in three main scales:

  1. Context (years):  This involves understanding the past, present, and especially future of the industry the business is in, and understanding how a product or service may fit into the long-term trajectory of a company or industry.
  2. Journey (days): Research in this scale looks at the user and how they interact with the product or service. This is what is traditionally thought of as UX Research, journey mapping, empathy mapping, personas, etc, and much of what is seen in the industry today.
  3. Interaction (seconds):  This type of micro-analysis of user research, can help uncover the interaction minutiae, and how different interactions affect usability, user emotions, etc. 

Challenges of Existing UX Research Methods

Understanding how UX research exists in scales can help companies visualize how to propel forward by utilizing the different scales of UX research, and not get trapped in UX research as it exists currently. As mentioned before, though there has been considerable progress in the field of user experience (UX) research over the last decade, several challenges persist: 

  1. Experience = Subjective: Despite the field being called User Experience Research, the actual experience is often dismissed as subjective and difficult to quantify, yet holding immense untapped potential for user insight.
  1. Indirect & Intangible Benefits: The UX industry is currently awash with storyboards, user journeys, and experience maps. These methods offer crucial insights, if executed correctly, their benefits are often indirect and intangible. This poses challenges when trying to demonstrate the tangible value of UX research and garner buy-in from stakeholders.
  1. Insufficient Quant Research on end-to-end user experience: Many existing methods are focused on delivering product, process, or service-specific metrics, and very few attempt to understand the user in a more profound and holistic manner.  Quantitative research methods, while being instrumental in comparing performances of specific UX/UI interactions, fall short of providing a holistic quantification of the user experience. Both pre and post-design stages lack targeted quant efforts that dive into the broader tapestry of user experience with product or service.

Existing Quantitative UX Research 

It might seem obvious that the way to move forwards is with more Quantitative UX Research. Here is the list of some of the most used Quant UX Research Methods

MethodTypically Used forCostDifficulty of CollectionDifficulty of AnalysisTypeContext of Use
Quantitative Usability TestingTracking usability over time comparing competitorsMediumMediumMediumBehavioralTask-Based
Web Analytics (or App Analytics)Detecting or prioritizing problems Monitoring performance LowLowHighBehavioralLive
A/B TestingComparing two specific design optionsLowLowLowBehavioralLive
Card SortingDetermining IA labels and structuresLowLowMediumAttitudinalNot Using Product
Tree TestingEvaluating IA hierarchiesLowLowMediumBehavioralNot Using Product
Surveys and QuestionnairesGather information about your users, their attitudes, and behaviorsLowLowLowAttitudinalAny
Clustering Qualitative CommentsIdentifying important themes in qualitative dataLowMediumMediumAttitudinalAny
Desirability StudiesIdentifying attributes associated to your product or brandLowLowLowAttitudinalTask-Based
Eyetracking TestingDetermining which UI elements are distracting, findable, or discoverableHighHighHighBehavioralTask-Based
*NNGroup’s Table of Quantitative Methods

Quantitative methods in user experience (UX) research provide the framework for understanding the measurable aspects of a product's usability. These methods generate numerical data which can be statistically analyzed, providing clear, objective insights that are crucial for strategic decision-making. Current methods allow teams to: 

  1. Put a number on the usability of your product. 
  2. Compare different designs (eg. different versions, or competitor products)
  3. Improve UX trade-off decisions, to see if different designs and features are worth the change.
  4. Tie UX improvements back to organizational goals and key performance indicators

Though traditional quant methods, like the ones mentioned in the chart above, can provide data with a lot of usability insight, most are product/service-centric, asking questions like “How long did it take for the user to navigate to [Page A] instead of [Page B]?” they have a hard time quantifying questions such as “What, if anything, surprise you or frustrated you about the experience?” 

Future of Quant UX Research Methods

Emotion related user experience questions like the one asked above are hard to quantify using traditional quant methods.  Quant methods of researching subtleties in interactions, perception, and emotion are mainly used in academic studies, and scientific research, including using a myriad of sensing technologies to detect how various sensory inputs, spatial design, and interactions affect physical arousal. Usually seen in the fields of psychology, architecture, sociology, and neuroscience, sensing technology has been tested in fields from sports to retail, such as using eye tracking for desirability heatmaps, but is not yet mainstream.

Modified from NNG's Current Landscape of User Research Methods

The potentials of pairing different types of sensing technologies are infinite. Biometric methods like EDA, EEG, and eye-tracking, hold the promise of unlocking rich, nuanced insights into not just isolated user interactions but the entire user experience. This underutilization leaves a significant gap in our comprehension of the user experience, suggesting there is much room for growth and evolution in our approach to UX research, and for us to address deeper user needs, or currently unobserved pain points. 

Future Landscape of User Research Methods using sensing technologies to quantify user experience.

Implementation of these methods addresses the three main challenges of traditional UX research

  1. We can quantify user experience, in terms of a spectrum of pain points that contribute to stress triggers, irritations, and pain points, and start to know not only what the pain points are, but how they affect the users, seen through acute or gradual changes in physiological responses.
  2. Benefits are tangible and direct because 1) this method can allow researchers to isolate and prioritize the biggest frustration, both observable and hidden and 2) the same sensing metric can be used to evaluate user experience before and after change. 
  3. Experiences are evaluated from end-to-end. With this type of quantitative method, the research team is able to quantify the entire experience. This is extremely useful when companies are looking to understand how their users are interacting with multiple touchpoints for their service or product. Its ability to cover the entire journey also allows companies to start asking the right questions earlier.

Summary:

Traditional UX research has frequently centered around post-design evaluation, analyzing results after key decisions have been made. While this approach certainly provides valuable insights, it could limit the potential for a proactive, data-driven design strategy. The future calls for a more anticipatory approach, one that uncovers statistically significant aspects of experiences ahead of time and enables teams to focus their efforts where it matters most. With the increasing integration of biometric micro-research and advanced quantitative methods, we're entering an era where UX research can not only guide post-design refinement but also own a greater role in steering the initial stages of design and decision-making. As we usher in this future-forward perspective, the role of UX research is expanding beyond analysis and evaluation, becoming an indispensable tool for strategic, user-centric innovation. It's no longer just about understanding what happened, but illuminating what could - and should - happen next.

August 31, 2022No Comments

Future Home: How will our homes and lifestyles change?

What is the future home? We will live in smarter and more sustainable homes. Will we live further from the city, will we live closer to our friends and families, and will our home be too smart?

With “smart home” gaining popularity in the last decade, the sufficiency of home automation as the defining aspect of the term “smart home” should be questioned. The contemporary definition of the term “smart home” revolves around the idea of optimization, commonly seen in phrases such as smart home appliances, smart home systems and smart home gadgets. On a system level, products address home security, thermostat, energy usage, and etcetera, all aspects of human basic needs of safety and comfort. On a furniture level, Kinetic furniture, or transformable furniture, currently exists for space optimization, a response to increasing rent and smaller apartments.

The Thousand Dreams of Stellavista

This efficiency and comfort based understanding of smart homes is just scratching the surface. Even as early as 1960’s, science fiction dreamt up of psychotropic houses that learned the personality of previous owners. Or houses made from living entities genetically designed as gentle creatures, susceptible to sentient irritation ( Frank Herbert’s Whipping Star).  In these visions of what a home could be, inhabitants and homes interact with more than just thermostat interfaces, or security alarms. The house responds to not only the preferences of its inhabitants.

It's always interesting to watch a psychotropic house try to adjust itself to strangers, particularly those at all guarded or suspicious. The responses vary, a blend of past reactions to negative emotions, the hostility of the previous tenants, a traumatic encounter with a bailiff or burglar (though both these usually stay well away from PT houses; the dangers of an inverting balcony or the sudden deflatus of a corridor are too great). The initial reaction can be a surer indication of a house's true condition than any amount of sales talk about horsepower and moduli of elasticity.

The Thousand Dreams of Stellavista in Vermillion Sands by JG Ballard 1962

Responsiveness

At the very basic level of responsiveness, an action creates a reaction. An “intelligent” object can directly respond to the user's command, or it can have a mind of its own.  Current robotic furniture systems can optimize space through mounting its furniture on the ceiling, allowing for slightly more flexible spaces, and  they can create a multifunctional space optimization furniture housed on the floor. Many smart systems remain one directional, in that they respond to the user’s commands. However, to have a fully responsive home environment, the interaction between the user and the built environment must be two-directional. Could you home also rebel? Have feelings? Treat you well only if you treat them well?

This type of reciprocity plays out in our concept of a ‘sentient home’, which explores the relationship between user’s behavior, contextual occurrences and the responsiveness of the built environment. Real time feedback of user interaction analysis starts to address the more behavioral aspect of responsibility - a step beyond preferences. Responsiveness becomes the mediator between the user and their physical environment. The future smart home will allow spaces that can both satisfy the inhabitant’s basic needs and help foster the inhabitant's identity. 

Our environment should respond not only to one user, but to multiple users as it engages and mediates the complexity of human interactions and social experiences with and within built environments. The future homes should address users’ emotional needs by actively augmenting spatial experiences to encourage mindfulness, curiosity and imagination. Combining the spatial possibilities of kinetic elements, and the responsiveness and reciprocity of a sentient home can also trigger interactions.  These future residential scenarios include sensing, kinetic, and modular technologies that respond to real-time user needs and environmental conditions.  The spatial flexibility allowed by modular kinetic elements accommodates a spectrum of private to social life, working to leisure life.

All of the above depends on our future lifestyle and needs, as well as the way we interact with technology.

Check out:

  1. Future Work
  2. Future Living
  3. Future Mobility

April 21, 2022No Comments

What is the future of work?

What is the future office environment and the future work experience? How did a pandemic change its trajectory? Would organizations have accepted remote work? Could Metaverse and Zoom have found their place in people’s everyday life?

Over the last couple of years, the Future of Work has been a topic on many of our minds. However, the future experience of work is less talked about, and that's something we would like to address in this article.

In this article, we will cover:

  1. Technology enabled work experiences.
  2. Work environment trends.
  3. Why will work change in the future?
  4. What is the future of work experience.

Technology enabled work experiences: 

Jacques Tati, Playtime, 1967

In 1967, Director Jacques Tati, anticipated the cubical office, in his film Playtime. In the same year,  on an episode of the CBS show The 21st Century, Walter Cronkite introduces viewers to the Home Office of 2001.  The episode shows him unveiling the future home office with a printer, a computer, a prototype of a videophone, a closed-circuit television system and “electronic correspondence machine.” 

Walter Cronkite, Home Office of 2001, 1967

Cronkite’s speculative future office in 2001 was based on the predicted 30-hour work and the preference for the suburban life. Technology at that time afforded us the possibility of bringing work home. Office designs were always a reflection of the way societies thought about work- a negotiation between the employee and the employer. Changes in mindset from efficiency and productivity to creativity and collaboration altered the way office spaces were designed.

Though the technology needed to work remotely existed, most employers still required employees to work from the office. Very little people challenged the status quo.  The pandemic accelerated the tingling desire most employees had for a more flexible work week, by giving employees more agency in organizing their days and weeks. As businesses move forward, they need to consider the changing social and professional norms, as well as changing personal preferences, and values on personal and workplace wellbeing.

Work environment trends

To fully understand what work will look like in 2050, businesses will need to think about the world they will exist in, the customers or clients they will have, and the desires of their future employees. What types of existing technologies will finally come into play, and what kind of new technologies will be on the market? Where will business meetings take place? Will they even be necessary? These are all questions that must be answered.

Many companies have analyzed trends that will shape the Future of Work, such as this one from Korn Ferry, and this one from HBR. There are many more where those came from, so heres a summary:

Some trends we saw in the past couple of years: 

  1. The power has shifted from organization to people. There is more focus on fairness and equity.
  2. Scarcity of talent means promoting internal mobility, shorter work weeks, flexible work weeks.
  3. Focus on employee experience and wellbeing.
  4. Many tasks usually done by humans will now be automated. We can now focus on human-human interactions.
  5. More freedom in remote working means more need for accountability. 

It's become increasingly clear that the experience of work is heavily influenced by the work environment and company culture. Company environments that nurture a more inclusive culture, create better working experiences for their employees..

Why will work change in the future?

Work changes because people change. Their environments change, needs change, interactions change and technologies change.

In 2006, Microsoft Hardware Group came to us with a problem, "What is the future of smartphones?" Though this might seem like a question with a known answer now, we should never underestimate the role the future of technology has on our daily life, whether this is at home (link to dynamic home) or at work. In that project, we set out to envision a technology-augmented future of smartphones, with the future users in mind.  

Similarly, envisioning the future of work involves understanding how work plays into our lives in general, and by speculating how we will interact with technology. If we can understand how technology, lifestyle, human needs, and markets will change in the future, we might be able to speculate on possible future work scenarios. Though this might seem like a lot of uncertainty, core human needs will always be driving us.

For example, in 2006, we were tasked to envision the future of smartphones for Microsoft Hardware Group. The Future of Mobile, codenamed “Rouge,” was an exploration of the Future of Mobile within a home and work context. Using scenario building exercises with designers, this pre-Windows Mobile 7 effort illustrated a world where the smartphone is the only device you need.

On a surface level, this project from almost a decade ago shows us our desire for telecommuting and the technology enabled freedom that comes with it. On a deeper level, it suggests our innate desire for novelty, freedom, agency, and interaction. 

The next question is what is the future of work experience 20 years from now?

The answer? What is the world like 20 years from now, and who is living in that world?

July 17, 2007No Comments

#tbt Curved Touch Screen Design

This post that I dug up blows my mind - didn't even know curved touch screens were even a thing back then. To give us a reference point, Samsung's first curved display was from 2014!

Interactive touch screens are now ubiquitous. They are found in our homes, offices and pockets. They are designed to control our lights, appliances and entertainment systems. We use them to access information and communicate with others. They have become an essential part of our lives. But it wasn't always this way. In 2007, the year the first iPhone was released, interactive touch screen designs were not as common as they are today.

So what was it like to design touch screens back then? We dug up a couple of our blog posts from 2007 to give you a little taste on touch screen prototyping:

  1. Z-Depth Touch Screen
  2. Curved Touch Screen
  3. Touch Screen + LED
  4. Touch Screen Instruments
  5. Touch Screen V5

7/17/2007 Curved Touch Screen // Ron

Ah, the joys of having a band saw that works, and lots of scrap wood!!!
It didn’t take me more than about 30mins to find a couple sacrificial scraps, mark out a radius, cut them on my band saw and find enough clamps to hold everything together long enough to see if it works…

Curved Touch Screen Prototype

I had to mount the LED array on the end with silver tape to guide the light into the edge and figure out a way to clamp the end of the plexi straight enough so that the array fit…etc.. but this is what the camera saw!


so, in short, it works like a charm… the only thing that was surprising was that as the screen starts to curve, it has to be very clean or those smudges really show.. (note the very bright smudge patterns at the bottom of the image..)

Hope you enjoyed the a sneak-peak into how interactive curved touch screens were prototyped in 2007! Check out another one of our curvy interactive designs!

Instagram    |    LinkedIn    |    Facebook

Cambridge, MA 02138
+1 617 497 9900
info@INVIVIA.COM

© 2022 INVIVIA          Legal          Privacy