The origins of UX can be traced back centuries, but it was only in the early 19th century that UX started to mold into a distinct discipline. American engineer Frederick Winslow Taylor (1856-1915) pioneered the Industrial Revolution by giving birth to a movement called Taylorism, which focused on how workers interact with their tools to complete tasks efficiently. During the 1940s, Toyota developed a sociotechnical system, called the Toyota Production System, that was recognized as the first Human Centered Production System that focused on the interaction between humans and technology. Toyota's philosophy with this new system included constant improvement, no waste, and an emphasis on respecting team members. This approach included people as part of the improvement process, thus bringing back the human factor that was lost with Taylorism.
Fast forwarding to 1955, Henry Dreyfuss, an American industrial designer, wrote a classic design text called Designing for People in which he focused on people’s experiences, good or bad, with the design of a product. One of the many useful points from his book is how to design for a variety of people with different sizes and abilities, to effortlessly use a service or product.
An excerpt from Dreyfuss's book:
When the point of contact between the product and the people becomes a point of friction, then the industrial designer has failed. On the other hand, if people are made safer, more comfortable, more eager to purchase, more efficient--or just plain happier--by contact with the product, then the designer has succeeded.
During the 1970s, a Xerox research center called PARC (Palo Alto Research Center Incorporated) explored the subject of innovations in workplace technology. This experimental project resulted in some groundbreaking technological innovations that we are still using today, for example, the graphical user interface (GUI), the bitmap graphic format (.bmp) for images, as well as the computer mouse to navigate with on a personal computer.
In the early 1980s, human-computer interaction (HCI) was established as a discipline that focused on computer science and human factors engineering, but has grown exponentially the past couple of decades into a broad discipline that focuses on a variety of specialties, such as ergonomics, sociology, cognitive processes of human behavior, accessibility, and human interface design, to name just a few.
Before the rise of HCI as a discipline, the only people who interacted with computers were professionals in the field of information technology, but this changed with the first personal computer that was released in the late 1970s. Any person now had access to a personal computer and could use text editors, spreadsheets, and play computer games, and had the opportunity to learn programming languages. With the personal computer being a technological tool accessible to any person, the GUI, which is the visual interface the person uses to interact with the personal computer, was developed to make it easy to use. Soon after the GUI was implemented, computer screens became cluttered with hundreds of icons that made it hard to find files, and the first extension of HCI, the search functionality, was introduced for users to easily find what they were looking for. Another fundamental extension of HCI was in the 1980s when users were able to communicate with other users via email, which quickly expanded to instant messaging, online forums, and social networks, thus the interaction was not limited to using the computer to do personal tasks but to the computer becoming a communication channel with other humans. Throughout the 1980s, HCI expanded by introducing different devices. Users were not only using desktop computers but started using laptops and mobile phones that became available. From gradual device expansion, technology found its way into the daily lives of the user, such as in their cars and home appliances. It’s clear from the path HCI has taken already that it will keep evolving with humans and become more integrated into their daily lives and in society.
Eventually, in the 90s, cognitive psychologist Donald Norman joined Apple as the vice president of the Advanced Technology Group and gave birth to the term user experience as we know it today.
Norman on creating the term user experience:
I invented the term because I thought the human interface and usability were too narrow. I wanted to cover all aspects of the person’s experience with a system, including industrial design, graphics, the interface, the physical interaction, and the manual.
Norman's writings focus on the user's cognitive experience with products, especially with technological products. He wrote a widely influential text called The Design of Everyday Things. In this book, he promotes the usability and user experience, rather than aesthetics, of a product. Norman's focus on improving usability goes hand-in-hand with the person's perception of the product. Human-related sciences, technical advancement, and design focused disciplines all contribute to the improved interaction between a person and the technology they're using. People are different.
The way they learn and understand concepts, even their own personal perceptions of how they interact with technology, differ. American businessman and philanthropist Charles Thomas Munger established the theory of mental models in the early 90s in the business and finance realm and, from there, it’s been expanded to many other industries, including UX. Due to mental models infiltrating many industries, there are many definitions. Susan Carey defined mental models in her 1986 journal article Cognitive science and Science Education, as follows:
A mental model represents a person’s thought process for how something works (that is, a person’s understanding of the surrounding world). Mental models are based on incomplete facts, past experiences, and even intuitive perceptions. They help shape actions and behavior, influence what people pay attention to in complicated situations, and define how people approach and solve problems.
Carey's simplistic definition of mental models paves the way for UX professionals to create unique experiences that make sense to the user. The Nielsen Norman Group, one of the most influential leaders in the UX industry and founded by Jakob Nielsen, Don Norman and Bruce Tognazzini in 1998, highlights the following two points to keep in mind when working with the mental models and UX of a website:
- A mental model is based on belief, not facts. Thus, the model is based on what the user knows or what they think they know about your website.
- Individual users each have their own mental model. Different users will have different mental models of the same website.
The more the user interacts with your website, the more experience they will build up with the website, thus their mental model will change accordingly. The same way the user can interact with other websites that can also adjust to the user's mental model. It's imperative that the UI designer understands mental models and ensure their design patterns; layouts are consistent to reduce cognitive load. In the next chapter, we'll go into more detail on how to design accordingly regarding user's mental models to improve the user experience.