HUMAN FACTORS ENGINEERING AND SYSTEMS DESIGN
1.2 Brief History of the Systems Approach and Human Factors Engineering
0.0%
10.0%
20.0%
30.0%
40.0%
50.0%
60.0%
70.0%
65–74 75+
55–64
2016 2006
1996 1986
Age group
Figure 2 Projected labor force participation rates of older adults, 1986–2016 (Toossi, 2007).
cultural values vary with respect to work practices, com- munication, and family. Cultural/ethnic values are also dynamic and change over time. If ethnic/cultural factors are not considered in systems design and operations, there may be breakdowns in system team performance and overall system efficiency. To date, there is limited information on how cultural factors affect issues such as team work, communication, and the overall opera- tions of systems. In general, all of the aforementioned issues underscore the need for a more human factors involvement in systems design.
1.2 Brief History of the Systems Approach
42 HUMAN FACTORS FUNCTION of communications technologies has become ubiquitous
in most work systems. One of the most dramatic changes has been the development of the Internet, which allows increased access to vast amounts of information by a wide variety of users as well as greater interconnectiv- ity than ever before across time zones and distances.
Access to the Internet places greater demands on infor- mation processing, and information management and concerns about privacy and information security have become important issues within the fields of human fac- tors and human–computer interaction (Proctor and Vu, 2010). Phase C is continuing to grow at a rapid pace and human factors engineers are confronted with many new types of technology and work systems, such as artificial intelligence agents, human supervisory control, and vir- tual reality. For example, robots are increasingly being introduced into military, space, aviation, and medical domains and research is being conducted on how to optimize human–robot teams. Issues being investigated include strategies for maximizing communication such as using gesture or gaze and how to optimally coordi- nate human–robot behavior. Ongoing research is also examining how theories and models of natural human interactions can be applied to robotic systems (e.g., Shah and Breazeal, 2010). Clearly, these types of systems present new challenges for system designers and human factors specialists.
To design today’s work systems effectively, we need to apply knowledge regarding human information- processing capabilities to the design process. The need for this type of knowledge has created a greater empha- sis on issues related to human cognition within the field of human factors and has led to the emergence of cogni- tive engineering (Woods, 1988). Cognitive engineering focuses on complex, cognitive thinking and knowledge- related aspects of human performance, whether carried out by humans or by machine agents (Wickens et al., 2004). It is closely aligned with the field of cognitive science and artificial intelligence. With the emphasis on team work, the concept of team cognition has emerged, which refers to the interaction between intraindivid- ual and interindividual cognitive processes and applies the conceptual tools of cognitive science to a team or group as opposed to the individual. More recently, theories of macrocognition have been developed to guide complex collaborative processes or knowledge- based performance in nonroutinized, novel situations. It emphasizes expertise out of context and teams going beyond routine methods of performing and generating new performance processes to deal with novel situa- tions (Fiore et al., 2010). Another new construct that has emerged is neuroergonomics, which involves the study of the mechanisms that underlie human information pro- cessing through methods used in cognitive neuroscience.
These methods include neuroimaging techniques such as functional magnetic resonance imaging (fMRI), elec- troencephalography (EEG), and event-related potentials (ERPs). These techniques have been applied to assess- ment workload in complex tasks and mental workload and vigilance (Parasuraman and Wilson, 2008; Proctor and Vu; 2010).
Further need for new approaches to system design comes from the changing nature of the design process.
Developments in technology and automation have not only increased the complexities of the types of systems that are being designed but have also changed the design process itself and the way designers think, act, and communicate. System design is an extremely complex process that proceeds over relatively long time periods in an atmosphere of uncertainty (Meister, 2000;
Sage and Rouse, 2009). The process is influenced by many factors, some of which are behavioral and some of which are physical, technical, and organizational (Table 1). As noted, design also involves interaction among many people with different types and levels of knowledge and diverse backgrounds. At the most basic level, this interaction involves engineers from many different specialties; however, in reality it also involves the users of the system being designed and organizational representatives. Further, system design often takes place under time constraints in turbulent economic and social markets. Design also involves the use of many different tools and technologies. For example, human performance models are often used to aid the design process. In this regard, there have been three major trends in the development of human performance models: manual control models, network models, and cognitive process models. Today, in many instances sophisticated models of human behavior are simulated in virtual environments to evaluate human system integration. In these instances a digital or numerical manikin is used to model human processes in an attempt to take into account factors, such as human behavior, that influence system reliability early in the design process (L¨amkull et al., 2007; Fass and Leiber, 2009). These types of modeling techniques are being deployed in the aircraft and air traffic control systems as well as in the automotive and military industries.
Currently, many models are complex and difficult to use without training. There is a strong need within the human factors community to improve the quality and usability of these models and to ensure that practitioners have the requisite skills to use these models (Pew, 2008).
Overall, it has become apparent that we cannot restrict the application of human factors to the design of specific jobs, workplaces, or human–machine inter- faces; instead we must broaden our view of system design and consider broader sociotechnical issues. In other words, design of today’s systems requires the adoption of a moremacroergonomic approach, a top- down sociotechnical system approach to design that is concerned with the human–organizational interface and represents a broad perspective to systems design.
Sociotechnical systems integrate people and social and technical elements to accomplish system objectives.
Thus, people within these systems must demonstrate both social and technical skills and have an aware- ness of the broader environment to function effectively (Carayon, 2006). As illustrated throughout this chapter, a number of important trends are related to the organiza- tion and design of work systems that underscore the need for a macroergonomic approach, including (1) rapid
Table 1 Design Process
Elements 1. Design specification
2. Design history (e.g., predecessor system data and analyses) 3. Design components transferred from a predecessor system 4. Design goals (technological and idiosyncratic)
Processes
1. Analysis of design goals (performed by both designers and human factors ergonomics specialists) 2. Determination of design problem parameters (both)
3. Search for information to understand the design problem and parameters (both) 4. Behavioral analysis of functions and tasks (specialist only)
5. Transformation of behavioral information into physical surrogates (specialist only)
6. Development and evaluation of alternative solution to the design problem (both, mostly designers) 7. Selection of one design solution to be followed by detailed design (both, mostly designers)
8. Design of the human– machine interface, human–computer interface, human–robot interface (any may be primary) 9. Evaluation and testing of design outputs (both)
10. Determination of system status and development progress (both) Factors Affecting Design
1. Nature of the design problem and of the system, equipment, or product to be designed 2. Availability of needed relevant information
3. Strategies for solution of design problem (information-processing methods)
4. Idiosyncratic factors (designer/specialist intelligence, training, experience, skill, personality) 5. Multidisciplinary nature of the team
6. Environmental constraints and characteristics 7. Project organization and management Source: Adapted from Meister (2000).
developments in technology, (2) demographic shifts, (3) changes in the value system of the workforce, (4) world competition, (5) an increased concern for safety and the resulting increase in ergonomics-based litigations, and (6) the failure of traditional microer- gonomics (Hendrick and Kleiner, 2001; Kleiner, 2008).
In sum, the nature of human–machine systems has changed drastically since the era of knobs and dials, presenting new challenges and opportunities for human factors engineers. We are faced not only with designing and evaluating new types of systems and a wider variety of systems (e.g., health care systems, living environments) but also with many different types of user populations. Many people with limited technical background and of varying ages are operating complex technology-based systems, which raises many new issues for system designers. For example, older workers may require different types of training or different work schedules to interact effectively with new technology, or operators with a limited technical background may require a different type of interface than those who are more experienced. Emergence of these types of issues reinforces the need to include human factors in system design. In the following section we present a general model of a system that will serve as background to a discussion of the system design process.
2 DEFINITION OF A SYSTEM 2.1 General System Characteristics
A system is an aggregation of elements organized in some structure (usually, hierarchical) to accomplish sys- tem goals and objectives. All systems have the following characteristics: interaction of elements, structure, pur- pose, and goals and inputs and outputs. A system is usually composed of humans and machines and has a definable structure and organization and external bound- aries that separate it from elements outside the system.
All the elements within a system interact and function to achieve system goals. Further, each system component has an effect on the other components. It is through the system inputs and outputs that the elements of a system interact and communicate. Systems also exist within an environment (physical and social), and the characteristics of this environment have an impact on the structure and the overall effectiveness of the system (Meister, 1989, 1991). For example, to be responsive to today’s highly competitive and unstable environ- ment, systems have to be flexible and dynamic. This creates the need for changes in organizational struc- tures. Formal, hierarchical organizations do not effec- tively support distributed decision making and flexible processes.
44 HUMAN FACTORS FUNCTION Generally, all systems have the following com-
ponents: (1) elements (personnel, equipment, proce- dures); (2) conversion processes (processes that result in changes in system states); (3) inputs or resources (personnel abilities, technical data); (4) outputs (e.g., number of units produced); (5) an environment (phys- ical and social and organizational); (6) purpose and functions (the starting point in system development);
(7) attributes (e.g., reliability); (8) components and pro- grams; (9) management, agents, and decision makers;
and (10) structure. These components must be consid- ered in the design and evaluation of every system. For example, the nature of the system inputs has a sig- nificant impact on the ability of a system to produce the desired outputs. Inputs that are complex, ambigu- ous, or unanticipated may lead to errors or time delays in information processing, which in turn may lead to inaccurate or inappropriate responses. If there is con- flicting or confusing information on a patient’s chart, a physician might have difficulty diagnosing the illness and prescribing the appropriate course of treatment.
There are various ways in which systems are clas- sified. Systems can be distinguished according to degree of automation, functions and tasks, feedback mechanisms, system class, hierarchical levels, and com- binations of system elements (Meister, 1991). A basic distinction between open- and closed-loop systems is usually made on the basis of the nature of a system’s feedback mechanisms. Closed-loop systems perform a process that requires continuous control and feedback for error correction. Feedback mechanisms exist that provide continuous information regarding the difference between the actual and the desired states of the system.
In contrast, open-loop systems do not use feedback
for continuous control; when activated, no further control is executed. However, feedback can be used to improve future operations of the system (Sanders and McCormick, 1993). The distinction between open- and closed-loop systems is important, as they require different design strategies.
We are also able to describe different classes of systems. For example, we can distinguish at a very general level among educational systems, production systems, maintenance systems and health care systems, transportation systems, communication systems, and military systems. Within each of these systems we can also identify subsystems, such as the social system or the technical system. Complex systems generally contain a number of subsystems. Finally, we are able to distinguish systems according to components or elements. For example, we can distinguish among machine systems, human systems (biological systems), and human–machine systems and more recently human–robot systems and collaborative team or group systems.