Which of the following refers to the ability to manipulate objects and to be physically adept?

Skip Nav Destination

Connected Content

A companion article has been published: Discussion of “A Review of Intent Detection, Arbitration, and Communication Aspects of Shared Control for Physical Human–Robot Interaction” (Losey, D. P., McDonald, C. G., Battaglia, E., and O'Malley, M. K., 2018, ASME Appl. Mech. Rev., 70(1), p. 010804)

A companion article has been published: Closure to “Discussion of ‘A Review of Intent Detection, Arbitration, and Communication Aspects of Shared Control for Physical Human–Robot Interaction”’ (Losey, D. P., McDonald, C. G., Battaglia, E., and O'Malley, M.K., 2018, ASME Appl. Mech. Rev., 70(1), p. 010804)

Affordances Demystified

Rex Hartson, Partha S. Pyla, in The UX Book, 2012

20.3.2 Physical Affordance

Physical affordance is a design feature that helps, aids, supports, facilitates, or enables doing something physically. Adequate size and easy-to-access location could be physical affordance features of an interface button design enabling users to click easily on the button.

Because physical affordance has to do with physical objects, we treat active interface objects on the screen, for example, as real physical objects, as they can be on the receiving end of real physical actions (such as clicking or dragging) by users. Physical affordance is associated with the “operability” characteristics of such user interface artifacts. As many in the literature have pointed out, it is clear that a button on a screen cannot really be pressed, which is why we try to use the terminology “clicking on buttons.”

Physical affordances play a starring role in interaction design for experienced or power users who have less need for elaborate cognitive affordances but whose task performance depends largely on the speed of physical actions. Design issues for physical affordances are about physical characteristics of a device or interface that afford physical manipulation. Such design issues include Fitts' law (Fitts, 1954; MacKenzie, 1992), physical disabilities and limitations, and physical characteristics of interaction devices and interaction techniques.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780123852410000208

The Interaction Cycle

Rex Hartson, Pardha Pyla, in The UX Book (Second Edition), 2019

31.3.3.1 Physical actions—concepts

Physical actions are especially important for analysis of performance by expert users who have, to some extent, “automated” planning and translation associated with a task and for whom physical actions have become the limiting factor in task performance.

Physical affordance design factors include design of input/output devices (e.g., touchscreen design or keyboard layout), haptic devices, interaction styles and techniques, direct manipulation issues, gestural body movements, physical fatigue, and such physical human factors issues as manual dexterity, hand-eye coordination, layout, interaction using two hands and feet, and physical disabilities.

Physical affordance

A design feature that helps, aids, supports, facilitates, or enables user physical actions: clicking, touching, pointing, gesturing, and moving things (Section 30.3).

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B978012805342300031X

Affordances in UX Design

Rex Hartson, Pardha Pyla, in The UX Book (Second Edition), 2019

30.3.2 Physical Affordance Design Issues

Physical affordance design issues have to do with the following, each of which is explained in subsequent sections:

Helping users perform physical actions.

Accommodating physical disabilities.

Awkwardness of physical actions.

Human factors and ergonomic issues of device design.

Physicality.

Manual dexterity and Fitts’ law (Fitts, 1954; MacKenzie, 1992).

A special characteristic of physical actions that we call physical overshoot.

These physical affordance design issues and more are covered in the UX design guidelines of Chapter 33.

30.3.2.1 Helping user manipulate objects, do actions

Above all else, physical affordance design issues are about helping users perform physical actions, usually to manipulate user interface objects. It’s about how easy it is for the user to make physical actions. This definitely applies in GUI design, but can also apply to hardware interfaces with real buttons and knobs.

30.3.2.2 Physical disabilities

Some human factors and UX designs require attention to physical disabilities and physical limitations of users, and this is at the core of what physical affordances are all about. Clearly, there are individual differences among human users in their physical abilities. For example, very young or older users might have difficulty with fine motor control in using the mouse or another pointing device.

Sometimes, users just naturally have limitations, while others develop disabilities from accidents or disease; it doesn’t matter. The design of physical affordances is where you accommodate their needs.

30.3.2.3 Physical awkwardness

Awkwardness is something we may not often think about in UX design but, if we do, it can be one of the easiest difficulties to avoid. Designs that present physical awkwardness, such as in holding down the Ctrl, Shift, and Alt keys while dragging with the mouse button down, present an awkwardness that costs users time and energy.

Beyond that, a device that requires an awkward hand movement can lead to fatigue in repetitive usage. For example, a touchscreen mounted on the wall at eye level can cause arm fatigue from having to do interactions with one’s hand up in the air.

Another example of awkwardness results from the user having to alternate constantly among multiple input devices, such as having to move between keyboard and mouse or between either device and the touchscreen. This kind of behavior involves constant “homing” actions, never getting a chance to “settle down” physically on one device, a time-consuming and effortful distraction of cognitive focus, visual attention, and physical action.

30.3.2.4 Physicality

Physicality is a term referring to real direct physical interaction with real physical (hardware) devices like in the grasping and moving of knobs and levers. This isn’t about clicking on “soft” display controls like images of arrows, buttons, or sliders (e.g., to tune a radio or adjust the volume), but is about actually pushing, pulling, grasping, and/or turning real hardware devices such as knobs, buttons, and levers.

Example: Physicality in a Car Shifting Knob

Fig. 30-15 shows a gear-shift knob that is an archetypical example of physicality.

Which of the following refers to the ability to manipulate objects and to be physically adept?

Fig. 30-15. A car gear-shift knob that just says physicality.

That knob just looks so graspable. It’s just sticking out there and fits your hand so nicely. The leather exterior gives it a nice texture for grasping. Almost everyone likes the kind of physicality that this thing affords. It gives a sure grip and a satisfying feeling of control as you shift confidently through the gears.

Physicality has been an issue for human factors engineers for a long time; Don Norman (2007a) brought it to the attention of the HCI/UX community.

Example: Physicality in Controls for a Radio

Fig. 30-16 shows a car radio that illustrates the issue of physicality.

Which of the following refers to the ability to manipulate objects and to be physically adept?

Fig. 30-16. A car radio with limited physicality.

While this radio does have a physical volume control, there is no knob for tuning. Tuning is done by pushing the up and down arrows on the left-hand side. This design lacks the satisfying kind of physicality you get from the almost universally preferred grasping and turning of a knob.

30.3.2.5 Manual dexterity and Fitts’ law

Among the characteristics that affect physical affordance, especially in GUIs, are the size and location of the object to be manipulated. A large object is obviously easier to click on than a tiny one. And location of the object can determine how easy it is to get at the object to manipulate it.

These relationships are represented by Fitts’ law, an empirically-based theory expressed as a set of mathematical formulae that govern certain physical actions in human behavior. As it applies in HCI/UX, Fitts’ law governs physical movement (e.g., of the cursor) for object selection, moving, dragging, and dropping.

It’s specifically about movement from an initial position to a target object at a terminal position. The formulas predict the time to make a movement as:

Proportional to log2 of distance moved.

Inversely proportional to log2 of target cross-section normal to the direction of motion.

Also (not expressed by Fitts’ Law), the time to make a movement is also inversely proportional to the depth of the target along the path of movement.

The potential for errors is the same, namely, proportional to log2 of the distance moved and inversely proportional to log2 of target cross-section normal to direction of motion. And, similarly, accuracy is proportional to the depth of the target perpendicular to the path of movement.

In Table 31-2, we translate this to what it means for UX design.

Table 31-2. How Fitts’ law plays out in UX design

Practical ConclusionsDesign Implications
Longer distance movement takes more time (and produces more fatigue)Group clickable objects related by task flow close together, but not so close that it could cause erroneous selection
Small objects are harder to click on than large onesMake selectable objects large enough to be clicked on easily
Smaller (shallower) objects are harder to land on as targets of movementMake target objects large enough for quick and accurate termination of cursor movement

Fitts’ law is one of the elements of HCI theory and is the subject of numerous empirical studies in the early HCI literature (Fitts, 1954; MacKenzie, 1992).

30.3.2.6 Physical overshoot

What is physical overshoot?

Physical action overshoot is what occurs when you move an object, a cursor, a slider, a lever, a switch—and you go too far in making the physical action, beyond where you wanted to be. An example in a computer interface is the setting of a slider bar, when you pull the slider too far.

Example: Downshifting an Automatic Transmission

Fig. 30-17 shows the transmission gear indicator in an old pickup truck I used to have. Here you can see, from right to left at the bottom, the common linear progression of gears from low (number 1) to high (shown as a circle around “D,” meaning drive or overdrive).

Which of the following refers to the ability to manipulate objects and to be physically adept?

Fig. 30-17. Gear shifting with an automatic transmission.

For general driving, it’s a pretty good design, but when you are coming down a long hill, you might want to downshift to third gear on the fly, using engine drag to maintain the speed limit without wearing out the brakes.

However, because the shifting movement is linear, when you pull that lever down from D, it is too easy to overshoot third gear and end up in second. The result of this error becomes immediately obvious from the engine straining at high RPM.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128053423000308

UX Design Guidelines

Rex Hartson, Pardha Pyla, in The UX Book (Second Edition), 2019

32.7 Physical Actions

Physical actions guidelines support users in doing physical actions, including typing, clicking, dragging in a GUI, scrolling on a web page, speaking with a voice interface, walking in a virtual environment, moving one’s hands in gestural interaction, and gazing with the eyes. This is the one part of the user’s Interaction Cycle where there is essentially no cognitive component; the user already knows what to do and how to do it.

Issues here are limited to how well the design supports the physical actions of doing it, acting upon user interface objects to access all features, and functionality within the system. The two primary areas of design considerations are how well the design supports users in sensing the object(s) to be manipulated and how well the design supports users in doing the physical manipulation. As a simple example, it’s about seeing a button and clicking on it.

Physical actions are the one place in the Interaction Cycle where physical affordances are relevant, where you will find issues about Fitts’ law, manual dexterity, physical disabilities, awkwardness, and physical fatigue.

Fitts' law

An empirically-based theory expressed as a set of mathematical formulae that predict the time to make a cursor or other movement is proportional to log2 of distance moved and inversely proportional to log2 of target cross-section normal to the direction of movement (Section 30.3.2.5).

32.7.1 Sensing Objects of Physical Actions

In Fig. 32-45, we highlight the “sensing user interface object” part within the breakdown of the physical actions part of the Interaction Cycle.

Which of the following refers to the ability to manipulate objects and to be physically adept?

Fig. 32-45. Sensing the user interface (UI) object, within physical actions.

32.7.1.1 Sensing objects to manipulate

The “Sensing user interface object” portion of the physical actions part is about designing to support user sensory (for example, visual, auditory, or tactile) needs in locating the appropriate physical affordance quickly in order to manipulate it. Sensing for physical actions is about presentation of physical affordances, and the associated design issues are similar to those of the presentation of cognitive affordances in other parts of the Interaction Cycle, including visibility, noticeability, findability, distinguishability, discernibility, sensory disabilities, and presentation medium.

Support users making physical actions with effective sensory affordances for sensing physical affordances.

Make objects to be manipulated visible, discernable, legible, noticeable, and distinguishable. When possible, locate the focus of attention (the cursor, for example) near the objects to be manipulated.

32.7.1.2 Sensing objects during manipulation

Not only is it important to be able to sense objects statically to initiate physical actions, but users also need to be able to sense the cursor and the physical affordance object dynamically to keep track of them during manipulation. As an example, in dragging a graphical object, the user’s dynamic sensory needs are supported by showing an outline of the graphical object, aiding its placement in a drawing application.

As another very simple example, if the cursor is the same color as the background, the cursor can disappear into the background while moving it, making it difficult to judge how far to move the mouse back to get it visible again.

32.7.2 Help User in Doing Physical Actions

In Fig. 32-46, we highlight the “manipulating user interface object” part within the breakdown of the physical actions part of the Interaction Cycle.

Which of the following refers to the ability to manipulate objects and to be physically adept?

Fig. 32-46. Manipulating the user interface (UI) object within physical actions.

This part of the Interaction Cycle is about supporting user physical needs at the time of making physical actions; it’s about making user interface object manipulation physically easy. It’s especially about designing to make physical actions efficient for expert users.

Support user with effective physical affordances for manipulating objects, help in doing actions.

Issues relevant to supporting physical actions include awkwardness and physical disabilities, manual dexterity and Fitts’ law, plus haptics and physicality.

Physicality

Referring to real direct physical interaction with real physical (hardware) devices like in the grasping and moving of knobs and levers (Section 30.3.2.4).

32.7.2.1 Awkwardness and physical disabilities

One of the easiest aspects of designing for physical actions is avoiding awkwardness. It’s also one of the easiest areas in which to find existing problems in UX evaluation.

Avoid physical awkwardness.

Issues of physical awkwardness are often about time and energy expended in physical motions. The classic example of this issue is a user having to alternate constantly among multiple input devices such as between a keyboard and a mouse or between either device and a touchscreen.

This device switching involves constant “homing” actions that require time-consuming and effortful distraction of cognitive focus and visual attention. Keyboard combinations requiring multiple fingers on multiple keys can also be awkward user actions that hinder smooth and fast interaction.

Accommodate physical disabilities.

Not all human users have the same physical abilities—range of motion, fine motor control, vision, or hearing. Some users are innately limited; some have disabilities due to accidents. Although in-depth coverage of accessibility issues is beyond our scope, accommodation of user disabilities is an extremely important part of designing for the physical actions part of the Interaction Cycle.

32.7.2.2 Manual dexterity and Fitts’ law

Design issues related to Fitts’ law are about movement distances, mutual object proximities, and target object size. Performance is reckoned in terms of both time and errors. In a strict interpretation, an error would be clicking anywhere except on the correct object. A more practical interpretation would limit errors to clicking on incorrect objects that are near the correct object; this is the kind of error that can have a more negative effect on the interaction. This discussion leads to the following guidelines.

Design layout to support manual dexterity and Fitts’ law.

Support targeted cursor movement by making selectable objects large enough.

The bottom line about sizes and cursor movement among targets is simple: small objects are harder to click on than large ones. Give your interaction objects enough size, both in cross-section for accuracy in the cursor movement direction and in their depth to support accurate termination of movement within the target object.

Group clickable objects related by task flow close together.

Avoid fatigue and slow movement times. Large movement distances require more time and can lead to more targeting errors. Short distances between related objects will result in shorter movement times and fewer errors.

But don’t group objects too close, and don’t include unrelated objects in the grouping.

Avoid erroneous selection that can be caused by close proximity of target objects to nontarget objects.

Example: Oops, I Missed the Icon

A software drawing application has a very large number of functions, most of which are accessible via small icons in a tool bar. Each function can also be invoked by another way (e.g., a menu choice), but our observations tell us that expert users like to use the tool bar icons.

But there are problems in using the icons because there are so many icons, they are small, and they are crowded together. This, combined with the fast actions of experienced users, leads to clicking on the wrong icon more often than users would like.

32.7.2.3 Constraining physical actions to avoid physical overshoot errors

Design physical movement to avoid physical overshoot.

Physical overshoot

Moving an object, a cursor, a slider, a lever, a switch too far in making the physical action, beyond where it was intended to be (Section 30.3.2.6).

Just as in the case of cursor movement, other kinds of physical actions can be at risk for overshoot, extending the movement beyond what was intended. This concept is best illustrated by the hair dryer switch example that follows.

Example: Blow Dry, Anyone?

Suppose you are using a hair dryer on the low setting, and you are ready to switch it off. To move the hair dryer switch takes a certain threshold pressure to overcome initial resistance. Once in motion, however, unless the user is adept at reducing this pressure instantly, the switch can move beyond the intended setting.

A strong detent at each switch position can help prevent the movement from overshooting, but it’s still easy to push the switch too far, as the photo of a hair dryer switch in Fig. 32-47 illustrates. Starting in the LOW position and pushing the switch toward OFF, the switch configuration makes it easy to move accidentally beyond OFF over to the HIGH setting.

Which of the following refers to the ability to manipulate objects and to be physically adept?

Fig. 32-47. A hair dryer control switch inviting physical overshoot.

This physical overshoot is preventable with a switch design that goes directly from HIGH to LOW and then to OFF in a logical progression. Having the OFF position at one end of the physical movement is a kind of physical constraint or boundary condition that allows you to push the switch to OFF firmly and quickly without a careful touch or worrying about overshooting.

Why do essentially all hair dryers have this UX flaw? It’s probably easier to manufacture a switch with the neutral OFF position in the middle.

32.7.2.4 Haptics and physicality

Haptics is about the sense of touch and physical grasping, and physicality is about real physical interaction using real physical devices, such as real knobs and levels, instead of “virtual” interaction via “soft” devices.

Include physicality in your design when the alternatives are not as satisfying to the user.

Example: Beamer Without Knobs

The BMW iDrive idea seemed so good on paper. It was simplicity in itself. No panels cluttered with knobs and buttons. How cool and forward looking. Designers realized that drivers could do anything via a set of menus. But drivers soon realized that the controls for everything were buried in a maze of hierarchical menus. No longer could you reach out and tweak the heater fan speed without looking. Fortunately, physical knobs are now coming back in BMWs.

Example: Roger's New Microwave

Here is an old email from our friend, Roger Ehrich, only slightly edited:

Hey Rex, since our microwave was about 25 years old, we worried about radiation leakage, so we reluctantly got a new one. The old one had a knob that you twisted to set the time, and a START button that, unlike in Windows, actually started the thing. The new one had a digital interface and Marion and I spent over 10 minutes trying to get it to even turn on, but we got nothing but an error message. I feel you should never get an error message from an appliance! Eventually we got it to turn on. The sequence was not complicated, but it will not tolerate any variation in user behavior. The problem is that the design is modal, some buttons being multi-functional and sequential. A casual user like me will forget and get it wrong again. Better for me to take my popcorn over to a neighbor who remembers what to do. Anyway, here’s to the good old days and the timer knob.

– Regards, Roger

Example: Great Physicality in Truck Radio and Heater Knobs

Fig. 32-48 is a photo of the radio and heater controls of a pickup truck.

Which of the following refers to the ability to manipulate objects and to be physically adept?

Fig. 32-48. Great physicality in the radio volume control and heater control knobs.

The large and easily grasped outside ring of the volume control knob is a joy to use, and it’s not doubled up with any other mode. Also note the heater control knobs below the radio. Again, the physicality of grabbing and adjusting these knobs gives great pleasure on a cold winter morning.

The only downside: no tuning knob.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128053423000321

UX Design Guidelines

Rex Hartson, Partha S. Pyla, in The UX Book, 2012

Sensing objects during manipulation

Not only is it important to be able to sense objects statically to initiate physical actions but users need to be able to sense the cursor and the physical affordance object dynamically to keep track of them during manipulation. As an example, in dragging a graphical object, the user's dynamic sensory needs are supported by showing an outline of the graphical object, aiding its placement in a drawing application.

As another very simple example, if the cursor is the same color as the background, the cursor can disappear into the background while moving it, making it difficult to judge how far to move the mouse back to get it visible again.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780123852410000221

The Interaction Cycle and the User Action Framework

Rex Hartson, Partha S. Pyla, in The UX Book, 2012

Physical actions content in the UAF

UAF content under physical actions is about how an interaction design helps users actually make actions on objects (e.g., typing, clicking, and dragging in a GUI, scrolling on a Web page, speaking with a voice interface, walking in a virtual environment, hand movements in gestural interaction, gazing with eyes). This part of the UAF contains design issues pertaining to the support of users in doing physical actions. These concepts are addressed in specific child nodes about the following topics:

Existence of necessary physical affordances in user interface. Pertains to issues about providing physical affordances (e.g., UI objects to act upon) in support of users doing physical actions to access all features and functionality provided by the system.

Sensing UI objects for and during manipulation. This category includes the support of user sensory (visual, auditory, tactile, etc.) needs in regard to sensing UI objects for and during manipulation (e.g., user ability to notice and locate physical affordance UI objects to manipulate).

Manipulating UI objects, making physical actions. A primary concern in this UAF category is the support of user physical needs at the time of actually making physical actions, especially making physical actions efficient for expert users. This category includes how each UI object is manipulated and how manipulable UI objects operate.

Making UI object manipulation physically easy involves controls, UI object layout, interaction complexity, input/output devices, interaction styles and techniques. Finally, this is the category in which you consider Fitts’ law issues having to do with the proximity of objects involved in task sequence actions.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B978012385241000021X

Clickables

Mike Kuniavsky, in Smart Things, 2010

7.2.1.2 Operation

Given that the target audience of Clickables is children and the products are marketed as toys, it is not unreasonable to assume that most users would not read the instruction manuals. This assumption is reflected in the actual instruction manuals. The longest is two pages long and consists mostly of legally required disclaimers and warnings. Successful use of the Clickables devices therefore depends on how their physical affordances8 communicate proper use, which means that those signals need to be simple.

Every Clickables device needs to exchange data with the Pixie Hollow world. But each device takes a slightly different technical approach to data transmission.

Being unpowered, the charms need to be physically connected to a source of power to work. The charms have only one set of such contacts, which are located in the only obvious area that could make physical contact with another device.

The jewelry box's data contacts function as a button. Pushing a charm, bracelet, or game against the contacts activates lights, indicating data transfer. This creates a sensation of “magical” recognition, since no other objects pressed against the box will make it glow.

The bracelets’ contacts are flat, presumably to minimize damage while worn by active kids. Pushing a button activates the bracelets and if the bracelets are in appropriate contact, they both glow in confirmation.

The handheld game transfers data in three ways: through its own USB cable and by touching its control pad to a bracelet or jewel box.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780123748997000072

Getting the Information: Visual Space and Time

Colin Ware, in Visual Thinking for Information Design (Second Edition), 2022

Affordances

The 2.5-dimensional design principles discussed in the previous pages mostly apply to static, noninteractive designs. They are useful ideas, but they have taken us away from the main thread of this chapter—the connections between perception and action—to which we now return. A useful starting place is the concept of affordances developed by the psychologist James Gibson in the 1960s.* Gibson started a revolution in the study of perception by claiming that we perceive physical affordances for actions, and not images on the retina, which had been the basis of prior theories. A flattish and firm ground surface affords passage, through walking or running or maneuvering a vehicle. A horizontal surface at waist height affords support for objects and tools we are working with. Objects of a certain size, shape, and weight afford use as tools. For example, a rock can be substituted for a hammer.

* James Gibson’s book, The Senses Considered as Perceptual Systems, stresses the structure of information in the environment as the foundation of perception.

Boston, Houghton Mifflin. 1966.

Perception of space is fundamentally about perception of action potential within the local environment. Negative affordances are as important as positive ones since they rule out whole classes of activity. A brick wall is a negative affordance completely ruling out easy access to large areas of space.

Which of the following refers to the ability to manipulate objects and to be physically adept?

A pedestrian perceives the places that afford safe walking. A driver perceives the places that afford vehicle navigation. Buildings provide the negative affordances in that they restrict travel. On the other hand, if a pedestrian’s goal is shopping for shoes then affordances relating to the likely presence of shoe stores will be perceived.

Which of the following refers to the ability to manipulate objects and to be physically adept?

Workbenches and kitchens are structured to afford good support for manual construction tasks.

Stereoscopic depth perception, cast shadows, and texture gradients are important depth cues for interacting in such an environment.

Image courtesy of Joshua Eckels.

Gibson conceived affordances as physical properties of the environment, but in reality many of the most useful affordances have to do with access to information. To make the concept of affordances more broadly applicable, computer interface designers have changed its meaning. As it is used now, it has a cognitive dimension that departs from Gibson’s definition. Cognitive affordances are readily perceived possibilities for action. For example, a computer interface may have a number of on-screen buttons that are available for metaphorically pressing with the mouse cursor.

Which of the following refers to the ability to manipulate objects and to be physically adept?

This office space, although appearing messy, is well structured for cognitive work. The computer is the nearest object to the worker and at the focus of attention. Other information-carrying objects are papers arranged so that the most important are close to hand and near the tops of piles. Less frequently accessed sources of information, such as books, are more distant. The work space has affordances structured for efficient access to information.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128235676000057

A conceptual architecture for information literacy practice

Annemaree Lloyd, in Information Literacy Landscapes, 2010

Affordance

The drawing in of the individuals towards co-participation occurs through affordance. These are understood as activities and interactions and defined here as ‘invitational opportunities’ (Billett et al., 2004) furnished by the environment. According to Gibson (1979), who first coined the term, affordances focus on the sources of information available to people (e.g. symbols and artefacts) available within an environment. In discussing the definition of affordance, Gibson (1979, p. 27) suggests that: ‘the affordances of the environment are what it offers the animals, what it provides or furnishes, either for good or ill.’ Affordances are therefore opportunities that the setting provides, which promote interaction and action. Gibson’s (1979) use of the term relates to the perception of artefacts and symbols that characterize an environment and the meaning that people, who are engaged with that environment, attribute to them. Therefore, an affordance must be perceived as information that is meaningful or that makes a difference. Affordances are contextual, but they are not a prerequisite for action unless they are meaningfully recognized by the participants within the context (Gibson, 1979; Barnes, 2000; Lloyd-Zantiotis, 2004).

Affordances furnish information, and in relation to information literacy, they can be classified as textual, social, and physical. Textual affordances are those opportunities that allow members to engage with the codified knowledge of the institution or organization enabling them to come to know the institutional landscape and its accepted practices. Social affordances are found in the interactional opportunities that occur between members as they negotiate a shared understanding about information and practice (i.e. a sense of collaborative identity and place). Through co-participation members are afforded information about the collaborative nature of professional practice (i.e. the nature of relationships between members). The information afforded is implicit and nuanced, reflecting the shared values, beliefs and norms of the community of practice. Occasionally this information will conflict with the institutional affordances provided to members when they first engage with the institution. Physical affordances manifest through an engagement with the signs, symbols and tools and physical environments of practice. These affordances provide opportunities for members to develop, connect and become reflexive with embodied or contingent information that is closely related to the know-how aspect of performance and cannot be adequately articulated in written form.

Like information, the relationship between the affordance and the information has to be understood as meaningful and useful to particular practices (Chemero, 2003). This suggests that affordance must be understood situationally, as part of the information experience and as forming an ongoing part of knowledge construction. Examples of activities that afford opportunities to connect with information may include guiding, mentoring, rehearsal, scaffolding, modelling or coaching, narration and storytelling (Gibson, 1979; Billett 2001; Lloyd, 2005). Affordances may enable access to embodied information relating to experiences about practice. Most importantly, for an affordance to be taken up by an individual, that individual must perceive the opportunity, which suggests that the value of the affordance (which might be a resource, a tool, or a person or a piece of information) must be recognized by the individual.

When we consider affordances as part of an architecture for information literacy practice we need to account for how the range of information activities occur within a setting and how and why a person interacts and forms relationships with symbols, artefacts or environmental stimuli, including people within an information environment. Having said that, it is also important to understand that the provision or opportunities to engage with information are not evenly distributed or made available to everyone within a setting and not everyone who participates will be given the same or similar opportunities to engage and experience the information environment. In the workplace, for example, access to information affordances may be affected by the nature of work, e.g. differences between part-time workers and full-time workers or novice and expert workers, or the contesting of practice between the two groups. In other settings, such as libraries, access to affordances may be affected by librarians’ perception of their clients or librarians’ conception of themselves as gatekeepers. In an educational setting, opportunities to access information may be a teacher’s perceptions of students’ abilities or need for information.

The nature of the activities and desired outcomes will influence what affordances are valued and how they are offered through co-participatory practice and used by participants. From the perspective of information literacy practice, affordances can be conceived as information experienced in the landscape, through formal, informal or incidental information seeking and dissemination activities, which encourage an individual to become reflective and reflexive about their practices and then draw individuals into membership.

In educational contexts, affordances may occur through the librarian–student interaction and be centrally focused around engaging with the online world through computer instruction (to access databases or catalogues) or instruction about analysing and evaluating textual sources (print and information and communication technology literacy) specific to the student’s discipline knowledge. In workplaces, affordances may be centrally focused on engaging and guiding the newcomer through the opportunities offered in the storylines of the community of practice. They may also provide opportunities for novices to engage with tacit and contingent sources of knowledge, which cannot be articulated or expressed in textual form but are still central to developing knowledge about practice and work performance.

Interacting with sources of information that are afforded within the context facilitates meaning making and allows the individual to develop an individual subjective position (the individual as learner) and, over time, an intersubjective position in relation to others who act in consort as a community of practice. In this respect, activities including information seeking and information dissemination, are enmeshed and shaped within context, and facilitate information engagement and experience. This enables the individual to move towards participation in the performance of meaning making activities, including engaging with signs and symbols that are valued by the community and by making connections with others already engaged with the community.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9781843345077500070

Generative Design: Ideation, Sketching, and Critiquing

Rex Hartson, Pardha Pyla, in The UX Book (Second Edition), 2019

14.3 Sketching

The idea of sketching as an indispensable part of design goes back at least to the Middle Ages. Consider da Vinci and his famous sketchbooks. Nilsson and Ottersten (1998) describe sketching as an essential visual language for brainstorming and discussion.

14.3.1 Characteristics of Sketching

Sketching is the rapid creation of freehand drawings expressing preliminary design ideas, focusing on concepts rather than details. We credit Buxton (2007b) as the champion for sketching; much of what we say about sketching can be credited to him.

Here are some more defining characteristics of sketching (Buxton, 2007b; Tohidi, Buxton, Baecker, & Sellen, 2006):

Everyone can sketch; you do not have to be artistic.

Most ideas are conveyed more effectively with a sketch than with words.

Sketches are quick and inexpensive to create; they do not inhibit early exploration.

Sketches are disposable; there is no real investment in the sketch itself.

Sketches are timely; they can be made just in time, done in the moment, provided when needed.

Sketches should be plentiful; entertain a large number of ideas and make multiple sketches of each idea.

Textual annotations play an essential support role, explaining what is going on in each part of the sketch and how.

14.3.1.1 Sketching is essential to ideation and design

Sketching is an indispensable part of design. As Buxton (2007b) puts it, if you’re not sketching, you’re not doing design. Design is a process of creation and exploration, and sketching is a visual medium for exploration.

Sketching captures ideas into an embodied and tangible form; it externalizes the mental description of an idea for sharing, analysis, and archiving. By opening up new pathways to create new ideas, sketching acts as a multiplier in ideation.

By adding visualization to ideation, sketching adds cognitive supercharging, boosting creativity by bringing in more human senses to the task (Buxton, 2007b).

14.3.1.2 Sketching is a conversation about user experience

Sketching is not art. Sketching is not about putting pen to paper in the act of drawing, nor is it about artistic ability. A sketch is not about making a drawing or picture of a product to document a design.

A sketch is a conversation. A sketch is not just an artifact that you look at; a sketch is a conversation about design. A sketch is a medium to support a conversation among the design team.

A sketch is about the user experience, not the product. In a talk at Stanford, Buxton (2007a) challenges his audience to draw his mobile phone. But he does not mean a drawing of the phone as a product. He means something much harder—a sketch that reveals the interaction, the experience of using the phone in a situated context where the product and its physical affordances encourage one type of behavior and experience over another.

14.3.1.3 Sketching is embodied cognition to aid invention

Designers invent while sketching. A sketch is not just a way to represent your thinking; the act of making the sketch is part of the thinking (Fig. 14-5). In fact, the sketch itself is less important than the process of making it.

Which of the following refers to the ability to manipulate objects and to be physically adept?

Fig. 14-5. A sketch to think about design

(photo courtesy of Akshay Sharma, of the Virginia Tech Department of Industrial Design).

The importance of involving your hands in sketching. The kinesthetics of sketching, pointing, holding, and touching bring the entire hand-eye-brain coordination feedback loop to bear on the problem solving. Your physical motor movements are coupled with visual and cognitive activity; the designer's mind and body potentiate each other in invention (Baskinger, 2008).

14.3.2 Doing Sketching

14.3.2.1 Stock up on sketching and mockup supplies

Physical Mockup

A tangible, three-dimensional prototype or model of a physical device or product, often one that can be held in the hand and often crafted rapidly out of materials at hand, used during exploration and evaluation to at least simulate physical interaction (Section 20.6.1).

Stock the ideation studio with sketching supplies such as whiteboards, blackboards, corkboards, flip chart easels, Post-it labels, tape, and marking pens. Be sure to include supplies for constructing physical mockups, including scissors, hobby knives, cardboard, foam core board, duct tape, wooden blocks, push pins, string, bits of cloth, rubber, other flexible materials, crayons, and spray paint.

14.3.2.2 Use the language of sketching

The vocabulary of sketching. To be effective at sketching for design, you must use a particular vocabulary that has not changed much over the centuries. One of the most important language features is the vocabulary of lines, which are made as freehand “open” gestures. Instead of being mechanically correct and perfectly straight, lines in sketches are roughed in and not connected precisely.

In this language, lines overlap, often extending a bit beyond the corner. Sometimes they “miss” intersecting and leave the corner open a little bit.

An unfinished appearance proposes exploration. The low resolution and detail of a sketch suggest it is a concept in the making, not a finished design. It needs to look disposable and inexpensive to make. Sketches are deliberately ambiguous and abstract, leaving “holes” for the imagination about other aspects of the design. You can see this unfinished look in the sketches of Figs. 14-6 and 14-7.

Which of the following refers to the ability to manipulate objects and to be physically adept?

Fig. 14-6. Freehand gestural sketches for the Ticket Kiosk System

(sketches courtesy of Akshay Sharma, of the Virginia Tech Department of Industrial Design).

Which of the following refers to the ability to manipulate objects and to be physically adept?

Fig. 14-7. Ideation and design exploration sketches for the Ticket Kiosk System

(sketches courtesy of Akshay Sharma, of the Virginia Tech Department of Industrial Design).

Keep sketches open to interpretation. Sketches can be interpreted in different ways, fostering new relationships to be seen within them, even by the person who drew them. In other words, avoid the appearance of precision; if everything is specified and the design looks finished, then the message is that you are telling something, “This is the design,” not proposing exploration, “Let us play with this and see what comes up.”

In Fig. 14-8, we show examples of designers doing sketching.

Which of the following refers to the ability to manipulate objects and to be physically adept?

Fig. 14-8. Designers doing sketching

(photos courtesy of Akshay Sharma, of the Virginia Tech Department of Industrial Design).

Example: Sketching for a Laptop Projector Project

The following figures show sample sketches for the K-YAN project (K-yan means “vehicle for knowledge”), an exploratory collaboration by the Virginia Tech Industrial Design Department and IL&FS.4.3 The objective is to develop a combination laptop and projector in a single portable device for use in rural India. Thanks to Akshay Sharma of the Virginia Tech Industrial Design Department for these sketches. See Figs. 14-9–14-12 for different kinds of exploratory sketches for this project.

Which of the following refers to the ability to manipulate objects and to be physically adept?

Fig. 14-9. Early ideation sketches of K-YAN

(sketches courtesy of Akshay Sharma, of the Virginia Tech Department of Industrial Design).

Which of the following refers to the ability to manipulate objects and to be physically adept?

Fig. 14-10. Midfidelity exploration sketches of K-YAN

(sketches courtesy of Akshay Sharma, of the Virginia Tech Department of Industrial Design).

Which of the following refers to the ability to manipulate objects and to be physically adept?

Fig. 14-11. Sketches to explore flip-open mechanism of K-YAN

(sketches courtesy of Akshay Sharma, of the Virginia Tech Department of Industrial Design).

Which of the following refers to the ability to manipulate objects and to be physically adept?

Fig. 14-12. Sketches to explore the emotional impact of the form for K-YAN

(sketches courtesy of Akshay Sharma, of the Virginia Tech Department of Industrial Design).

14.3.3 Exercise 14-3: Practice in Ideation and Sketching

Goal: To get practice in ideation and sketching for design.

Activities: Doing this in a small group is strongly preferable, but you can do it with one other person.

Get out blank paper, appropriate marking pens, and any other supplies you might need for sketching.

Pick a topic, a system, or device. Our recommendation is something familiar, such as a dishwasher.

Start with some free-flow ideation about ways to design a new and improved concept of a dishwasher. Do not limit yourself to conventional designs.

Go with the flow and see what happens.

Remember that this is an exercise about the process, so what you come up with for the product is not that crucial.

Everyone should make sketches of the ideas that arise about a dishwasher design, as you go in the ideation.

Start with design sketches in the ecological perspective. For a dishwasher, this might include your dining room, kitchen, and the flow of dishes in their daily cycle. You could include something unorthodox: sketch a conveyor belt from the dinner table through your appliance and out into the dish cabinets. Sketch how avoiding the use of paper plates can save resources and not fill the trash dumps.

Make some sketches from an interaction perspective showing different ways you can operate the dishwasher: how you load and unload it and how you set wash cycle parameters and turn it on.

Make sketches that project the emotional perspective of a user experience with your product. This might be more difficult, but it is worth taking some time to try.

Ideate. Sketch, sketch, and sketch. Brainstorm and discuss.

Interaction Perspective

The design viewpoint taken within the interaction layer of the pyramid of user needs, between the ecological layer at the base and the emotional layer on top. The interaction perspective is about how users operate the system or product. It is a task and intention view, in which user and system come together. It is where users look at displays and manipulate controls, and do sensory, cognitive, and physical actions (Section 12.3.1).

Deliverables: A brief written description of the ideation process and its results, along with all your supporting sketches.

Schedule: Give yourself enough time to really get engaged in this activity.

Exercise 14-4: Ideation and Sketching for Your System

Goal: More practice in ideation and sketching for design. Do the same as you did in the previous exercise, only this time for your own system.

14.3.4 Physical Mockups as Embodied Sketches

Just as sketches are two-dimensional visual vehicles for invention, a physical mockup for ideation about a physical device or product is a three-dimensional sketch. Physical mockups as sketches, like all sketches, are made quickly, are highly disposable, and are made from at-hand materials to create tangible props for exploring design visions and alternatives.

A physical mockup is an embodied sketch because it is even more of a physical manifestation of a design idea and it is a tangible artifact for touching, holding, and acting out usage (Fig. 14-13).

Which of the following refers to the ability to manipulate objects and to be physically adept?

Fig. 14-13. Example of rough physical mockups

(courtesy of Akshay Sharma, of the Virginia Tech Department of Industrial Design).

For later in the process, after design exploration is done, you may want a more finished-looking three-dimensional design representation (Fig. 14-14) to show clients, customers, and implementers.

Which of the following refers to the ability to manipulate objects and to be physically adept?

Fig. 14-14. Example of a more finished looking physical mockup

(courtesy of Akshay Sharma, of the Virginia Tech Department of Industrial Design).

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B978012805342300014X