CHAPTER V. EwT - ENGAGING WITH THE TECHNOLOGY HCI - HUMAN-COMPUTER INTERACTION INTRODUCTION AND EARLY ASPECTS In this chapter we look at the interaction between human user and the computer or other information system as such, mostly considering the user interface (UI). In this chapter, after an introduction, we will look at each aspect in turn from the quantitative to the organic aspect. » The quantitative aspect concerns amounts of things. » The spatial aspect is particularly important in screen layout, but also the user's position. » The kinematic aspect is important in movement, both on the screen and of the user, especially when using a mobile device. » The physical aspect concerns materials and physical properties like force, weight and friction. » The organic / biotic aspect concerns how the interactions fit with human organs - which is largely the hardware components - and is the main component of the field known as ergonomics. V-1. OUR APPROACH TO UNDERSTANDING EwT/HCI The danger in understanding our interaction with the computer (EwT/HCI and UI) is that we focus on one or two aspects to the detriment of others. For example, many websites look great (aesthetic aspect) but are so badly structured (formative aspect) that you just cannot get the information you want. Have you ever got to the a place in a shopping website where it gives you a message and you don't know what to do? To overcome this danger, we look at each and every aspect of the interaction between human and computer, the EwT/HCI. Doing so helps us recall all the various types of things that are important in successful human interaction with computers and other IT such as mobile phones, whether this be in user interfaces or multimedia. Some texts place the technical artefact at the centre of EwT/HCI; here we will place the human at the centre, approaching EwT/HCI in temcs of what is meaningful to the human user. Since all human functioning exhibits all aspects, we will examine every aspect of EwT/HCI. V-1.1 Overview of Aspects of EwT/HCI In the Human Experience chapter we gave an overview of aspects of the UI or EwT/HCI: » Quantitative aspect: Amount and number of interactions and devices. » Spatial aspect: Spatial arrangements, location and size. » Kinematic aspect: Movement » Physical aspect: How both the UI and our bodies engage physically: forces, friction, light, vibration, etc. » Organic (biotic) aspect: How the user interface matches the our organs like eyes, ears, hands, and whether it affects our health. » Psychic aspect: Seeing colours, shapes, movement etc. on screen, hearing sounds, feeling vibration etc., controlling mouse, keyboard, etc. » Analytic aspect: Identifying that the shapes and sounds are expressing concepts, and what type they are. » Formative aspect: The structure of this information. » Lingual aspect: What the information means, its content. » Social aspect: The cultural connotations and acceptability of the information. » Economic aspect: The limited resources of the UI and EwT/HCI. » Aesthetic aspect: The design style of the UI and EwT/HCI: visual, aural and haptic and how they harmonise; 'nice' touches. » Juridical aspect: How well the UI does justice to the users or the information meaning. » Ethical aspect: The 'generosity' (or otherwise) of the UI. » Faith aspect: What is the deep motivation behind the UI? Two examples are given of aspects of EwT/HCI and user interface (UI). The first is aspects of our characteristics as human beings. The second gives aspects of the user interface. Example 1: Aspects of human functioning with UI Quantitative: 2 eyes, ears, 2 hands Spatial: My hand around mouse; fingers on keyboard. Kinematic: Move the mouse Physical: Spill coffee over keyboard! See also Intro example. Organic (biotic): Psychic: Visual impairment Analytic: Do I recognise this shape on screen (e.g. a logo)? Formative: I don't know in which menu to look for what I want Lingual: In a computer game, is what is seen easily understood, within the intention of the game? Social: Standards, e.g. Web accessibility standards. Economic: Download time of web pages Aesthetic: Does the whole UI 'hang together'? Juridical: Does the UI do justice to user (e.g. blind, deaf, etc.)? Ethical: Is the UI or web page too much 'in your face', promoting itself rather than serving the user? Faith: Do users trust the UI? Example 2: Aspects of the user interface and interaction Quantitative: 1 screen, 2 loudspeakers, 1 mouse Spatial: Where things are on screen, vertical and horizontal alignment in tables Kinematic: Animation Physical: Mechanical mouse slipping on mat! Organic (biotic): Mouse too small (or large) for my hand. Psychic: Yellow text on white background difficult to see Analytic: Small circle on screen: is it letter O or number 0? Line on screen: does its length signify anything? Formative: The grouping of icons on toolbar, or entries in menus Lingual: In a news website, is the content accurate, trustworthy and up-to-date? Social: Cross-cultural acceptability Economic: Limited screen area Aesthetic: Contemporary, retro styles; fun or boring? Juridical: Is look&feel appropriate to content? Ethical: Generous UI gives more than necessary. e.g. several different ways of getting the information, extra features useful to users. Faith: What is main vision that drove design of the UI, e.g. 'to help users' or 'to show my artistic prowess' or 'to follow object-oriented design principles'? V-1.2 Two Directions of Interaction: Input and Output From the viewpoint of several aspects, our interaction with IT may be seen as in two directions: input of information from human to computer, and output of information from computer to human. The information might state something, or give a command or ask a question. Examples of input are where we double-click on an application, type text, operate a slider with the mouse to increase the volume of music, or thumb across our mobile phone screen to get to the next photo. Examples of output that the computer might give in response to that include: the window of the application appears, the words we type come up on screen and an indication is given on spelling error, the volume of music increases, and the next photo slides into view. The characteristics of input and output differ, suiting the capabilities of both computer and human. » Input from human to computer tends to be slow and simple - an information rate of a few tens of pieces of information per second. This suits the human because our ability to send messages to the computer is limited, and it suits the computer since its ability to recognise what the user wants is limited. » Output from computer to human is fast and complex, a screenful of information (hundreds of pieces of information) can be given several times a second (such a in a fast-moving game). This suits the human since we can recognise and collate information, especially via our eyes, very fast, and it suits the computer, which can display or emit information very fast. Problems with input - such as hitting the wrong key, or keys getting stuck, or giving the wrong command - are different from problems with output - such as misunderstanding what is on the screen, or not being able to see it properly. Input to computer is usually undertake by fingers or hand, with devices such as keyboard, mouse, touch-pad. Output from computer usually involves our sense of sight (device = screen) or hearing (device = loudspeakers). These are called 'channels', and will be discussed under the psychic aspect, and devices will be discussed under the organic aspect. However, neither input not output can be understood on their own but must be seen as both part, with each other, of the EwT/HCI. For example, we type text and immediately see it appear on screen and know what we are writing. The difference between input and output occurs with the biotic to formative aspects. In other aspects, input and output are merged into our overall interaction. In most aspects we will discuss not only what occurs in our interaction as seen from that aspect, but also what challenges and problems there might be for user interfaces and multimedia. We will take the aspects in three groups: » Pre-lingual aspects, which support the lingual » The lingual aspect, which is the most important aspect of EwT/HCI and forms a link with EMC » The post-lingual aspects, which affect the style and success of the EwT/HCI. Each aspect of EwT/HCI will now be examined in detail, looking at EwT/HCI theories as well as practice. We are using the aspects not so much as a categories, but rather as a way of separating out the issues that are important in EwT/HCI and UI. Many discussions of the issues in EwT/HCI focus only on certain aspects and forget others. We will cover some aspects in more detail than others. We will look at technologies and techniques in each aspect. We will also look at quality criteria in some aspects (what makes a UI or EwT/HCI good or bad in that aspect) and various kinds of error that might afflict use of computers; each kind is usually explainable in one aspect. V-2. NUMBER OF THINGS IN INTERACTION (Quantitative Aspect of EwT/HCI) This concerns amounts and counts of things. For example, how many windows are open? Some applications open several windows, to show several different things. Example: MSWord has the main document window and a window showing styles. Example: the Imagine 3D virtual reality creator has four windows, showing four views of the scene being created. So the quantitative aspect here refers to the number of windows. But when we click the mouse or press a key on the keyboard, that should usually go to only one window. So the quantitative aspect here refers to one (1). But why count things? Usually for some other reason, relating to another aspect. By itself, the quantitative aspect of EwT/HCI (e.g. counts of things in the interaction) has little meaning. Rather, the counting of things is usually a prelude to considering another aspect. For example, Miller [1955] published a paper about the number of things users can keep in mind at one time ('The Magic number 7, plus or minus 2'). It is the number of 'chunks' of things that are on screen which the user is expected to be aware of; we will meet Miller again in the analytic aspect. Examples: "There's more information on this page than the previous one", "There's a lot of red on this screen", "5 apps running" Exercises: Be aware, over a day's use, of how often you are aware of quantities. How do I design for this?: Usually no need, since quantitative aspects is usually in service of others. Going deeper (Extant ideas): - V-3. SCREEN LAYOUT (Spatial Aspect of EwT/HCI) The spatial aspect of EwT/HCI is particularly important on screens. For example: » The layout of the screen: where things are, and where we expect them to be, for example navigation links on a website are collected together in one place. » The shape of things on screen. Usually rectangles for pictures. But also the shape of icons can help us recognise and locate them quickly. » Spatial relationships and arrangements on screen. For example, we expect that things that line up above each other have something in common, such as in a list or table. The importance of these is related to other aspects, as will be explained later. The mouse is an excellent at functioning spatially. The mouse pointer indicates an exact position. V-3.1 3D and 2D Space Think about a virtual reality scene (e.g. in a 3D computer game) on your screen. There are two spatial aspects here: the position and shapes on the screen itself, which are all in two dimensions, and the scene, which is in three dimensions. These are both spatial aspects, but one is EwT/HCI and the other is EMC. » The two-dimensional space on screen itself is EwT/HCI. » The three-dimensional space of the virtual scene is EMC, because it is what the information is about, namely 3D space. However, in a two-dimensional game or a map, both EwT/HCI and EMC are two-dimensional - which can sometimes lead to confusion. The spatial aspect is very important in EwT/HCI, but we usually need the visual psychic channel to see it; see below. Examples: See above. Exercises: Notice how spatial layout on screen helps or hinders you. Think what layout would be better. How do I design for this?: See exercise. Going deeper (Extant ideas): V-4. ANIMATION, MOVEMENT AND MOBILITY. (Kinematic Aspect of EwT/HCI) The kinematic functioning at the UI is evident in animation, in movement of our eyes or fingers, and, in mobile devices, in our moving around with the device. Animation. Our visual psychic channel is particularly sensitive to movement, so we tend to notice it. So movement is often used to attract (distract!) attention. However, that is a matter for the psychic aspect below. The actual kinematic aspect is concerned with movement itself, such as the examples below. Animation is relevant to EwT/HCI in at least three ways. It can be used during visual output as animation to attract attention (such as those annoying advertisements!). It can be used as decoration, to make the visual interface more 'lively'. But more important than either of these is the use of movement to let the user know what is going on. For example in user interfaces during the past few years, windows on screen have not just appeared, but move smoothly into and out of view, either expanding from where the user clicked the mouse or moving around the screen. Such movement provides a subliminal information to the user about what the computer or mobile phone is doing, and this provides comfort. Movement of user's organs. For example eyes move across the screen, and eye-tracking devices are available, such as in aircraft cockpits to take account of where the pilot if looking. In computer games, movement of hands can be tracked by a webcam or be accelometers in hand-held devices. More about this in biotic aspect. Movement of user of mobile device; the position of the user is tracked by global positioning system (GPS). Some of these will be picked up again in the psychic functioning in the UI below, because the kinematic aspect is rarely used for its own sake. Examples: » Animation. - Movement of mouse pointer. - Movement of objects across a screen (e.g. the piece of paper on the Microsoft copying facility). - Movement of our view across a landscape, e.g. as though we are flying across it. - Flowing movement, e.g. of fluids in pipes, shown on screen. - Non-visual movement includes: In music there is movement through the piece from beginning to end. In a document there is movement from beginning to end for the reader. » Movement of organs. Movement of hands and fingers controlling mouse or on tablet. Movement of eyes - eye-tracking software. Movement of hands tracked by webcam in computer games. Exercises: Notice, though the course of a day or week, how often movement occurs on screen, and what it means in each case. How do I design for this?: Creating animations takes skill to make them smooth; see psychic aspect. Going deeper (Extant ideas): - V-5. HARDWARE 1: MATERIALS (Physical Aspect) This is the aspect of what is often called the basic technology. It concerns, for example » materials, » electricity, » magnetism, » light, » vibration, » shocks » and the like. For example, your visual output might involve the physics of electron beams travelling through a vacuum tube to hit phosphor dots on a glass surface, causing them to emit light: the cathode ray tube used in most screens up until a few years ago. LCD screens, such as in a mobile phone, operate by other physical principles, specifically altering the orientation of complex crystals (LCD: liquid crystal display) so that they let light pass, or not. CRT and LCD have this one thing in common: all of them produce colours by means of a triple of three light-emitting dots, red, green and blue; by giving out different amounts of these three colours, almost all possible colours can be generated. Loudspeakers work by converting electrical signals into vibrations in the air, either by electro-magnetics or piezoelectric effects (applying electric field to certain crystals contracts them). Only seldom do we need to actively consider the physical aspect of EwT/HCI, because in normal circumstances, the physics works so well and reliably that we can take it for granted. Giving attention to the physical aspect is useful for at least two reasons. One is that it enables us to understand how things work, so we can perhaps understand better what is required, such as ruggedized equipment that must work in exceptional physical conditions such as in space. The other important reason why we should be aware of the physical aspect of EwT/HCI is when things go wrong. Some examples: Moral: keep good backups, and take care of your equipment! Examples: Physical properties like weight, robustness, friction, stiffness, etc. are used in the exercise below. Things that can go wrong that are meaningful in the physical aspect: » Power cuts! » Mouse ball on slippery surface works unreliably. » Jam sandwiches have made your children's hands sticky - no wonder the mouse and keys end up all sticky! » When the internal mechanism of some keys or buttons is worn or springs become weak, then they don't work reliably. » Coffee spilled on the keyboard is not good for it! » Heat melts the case of your mouse or keyboard, distorting it. » If your data is stored on magnetic disk (e.g. floppy disk), then the data can be lost if a magnet get near it. » Bending a CD or DVD destroys it. » Physical shock, such as dropping your computer or mobile phone, can make it go wrong. » Overheating can make it malfunction, so do not block air vents. » Lightning strike can destroy the electronics of your computer. » Fire can burn it all. Exercises: Think about the above physical characteristics of your mobile phone or laptop. Why is each important? How do I design for this?: That is the realm of computer and mobile phone manufacturers. Going deeper (Extant ideas): - V-6. HARDWARE 2: DEVICES MATCHING BODILY CHARACTERISTICS (Biotic-Organic Aspect of EwT/HCI) In EwT/HCI, the biotic/organic aspect is concerned with the actual hardware devices that engage with our sense organs - eyes, ears, hands etc. - regardless of what physics they employ. In Figure 1 we have: Ears Loudspeaker Eyes Screen Hand Mouse, trackball, joystick Swipe-card Camera held Fingers Keyboard, touch-screen, touch-pad Switches, buttons Keyboards for punched cards or tape Mouth, vocal organs Microphone Body Force and vibration actuators ('haptic') Direct to nerves Tiny wires inserted in brain (See details below.) {*** Example: Think about your mobile phone, and how well or badly it fits your hand, fingers, and the distance between ear and mouth. These issues are of the organic/biotic aspect. ***} From this aspect it matters little what material the mouse is made of; what matters is whether it fits the hand well: imagine a mouse the size of a desk: it would be unusable as a mouse! Figure 1. Computer with input and output hardware The organic/biotic aspect is also the realm of electronics (rather than electricity). Seen from this aspect the various devices work by electromotive force (EMF, measured in volts), currents (measured in amps) operating in conductors on components. Much of this is digital electronics, in which the EMFs (voltages) are limited to two values such as 2.7v and 3.3v or 0v and 5v. These represent the binary alternatives of on and off (or 1 and 0), when seen from psychic/sensitive aspect, later. But in the UI devices, there is also some analog electronics, in which a continuous range of EMFs is operative. For more on this, see books on computer electronics. Here we will look only at the larger-scale devices. So Figure 1 shows the electronics that serves these hardware devices (sound hardware to convert digital voltages into analog for the loudspeaker, display hardware to convert digital voltages into analog to drive the thousands of microscopic light-emitting devices that make up the screen, and analog-to-digital convertor for the microphone). These are linked to the central processing unit (CPU) and memory of the computer by conductors (the bus is a multiple conductor). V-6.1 The Organic Aspect of Input and Output If we see the computer and its user interface in temcs of model-view- controller, then the model in this aspect is the innards of the computer, including the printed circuit boards of the main memory and central processing unit, the disks. This is shown in Figure 1. Three different channels of output (view) hardware links with three different human organs: » Visual channel, relating to our eyes » Aural channel, relating to our ears » Haptic channel, relating to our body (often our hands). The input (controller) hardware usually links with our hands and fingers, though there is also some sound input via microphones. The view (output) consists of the screen and the electronics responsible for the visual display on screen, the loudspeakers and sound electronics, and force-actuators that and the electronics that controls them. » The visual display electronics consists of the display hardware (e.g. a graphics card), which accesses some of the the main computer memory and converts the digital electric charges it finds therein into analog voltages to drive tiny light-emitting cells. These cells emit light that is either red, green or blue, and the intensity of the light is controlled by the EMF (voltage) fed to them. The more the EMF, the brighter the light emitted. As the EMF varies, so the amount of light varies. The tiny light- emitters are grouped in triples (red, green, blue), and there are typically a million such triples arranged in an array in a modern visual display, and 200,000 in a mobile phone display. » The sound electronics consists of sound hardware (e.g. a sound card) that converts some of the electric charges found in the computer's main memory into a stream of analog current that is fed through the coils of loudspeakers. This current is rapidly alternating, at frequencies usually of between 300 to 3000 times per second, and these make the coil and cone of loudspeakers vibrate at a frequency that is audible to the human ear. » The haptic force actuators press on our hands as vibration, or control movable seating (such as immersive cinema). The electronics that controls this receives varying EMF from the central processing unit, and converts this into powerful currents sent through coils operating in magnetic fields which create movement. (It is similar to loudspeakers but operating at lower frequencies.) (The purpose of these tiny triple light-emitters cannot by understood from the point of view of the organic aspect, but can only be understood from the point of view of the psychic/sensory aspect. From that aspect we note that each different combination of red, green and blue light gives us the sensory experience of seeing a different colour. Most colours apart from flesh tones can be faithfully composed in this way. This illustrates how the biotic/organic aspect anticipates later aspects.) The controller (input) consists of the input devices like mouse, keys and touch-sensitive screen, and the electronics responsible for linking these with the computer. V-6.2 Input Hardware Devices We list input devices in approximate chronological order, older first (but sometimes still used). We indicate in bold text what the user does with each, but strictly each of these is of the psychic aspect. » Switches and plug boards: Early computers received their information by people setting switches or plug-boards (boards with lots of holes into which plugs were inserted). These operate by means of making contact between circuits of the computer. Suitable for manufacturing process control, e.g. measuring temperature, pressure, liquid level. Users throw switches or plug the board holes. » Paper tape reader: reads holes in punched paper tape by means of photoelectric cells. Tape can be any length. People have to punch tape ahead of it being read and feed them in to a card reader. Suitable for batch, not interactive, input. » Card reader: reads holes in punched cards, similar to paper tape, but each card is 80 columns. People have to punch the cards ahead of them being read and feed them in to a card reader. Suitable for batch, not interactive, input. » Swipe card reader: modern version of punched card reader, where the information is held not by punched holes but in magnetic strips or in bar codes that are read by lasers. The user either swupes the card or holds it for the laser to read (as in supemcarket checkouts). » Keyboard: Human user presses or hits keys and these send electric pulses to computer. Suitable for interactive input. Original version (1970s Teletype) was like an electric typewriter; today's versions have much more sensitive keys. » Joystick: User pushes a stick in one of 8 directions; this causes various switches to close and send electric pulses to computer, a different pattern of pulses for each direction. Some 'analogue' joysticks also send pulses that indicate how far the stick is pushed. Most joysticks also have a couple of keyboard-like switches that can be pressed or hit. » Mouse: User moves mouse, and this sends a stream of electric pulses to computer that indicate how the mouse moves in two directions. (By this means, in its psychic functioning) the computer keeps track of where the mouse is. Like the joystick, the mouse usually also has two keyboard-like switches that can be pressed or hit. » Trackball: Like an upside-down mouse, with a ball which the user moves around in any direction; the trackball sends a stream of electric pulses to computer that indicates its movement in two directions. One advantage over the mouse is greater precision for fine movement. » Touch-pad and touch-screen: The user touches or strokes a pad (e.g. on laptop) or the screen (on tablet or mobile phone) and this detects where on pad or screen the finger was placed, and sends a stream of electric pulses to computer whose pattern indicates this position. Used almost like a mouse. The difference between touch-pad and touch-screen is meaningful in the psychic aspect, not the organic. » Microphone input: The user speaks, and sound waves detected by microphone are converted first to electric waveforms, which are then converted to streams of (digital) pulses that are sent to the computer. » Camera: The user points the camera at a view, or at themselves. Light from the view is focused by a lens on an array of light- sensitive cells. These (or the electronics connected to them) emit pulses which are sent to the computer; typically 10 million pulses per picture. » Devices to measure physical properties: accelerometer (measures shakes and vibrations), magnetometer (compass), gyroscope (orientation), GPS chip (position on Earth). Most of these are found in mobile phones and tablets. » Direct connection to nerves: Tiny electrodes are inserted in nerve cells and pick up their electrical activity. When the user thinks these nerve cells might be activated, and this activation is converted to electric pulses that are sent to the computer. This type of input device is still only experimental. Thus all input devices except the first switches and plug-boards send electric pulses to the computer. What the computer does with these cannot very easily be described from the point of view of the biotic (hardware) aspect, but makes sense only at the psychic aspect; see below. Examples: See lists above. Exercises: Make a list of which of the above hardware UI devices you make use of over the course of a week. How do I design for this?: This is the realm of electronic and mechanical engineers, and the ergonomist, who ensures that hardware devices fit human physical characteristics. Going deeper (Extant ideas): The whole field of ergonomics considers this. V-6.3 Output Hardware Devices » Screen: An array of tiny pixels (e.g. 1280 by 1024) that emit light of various colours. This can be: phosphorescent dots in cathode ray tube, light-conductive diodes (LCD) that filter light to various colours, or plasma emitters. This gives output for the visual channel. » Loudspeaker: A strong, light paper or plastic cone that vibrates at frequencies of up to 20,000 times a second to cause air vibrations that impact on our ears. The cone is vibrated by electric alternating currents running through a small coil suspended in a strong magnetic field. These alternating currents are created, via an electronic device known as a digital-to- analogue convertor, from electric pulses sent from the computer. Typically the computer sends 0.5 million pulses per second to each speaker (16 pulses 30,000 times a second). This gives output for the aural channel. » Force and vibration actuator device: The force-feedback joystick not only sends pulses to computer, but also has tiny powerful electric magnets attached that can be activated by the computer to provide force on the user's hand holding the stick. The force can be steady or vibratory or an impact. The electric currents that cause the force are converted from streams of electric pulses sent from the computer. Another form of force output device is the 'dataglove'. This gloves has a number of cells that exert force at various points of the hand, in an attempt to make the hand feel as though it is touching something. This gives output for the haptic channel. » Direct connection to nerves: The computer sends electric pulses to tiny electrodes implanted in nerve cells, and this activates those nerve cells, causing the user to be aware of various things such as snatches of music or a feeling of sadness. This type of output device is still only experimental, and there are many safety features to design. The electric pulses the computer sends to these devices cannot be properly understood until we take account of the psychic aspect. Examples: See above. Exercises: During the course of a week of using your mobile phone, observe how often, and for how long, you gain information through each channel. (Remember the vibrator on the phone, if you use it.) Then do the same when using your laptop or desktop. Then do the same when using a games console. Compare and contrast, and then think why there might be differences. Researching this in more depth might be a good topic for your final dissertation. How do I design for this?: Do not assume everything must go onto screen. Think especially how you can use the sound channel effectively. See later aspects for what to take into account to use each channel. Going deeper (Extant ideas): - V-6.4 The Innards of the Computer So far we have considered the user interface, but there is much else with which the user engages, in the organic aspect, on the inside of the computer (the 'innards'). Some can have informational importance, as do the UI devices above, but some do not. Examples: Components with informational importance (often in mobile phones and tablets): » Accelerometer in mobile phones, which detect when we shake the device. » The hardware component which detects orientation in a table, so that the picture flips the right way round. » The GPS, which detects position on the earth. Components without informational importance: » Connectors - can cause havoc if faulty. » Hinge on laptop lid, which might break. » Wires - which might break. » Battery - which needs replacing sometimes » SD or SIMM card, which is inserted for memory. Exercises: Make a list of the internal components that you interact with over the course of a week. If you have a tablet, think of all the little design features that have been included to make your using of it a more pleasurable experience. And what features make it annoying. How do I design for this?: Apple Corporation is particularly good at giving attention design things without informational importance, such as style of case, and how to save battery consumption. Going deeper (Extant ideas): V-6.5 Challenges and Problems The kinds of challenges and problems explainable at this level are those to do with hardware, our fingers etc. and the electronics. Here are some examples: Examples: » Loose connections! Plug devices into the computer - keyboard, mouse, loudspeakers, screen, etc. - and if the connection is dirty, it won't work reliably. This is particularly important in the most demanding public multimedia, because the problems of loose connections cannot be tolerated. » Our fingers get onto the wrong keys, and so we mis-key. » The computer's main memory has a maximum frequency (speed) at which it can deliver or receive and store the electric charges by which it works. They are delivered via a special set of conductors called a bus, or 'direct memory access'. Both central processor, display hardware and sound hardware must share this bus. Sometimes up to half the available frequency is used by by the display and sound hardware, leaving only half for the central processor unit - this slows the CPU down by a factor of two, or even more. This is particularly important in high quality multimedia that has a lot of high-definition animation and sound. Exercises: {*** Discuss: Consider each of the devices listed at the start of this section and think / discuss what might go wrong with each. ***} How do I design for this?: Going deeper (Extant ideas): Copyright (c) Andrew Basden & Janice Whatley. 16 September 2008, 18 October 2008. 3 September 2009, 22 September 2009, 25 November 2009, 20 September 2010, 14 September 2011, 14 August 2012, 17 September 2012.