- •Preface
- •Contents
- •1 Disability and Assistive Technology Systems
- •Learning Objectives
- •1.1 The Social Context of Disability
- •1.2 Assistive Technology Outcomes: Quality of Life
- •1.2.1 Some General Issues
- •1.2.2 Definition and Measurement of Quality of Life
- •1.2.3 Health Related Quality of Life Measurement
- •1.2.4 Assistive Technology Quality of Life Procedures
- •1.2.5 Summary and Conclusions
- •1.3 Modelling Assistive Technology Systems
- •1.3.1 Modelling Approaches: A Review
- •1.3.2 Modelling Human Activities
- •1.4 The Comprehensive Assistive Technology (CAT) Model
- •1.4.1 Justification of the Choice of Model
- •1.4.2 The Structure of the CAT Model
- •1.5 Using the Comprehensive Assistive Technology Model
- •1.5.1 Using the Activity Attribute of the CAT Model to Determine Gaps in Assistive Technology Provision
- •1.5.2 Conceptual Structure of Assistive Technology Systems
- •1.5.3 Investigating Assistive Technology Systems
- •1.5.4 Analysis of Assistive Technology Systems
- •1.5.5 Synthesis of Assistive Technology Systems
- •1.6 Chapter Summary
- •Questions
- •Projects
- •References
- •2 Perception, the Eye and Assistive Technology Issues
- •Learning Objectives
- •2.1 Perception
- •2.1.1 Introduction
- •2.1.2 Common Laws and Properties of the Different Senses
- •2.1.3 Multisensory Perception
- •2.1.4 Multisensory Perception in the Superior Colliculus
- •2.1.5 Studies of Multisensory Perception
- •2.2 The Visual System
- •2.2.1 Introduction
- •2.2.2 The Lens
- •2.2.3 The Iris and Pupil
- •2.2.4 Intraocular Pressure
- •2.2.5 Extraocular Muscles
- •2.2.6 Eyelids and Tears
- •2.3 Visual Processing in the Retina, Laternal Geniculate Nucleus and the Brain
- •2.3.1 Nerve Cells
- •2.3.2 The Retina
- •2.3.3 The Optic Nerve, Optic Tract and Optic Radiation
- •2.3.4 The Lateral Geniculate Body or Nucleus
- •2.3.5 The Primary Visual or Striate Cortex
- •2.3.6 The Extrastriate Visual Cortex and the Superior Colliculus
- •2.3.7 Visual Pathways
- •2.4 Vision in Action
- •2.4.1 Image Formation
- •2.4.2 Accommodation
- •2.4.3 Response to Light
- •2.4.4 Colour Vision
- •2.4.5 Binocular Vision and Stereopsis
- •2.5 Visual Impairment and Assistive Technology
- •2.5.1 Demographics of Visual Impairment
- •2.5.2 Illustrations of Some Types of Visual Impairment
- •2.5.3 Further Types of Visual Impairment
- •2.5.4 Colour Blindness
- •2.5.5 Corrective Lenses
- •2.6 Chapter Summary
- •Questions
- •Projects
- •References
- •3 Sight Measurement
- •Learning Objectives
- •3.1 Introduction
- •3.2 Visual Acuity
- •3.2.1 Using the Chart
- •3.2.2 Variations in Measuring Visual Acuity
- •3.3 Field of Vision Tests
- •3.3.1 The Normal Visual Field
- •3.3.2 The Tangent Screen
- •3.3.3 Kinetic Perimetry
- •3.3.4 Static Perimetry
- •3.4 Pressure Measurement
- •3.5 Biometry
- •3.6 Ocular Examination
- •3.7 Optical Coherence Tomography
- •3.7.1 Echo Delay
- •3.7.2 Low Coherence Interferometry
- •3.7.3 An OCT Scanner
- •3.8 Ocular Electrophysiology
- •3.8.1 The Electrooculogram (EOG)
- •3.8.2 The Electroretinogram (ERG)
- •3.8.3 The Pattern Electroretinogram
- •3.8.4 The Visual Evoked Cortical Potential
- •3.8.5 Multifocal Electrophysiology
- •3.9 Chapter Summary
- •Glossary
- •Questions
- •Projects
- •4 Haptics as a Substitute for Vision
- •Learning Objectives
- •4.1 Introduction
- •4.1.1 Physiological Basis
- •4.1.2 Passive Touch, Active Touch and Haptics
- •4.1.3 Exploratory Procedures
- •4.2 Vision and Haptics Compared
- •4.3 The Capacity of Bare Fingers in Real Environments
- •4.3.1 Visually Impaired People’s Use of Haptics Without any Technical Aid
- •4.3.2 Speech Perceived by Hard-of-hearing People Using Bare Hands
- •4.3.3 Natural Capacity of Touch and Evaluation of Technical Aids
- •4.4 Haptic Low-tech Aids
- •4.4.1 The Long Cane
- •4.4.2 The Guide Dog
- •4.4.3 Braille
- •4.4.4 Embossed Pictures
- •4.4.5 The Main Lesson from Low-tech Aids
- •4.5 Matrices of Point Stimuli
- •4.5.1 Aids for Orientation and Mobility
- •4.5.2 Aids for Reading Text
- •4.5.3 Aids for Reading Pictures
- •4.6 Computer-based Aids for Graphical Information
- •4.6.1 Aids for Graphical User Interfaces
- •4.6.2 Tactile Computer Mouse
- •4.7 Haptic Displays
- •4.7.1 Information Available via a Haptic Display
- •4.7.2 What Information Can Be Obtained with the Reduced Information?
- •4.7.3 Haptic Displays as Aids for the Visually Impaired
- •4.8 Chapter Summary
- •4.9 Concluding Remarks
- •Questions
- •Projects
- •References
- •5 Mobility: An Overview
- •Learning Objectives
- •5.1 Introduction
- •5.2 The Travel Activity
- •5.2.1 Understanding Mobility
- •5.2.2 Assistive Technology Systems for the Travel Process
- •5.3 The Historical Development of Travel Aids for Visually Impaired and Blind People
- •5.4 Obstacle Avoidance AT: Guide Dogs and Robotic Guide Walkers
- •5.4.1 Guide Dogs
- •5.4.2 Robotic Guides and Walkers
- •5.5 Obstacle Avoidance AT: Canes
- •5.5.1 Long Canes
- •5.5.2 Technology Canes
- •5.6 Other Mobility Assistive Technology Approaches
- •5.6.1 Clear-path Indicators
- •5.6.2 Obstacle and Object Location Detectors
- •5.6.3 The vOICe System
- •5.7 Orientation Assistive Technology Systems
- •5.7.1 Global Positioning System Orientation Technology
- •5.7.2 Other Technology Options for Orientation Systems
- •5.8 Accessible Environments
- •5.9 Chapter Summary
- •Questions
- •Projects
- •References
- •6 Mobility AT: The Batcane (UltraCane)
- •Learning Objectives
- •6.1 Mobility Background and Introduction
- •6.2 Principles of Ultrasonics
- •6.2.1 Ultrasonic Waves
- •6.2.2 Attenuation and Reflection Interactions
- •6.2.3 Transducer Geometry
- •6.3 Bats and Signal Processing
- •6.3.1 Principles of Bat Sonar
- •6.3.2 Echolocation Call Structures
- •6.3.3 Signal Processing Capabilities
- •6.3.4 Applicability of Bat Echolocation to Sonar System Design
- •6.4 Design and Construction Issues
- •6.4.1 Outline Requirement Specification
- •6.4.2 Ultrasonic Spatial Sensor Subsystem
- •6.4.3 Trial Prototype Spatial Sensor Arrangement
- •6.4.4 Tactile User Interface Subsystem
- •6.4.5 Cognitive Mapping
- •6.4.6 Embedded Processing Control Requirements
- •6.5 Concept Phase and Engineering Prototype Phase Trials
- •6.6 Case Study in Commercialisation
- •6.7 Chapter Summary
- •Questions
- •Projects
- •References
- •7 Navigation AT: Context-aware Computing
- •Learning objectives
- •7.1 Defining the Orientation/Navigation Problem
- •7.1.1 Orientation, Mobility and Navigation
- •7.1.2 Traditional Mobility Aids
- •7.1.3 Limitations of Traditional Aids
- •7.2 Cognitive Maps
- •7.2.1 Learning and Acquiring Spatial Information
- •7.2.2 Factors that Influence How Knowledge Is Acquired
- •7.2.3 The Structure and Form of Cognitive Maps
- •7.3 Overview of Existing Technologies
- •7.3.1 Technologies for Distant Navigation
- •7.3.2 User Interface Output Technologies
- •7.4 Principles of Mobile Context-aware Computing
- •7.4.1 Adding Context to User-computer Interaction
- •7.4.2 Acquiring Useful Contextual Information
- •7.4.3 Capabilities of Context-awareness
- •7.4.4 Application of Context-aware Principles
- •7.4.5 Technological Challenges and Unresolved Usability Issues
- •7.5 Test Procedures
- •7.5.1 Human Computer Interaction (HCI)
- •7.5.2 Cognitive Mapping
- •7.5.3 Overall Approach
- •7.6 Future Positioning Technologies
- •7.7 Chapter Summary
- •7.7.1 Conclusions
- •Questions
- •Projects
- •References
- •Learning Objectives
- •8.1 Defining the Navigation Problem
- •8.1.1 What is the Importance of Location Information?
- •8.1.2 What Mobility Tools and Traditional Maps are Available for the Blind?
- •8.2 Principles of Global Positioning Systems
- •8.2.1 What is the Global Positioning System?
- •8.2.2 Accuracy of GPS: Some General Issues
- •8.2.3 Accuracy of GPS: Some Technical Issues
- •8.2.4 Frequency Spectrum of GPS, Present and Future
- •8.2.5 Other GPS Systems
- •8.3 Application of GPS Principles
- •8.4 Design Issues
- •8.5 Development Issues
- •8.5.1 Choosing an Appropriate Platform
- •8.5.2 Choosing the GPS Receiver
- •8.5.3 Creating a Packaged System
- •8.5.4 Integration vs Stand-alone
- •8.6 User Interface Design Issues
- •8.6.1 How to Present the Information
- •8.6.2 When to Present the Information
- •8.6.3 What Information to Present
- •8.7 Test Procedures and Results
- •8.8 Case Study in Commercialisation
- •8.8.1 Understanding the Value of the Technology
- •8.8.2 Limitations of the Technology
- •8.8.3 Ongoing Development
- •8.9 Chapter Summary
- •Questions
- •Projects
- •References
- •9 Electronic Travel Aids: An Assessment
- •Learning Objectives
- •9.1 Introduction
- •9.2 Why Do an Assessment?
- •9.3 Methodologies for Assessments of Electronic Travel Aids
- •9.3.1 Eliciting User Requirements
- •9.3.2 Developing a User Requirements Specification and Heuristic Evaluation
- •9.3.3 Hands-on Assessments
- •9.3.4 Methodology Used for Assessments in this Chapter
- •9.4 Modern-day Electronic Travel Aids
- •9.4.1 The Distinction Between Mobility and Navigation Aids
- •9.4.2 The Distinction Between Primary and Secondary Aids
- •9.4.3 User Requirements: Mobility and Navigation Aids
- •9.4.4 Mobility Aids
- •9.4.5 Mobility Aids: Have They Solved the Mobility Challenge?
- •9.4.6 Navigation Aids
- •9.4.7 Navigation Aids: Have They Solved the Navigation Challenge?
- •9.5 Training
- •9.6 Chapter Summary and Conclusions
- •Questions
- •Projects
- •References
- •10 Accessible Environments
- •Learning Objectives
- •10.1 Introduction
- •10.1.1 Legislative and Regulatory Framework
- •10.1.2 Accessible Environments: An Overview
- •10.1.3 Principles for the Design of Accessible Environments
- •10.2 Physical Environments: The Streetscape
- •10.2.1 Pavements and Pathways
- •10.2.2 Road Crossings
- •10.2.3 Bollards and Street Furniture
- •10.3 Physical Environments: Buildings
- •10.3.1 General Exterior Issues
- •10.3.2 General Interior Issues
- •10.3.4 Signs and Notices
- •10.3.5 Interior Building Services
- •10.4 Environmental Information and Navigation Technologies
- •10.4.1 Audio Information System: General Issues
- •10.4.2 Some Technologies for Environmental Information Systems
- •10.5 Accessible Public Transport
- •10.5.1 Accessible Public Transportation: Design Issues
- •10.6 Chapter Summary
- •Questions
- •Projects
- •References
- •11 Accessible Bus System: A Bluetooth Application
- •Learning Objectives
- •11.1 Introduction
- •11.2 Bluetooth Fundamentals
- •11.2.1 Brief History of Bluetooth
- •11.2.2 Bluetooth Power Class
- •11.2.3 Protocol Stack
- •11.2.4 Bluetooth Profile
- •11.2.5 Piconet
- •11.3 Design Issues
- •11.3.1 System Architecture
- •11.3.2 Hardware Requirements
- •11.3.3 Software Requirements
- •11.4 Developmental Issues
- •11.4.1 Bluetooth Server
- •11.4.2 Bluetooth Client (Mobile Device)
- •11.4.3 User Interface
- •11.5 Commercialisation Issues
- •11.6 Chapter Summary
- •Questions
- •Projects
- •References
- •12 Accessible Information: An Overview
- •Learning Objectives
- •12.1 Introduction
- •12.2 Low Vision Aids
- •12.2.1 Basic Principles
- •12.3 Low Vision Assistive Technology Systems
- •12.3.1 Large Print
- •12.3.2 Closed Circuit Television Systems
- •12.3.3 Video Magnifiers
- •12.3.4 Telescopic Assistive Systems
- •12.4 Audio-transcription of Printed Information
- •12.4.1 Stand-alone Reading Systems
- •12.4.2 Read IT Project
- •12.5 Tactile Access to Information
- •12.5.1 Braille
- •12.5.2 Moon
- •12.5.3 Braille Devices
- •12.6 Accessible Computer Systems
- •12.6.1 Input Devices
- •12.6.2 Output Devices
- •12.6.3 Computer-based Reading Systems
- •12.6.4 Accessible Portable Computers
- •12.7 Accessible Internet
- •12.7.1 World Wide Web Guidelines
- •12.7.2 Guidelines for Web Authoring Tools
- •12.7.3 Accessible Adobe Portable Document Format (PDF) Documents
- •12.7.4 Bobby Approval
- •12.8 Telecommunications
- •12.8.1 Voice Dialling General Principles
- •12.8.2 Talking Caller ID
- •12.8.3 Mobile Telephones
- •12.9 Chapter Summary
- •Questions
- •Projects
- •References
- •13 Screen Readers and Screen Magnifiers
- •Learning Objectives
- •13.1 Introduction
- •13.2 Overview of Chapter
- •13.3 Interacting with a Graphical User Interface
- •13.4 Screen Magnifiers
- •13.4.1 Overview
- •13.4.2 Magnification Modes
- •13.4.3 Other Interface Considerations
- •13.4.4 The Architecture and Implementation of Screen Magnifiers
- •13.5 Screen Readers
- •13.5.1 Overview
- •13.5.2 The Architecture and Implementation of a Screen Reader
- •13.5.3 Using a Braille Display
- •13.5.4 User Interface Issues
- •13.6 Hybrid Screen Reader Magnifiers
- •13.7 Self-magnifying Applications
- •13.8 Self-voicing Applications
- •13.9 Application Adaptors
- •13.10 Chapter Summary
- •Questions
- •Projects
- •References
- •14 Speech, Text and Braille Conversion Technology
- •Learning Objectives
- •14.1 Introduction
- •14.1.1 Introducing Mode Conversion
- •14.1.2 Outline of the Chapter
- •14.2 Prerequisites for Speech and Text Conversion Technology
- •14.2.1 The Spectral Structure of Speech
- •14.2.2 The Hierarchical Structure of Spoken Language
- •14.2.3 Prosody
- •14.3 Speech-to-text Conversion
- •14.3.1 Principles of Pattern Recognition
- •14.3.2 Principles of Speech Recognition
- •14.3.3 Equipment and Applications
- •14.4 Text-to-speech Conversion
- •14.4.1 Principles of Speech Production
- •14.4.2 Principles of Acoustical Synthesis
- •14.4.3 Equipment and Applications
- •14.5 Braille Conversion
- •14.5.1 Introduction
- •14.5.2 Text-to-Braille Conversion
- •14.5.3 Braille-to-text Conversion
- •14.6 Commercial Equipment and Applications
- •14.6.1 Speech vs Braille
- •14.6.2 Speech Output in Devices for Daily Life
- •14.6.3 Portable Text-based Devices
- •14.6.4 Access to Computers
- •14.6.5 Reading Machines
- •14.6.6 Access to Telecommunication Devices
- •14.7 Discussion and the Future Outlook
- •14.7.1 End-user Studies
- •14.7.2 Discussion and Issues Arising
- •14.7.3 Future Developments
- •Questions
- •Projects
- •References
- •15 Accessing Books and Documents
- •Learning Objectives
- •15.1 Introduction: The Challenge of Accessing the Printed Page
- •15.2 Basics of Optical Character Recognition Technology
- •15.2.1 Details of Optical Character Recognition Technology
- •15.2.2 Practical Issues with Optical Character Recognition Technology
- •15.3 Reading Systems
- •15.4 DAISY Technology
- •15.4.1 DAISY Full Audio Books
- •15.4.2 DAISY Full Text Books
- •15.4.3 DAISY and Other Formats
- •15.5 Players
- •15.6 Accessing Textbooks
- •15.7 Accessing Newspapers
- •15.8 Future Technology Developments
- •15.9 Chapter Summary and Conclusion
- •15.9.1 Chapter Summary
- •15.9.2 Conclusion
- •Questions
- •Projects
- •References
- •Learning Objectives
- •16.1 Introduction
- •16.1.1 Print Impairments
- •16.1.2 Music Notation
- •16.2 Overview of Accessible Music
- •16.2.1 Formats
- •16.2.2 Technical Aspects
- •16.3 Some Recent Initiatives and Projects
- •16.3.2 Play 2
- •16.3.3 Dancing Dots
- •16.3.4 Toccata
- •16.4 Problems to Be Overcome
- •16.4.1 A Content Processing Layer
- •16.4.2 Standardization of Accessible Music Technology
- •16.5 Unifying Accessible Design, Technology and Musical Content
- •16.5.1 Braille Music
- •16.5.2 Talking Music
- •16.6 Conclusions
- •16.6.1 Design for All or Accessibility from Scratch
- •16.6.2 Applying Design for All in Emerging Standards
- •16.6.3 Accessibility in Emerging Technology
- •Questions
- •Projects
- •References
- •17 Assistive Technology for Daily Living
- •Learning Objectives
- •17.1 Introduction
- •17.2 Personal Care
- •17.2.1 Labelling Systems
- •17.2.2 Healthcare Monitoring
- •17.3 Time-keeping, Alarms and Alerting
- •17.3.1 Time-keeping
- •17.3.2 Alarms and Alerting
- •17.4 Food Preparation and Consumption
- •17.4.1 Talking Kitchen Scales
- •17.4.2 Talking Measuring Jug
- •17.4.3 Liquid Level Indicator
- •17.4.4 Talking Microwave Oven
- •17.4.5 Talking Kitchen and Remote Thermometers
- •17.4.6 Braille Salt and Pepper Set
- •17.5 Environmental Control and Use of Appliances
- •17.5.1 Light Probes
- •17.5.2 Colour Probes
- •17.5.3 Talking and Tactile Thermometers and Barometers
- •17.5.4 Using Appliances
- •17.6 Money, Finance and Shopping
- •17.6.1 Mechanical Money Indicators
- •17.6.2 Electronic Money Identifiers
- •17.6.3 Electronic Purse
- •17.6.4 Automatic Teller Machines (ATMs)
- •17.7 Communications and Access to Information: Other Technologies
- •17.7.1 Information Kiosks and Other Self-service Systems
- •17.7.2 Using Smart Cards
- •17.7.3 EZ Access®
- •17.8 Chapter Summary
- •Questions
- •Projects
- •References
- •Learning Objectives
- •18.1 Introduction
- •18.2 Education: Learning and Teaching
- •18.2.1 Accessing Educational Processes and Approaches
- •18.2.2 Educational Technologies, Devices and Tools
- •18.3 Employment
- •18.3.1 Professional and Person-centred
- •18.3.2 Scientific and Technical
- •18.3.3 Administrative and Secretarial
- •18.3.4 Skilled and Non-skilled (Manual) Trades
- •18.3.5 Working Outside
- •18.4 Recreational Activities
- •18.4.1 Accessing the Visual, Audio and Performing Arts
- •18.4.2 Games, Puzzles, Toys and Collecting
- •18.4.3 Holidays and Visits: Museums, Galleries and Heritage Sites
- •18.4.4 Sports and Outdoor Activities
- •18.4.5 DIY, Art and Craft Activities
- •18.5 Chapter Summary
- •Questions
- •Projects
- •References
- •Biographical Sketches of the Contributors
- •Index
304 9 Electronic Travel Aids: An Assessment
arriving close to their destination but being unable to find the entrance or being unable to find their way if positioning information is temporarily unavailable. As pointed out by Golledge et al. (1998), a system that provides information about landmarks is enriching the user’s experience of the environment. If this information is not provided the device is simply acting as path following aid. An additional important factor is that this extra information can boost a person’s confidence; it provides a means for them to check that they are on the right path. This becomes particularly important when we recall the findings of Yerassimou (2002) that fear of becoming lost or disorientated deter many blind or visually impaired people from travelling independently. If navigation aids are to be used by this group of people they must build the user’s confidence.
There is still much research to be done to identify user requirements for electronic travel aids. Those requirements presented here are just a beginning. We can, however, see some common themes running throughout all of the requirements discussed here. These include the wish for safe, hassle-free, dignified independent travel. This leads to a desire for travel aids that are discrete, simple and flexible and which allow the user choice and control.
Let us now consider a representative selection of electronic travel aids that are available today. The devices discussed here are chosen as examples of electronic travel aids using different types of technologies and serving different purposes.
The first five devices, the Miniguide, the Ultracane, the Laser Cane, the BAT “K” Sonar Cane and the vOICe, are examples of mobility aids. These are aids to assist the user in avoiding hazards. Two of the above devices, the Miniguide and the vOICe are secondary aids. They augment the information provided by a primary aid, such as cane or guide dog, they do not replace it. The other three mobility aids, the Ultracane, Laser Cane and the BAT “K” Sonar Cane, are examples of electronic primary mobility aids. They provide the essential information for moving safely through the environment; replacing the information provided by other primary aids such as the long cane or guide dog.
The final two devices, the BrailleNote GPS and the Victor Trekker, are examples of navigation aids. These assist users in identifying their location within the macro environment and finding routes between places. These are secondary aids. They provide different information to that provided by a primary aid and must always be used in conjunction with a primary aid such as the long cane or guide dog.
9.4.4 Mobility Aids
Mobility aids are those that assist the user in identifying a safe path through the immediate environment.
9.4.4.1 The Miniguide
The Miniguide is a truly mini, lightweight and versatile secondary mobility aid. It has two transducers. One emits a short pulse of ultrasound. The ultrasound is reflected back off objects in its path. This reflected ultrasound is received back
9.4 Modern-day Electronic Travel Aids |
305 |
Figure 9.2. The Audio Miniguide (© GDP Research http://www.gdp-research.com.au. Used with permission.)
Figure 9.3. The Tactile Miniguide (© GDP Research http://www.gdp-research.com.au. Used with permission.)
by the second transducer. The delay between transmitting and receiving back the ultrasound is used to approximate the distance to the object.
Initially there were two versions of the Miniguide. One provided audio output (see Figure 9.2) and the other provided vibratory output (see Figure 9.3). These have now been combined so that one device can be used to provide either audio or tactile output (see Chapter 5). This gives the user greater choice and flexibility. The audio mode beeps to inform the user of objects within the user-selectable range. The distance to an object is represented by the pitch of the beep. High-pitched beeps indicate a close up object. The further away the object is, the lower the beep. The tactile mode uses vibrations to alert the user to objects within range. A distant object results in a slow rate of vibration and a nearby object causes a rapid vibration.
The Miniguide has a number of modes of operation, making it well suited to many different environments. The user can select an object detection range of 8, 4, 2, 1 or 0.5 m. In addition there are modes with reduced sensitivity designed for the special purpose of locating gaps, such as open doors. The audio modes offer a choice of “chirp” or “sweep” output. In “chirp” mode, the device makes a short sound once in each cycle. In “sweep” mode the audio output is continuous. This continuous output assists in the detection of small changes. However, for some users it makes the interpretation of the output too complex. Switching the Miniguide on and off and changing modes is achieved through simple operation of a single button.
The Miniguide performs well against user requirements. In particular, it is small and lightweight, is very versatile, has excellent battery life and is relatively low cost.
306 9 Electronic Travel Aids: An Assessment
Size and weight are important physical characteristics for convenience of use and carrying whilst not in use and making the device inconspicuous—something that many users consider an essential criterion. The Miniguide is usually hand-held but some users have successfully mounted it to a cane, wheelchair or even a hat. This flexibility in where to hold the Miniguide, combined with its multiple and easily user-selectable modes make the Miniguide well suited to many different environments both indoor and outdoor. Hill and Black (2003) has reported an evaluation of the Miniguide based on field trials. One of the most striking findings is the variety of purposes users put their Miniguides to. As well as the typical mobility related obstacle avoidance uses included locating the clothesline, following queues and locating cashiers behind counters. Its flexibility and versatility is one of the Miniguides greatest strengths.
However, it is not without its shortcomings, primarily due to the use of short pulses of ultrasound. As these problems apply equally to the Miniguide and the Ultracane, let us look first at the Ultracane and discuss the disadvantages of both together (see Section 9.4.4.3).
9.4.4.2 The Ultracane
The Ultracane is a primary mobility aid, which combines the long cane with ultrasonic sensors. Two ultrasonic sensors are built into the shaft of a cane. One is positioned to detect obstacles straight ahead, the second to detect obstacles ahead and upwards. Short pulses of ultrasound are transmitted. The ultrasound reflects back off objects within its path and is detected by the two sensors. The delay between transmission and detection provides a basis for estimating the distance to the object. The user is informed of the presence of an object through two vibrating buttons on the handle of the cane—one button for each sensor. This means that a user can distinguish between an object close to ground level and one closer to head level. If a distant object is detected the vibration rate is slow. As the distance to the object decreases, the rate of vibration increases. The user can switch between an object detection range of 3 or 1.5 m, intended for use outdoors and indoors respectively. For further details on the Ultracane see Chapter 6.
The Ultracane has the positive points of providing more information than is provided by a standard long cane whilst being a single unit very similar to the traditional cane. This means that users gain from enhanced mobility information, whilst only needing one hand for mobility purposes leaving the other free to, for example, carry bags. Having at least one hand free is something many users consider important. It offers some flexibility, with user selectable ranges, again something of considerable importance. It also has the advantage of making use of the long cane, something that most users will be familiar with before using the Ultracane, and so is fairly straightforward to learn to use.
Though the Ultracane is based on the traditional long cane, it is a little heavier and the handle a little thicker than most long canes. This results in some users finding it difficult to maintain their usual cane technique. This could have implications for the user’s ability to detect drop-offs, such as the edges of platforms, using the cane. This is because both the two-point touch technique and the constant
9.4 Modern-day Electronic Travel Aids |
307 |
contact techniques, when properly used, restrict the directions in which the wrist moves to just side to side movements and when the cane tip drops over the edge of a drop-off, even a very small one, this results in a subtle up-down stretch in the user’s wrist, alerting them to the change of level (Wall 2002). If correct cane technique is not maintained, the users wrist may be allowed to move up and down, as well as side to side, making the stretch as the cane tip descends over an edge less noticeable. Other problems relate to the use of short pulses of ultrasound.
9.4.4.3 Ultrasonic Mobility Aids: Some Disadvantages
Both the Miniguide and Ultracane use short pulses of ultrasound to detect objects within range. There are some disadvantages with this. Note that here we are not talking just of the weaknesses of the technology but how they impact upon users. Assessments of electronic travel aids must always consider how the user may be affected by the technology.
Different surfaces differ in how well they reflect ultrasound. Generally, hard surfaces such as stone or metal reflect ultrasound well, whereas, soft objects such as bushes or soft furnishings absorb some of the ultrasound so reflect less well. This means that a hard object, one that reflects lots of ultrasound, will be detected at a greater distance than a soft one, which reflects less ultrasound. There is, then, an inconsistency in the amount of warning users are given about the presence of an object. Consistency of information provided by an electronic travel aid is an important user requirement.
Ultrasonic mobility aids are subject to interference from environmental sources of ultrasound. For example, some machinery and the air brakes on buses and lorries can emit ultrasound. If this is picked up by an ultrasonic mobility aid, such as the Miniguide or Ultracane, the aid will react as if an object has been detected. The user will wrongly be alerted to an object, when in fact none is present. This may result in confusion. In practice, however, interference is not common and, as users gain experience of using a device, they will come to suspect when interference may be giving rise to false alarms.
Ultrasonic mobility aids can detect objects above ground level. However, they cannot satisfactorily detect drop-offs, such as steps going down or the edges of platforms. This type of hazard can be particularly significant and so the ability to reliably detect drop-offs is essential. Note that the Miniguide should always be used in conjunction with a primary aid and that it does not interfere with the use of a primary aid. The user should therefore choose a primary aid that can detect drop-offs. The Ultracane, on the other hand, is a primary aid and, though if correct cane technique is used the user will be able to detect drop-offs through the cane itself (note the cane not the ultrasonic detectors) but if cane technique is not correct drop-offs may be missed.
Finally, mobility aids using short pulses of ultrasound, such as the Miniguide and Ultracane, provide the user only with information about the nearest object. The aid will always react to the closest object. If there is more than one object within range the user will only be warned of the nearest one and will remain unaware of the others. For many situations this is adequate. It tells the user whether or not the
308 9 Electronic Travel Aids: An Assessment
way is clear for them to take the next steps, so achieves the most essential element of safe movement. An important user requirement is that information should be prioritised. This is a form of prioritisation. The user, if they continue on the same path, will collide with the nearest object first, so this can be considered as priority. However, in many situations more information, such as the size and shape of the object and the spatial relationship between one object and another are important for enabling the user to make decisions about how to negotiate an obstacle or to allow the user to identify the object in order that they may use it as a landmark.
Other mobility aids exist which overcome at least some of these problems of short pulse-based ultrasonic devices.
9.4.4.4 The Laser Cane
The Laser Cane, available from Nurion Industries, is a primary mobility aid that combines a long cane with a laser obstacle detection system. Short pulses of laser are transmitted. The laser light reflects off obstacles in its path. This reflected light is detected by the sensors mounted on the cane. Sensors are positioned to detect obstacles straight ahead near ground level and at head height, as well as obstacles to either side and a downward pointing sensor is used to detect drop-offs. Information about the presence of obstacles is provided to the user through both vibrations and beeps. The user may select to switch off the beeps for silent operation.
The Laser Cane has two major advantages over the Miniguide and Ultracane, relating to the use of laser rather than ultrasound. First, laser is able to detect dropoffs, giving the user extra warning of these hazards, something which is important for safe travel. Second, laser is, largely, immune to interference. It is extremely unlikely indeed that users will be falsely warned of an object (see Dodds 1988, for a discussion of these issues).
The Laser Cane, however, does have its own disadvantages. One important disadvantage is that not all materials will be detected. For example, light will pass straight through transparent glass, it will not be reflected, so the laser sensors will not detect it. The user will continue to walk until the cane itself hits against the glass object. So, like sonar aids, there is an inconsistency in the warning users will be given about the presence of obstacles.
A further disadvantage with the Laser Cane, due to its use of short pulses of laser, is that, like devices using short pulses of sonar, it only provides the user with information about the nearest object. The user is informed only of the approximate distance to the closest object. Multiple objects are ignored and information about spatial relationships between objects is unavailable to the user. Again, for many situations this is adequate and can be thought of as a form of prioritisation of information. However there are some situations where more spatial information is required. Let us now look at two devices that provide a much richer source of spatial information.
9.4 Modern-day Electronic Travel Aids |
309 |
9.4.4.5 The BAT “K” Sonar Cane
The BAT “K” Sonar Cane, from Bay Advanced Technologies, is a primary mobility aid that combines the long cane with an advanced sweeping sonar system (see Figure 9.4). It uses a modern revision of one of the earliest electronic travel aids, the Sonar Torch, to provide advanced spatial information. It has two transducers. One emits a prolonged beam of ultrasound. The ultrasound is reflected off obstacles in its path and this reflected ultrasound is detected by the second transducer. Distance to an object is represented to the user through beeps of differing pitches. The further away the object is, the higher the beep. As the cane is moved from side to side this sonar beam sweeps across with it. This results in a continuous and unique pattern of beeps being created as the beam sweeps across each object or across multiple objects. This unique pattern of beeps is known as a “sound signature”. By remembering these sound signatures users can learn to identify objects and gather detailed spatial information about the surroundings. This enables travellers to identify objects as landmarks, as well as to identify an optimal route around or between obstacles. A further advantage of the BAT “K” Sonar Cane is that the sonar unit can be easily detached from the cane and used independently of it, providing for increased flexibility and meaning that damage to the cane means only the cane need be replaced (see Figure 9.5). The sonar torch may be used alone but the cane provides additional safety in cases where the sonar may miss objects or drop-offs. The Sonar Torch has a maximum range of 5 m and the user may choose between long or short range.
Like other sonar devices, the BAT “K” Sonar Cane is occasionally subject to interference from environmental sources of ultrasound, such as the air brakes on buses, and some surfaces will be detected at greater distance than others.
Users must interpret the complex auditory information from the BAT “K” Sonar Cane. Further research is needed to establish what impact this perceptual demand
Figure 9.4. The BAT “K” Sonar Cane (© Bay Advanced Technologies http://www.batforblind.co.nz. Used with permission.)
310 9 Electronic Travel Aids: An Assessment
Figure 9.5. The Sonar Torch detached from the BAT “K” Sonar Cane (© Bay Advanced Technologies http://www.batforblind.co.nz. Used with permission.)
Figure 9.6. A typical set-up for the vOICe (© Barbara Schweizer and used with permission. Photo kindly donated by Peter Meijer of The vOICe website http://www.seeingwithsound.com.)
may have upon the user’s ability to attend to and process other sensory information and how this may impact upon their travelling ability.
9.4.4.6 The vOICe
The vOICe is software which claims to substitute sound for low-vision. The software can be run on a PC or notebook with a camera or even on some modern camera mobile phones. The camera takes images of the environment; these are converted into sound pictures. Visual contours are represented by pitch of beep. This results in every object and scene giving a unique pattern of beeps, its own sound signature. By remembering these sound signatures and learning to interpret the patterns of sounds users can learn to visualise what is around them (see Figure 9.6). For experienced users of the vOICe, it is equivalent to having a small amount of vision. It can, therefore, be used as a secondary mobility aid. It has the key advantages of distance viewing and of being free software that can be run on readily available hardware.
The vOICe software can be run on readily available hardware. How conspicuous or inconspicuous the system is will, therefore, depend on the hardware being used.
The vOICe, like the BAT “K” Sonar Cane, provides complex auditory information that must be processed by the user. This places additional cognitive demands upon the user. Further research is needed to establish the consequences of this on the user’s attention and travel ability.
