- •Preface
- •Contents
- •1 Disability and Assistive Technology Systems
- •Learning Objectives
- •1.1 The Social Context of Disability
- •1.2 Assistive Technology Outcomes: Quality of Life
- •1.2.1 Some General Issues
- •1.2.2 Definition and Measurement of Quality of Life
- •1.2.3 Health Related Quality of Life Measurement
- •1.2.4 Assistive Technology Quality of Life Procedures
- •1.2.5 Summary and Conclusions
- •1.3 Modelling Assistive Technology Systems
- •1.3.1 Modelling Approaches: A Review
- •1.3.2 Modelling Human Activities
- •1.4 The Comprehensive Assistive Technology (CAT) Model
- •1.4.1 Justification of the Choice of Model
- •1.4.2 The Structure of the CAT Model
- •1.5 Using the Comprehensive Assistive Technology Model
- •1.5.1 Using the Activity Attribute of the CAT Model to Determine Gaps in Assistive Technology Provision
- •1.5.2 Conceptual Structure of Assistive Technology Systems
- •1.5.3 Investigating Assistive Technology Systems
- •1.5.4 Analysis of Assistive Technology Systems
- •1.5.5 Synthesis of Assistive Technology Systems
- •1.6 Chapter Summary
- •Questions
- •Projects
- •References
- •2 Perception, the Eye and Assistive Technology Issues
- •Learning Objectives
- •2.1 Perception
- •2.1.1 Introduction
- •2.1.2 Common Laws and Properties of the Different Senses
- •2.1.3 Multisensory Perception
- •2.1.4 Multisensory Perception in the Superior Colliculus
- •2.1.5 Studies of Multisensory Perception
- •2.2 The Visual System
- •2.2.1 Introduction
- •2.2.2 The Lens
- •2.2.3 The Iris and Pupil
- •2.2.4 Intraocular Pressure
- •2.2.5 Extraocular Muscles
- •2.2.6 Eyelids and Tears
- •2.3 Visual Processing in the Retina, Laternal Geniculate Nucleus and the Brain
- •2.3.1 Nerve Cells
- •2.3.2 The Retina
- •2.3.3 The Optic Nerve, Optic Tract and Optic Radiation
- •2.3.4 The Lateral Geniculate Body or Nucleus
- •2.3.5 The Primary Visual or Striate Cortex
- •2.3.6 The Extrastriate Visual Cortex and the Superior Colliculus
- •2.3.7 Visual Pathways
- •2.4 Vision in Action
- •2.4.1 Image Formation
- •2.4.2 Accommodation
- •2.4.3 Response to Light
- •2.4.4 Colour Vision
- •2.4.5 Binocular Vision and Stereopsis
- •2.5 Visual Impairment and Assistive Technology
- •2.5.1 Demographics of Visual Impairment
- •2.5.2 Illustrations of Some Types of Visual Impairment
- •2.5.3 Further Types of Visual Impairment
- •2.5.4 Colour Blindness
- •2.5.5 Corrective Lenses
- •2.6 Chapter Summary
- •Questions
- •Projects
- •References
- •3 Sight Measurement
- •Learning Objectives
- •3.1 Introduction
- •3.2 Visual Acuity
- •3.2.1 Using the Chart
- •3.2.2 Variations in Measuring Visual Acuity
- •3.3 Field of Vision Tests
- •3.3.1 The Normal Visual Field
- •3.3.2 The Tangent Screen
- •3.3.3 Kinetic Perimetry
- •3.3.4 Static Perimetry
- •3.4 Pressure Measurement
- •3.5 Biometry
- •3.6 Ocular Examination
- •3.7 Optical Coherence Tomography
- •3.7.1 Echo Delay
- •3.7.2 Low Coherence Interferometry
- •3.7.3 An OCT Scanner
- •3.8 Ocular Electrophysiology
- •3.8.1 The Electrooculogram (EOG)
- •3.8.2 The Electroretinogram (ERG)
- •3.8.3 The Pattern Electroretinogram
- •3.8.4 The Visual Evoked Cortical Potential
- •3.8.5 Multifocal Electrophysiology
- •3.9 Chapter Summary
- •Glossary
- •Questions
- •Projects
- •4 Haptics as a Substitute for Vision
- •Learning Objectives
- •4.1 Introduction
- •4.1.1 Physiological Basis
- •4.1.2 Passive Touch, Active Touch and Haptics
- •4.1.3 Exploratory Procedures
- •4.2 Vision and Haptics Compared
- •4.3 The Capacity of Bare Fingers in Real Environments
- •4.3.1 Visually Impaired People’s Use of Haptics Without any Technical Aid
- •4.3.2 Speech Perceived by Hard-of-hearing People Using Bare Hands
- •4.3.3 Natural Capacity of Touch and Evaluation of Technical Aids
- •4.4 Haptic Low-tech Aids
- •4.4.1 The Long Cane
- •4.4.2 The Guide Dog
- •4.4.3 Braille
- •4.4.4 Embossed Pictures
- •4.4.5 The Main Lesson from Low-tech Aids
- •4.5 Matrices of Point Stimuli
- •4.5.1 Aids for Orientation and Mobility
- •4.5.2 Aids for Reading Text
- •4.5.3 Aids for Reading Pictures
- •4.6 Computer-based Aids for Graphical Information
- •4.6.1 Aids for Graphical User Interfaces
- •4.6.2 Tactile Computer Mouse
- •4.7 Haptic Displays
- •4.7.1 Information Available via a Haptic Display
- •4.7.2 What Information Can Be Obtained with the Reduced Information?
- •4.7.3 Haptic Displays as Aids for the Visually Impaired
- •4.8 Chapter Summary
- •4.9 Concluding Remarks
- •Questions
- •Projects
- •References
- •5 Mobility: An Overview
- •Learning Objectives
- •5.1 Introduction
- •5.2 The Travel Activity
- •5.2.1 Understanding Mobility
- •5.2.2 Assistive Technology Systems for the Travel Process
- •5.3 The Historical Development of Travel Aids for Visually Impaired and Blind People
- •5.4 Obstacle Avoidance AT: Guide Dogs and Robotic Guide Walkers
- •5.4.1 Guide Dogs
- •5.4.2 Robotic Guides and Walkers
- •5.5 Obstacle Avoidance AT: Canes
- •5.5.1 Long Canes
- •5.5.2 Technology Canes
- •5.6 Other Mobility Assistive Technology Approaches
- •5.6.1 Clear-path Indicators
- •5.6.2 Obstacle and Object Location Detectors
- •5.6.3 The vOICe System
- •5.7 Orientation Assistive Technology Systems
- •5.7.1 Global Positioning System Orientation Technology
- •5.7.2 Other Technology Options for Orientation Systems
- •5.8 Accessible Environments
- •5.9 Chapter Summary
- •Questions
- •Projects
- •References
- •6 Mobility AT: The Batcane (UltraCane)
- •Learning Objectives
- •6.1 Mobility Background and Introduction
- •6.2 Principles of Ultrasonics
- •6.2.1 Ultrasonic Waves
- •6.2.2 Attenuation and Reflection Interactions
- •6.2.3 Transducer Geometry
- •6.3 Bats and Signal Processing
- •6.3.1 Principles of Bat Sonar
- •6.3.2 Echolocation Call Structures
- •6.3.3 Signal Processing Capabilities
- •6.3.4 Applicability of Bat Echolocation to Sonar System Design
- •6.4 Design and Construction Issues
- •6.4.1 Outline Requirement Specification
- •6.4.2 Ultrasonic Spatial Sensor Subsystem
- •6.4.3 Trial Prototype Spatial Sensor Arrangement
- •6.4.4 Tactile User Interface Subsystem
- •6.4.5 Cognitive Mapping
- •6.4.6 Embedded Processing Control Requirements
- •6.5 Concept Phase and Engineering Prototype Phase Trials
- •6.6 Case Study in Commercialisation
- •6.7 Chapter Summary
- •Questions
- •Projects
- •References
- •7 Navigation AT: Context-aware Computing
- •Learning objectives
- •7.1 Defining the Orientation/Navigation Problem
- •7.1.1 Orientation, Mobility and Navigation
- •7.1.2 Traditional Mobility Aids
- •7.1.3 Limitations of Traditional Aids
- •7.2 Cognitive Maps
- •7.2.1 Learning and Acquiring Spatial Information
- •7.2.2 Factors that Influence How Knowledge Is Acquired
- •7.2.3 The Structure and Form of Cognitive Maps
- •7.3 Overview of Existing Technologies
- •7.3.1 Technologies for Distant Navigation
- •7.3.2 User Interface Output Technologies
- •7.4 Principles of Mobile Context-aware Computing
- •7.4.1 Adding Context to User-computer Interaction
- •7.4.2 Acquiring Useful Contextual Information
- •7.4.3 Capabilities of Context-awareness
- •7.4.4 Application of Context-aware Principles
- •7.4.5 Technological Challenges and Unresolved Usability Issues
- •7.5 Test Procedures
- •7.5.1 Human Computer Interaction (HCI)
- •7.5.2 Cognitive Mapping
- •7.5.3 Overall Approach
- •7.6 Future Positioning Technologies
- •7.7 Chapter Summary
- •7.7.1 Conclusions
- •Questions
- •Projects
- •References
- •Learning Objectives
- •8.1 Defining the Navigation Problem
- •8.1.1 What is the Importance of Location Information?
- •8.1.2 What Mobility Tools and Traditional Maps are Available for the Blind?
- •8.2 Principles of Global Positioning Systems
- •8.2.1 What is the Global Positioning System?
- •8.2.2 Accuracy of GPS: Some General Issues
- •8.2.3 Accuracy of GPS: Some Technical Issues
- •8.2.4 Frequency Spectrum of GPS, Present and Future
- •8.2.5 Other GPS Systems
- •8.3 Application of GPS Principles
- •8.4 Design Issues
- •8.5 Development Issues
- •8.5.1 Choosing an Appropriate Platform
- •8.5.2 Choosing the GPS Receiver
- •8.5.3 Creating a Packaged System
- •8.5.4 Integration vs Stand-alone
- •8.6 User Interface Design Issues
- •8.6.1 How to Present the Information
- •8.6.2 When to Present the Information
- •8.6.3 What Information to Present
- •8.7 Test Procedures and Results
- •8.8 Case Study in Commercialisation
- •8.8.1 Understanding the Value of the Technology
- •8.8.2 Limitations of the Technology
- •8.8.3 Ongoing Development
- •8.9 Chapter Summary
- •Questions
- •Projects
- •References
- •9 Electronic Travel Aids: An Assessment
- •Learning Objectives
- •9.1 Introduction
- •9.2 Why Do an Assessment?
- •9.3 Methodologies for Assessments of Electronic Travel Aids
- •9.3.1 Eliciting User Requirements
- •9.3.2 Developing a User Requirements Specification and Heuristic Evaluation
- •9.3.3 Hands-on Assessments
- •9.3.4 Methodology Used for Assessments in this Chapter
- •9.4 Modern-day Electronic Travel Aids
- •9.4.1 The Distinction Between Mobility and Navigation Aids
- •9.4.2 The Distinction Between Primary and Secondary Aids
- •9.4.3 User Requirements: Mobility and Navigation Aids
- •9.4.4 Mobility Aids
- •9.4.5 Mobility Aids: Have They Solved the Mobility Challenge?
- •9.4.6 Navigation Aids
- •9.4.7 Navigation Aids: Have They Solved the Navigation Challenge?
- •9.5 Training
- •9.6 Chapter Summary and Conclusions
- •Questions
- •Projects
- •References
- •10 Accessible Environments
- •Learning Objectives
- •10.1 Introduction
- •10.1.1 Legislative and Regulatory Framework
- •10.1.2 Accessible Environments: An Overview
- •10.1.3 Principles for the Design of Accessible Environments
- •10.2 Physical Environments: The Streetscape
- •10.2.1 Pavements and Pathways
- •10.2.2 Road Crossings
- •10.2.3 Bollards and Street Furniture
- •10.3 Physical Environments: Buildings
- •10.3.1 General Exterior Issues
- •10.3.2 General Interior Issues
- •10.3.4 Signs and Notices
- •10.3.5 Interior Building Services
- •10.4 Environmental Information and Navigation Technologies
- •10.4.1 Audio Information System: General Issues
- •10.4.2 Some Technologies for Environmental Information Systems
- •10.5 Accessible Public Transport
- •10.5.1 Accessible Public Transportation: Design Issues
- •10.6 Chapter Summary
- •Questions
- •Projects
- •References
- •11 Accessible Bus System: A Bluetooth Application
- •Learning Objectives
- •11.1 Introduction
- •11.2 Bluetooth Fundamentals
- •11.2.1 Brief History of Bluetooth
- •11.2.2 Bluetooth Power Class
- •11.2.3 Protocol Stack
- •11.2.4 Bluetooth Profile
- •11.2.5 Piconet
- •11.3 Design Issues
- •11.3.1 System Architecture
- •11.3.2 Hardware Requirements
- •11.3.3 Software Requirements
- •11.4 Developmental Issues
- •11.4.1 Bluetooth Server
- •11.4.2 Bluetooth Client (Mobile Device)
- •11.4.3 User Interface
- •11.5 Commercialisation Issues
- •11.6 Chapter Summary
- •Questions
- •Projects
- •References
- •12 Accessible Information: An Overview
- •Learning Objectives
- •12.1 Introduction
- •12.2 Low Vision Aids
- •12.2.1 Basic Principles
- •12.3 Low Vision Assistive Technology Systems
- •12.3.1 Large Print
- •12.3.2 Closed Circuit Television Systems
- •12.3.3 Video Magnifiers
- •12.3.4 Telescopic Assistive Systems
- •12.4 Audio-transcription of Printed Information
- •12.4.1 Stand-alone Reading Systems
- •12.4.2 Read IT Project
- •12.5 Tactile Access to Information
- •12.5.1 Braille
- •12.5.2 Moon
- •12.5.3 Braille Devices
- •12.6 Accessible Computer Systems
- •12.6.1 Input Devices
- •12.6.2 Output Devices
- •12.6.3 Computer-based Reading Systems
- •12.6.4 Accessible Portable Computers
- •12.7 Accessible Internet
- •12.7.1 World Wide Web Guidelines
- •12.7.2 Guidelines for Web Authoring Tools
- •12.7.3 Accessible Adobe Portable Document Format (PDF) Documents
- •12.7.4 Bobby Approval
- •12.8 Telecommunications
- •12.8.1 Voice Dialling General Principles
- •12.8.2 Talking Caller ID
- •12.8.3 Mobile Telephones
- •12.9 Chapter Summary
- •Questions
- •Projects
- •References
- •13 Screen Readers and Screen Magnifiers
- •Learning Objectives
- •13.1 Introduction
- •13.2 Overview of Chapter
- •13.3 Interacting with a Graphical User Interface
- •13.4 Screen Magnifiers
- •13.4.1 Overview
- •13.4.2 Magnification Modes
- •13.4.3 Other Interface Considerations
- •13.4.4 The Architecture and Implementation of Screen Magnifiers
- •13.5 Screen Readers
- •13.5.1 Overview
- •13.5.2 The Architecture and Implementation of a Screen Reader
- •13.5.3 Using a Braille Display
- •13.5.4 User Interface Issues
- •13.6 Hybrid Screen Reader Magnifiers
- •13.7 Self-magnifying Applications
- •13.8 Self-voicing Applications
- •13.9 Application Adaptors
- •13.10 Chapter Summary
- •Questions
- •Projects
- •References
- •14 Speech, Text and Braille Conversion Technology
- •Learning Objectives
- •14.1 Introduction
- •14.1.1 Introducing Mode Conversion
- •14.1.2 Outline of the Chapter
- •14.2 Prerequisites for Speech and Text Conversion Technology
- •14.2.1 The Spectral Structure of Speech
- •14.2.2 The Hierarchical Structure of Spoken Language
- •14.2.3 Prosody
- •14.3 Speech-to-text Conversion
- •14.3.1 Principles of Pattern Recognition
- •14.3.2 Principles of Speech Recognition
- •14.3.3 Equipment and Applications
- •14.4 Text-to-speech Conversion
- •14.4.1 Principles of Speech Production
- •14.4.2 Principles of Acoustical Synthesis
- •14.4.3 Equipment and Applications
- •14.5 Braille Conversion
- •14.5.1 Introduction
- •14.5.2 Text-to-Braille Conversion
- •14.5.3 Braille-to-text Conversion
- •14.6 Commercial Equipment and Applications
- •14.6.1 Speech vs Braille
- •14.6.2 Speech Output in Devices for Daily Life
- •14.6.3 Portable Text-based Devices
- •14.6.4 Access to Computers
- •14.6.5 Reading Machines
- •14.6.6 Access to Telecommunication Devices
- •14.7 Discussion and the Future Outlook
- •14.7.1 End-user Studies
- •14.7.2 Discussion and Issues Arising
- •14.7.3 Future Developments
- •Questions
- •Projects
- •References
- •15 Accessing Books and Documents
- •Learning Objectives
- •15.1 Introduction: The Challenge of Accessing the Printed Page
- •15.2 Basics of Optical Character Recognition Technology
- •15.2.1 Details of Optical Character Recognition Technology
- •15.2.2 Practical Issues with Optical Character Recognition Technology
- •15.3 Reading Systems
- •15.4 DAISY Technology
- •15.4.1 DAISY Full Audio Books
- •15.4.2 DAISY Full Text Books
- •15.4.3 DAISY and Other Formats
- •15.5 Players
- •15.6 Accessing Textbooks
- •15.7 Accessing Newspapers
- •15.8 Future Technology Developments
- •15.9 Chapter Summary and Conclusion
- •15.9.1 Chapter Summary
- •15.9.2 Conclusion
- •Questions
- •Projects
- •References
- •Learning Objectives
- •16.1 Introduction
- •16.1.1 Print Impairments
- •16.1.2 Music Notation
- •16.2 Overview of Accessible Music
- •16.2.1 Formats
- •16.2.2 Technical Aspects
- •16.3 Some Recent Initiatives and Projects
- •16.3.2 Play 2
- •16.3.3 Dancing Dots
- •16.3.4 Toccata
- •16.4 Problems to Be Overcome
- •16.4.1 A Content Processing Layer
- •16.4.2 Standardization of Accessible Music Technology
- •16.5 Unifying Accessible Design, Technology and Musical Content
- •16.5.1 Braille Music
- •16.5.2 Talking Music
- •16.6 Conclusions
- •16.6.1 Design for All or Accessibility from Scratch
- •16.6.2 Applying Design for All in Emerging Standards
- •16.6.3 Accessibility in Emerging Technology
- •Questions
- •Projects
- •References
- •17 Assistive Technology for Daily Living
- •Learning Objectives
- •17.1 Introduction
- •17.2 Personal Care
- •17.2.1 Labelling Systems
- •17.2.2 Healthcare Monitoring
- •17.3 Time-keeping, Alarms and Alerting
- •17.3.1 Time-keeping
- •17.3.2 Alarms and Alerting
- •17.4 Food Preparation and Consumption
- •17.4.1 Talking Kitchen Scales
- •17.4.2 Talking Measuring Jug
- •17.4.3 Liquid Level Indicator
- •17.4.4 Talking Microwave Oven
- •17.4.5 Talking Kitchen and Remote Thermometers
- •17.4.6 Braille Salt and Pepper Set
- •17.5 Environmental Control and Use of Appliances
- •17.5.1 Light Probes
- •17.5.2 Colour Probes
- •17.5.3 Talking and Tactile Thermometers and Barometers
- •17.5.4 Using Appliances
- •17.6 Money, Finance and Shopping
- •17.6.1 Mechanical Money Indicators
- •17.6.2 Electronic Money Identifiers
- •17.6.3 Electronic Purse
- •17.6.4 Automatic Teller Machines (ATMs)
- •17.7 Communications and Access to Information: Other Technologies
- •17.7.1 Information Kiosks and Other Self-service Systems
- •17.7.2 Using Smart Cards
- •17.7.3 EZ Access®
- •17.8 Chapter Summary
- •Questions
- •Projects
- •References
- •Learning Objectives
- •18.1 Introduction
- •18.2 Education: Learning and Teaching
- •18.2.1 Accessing Educational Processes and Approaches
- •18.2.2 Educational Technologies, Devices and Tools
- •18.3 Employment
- •18.3.1 Professional and Person-centred
- •18.3.2 Scientific and Technical
- •18.3.3 Administrative and Secretarial
- •18.3.4 Skilled and Non-skilled (Manual) Trades
- •18.3.5 Working Outside
- •18.4 Recreational Activities
- •18.4.1 Accessing the Visual, Audio and Performing Arts
- •18.4.2 Games, Puzzles, Toys and Collecting
- •18.4.3 Holidays and Visits: Museums, Galleries and Heritage Sites
- •18.4.4 Sports and Outdoor Activities
- •18.4.5 DIY, Art and Craft Activities
- •18.5 Chapter Summary
- •Questions
- •Projects
- •References
- •Biographical Sketches of the Contributors
- •Index
|
5.6 Other Mobility Assistive Technology Approaches |
191 |
|
Table 5.2. Detection configurations and information channels |
|
|
|
|
|
|
|
System no. |
Beam configuration |
Information channels |
|
|
|
total |
|
1 |
Downward, forward |
2 |
|
2 |
Downward, forward, upward |
3 |
|
3 |
Downward, forward, upward, |
4 |
|
|
reinforced upward |
|
|
4 |
Downward, forward left, forward right |
3 |
|
5 |
Downward, forward left, forward right, upward |
4 |
|
|
|
|
|
the obstacle information has already been reduced to that available from a number of particular directions. In the case of some of the other technologies used to obtain information, such as the camera, the potentially available information is much more complex and decisions have to be made about how much of this information should be conveyed to the user. The tendency has generally been to focus on information about obstacle locations rather than to try to present an overview or more detailed information of the scene.
In the case of a technology cane, which obtains information from a number of different directions, a number of different information channels, which could be auditory, tactile or a combination, will generally be required. Several typical beam configurations along with the total number of information channels for each configuration are given in Table 5.2.
The number of information channels given in Table 5.2 then has to be mapped to a user interface consisting of a suitable mixture of audio and/or tactile modes. For example, for the configuration of System 2, three vibrating tactile buttons matched to the cane-bearer’s thumb, and two fingers might be appropriate. Alternatively, the downward and forward channels could be tactile and the upward channel audio. Providing both tactile and audio options for all channels could have a number of benefits, including making the device available to deafblind users. However, there may be additional costs and increased complexity. Decisions on the human– technology interface are usually based on a combination of the results of end-user trials, technical constraints and cost considerations.
5.6 Other Mobility Assistive Technology Approaches
Guide dogs, long canes and technology canes are all well-established assistive technology solutions for obstacle avoidance. As indicated in Section 5.3, a number of (electronic) travel aids have been developed, though many of them have not gone beyond the prototype stage. In this section, the range of other devices that have been developed is illustrated by three examples, the Polaron, which is a clear travel path indicator, the Miniguide, which provides supplementary information to support a primary travel aid and the vOICe system, which provides a ‘soundscape’ of a visual scene.
192 5 Mobility: An Overview
5.6.1 Clear-path Indicators
The engineering response to a typical travel scene (Figure 5.3) has been to decompose the problem into a number of issues and then devise systems for these sub-problems. As discussed previously, the long cane is able to detect obstacles and identify a clear-path at lower leg height. However, other obstacles may well reach down to chest and head height as in the example of overhanging branches on trees in the environment. Laser and ultrasound technology canes are able to detect a clear-path at lower leg height and, depending on the beam configuration, at other heights too. Therefore, there is a need for assistive devices to supplement the long cane by providing information on obstacles at head and chest height. Such devices are usually termed clear-path indicators, as they are generally designed to simply indicate that the forward path at chest and head level is clear and negotiable.
The Polaron
The Polaron, manufactured by Nurion-Raycal, USA, is a clear-path indicator or an object detector that is available in a torch format or in a version that can be worn on a loose strap around the neck to be positioned at chest height. It is described here in its chest-mounted format. The echolocation principle of ultrasonic sound is used to detect whether the path is clear. In the chest-mounted format, the ultrasound transmitter and receiver are worn at chest level. The range for detection of obstacles can be set at 1.22 m, 2.44 m or 4.88 m (4, 8, or 16 ft) from the user. A sophisticated human–technology interface of audio and tactile cues is used to inform the user of obstacles in the forward path, as shown in Table 5.3.
The Polaron has sufficient tactile cues to be used by a deafblind person. The chest mounted location has the advantage of leaving the user’s hands free. This enables the Polaron to be used together with a long cane or guide dog (in one hand), while still allowing the user to carry shopping or other baggage in the other. However, this location has the drawback of being conspicuous. Similarly, a blind person who uses a manual wheelchair could use the Polaron to support independent mobility.
Wheelchair mounted clear-path indicators
Whilst the Polaron can be used with a manual wheelchair, powered wheelchairs need a dedicated device. There are two main reasons for this. Powered wheelchairs can travel two to three times as fast as a pedestrian can and therefore a faster
Table 5.3. Audio-tactile interface cues – Polaron
Path status |
Interface response |
Clear-path: no obstacles |
Polaron silent |
Obstacle more than 1.83 m (6 ft) away |
Polaron emits low-frequency audible sound |
Obstacle between 0.91 and 1.83 m |
Polaron emits series of audible clicks and vibration |
(3 ft and 6 ft) away |
at chest level |
Obstacle less than 0.91 m (3 ft) away |
Polaron emits high pitch bleeping and a tactile |
|
vibration occurs in the neck strap |
|
|
5.6 Other Mobility Assistive Technology Approaches |
193 |
response time is required. In addition, the additional impairments of blind and visually impaired users of power wheelchairs mean that they may not be able to respond fast enough. Therefore, ‘intelligent’ wheelchairs have been developed to assist users with functions such as obstacle and collision avoidance, going through a narrow doorway, between pillars or along a narrow hallway or passage as well as with landmark-based navigation. Only a small number of ‘intelligent’ wheelchairs are commercially available, including the Smart Wheelchair for children, whereas most of them are still in the prototype stage.
‘Intelligent’ wheelchairs raise the same problem of control sharing between the user and the chair system as do robotic walkers. They generally resolve it by giving the user control over high level functions, such as directing the wheelchair to a desired location, while providing different degrees of assistance with lowlevel functions such as obstacle avoidance. It should be noted that ‘intelligent’ powered wheelchairs have not been developed specifically for blind wheelchairs users and that they may require more assistance with obstacle avoidance and manoeuvring the chair in a narrow space than sighted wheelchair users. The ‘Intelligent’ wheelchair illustrates one approach to obstacle avoidance for powered wheelchair users, namely, an integrated system that forms part of the wheelchair control system. Another approach is a separate device that can be mounted on the wheelchair, for example, the Wheelchair Pathfinder as manufactured by NurionRaycal, USA.
Wheelchair Pathfinder
The Wheelchair Pathfinder is an obstacle detection system that can be used with manual and powered mobility vehicles and wheelchairs. It uses a mixture of infrared and ultrasonic sensing devices to give information about the path to be travelled. A downward pointing laser beam system is used for step detection and ultrasound is used to the front and sides for clear-path detection. The modes of operation are as follows:
• Forward path obstacle detection. This uses a forward path ultrasound beam. The forward detection distance can be set to be 1.22 m (4 ft) or 2.44 m (8 ft). An intermittent beeping sound indicates that the forward beam has detected an object.
•Side object detection. A continuous tone indicates the presence of an object within 30.5 cm (12 ins) to either side of the wheelchair. To distinguish between objects on the right and left side of the wheelchair user, different pitches (frequencies) are used, with the higher of the two pitches indicating an object to the right hand side. Knowledge about objects to the side can facilitate navigation in enclosed spaces or through doorways.
•Step detection. A downward pointing laser system is used for the detection of steps, curbs or drop-offs. Steps up to 1.22 m (4 ft) from the device can be detected and are indicated by a low pitched audio signal.
Although the obstacle detection information described above is transmitted using sounds, the Wheelchair Pathfinder is also available with a tactile interface
194 5 Mobility: An Overview
making it suitable for those who prefer tactile information, such as deafblind people. The system can be used in several different ways, for clear-path detection and navigation.
•Clear-path detection. When the intermittent sound from the forward path detector is heard, then the wheelchair should be turned slowly until the intermittent beeping sounds stops. The absence of an intermittent beeping indicates that the forward path is now clear.
•Finding a landmark. In this case the beeping signal is used as a homing signal so that the user keeps the intermittent beeping signal in front of the wheelchair and homes in on a desired destination landmark.
•Straight travel. In this mode the side beam is used. The wheelchair is positioned to within 30 cm of a wall or a hallway so that the side beam issues a constant and continuous tone. Forward travel that maintains this side beam tone means that the wheelchair is travelling parallel to the wall. When the side beam tone stops, an open doorway or a corridor intersection has been reached.
5.6.2 Obstacle and Object Location Detectors
There is a clear, though sometimes subtle, difference between obstacle avoidance and object location. The primary mobility aids, such as the guide dog and the long cane, provide obstacle avoidance information. However, there are differences between this task and the object location task, where the user actually wants to make contact with an object. This could include detecting the end of a queue, the start of a sales counter or the location of lift doors, the waste paper bin or a desk chair. The use of a long cane is probably inappropriate and, in addition, could be damaged in some circumstances, for example, by closing lift doors. To gain a specification for a suitable assistive technology system, the CAT model checklist is completed and shown in Table 5.4.
This results in a specification for a small lightweight hand-held device that could be used to explore and interrogate the surroundings of the user or find objects in the user’s immediate vicinity. The obvious analogy is a small torch whose light beam is used to illuminate the immediate surroundings of a sighted user.
Miniguide
The Miniguide (GDP 2006) is an assistive technology solution that meets many of the requirements of this specification. It is a small hand-held, torch-like device with dimensions of 80 mm long, 38 mm wide, and 23 mm thick. It uses the echolocation principle with ultrasonics to detect objects within the range of its beam. It is essentially a support device to be used in conjunction with a primary mobility technology, such as the long cane or a guide dog. But, it can also be used to interrogate the spatial layout of the local space around a user who might, for example, be trying to find the position of one particular item.
5.6 Other Mobility Assistive Technology Approaches |
195 |
Table 5.4. CAT model checklist – specification for obstacle/object detection
Attribute – context
Context – cultural and social |
Travelling alone on public transport or on foot socially acceptable |
Context – national context |
Modern infrastructure; anti-discrimination legislation in place |
Context – local settings |
Urban community; noisy outdoor and indoor environments; |
|
stationary and moving objects; weather includes rain, snow, wind |
|
|
Attribute – person |
|
|
|
Person – social aspects |
Support from family and friends; training in orientation and |
|
mobility is available |
Person – attitudes |
Willing to try new assistive technology and will persevere |
Person – characteristics |
Visually impaired with tunnel vision, blind or deafblind; physically |
|
mobile. Preference for independent travel |
|
|
Attribute – activity |
|
|
|
Activity – mobility |
Obstacle detection; spatial awareness |
|
|
Attribute – assistive technology |
|
|
|
AT – activity specification – |
Locates stationary objects; preferably also able to locate moving |
task specification |
objects; able to identify some objects; obtains additional |
|
information to provide a sense of spatial awareness; identifies |
|
openings such as open doors; provides information on objects, |
|
openings etc. to user in an accessible and easily comprensible form |
AT – activity specification – user |
Portable; small; wireless; options for sensory channels – tactile, |
requirements |
audio; lightweight; battery lasts several hours, i.e. for long journey |
AT – design issues – design |
Design for visually impaired, blind and deafblind enduser groups |
approach |
|
AT – design issues – technology |
Options: ultrasonics, laser; battery powered. Integral battery or |
selection |
separate battery pack |
AT – AT system – system interfaces |
Provides environmental information, including object distances; |
|
should provide feedback using different sensory modalities; |
|
information provided should be unambiguous |
AT – AT system – technical |
Robust construction; weather resistant; high reliability; |
performance |
battery lasts several hours. i.e. during a long journey |
AT – enduser issues – ease of use |
Portable; wireless; fits in a pocket; easy battery replacement |
AT – enduser issues – mode of use |
Hand-held with wrist strap for safety or clips to a belt; audio via |
|
single earphone to enable user to hear environmental sounds |
AT – enduser issues – training |
Minimal; device should be intuitive to use |
requirements |
|
AT – enduser issues – |
Available in different formats: standard and large print, Braille, |
documentation |
on audio-cassette and on the Web |
|
|
The first Miniguide was introduced to the market in 2000 and this model is illustrated in Figure 5.16a. This model was superseded in 2005 by a new realisation of the Miniguide concept that used an injection-moulded case, as shown in Figure 5.16b.
The description that follows is based on the User Guide for the new 2005 Miniguide model (GDP 2006). The Miniguide is battery powered and has four basic beam ranges for object detection: 0.5 m, 1 m, 2 m and 4 m. The torch-like body has a depressible ON/OFF switch and an ear plug socket at the back for an audio
196 5 Mobility: An Overview
Figure 5.16a,b. Miniguide: a Miniguide circa 2000; b Miniguide circa 2005 (photographs reproduced by kind permission of GDP Research, Australia)
sound feedback signal. However, the Miniguide is basically tactile in that the torch vibrates at a speed that decreases with increasing distance of the object from the device. Slow vibrations indicate that the object detected is at the limit of the Miniguide beam range setting. A rapid vibration rate indicates that the object detected is very close to the Miniguide and no vibration indicates that there is no object within the range field of the beam. This tactile modality is suitable for visually impaired, blind and deafblind users.
However, the use of the earplug socket to obtain an audio-sound feedback signal creates an additional, or an alternative sensory information channel. Thus, the Miniguide can be configured to supply (1) only tactile feedback, (2) both tactile and audio feedback or (3) only audio feedback. The use of a single earphone allows the user to perceive the ambient background environmental noises, which provide important locational information to blind people, as well as receiving aural information from the Miniguide.
The Miniguide will be silent if it cannot detect an object in its current beam range setting. If an object is detected then an audio-feedback tone is produced. Two different types of audio tones are available:
1.“Chirp” audio tones. This is the default audio-tone for the Miniguide. The rate of “chirp” tones heard is the aural equivalent of the rate of vibration of the small motor within the Miniguide. A high “chirp” rate indicates that the object detected is close to the device.
2.Sweep audio tones. For the sweep audio setting, a continuous tone is produced, where the pitch indicates the closeness of the object detected to the Miniguide. The higher the pitch of the sweep audio tone, the closer the object detected is to the device. With practice, this variation in pitch can provide the user with some spatial awareness of their immediate environment. For example, the presence of a doorway with a wall on both sides would be indicated by the following sequence: high (wall), low (doorway), high (wall) in the audio sweep signal.
