- •Preface
- •Contents
- •1 Disability and Assistive Technology Systems
- •Learning Objectives
- •1.1 The Social Context of Disability
- •1.2 Assistive Technology Outcomes: Quality of Life
- •1.2.1 Some General Issues
- •1.2.2 Definition and Measurement of Quality of Life
- •1.2.3 Health Related Quality of Life Measurement
- •1.2.4 Assistive Technology Quality of Life Procedures
- •1.2.5 Summary and Conclusions
- •1.3 Modelling Assistive Technology Systems
- •1.3.1 Modelling Approaches: A Review
- •1.3.2 Modelling Human Activities
- •1.4 The Comprehensive Assistive Technology (CAT) Model
- •1.4.1 Justification of the Choice of Model
- •1.4.2 The Structure of the CAT Model
- •1.5 Using the Comprehensive Assistive Technology Model
- •1.5.1 Using the Activity Attribute of the CAT Model to Determine Gaps in Assistive Technology Provision
- •1.5.2 Conceptual Structure of Assistive Technology Systems
- •1.5.3 Investigating Assistive Technology Systems
- •1.5.4 Analysis of Assistive Technology Systems
- •1.5.5 Synthesis of Assistive Technology Systems
- •1.6 Chapter Summary
- •Questions
- •Projects
- •References
- •2 Perception, the Eye and Assistive Technology Issues
- •Learning Objectives
- •2.1 Perception
- •2.1.1 Introduction
- •2.1.2 Common Laws and Properties of the Different Senses
- •2.1.3 Multisensory Perception
- •2.1.4 Multisensory Perception in the Superior Colliculus
- •2.1.5 Studies of Multisensory Perception
- •2.2 The Visual System
- •2.2.1 Introduction
- •2.2.2 The Lens
- •2.2.3 The Iris and Pupil
- •2.2.4 Intraocular Pressure
- •2.2.5 Extraocular Muscles
- •2.2.6 Eyelids and Tears
- •2.3 Visual Processing in the Retina, Laternal Geniculate Nucleus and the Brain
- •2.3.1 Nerve Cells
- •2.3.2 The Retina
- •2.3.3 The Optic Nerve, Optic Tract and Optic Radiation
- •2.3.4 The Lateral Geniculate Body or Nucleus
- •2.3.5 The Primary Visual or Striate Cortex
- •2.3.6 The Extrastriate Visual Cortex and the Superior Colliculus
- •2.3.7 Visual Pathways
- •2.4 Vision in Action
- •2.4.1 Image Formation
- •2.4.2 Accommodation
- •2.4.3 Response to Light
- •2.4.4 Colour Vision
- •2.4.5 Binocular Vision and Stereopsis
- •2.5 Visual Impairment and Assistive Technology
- •2.5.1 Demographics of Visual Impairment
- •2.5.2 Illustrations of Some Types of Visual Impairment
- •2.5.3 Further Types of Visual Impairment
- •2.5.4 Colour Blindness
- •2.5.5 Corrective Lenses
- •2.6 Chapter Summary
- •Questions
- •Projects
- •References
- •3 Sight Measurement
- •Learning Objectives
- •3.1 Introduction
- •3.2 Visual Acuity
- •3.2.1 Using the Chart
- •3.2.2 Variations in Measuring Visual Acuity
- •3.3 Field of Vision Tests
- •3.3.1 The Normal Visual Field
- •3.3.2 The Tangent Screen
- •3.3.3 Kinetic Perimetry
- •3.3.4 Static Perimetry
- •3.4 Pressure Measurement
- •3.5 Biometry
- •3.6 Ocular Examination
- •3.7 Optical Coherence Tomography
- •3.7.1 Echo Delay
- •3.7.2 Low Coherence Interferometry
- •3.7.3 An OCT Scanner
- •3.8 Ocular Electrophysiology
- •3.8.1 The Electrooculogram (EOG)
- •3.8.2 The Electroretinogram (ERG)
- •3.8.3 The Pattern Electroretinogram
- •3.8.4 The Visual Evoked Cortical Potential
- •3.8.5 Multifocal Electrophysiology
- •3.9 Chapter Summary
- •Glossary
- •Questions
- •Projects
- •4 Haptics as a Substitute for Vision
- •Learning Objectives
- •4.1 Introduction
- •4.1.1 Physiological Basis
- •4.1.2 Passive Touch, Active Touch and Haptics
- •4.1.3 Exploratory Procedures
- •4.2 Vision and Haptics Compared
- •4.3 The Capacity of Bare Fingers in Real Environments
- •4.3.1 Visually Impaired People’s Use of Haptics Without any Technical Aid
- •4.3.2 Speech Perceived by Hard-of-hearing People Using Bare Hands
- •4.3.3 Natural Capacity of Touch and Evaluation of Technical Aids
- •4.4 Haptic Low-tech Aids
- •4.4.1 The Long Cane
- •4.4.2 The Guide Dog
- •4.4.3 Braille
- •4.4.4 Embossed Pictures
- •4.4.5 The Main Lesson from Low-tech Aids
- •4.5 Matrices of Point Stimuli
- •4.5.1 Aids for Orientation and Mobility
- •4.5.2 Aids for Reading Text
- •4.5.3 Aids for Reading Pictures
- •4.6 Computer-based Aids for Graphical Information
- •4.6.1 Aids for Graphical User Interfaces
- •4.6.2 Tactile Computer Mouse
- •4.7 Haptic Displays
- •4.7.1 Information Available via a Haptic Display
- •4.7.2 What Information Can Be Obtained with the Reduced Information?
- •4.7.3 Haptic Displays as Aids for the Visually Impaired
- •4.8 Chapter Summary
- •4.9 Concluding Remarks
- •Questions
- •Projects
- •References
- •5 Mobility: An Overview
- •Learning Objectives
- •5.1 Introduction
- •5.2 The Travel Activity
- •5.2.1 Understanding Mobility
- •5.2.2 Assistive Technology Systems for the Travel Process
- •5.3 The Historical Development of Travel Aids for Visually Impaired and Blind People
- •5.4 Obstacle Avoidance AT: Guide Dogs and Robotic Guide Walkers
- •5.4.1 Guide Dogs
- •5.4.2 Robotic Guides and Walkers
- •5.5 Obstacle Avoidance AT: Canes
- •5.5.1 Long Canes
- •5.5.2 Technology Canes
- •5.6 Other Mobility Assistive Technology Approaches
- •5.6.1 Clear-path Indicators
- •5.6.2 Obstacle and Object Location Detectors
- •5.6.3 The vOICe System
- •5.7 Orientation Assistive Technology Systems
- •5.7.1 Global Positioning System Orientation Technology
- •5.7.2 Other Technology Options for Orientation Systems
- •5.8 Accessible Environments
- •5.9 Chapter Summary
- •Questions
- •Projects
- •References
- •6 Mobility AT: The Batcane (UltraCane)
- •Learning Objectives
- •6.1 Mobility Background and Introduction
- •6.2 Principles of Ultrasonics
- •6.2.1 Ultrasonic Waves
- •6.2.2 Attenuation and Reflection Interactions
- •6.2.3 Transducer Geometry
- •6.3 Bats and Signal Processing
- •6.3.1 Principles of Bat Sonar
- •6.3.2 Echolocation Call Structures
- •6.3.3 Signal Processing Capabilities
- •6.3.4 Applicability of Bat Echolocation to Sonar System Design
- •6.4 Design and Construction Issues
- •6.4.1 Outline Requirement Specification
- •6.4.2 Ultrasonic Spatial Sensor Subsystem
- •6.4.3 Trial Prototype Spatial Sensor Arrangement
- •6.4.4 Tactile User Interface Subsystem
- •6.4.5 Cognitive Mapping
- •6.4.6 Embedded Processing Control Requirements
- •6.5 Concept Phase and Engineering Prototype Phase Trials
- •6.6 Case Study in Commercialisation
- •6.7 Chapter Summary
- •Questions
- •Projects
- •References
- •7 Navigation AT: Context-aware Computing
- •Learning objectives
- •7.1 Defining the Orientation/Navigation Problem
- •7.1.1 Orientation, Mobility and Navigation
- •7.1.2 Traditional Mobility Aids
- •7.1.3 Limitations of Traditional Aids
- •7.2 Cognitive Maps
- •7.2.1 Learning and Acquiring Spatial Information
- •7.2.2 Factors that Influence How Knowledge Is Acquired
- •7.2.3 The Structure and Form of Cognitive Maps
- •7.3 Overview of Existing Technologies
- •7.3.1 Technologies for Distant Navigation
- •7.3.2 User Interface Output Technologies
- •7.4 Principles of Mobile Context-aware Computing
- •7.4.1 Adding Context to User-computer Interaction
- •7.4.2 Acquiring Useful Contextual Information
- •7.4.3 Capabilities of Context-awareness
- •7.4.4 Application of Context-aware Principles
- •7.4.5 Technological Challenges and Unresolved Usability Issues
- •7.5 Test Procedures
- •7.5.1 Human Computer Interaction (HCI)
- •7.5.2 Cognitive Mapping
- •7.5.3 Overall Approach
- •7.6 Future Positioning Technologies
- •7.7 Chapter Summary
- •7.7.1 Conclusions
- •Questions
- •Projects
- •References
- •Learning Objectives
- •8.1 Defining the Navigation Problem
- •8.1.1 What is the Importance of Location Information?
- •8.1.2 What Mobility Tools and Traditional Maps are Available for the Blind?
- •8.2 Principles of Global Positioning Systems
- •8.2.1 What is the Global Positioning System?
- •8.2.2 Accuracy of GPS: Some General Issues
- •8.2.3 Accuracy of GPS: Some Technical Issues
- •8.2.4 Frequency Spectrum of GPS, Present and Future
- •8.2.5 Other GPS Systems
- •8.3 Application of GPS Principles
- •8.4 Design Issues
- •8.5 Development Issues
- •8.5.1 Choosing an Appropriate Platform
- •8.5.2 Choosing the GPS Receiver
- •8.5.3 Creating a Packaged System
- •8.5.4 Integration vs Stand-alone
- •8.6 User Interface Design Issues
- •8.6.1 How to Present the Information
- •8.6.2 When to Present the Information
- •8.6.3 What Information to Present
- •8.7 Test Procedures and Results
- •8.8 Case Study in Commercialisation
- •8.8.1 Understanding the Value of the Technology
- •8.8.2 Limitations of the Technology
- •8.8.3 Ongoing Development
- •8.9 Chapter Summary
- •Questions
- •Projects
- •References
- •9 Electronic Travel Aids: An Assessment
- •Learning Objectives
- •9.1 Introduction
- •9.2 Why Do an Assessment?
- •9.3 Methodologies for Assessments of Electronic Travel Aids
- •9.3.1 Eliciting User Requirements
- •9.3.2 Developing a User Requirements Specification and Heuristic Evaluation
- •9.3.3 Hands-on Assessments
- •9.3.4 Methodology Used for Assessments in this Chapter
- •9.4 Modern-day Electronic Travel Aids
- •9.4.1 The Distinction Between Mobility and Navigation Aids
- •9.4.2 The Distinction Between Primary and Secondary Aids
- •9.4.3 User Requirements: Mobility and Navigation Aids
- •9.4.4 Mobility Aids
- •9.4.5 Mobility Aids: Have They Solved the Mobility Challenge?
- •9.4.6 Navigation Aids
- •9.4.7 Navigation Aids: Have They Solved the Navigation Challenge?
- •9.5 Training
- •9.6 Chapter Summary and Conclusions
- •Questions
- •Projects
- •References
- •10 Accessible Environments
- •Learning Objectives
- •10.1 Introduction
- •10.1.1 Legislative and Regulatory Framework
- •10.1.2 Accessible Environments: An Overview
- •10.1.3 Principles for the Design of Accessible Environments
- •10.2 Physical Environments: The Streetscape
- •10.2.1 Pavements and Pathways
- •10.2.2 Road Crossings
- •10.2.3 Bollards and Street Furniture
- •10.3 Physical Environments: Buildings
- •10.3.1 General Exterior Issues
- •10.3.2 General Interior Issues
- •10.3.4 Signs and Notices
- •10.3.5 Interior Building Services
- •10.4 Environmental Information and Navigation Technologies
- •10.4.1 Audio Information System: General Issues
- •10.4.2 Some Technologies for Environmental Information Systems
- •10.5 Accessible Public Transport
- •10.5.1 Accessible Public Transportation: Design Issues
- •10.6 Chapter Summary
- •Questions
- •Projects
- •References
- •11 Accessible Bus System: A Bluetooth Application
- •Learning Objectives
- •11.1 Introduction
- •11.2 Bluetooth Fundamentals
- •11.2.1 Brief History of Bluetooth
- •11.2.2 Bluetooth Power Class
- •11.2.3 Protocol Stack
- •11.2.4 Bluetooth Profile
- •11.2.5 Piconet
- •11.3 Design Issues
- •11.3.1 System Architecture
- •11.3.2 Hardware Requirements
- •11.3.3 Software Requirements
- •11.4 Developmental Issues
- •11.4.1 Bluetooth Server
- •11.4.2 Bluetooth Client (Mobile Device)
- •11.4.3 User Interface
- •11.5 Commercialisation Issues
- •11.6 Chapter Summary
- •Questions
- •Projects
- •References
- •12 Accessible Information: An Overview
- •Learning Objectives
- •12.1 Introduction
- •12.2 Low Vision Aids
- •12.2.1 Basic Principles
- •12.3 Low Vision Assistive Technology Systems
- •12.3.1 Large Print
- •12.3.2 Closed Circuit Television Systems
- •12.3.3 Video Magnifiers
- •12.3.4 Telescopic Assistive Systems
- •12.4 Audio-transcription of Printed Information
- •12.4.1 Stand-alone Reading Systems
- •12.4.2 Read IT Project
- •12.5 Tactile Access to Information
- •12.5.1 Braille
- •12.5.2 Moon
- •12.5.3 Braille Devices
- •12.6 Accessible Computer Systems
- •12.6.1 Input Devices
- •12.6.2 Output Devices
- •12.6.3 Computer-based Reading Systems
- •12.6.4 Accessible Portable Computers
- •12.7 Accessible Internet
- •12.7.1 World Wide Web Guidelines
- •12.7.2 Guidelines for Web Authoring Tools
- •12.7.3 Accessible Adobe Portable Document Format (PDF) Documents
- •12.7.4 Bobby Approval
- •12.8 Telecommunications
- •12.8.1 Voice Dialling General Principles
- •12.8.2 Talking Caller ID
- •12.8.3 Mobile Telephones
- •12.9 Chapter Summary
- •Questions
- •Projects
- •References
- •13 Screen Readers and Screen Magnifiers
- •Learning Objectives
- •13.1 Introduction
- •13.2 Overview of Chapter
- •13.3 Interacting with a Graphical User Interface
- •13.4 Screen Magnifiers
- •13.4.1 Overview
- •13.4.2 Magnification Modes
- •13.4.3 Other Interface Considerations
- •13.4.4 The Architecture and Implementation of Screen Magnifiers
- •13.5 Screen Readers
- •13.5.1 Overview
- •13.5.2 The Architecture and Implementation of a Screen Reader
- •13.5.3 Using a Braille Display
- •13.5.4 User Interface Issues
- •13.6 Hybrid Screen Reader Magnifiers
- •13.7 Self-magnifying Applications
- •13.8 Self-voicing Applications
- •13.9 Application Adaptors
- •13.10 Chapter Summary
- •Questions
- •Projects
- •References
- •14 Speech, Text and Braille Conversion Technology
- •Learning Objectives
- •14.1 Introduction
- •14.1.1 Introducing Mode Conversion
- •14.1.2 Outline of the Chapter
- •14.2 Prerequisites for Speech and Text Conversion Technology
- •14.2.1 The Spectral Structure of Speech
- •14.2.2 The Hierarchical Structure of Spoken Language
- •14.2.3 Prosody
- •14.3 Speech-to-text Conversion
- •14.3.1 Principles of Pattern Recognition
- •14.3.2 Principles of Speech Recognition
- •14.3.3 Equipment and Applications
- •14.4 Text-to-speech Conversion
- •14.4.1 Principles of Speech Production
- •14.4.2 Principles of Acoustical Synthesis
- •14.4.3 Equipment and Applications
- •14.5 Braille Conversion
- •14.5.1 Introduction
- •14.5.2 Text-to-Braille Conversion
- •14.5.3 Braille-to-text Conversion
- •14.6 Commercial Equipment and Applications
- •14.6.1 Speech vs Braille
- •14.6.2 Speech Output in Devices for Daily Life
- •14.6.3 Portable Text-based Devices
- •14.6.4 Access to Computers
- •14.6.5 Reading Machines
- •14.6.6 Access to Telecommunication Devices
- •14.7 Discussion and the Future Outlook
- •14.7.1 End-user Studies
- •14.7.2 Discussion and Issues Arising
- •14.7.3 Future Developments
- •Questions
- •Projects
- •References
- •15 Accessing Books and Documents
- •Learning Objectives
- •15.1 Introduction: The Challenge of Accessing the Printed Page
- •15.2 Basics of Optical Character Recognition Technology
- •15.2.1 Details of Optical Character Recognition Technology
- •15.2.2 Practical Issues with Optical Character Recognition Technology
- •15.3 Reading Systems
- •15.4 DAISY Technology
- •15.4.1 DAISY Full Audio Books
- •15.4.2 DAISY Full Text Books
- •15.4.3 DAISY and Other Formats
- •15.5 Players
- •15.6 Accessing Textbooks
- •15.7 Accessing Newspapers
- •15.8 Future Technology Developments
- •15.9 Chapter Summary and Conclusion
- •15.9.1 Chapter Summary
- •15.9.2 Conclusion
- •Questions
- •Projects
- •References
- •Learning Objectives
- •16.1 Introduction
- •16.1.1 Print Impairments
- •16.1.2 Music Notation
- •16.2 Overview of Accessible Music
- •16.2.1 Formats
- •16.2.2 Technical Aspects
- •16.3 Some Recent Initiatives and Projects
- •16.3.2 Play 2
- •16.3.3 Dancing Dots
- •16.3.4 Toccata
- •16.4 Problems to Be Overcome
- •16.4.1 A Content Processing Layer
- •16.4.2 Standardization of Accessible Music Technology
- •16.5 Unifying Accessible Design, Technology and Musical Content
- •16.5.1 Braille Music
- •16.5.2 Talking Music
- •16.6 Conclusions
- •16.6.1 Design for All or Accessibility from Scratch
- •16.6.2 Applying Design for All in Emerging Standards
- •16.6.3 Accessibility in Emerging Technology
- •Questions
- •Projects
- •References
- •17 Assistive Technology for Daily Living
- •Learning Objectives
- •17.1 Introduction
- •17.2 Personal Care
- •17.2.1 Labelling Systems
- •17.2.2 Healthcare Monitoring
- •17.3 Time-keeping, Alarms and Alerting
- •17.3.1 Time-keeping
- •17.3.2 Alarms and Alerting
- •17.4 Food Preparation and Consumption
- •17.4.1 Talking Kitchen Scales
- •17.4.2 Talking Measuring Jug
- •17.4.3 Liquid Level Indicator
- •17.4.4 Talking Microwave Oven
- •17.4.5 Talking Kitchen and Remote Thermometers
- •17.4.6 Braille Salt and Pepper Set
- •17.5 Environmental Control and Use of Appliances
- •17.5.1 Light Probes
- •17.5.2 Colour Probes
- •17.5.3 Talking and Tactile Thermometers and Barometers
- •17.5.4 Using Appliances
- •17.6 Money, Finance and Shopping
- •17.6.1 Mechanical Money Indicators
- •17.6.2 Electronic Money Identifiers
- •17.6.3 Electronic Purse
- •17.6.4 Automatic Teller Machines (ATMs)
- •17.7 Communications and Access to Information: Other Technologies
- •17.7.1 Information Kiosks and Other Self-service Systems
- •17.7.2 Using Smart Cards
- •17.7.3 EZ Access®
- •17.8 Chapter Summary
- •Questions
- •Projects
- •References
- •Learning Objectives
- •18.1 Introduction
- •18.2 Education: Learning and Teaching
- •18.2.1 Accessing Educational Processes and Approaches
- •18.2.2 Educational Technologies, Devices and Tools
- •18.3 Employment
- •18.3.1 Professional and Person-centred
- •18.3.2 Scientific and Technical
- •18.3.3 Administrative and Secretarial
- •18.3.4 Skilled and Non-skilled (Manual) Trades
- •18.3.5 Working Outside
- •18.4 Recreational Activities
- •18.4.1 Accessing the Visual, Audio and Performing Arts
- •18.4.2 Games, Puzzles, Toys and Collecting
- •18.4.3 Holidays and Visits: Museums, Galleries and Heritage Sites
- •18.4.4 Sports and Outdoor Activities
- •18.4.5 DIY, Art and Craft Activities
- •18.5 Chapter Summary
- •Questions
- •Projects
- •References
- •Biographical Sketches of the Contributors
- •Index
160 4 Haptics as a Substitute for Vision
P.2 Discuss if there are any differences in usefulness for visually impaired users of the different haptic displays (GALLERY, undated). The usefulness may vary for different kinds of tasks, such as getting web accessibility, reading 2D graphics, and identifying virtual 3D objects and their properties (shape, texture, hardness, weight).
P.3 As discussed in the text, getting spatially distributed information to the contact areas of the haptic displays is the most important way of improving their functioning for identification of virtual objects. Search on the Web for suitable solutions for this requirement. Observe that even if equidistant pins may be a solution in the right direction, technology that would allow varying distances between sensation points probably is a still better solution, as stimulus variables such as texture gradients important for depth perception would be possible.
References
AFB, undated, American Foundation for the Blind, http://www.afb.org APH, undated, American Printing House for the Blind, http://www.aph.org
Austin, T., and Sleight, R., 1952, Accuracy or tactual discrimination of letters, numerals, and geometric form, J. Experimental Psychology, Vol. 43, pp. 239–247
Bach-y-Rita, P., 1972, Brain Mechanisms in Sensory Substitution, Academic Press, New York Bach-y-Rita, P., and Kercel, S.W., 2003, Sensory substitution and the human–machine interface, Trends
in Cognitive Sciences, Vol. 7, pp. 541–546
Bach-y-Rita, P., Kaczmarek, K., Tyler, M., and Garcia-Lara, J., 1998, Form perception with a 49-point electrotactile stimulus array on the tongue, J. Rehabilitation Research and Development, Vol. 35, pp. 427–431
Bach-y-Rita, P., Kaczmarek, K.A., and Tyler, M.E., 2003, A tongue-based tactile display for portrayal of environmental characteristics, In J. Hettinger and M.W. Haas (Eds.), Virtual and adaptive environments: Applications, implications, and human performance, Erlbaum, Mahwah, NJ
Ballesteros Jim´enez, S., and Heller, M.A. (Eds.), 2004, Touch, blindness and neuroscience, UNED Press, Madrid, Spain
Barth, J.L., 1982, The development and evaluation of a tactile graphics kit, J. Visual Impairment and Blindness, Vol. 76, pp. 269–273
Barth, J.L., and Foulke, E., 1979, Preview: A neglected variable in orientation and mobility, J. Visual Impairment and Blindness, Vol. 73, pp. 41–48
Bentzen, B.L., 1997, Orientation aids, In B.B. Blash, W.R. Wiener and R.L. Welsh (Eds.), Foundations of orientation and mobility, 2nd Edn., pp. 284–316. AFB Press, New York
Bergamasco, M., and Prisco, G., 1998, Design of an anthropomorphic haptic interface for the human arm, In Y. Shirai and S. Hircse (Eds,), Robotic Research, the Eight International Symposium, pp. 278–289, Springer, London, UK
Bergamasco, M., Avizzano, C., Di Petri, G., Barbagli, F., and Frisoli, A., 2001, The museum of pure form: system architecture, In Procs of 10th IEEE International Workshop on Robot and Human Interactive Communication, pp. 112–117, IEEE Press, Piscataway, NY
Blash, B.B., Wiener W.R., and Welsh R.L. (Eds.), 1997, Foundations of orientation and mobility, 2nd Edn., AFB Press, New York
Bledsoe, C.W., 1997, Originators of orientation and mobility training, In B.B. Blash, W.R. Wiener and R.L. Welsh (Eds.), Foundations of orientation and mobility, 2nd. Edn., pp. 580–623. AFB Press, New York
Bliss, J.C., 1978, Reading machines for the blind, In G. Gordon (Ed.), Active touch: The mechanism of recognition of objects by manipulation, A multidisciplinary approach, Pergamon Press, Oxford, UK
References 161
Brewster, S., and Murray-Smith, R. (Eds.), 2001, Haptic human-computer interaction, Springer, Berlin, Germany
Burdea, G., 1996, Force and touch feedback for virtual reality, Wiley, New York, USA
Burdea, G., and Coiffet, P., 2003, Virtual Reality Technology (2nd Edn. with CD-ROM), Wiley, New York Burger, D., Mazurier, C., Cesarano, S., and Sagot, J., 1993, In D. Burger and J.-C. Sperandio, Non-visual human-computer interactions, Prospects for the visually handicapped, Colloque INSERM, Vol. 228,
pp. 97–114, John Libbey Eurotext, Montrouge, France
Choi, S., and Tan, H.Z., 2004, Toward realistic haptic rendering of surface textures, IEEE Computer Graphics and Applications, Vol. 24, pp. 40–47
Cholewiak, R., and Collins, A., 1991, Sensory and physiological basis of touch, In M. A. Heller and W. Schiff (Eds.), The psychology of touch, Erlbaum, Hillsdale, NJ, USA
Chomsky, C., 1986, Analytic study of the Tadoma method: Language abilities of three deaf-blind subjects, J. Speech and Hearing Research, Vol. 29, pp. 332–347
Christou, C., and Wing, A., 2001, Friction and curvature judgment, In C. Baber, M. Faint, S. Wall and A.M. Wing (Eds.), Eurohaptics 2001 Conf. Procs. (Educational Technology Research Papers, ETRP 12, ISSN 1463-9394). The University of Birmingham, Birmingham, England, pp. 36–40
Cratty, B.J., 1971, Movement and spatial awareness in blind children and youth, Thomas, Springfield, IL, USA
Cronin, V., 1977, Active and passive touch at three age levels, Developmental Psychology, Vol. 13, pp. 253–256
DMD, undated, DMD 120060, http://www.metec-ag.de/company.html
Edman, P.K., 1992, Tactile graphics, American Foundation for the Blind, New York
Eriksson, Y., 1998, Tactile pictures: Pictorial representations for the blind 1784–1940, Acta Universitatis Gothoburgensis, Gothenburg Studies in Art and Architecture, No. 4.
Farmer, L.W., and Smith D.L., 1997, Adaptive technology, In B.B. Blash, W.R. Wiener and R.L. Welsh (Eds.), Foundations of orientation and mobility (2nd. Edn.), pp. 231–259, AFB Press, New York
Foulke, E., 1991, Braille, In M. A. Heller and W. Schiff (Eds.), The psychology of touch, pp. 219–233, Erlbaum, Hillsdale, NJ
Frisoli, A., Simoncini, F., and Bergamasco, M., 2002, Mechanical design of a haptic interface for the hand, In Procs. 2002 ASME DETC 27th Biennial Mechanisms and Robotics Conference, Montreal, Canada, pp. 25–32, ASME Press, Montreal, Canada
Frisoli, A., Barbagli, F., Wu, S.-L., Ruffaldi, E. Bergamasco, M., and Salisbury, K., 2004, Evaluation of multipoint contact interfaces in haptic perception of shapes, Manuscript
Frisoli, A., Jansson, G., Bergamasco, M., and Loscos, C., 2005, Evaluation of the Pure-Form haptic displays used for exploration of works of art at museums. Procs of Worldhaptics 2005, Pisa, March 18–20, 2005, Available in Procs on CD-ROM
GALLERY, undated, http://haptic.mech.northwestern.edu/intro/gallery
Gesink, J., Guth, D., and Fehr, B., 1996, A new, talking, gyroscopic device for training blind pedestrians to walk a straight path and make accurate turns, In J.M. Tellevik and G.E. Haugum (Eds.), Conf. Procs: International Mobility Conference No. 8, Trondheim and Melhus, Norway, May 10-19, 1996, pp. 108–110, Tambartun National Resource Center for Special Education of the Visually Handicapped, Tambartun, Norway
Gibson, J.J., 1962, Observations on active touch, Psychological Review, Vol. 69, pp. 477–491
Gill, J.M., 1973, Design, production and evaluation of tactual maps for the blind, Unpublished Ph. D. thesis, University of Warwick, Warwick, UK
Gill, J.M., 1982, Production of tangible graphic displays, In W. Schiff and E. Foulke, Tactual perception: A sourcebook, pp. 405–416, Cambridge University Press, Cambridge, UK
GRAB, undated, GRAB IST-2000-26151, http://www.grab-eu.com
GRAB, 2004, Results of the validation of the maps application, GRAB IST-2000-26151, Deliverable 16/3 Guarniero, G., 1974, Experience of tactile vision, Perception, Vol. 3, pp. 101–104
Guarniero, G., 1977, Tactile vision: a personal view. Journal of Visual Impairment and Blindness, Vol. 71, pp. 125–130
GUIB project, 1995, Textual and graphical user interfaces for blind people, Final report, Royal National Institute for the Blind, London, UK
162 4 Haptics as a Substitute for Vision
Guth, D.A., and LaDuke, R., 1994, The veering tendency and blind pedestrians: An analysis of the problem and literature review, J. Visual Impairment and Blindness, Vol. 88, pp. 391–400
Guth, D.A., and Rieser, J.J., 1997, Perception and the control of locomotion by blind and visually impaired pedestrians, In B.B. Blash, W.R. Wiener and R.L. Welsh (Eds.), Foundations of orientation and mobility,(2nd. Edn.), pp. 9–38, AFB Press, New York
Guth, D.A., Hill, E., and Rieser, J., 1989, Tests of blind pedestrians’ use of traffic sounds for street crossing alignment, J. Visual Impairment and Blindness, Vol. 83, pp. 461–468
Guth, D.A., LaDuke, R., and Gesink, J., 1996, Is feedback about body rotation useful for decreasing veer and increasing the accuracy of turns? Test of a new device, In J.M. Tellevik and G.E. Haugum (Eds.), Conf. Procs: Int. Mobility Conf. No. 8, Trondheim and Melhus, Norway, May 10–19, pp. 111–113, Tambartun National Resource Center for Special Education of the Visually Handicapped, Tambartun, Norway
Hännestrand, B., 1995, Människan, samhället och ledarhunden: studier i ledarhundens historia (Man, society and studies in the history of work with guide dogs), Ph. D. thesis, Uppsala Studies in Economic History, 36 (abstract in English)
Hardwick, A.J., 2004, Rendering of Moon text on simulated tactile diagrams for blind computer users by force-feedback, In S. Ballesteros Jim´enez and M.A. Heller (Eds.), Touch, Blindness and Neuroscience, pp. 351–358, UNED Press, Madrid, Spain
Hatwell, Y., 2006, A survey of some contemporary findings on haptic perception, Plenary talk at the Eurohaptics Int. Conf., EH 2006, July 3–6, Paris France, Procs. p. 3 (abstract)
Heller, M.A., 1989, Texture perception in sighted and blind observers, Perception and Psychophysics, Vol. 45, pp. 49–54
Heller, M.A. (Ed.), 2000, Touch, representation and blindness, Oxford University Press, Oxford, UK Heller, M.A., and Schiff, W. (Eds), 1991, The psychology of touch, Erlbaum, Hillsdale, NJ
Hollins, M., Faldowski, R., Rao, S., and Young, F., 1993, Perceptual dimensions of tactile surface texture: A multidimensional scaling analysis, Perception and Psychophysics, Vol. 54, 697–705
Holmes, E., Michel, R., and Raab, A., 1995, Computerunstützte Erkundung digitaler Karten dursch Sehbehinderte (Computer supported exploration of digital maps for visually impaired people), In W. Laufenberg and J. Lötzsch (Eds.), Taktile Medien. Kolloquium über tastbare Abbildungen für Blinde, pp. 81–87, Deutsche Blindenstudienanstalt e.V., Carl-Strehl-Schule, Marburg, Germany
Holmes, E., Jansson, G., and Jansson, A., 1996, Exploring auditorily enhanced maps for travel in new environments, In D. Burger (Ed.), New technologies in the education of the visually handicapped, pp. 191–196, John Libbey Eurotext Montrouge, France
Holmes, E., Hughes, B., and Jansson, G., 1998, Haptic perception of texture gradients, Perception, Vol. 27, pp. 993–1008
Hughes, B., and Jansson, G., 1994, Texture perception via active touch, Human Movement Science, Vol. 13, pp. 301–333
James, G., and Armstrong, J.D., 1976, Handbook on mobility maps, Mobility Monographs, No 2, Blind Mobility Research Unit, Department of Psychology, University of Nottingham, Nottingham, UK Jansson, G., 1972, Symbols for tactile maps, In B. Lindquist and N. Trovald (Eds.), European Conf. on
Educational Research for the Visually Handicapped (Rep No. 31, pp. 66–77), The Uppsala Institute for Education, Dept. of Education, Uppsala, Sweden
Jansson, G., 1983, Tactile guidance of movement, Int. J. Neuroscience, Vol. 19, pp. 37–46
Jansson, G., 1993, Perception of the amount of fluid in a vessel shaken by hand, In S.S. Valenti and J.B. Pittinger (Eds.), Studies in perception and action II, Posters presented at the VIIth Int. Conf Event Perception and Action. August 8-13, 1993. University of British Columbia, Vancouver, BC, Canada, pp. 263–267, Erlbaum, Hillsdale, NJ
Jansson, G., 1995, Information about direction – a comparison of verbal and vibro-tactile information forms, In E. Holmes, A. Jansson, G. Jansson, V. Johnson and H. Petrie (Eds.), Report on full evaluation of prototype interfaces for the MoBIC Travel Aid. Unpublished report. TIDE project 1148-MoBIC
Jansson, G., 1999, Can a haptic display rendering virtual 3D objects be useful for people with visual impairment?, J. Visual Impairment and Blindness, Vol. 93, pp. 426–429
Jansson, G., 2000a, Basic issues concerning visually impaired people’s use of haptic displays, In P. Sharkey, A. Cesarani, L. Pugnatti and A. Rizzo (Eds.), 3rd Int. Conf. Disability, Virtual Reality and
References 163
Associated Technologies – Procs, pp. 33–38, 23–25 September, Alghero, Sardinia, Italy, University of Reading, Reading, UK, Also available at http://www.icdvrat.reading.ac.uk/2000/papers/2000_05.pdf Jansson, G., 2000b, Spatial orientation and mobility of people with vision impairment, In B. Silverstone, M.A., Lang, B.P., Rosenthal and E.E. Faye (Eds.), The Lighthouse handbook on visual impairment
and vision rehabilitation, pp. 359–375, Oxford University Press, New York
Jansson, G., 2001, The potential usefulness of high-tech aids for visually impaired seniors, In H.-W. Wahl and H.-E. Schulze (Eds.), On the special needs of blind and low vision seniors, pp. 231–238, IOS Press, Amsterdam
Jansson, G., 2003, Tactile maps – overview of research and development, In Y. Eriksson, G. Jansson and M. Strucel (Eds.), Tactile maps, Guidance in map production, pp. 45–78, The Swedish Braille Authority, Stockholm, Sweden
Jansson, G., and Billberger, K., 1999, The PHANToM used without visual guidance, In Proceedings of the First PHANToM Users Research Symposium (PURS99), May 21–22, 1999, Deutsches Krebsgorschungszentrum, Heidelberg, Germany
Jansson, G., and Brabyn, L., 1981, Tactually guided batting, Uppsala Psychological Reports, No. 304, Department of Psychology Uppsala University, Uppsala, Sweden
Jansson, G., and Holmes, E., 2003, Can we read depth in tactile pictures? Potentials suggested by research in tactile perception, In E. Axel and N. Levant (Eds.), Art beyond vision: A resource guide to art, creativity, and visual impairment, pp. 146–156, Art Education for the Blind and American Foundation for the Blind, New York
Jansson, G., and Iv˚as, A., 2001, Can the efficiency of a haptic display be increased by short-time practice in exploration?, In S. Brewster and R. Murray-Smith, Haptic Human-Computer Interaction, pp. 85–91, Springer, Heidelberg, Germany
Jansson, G., and Larsson, K., 2002, Identification of haptic virtual objects with different degrees of complexity, In S.A. Wall, B. Riedel, A Crossan and M.R. McGee (Eds.), Eurohaptics 2002, Conf. Procs, pp. 57–60, Edinburgh University, Edinburgh, UK
Jansson, G., and Monaci, L., 2004, Haptic identification of objects with different numbers of fingers, In S. Ballesteros Jim´enez and M.A. Heller (Eds.), Touch, Blindness and Neuroscience, pp. 203–213, UNED Press, Madrid, Spain
Jansson, G., and Monaci, L., 2005, Improving haptic displays: Providing differentiated information at the contact areas is more important than increasing the number of areas. Poster at Worldhaptics 05, Pisa, March 18–20 2005. Available in Proceedings on CD-ROM
Jansson, G., and Monaci, L., 2006, Identification of real objects under conditions similar to those in haptic displays: Providing spatially distributed information at the contact areas is more important than increasing the number of areas, Virtual Reality, Vol. 9, 243–249
Jansson, G., and Öström, M., 2004, The effects of co-location of visual and haptic space on judgements of forms, In M. Buss and M. Fritschi (Eds.), Procs 4th Int. Conf Eurohaptics 2004, pp. 516–519, Technische Universität München, München, Germany
Jansson, G., and Pedersen, P., 2005, Obtaining geographical information from a virtual map with a haptic mouse, International Cartographic Conference (Theme “Maps for Blind and Visually Impaired”), A Coruna,˜ Spain, July 9–16, Available on conference CD-ROM
Jansson, G., and Pieraccioli, C., 2004, Effects of surface properties on the perception of the form of virtual objects, In M. Buss and M. Fritschi (Eds.), Procs 4th Int. Conf. Eurohaptics 2004, pp. 211–216, Technische Universität München, München, Germany
Jansson, G., Billberger, K., Petrie, H., Colwell, C., Kornbrot, D., Fänger, J., König, H., Hardwick, A., and Furner, S., 1999, Haptic virtual environments for blind people: Exploratory experiments with two devices, Int. J. Virtual Reality, Vol. 4, pp. 10–20
Jansson, G., Bergamasco, M., and Frisoli, A., 2003, A New Option for the Visually Impaired to Experience 3D Art at Museums: Manual Exploration of Virtual Copies, Visual Impairment Research, Vol. 5, pp. 1–12
Jansson, G., Juhasz, I., and Cammilton, A., 2006, Reading virtual maps with a haptic mouse: Effects of some modifications of the tactile and audio-tactile information, British J. of Visual Impairment, Vol. 24, 60–66
Jansson, G., Juslin, P., and Poom, L., 2006, Liquid-specific properties can be utilized for haptic perception of amount of liquid in a vessel put in motion, Perception, Vol. 35, pp. 1421–1432
164 4 Haptics as a Substitute for Vision
Johnson, K., 2002, Neural basis of haptic perception, In H. Pashler and S. Yantis, Stevens’ Handbook of Experimental Psychology, Vol. 1. Sensation and Perception, 3rd Edn., pp. 537–583, Wiley, New York Kaczmarek, K.A., and Bach-y-Rita, P., 1995, Tactile displays, In W. Barfied and T. Furness III (Eds.), Virtual environments and advanced interface design, pp. 349–414, Oxford University Press, New
York
Katz, D., 1989, The world of touch, (Original work published 1925; Translated by L.E. Kreuger), Erlbaum, Hillsdale, NJ
Kelly, T., and Schwartz, L., 1999, “Talking kiosk” for the visually impaired unveiled at Penn station, Pamphlet
Kennedy, J.M., 1993, Drawing and the blind: Pictures to touch, Yale University Press, New Haven, CT Kennedy, J.M., Gabias, P., and Nicholls, A., 1991, In M. A. Heller and W. Schiff (Eds.), The psychology
of touch, pp. 263–299, Erlbaum, Hillsdale, NJ
Klatzky, R.L., and Lederman, S.J., 1995, Identifying objects from a haptic glance, Perception and Psychophysics, Vol. 57, pp. 1111–1123
Klatzky, R.L., and Lederman, S.J., 2006, The perceived roughness of resistive virtual textures: I. Rendering by a force-feedback mouse, ACM Transactions on Applied Perception, Vol. 3, pp. 1–14
Klatzky, R.L., and Lederman, S.J., 2007, Object recognition by touch, In J. Rieser, D. Ashmeed, F. Ebner and A. Com (Eds.), Blindness and brain plasticity in navigation and object perception, Mahwah, Erlbaum
Klatzky, R.L., Lederman, S.J., and Metzger, V.A., 1985, Identifying objects by touch: An “expert system”, Perception and Psychophysics, Vol. 37, pp. 299–302
Klatzky, R.L., Loomis, J.M., Lederman, S.J., Wake, H., and Fujita, N., 1993, Haptic identification of objects and their depictions, Perception and Psychophysics, Vol. 54, pp. 170–178
Klatzky, R.L., Lederman, S.J., Hamilton, C., Grindley, M., and Swendsen, R.H., 2003, Feeling textures through a probe: Effects of probe and surface geometry and exploratory factors, Perception and Psychophysics, Vol. 65, pp. 613–631
Lederman, S., 1981, The perception of surface roughness by active and passive touch, Bull. Psychonomic Society, Vol. 18, pp. 253–255
Lederman, S.J., and Campbell, 1982, Tangible graphs for the blind, Human factors, Vol. 24, pp. 85–100 Lederman, S.J., and Kinch, D.H., 1979, Texture in tactual maps and graphics for the visually handicapped,
Visual impairment and Blindness, Vol. 73, pp. 217–227
Lederman, S.J., and Klatzky, R.L., 1987, Hand movements: A window into haptic object recognition, Cognitive Psychology, Vol. 19, pp. 342–368
Lederman, S.J., and Klatzky, R.L., 1999, Sensing and displaying spatially distributed fingertip forces in haptic interfaces for teleoperators and virtual environment systems, Presence, Vol. 8, pp. 86–103 Lederman, S.J., and Klatzky, R.L., 2004a, Haptic identification of common objects: Effects of constraining
the manual exploration process. Perception and Psychophysics, Vol. 66, pp. 618–628
Lederman, S.J., and Klatzky, R.L., 2004b, Multisensory texture perception, In G.A. Calvert, C. Spence and B.E. Stein (Eds.), Handbook of multisensory processes, pp. 107–122, MIT Press, Cambridge, MA Lederman, S.J., Klatzky, R.L., Hamilton, C.L., and Ramsey, G.I., 1999, Perceiving roughness via a rigid
probe: Psychophysical effects of exploration speed and mode of touch, Haptics-e, Vol. 1, pp. 1–20 Lederman, S.J., Klatzky, R.L., Tong, C., and Hamilton, C, 2006, The perceived roughness of resistive
virtual textures: II. Effects of varying viscosity with a force-feedback device, ACM Transactions on Applied Perception, Vol. 3, pp.15–30
Leotta, D.F., Rabinowitz, W.M., Reed, C.M., and Durlach, N.I., 1988, Preliminary results of speechreception tests obtained with the synthetic Tadoma system, J. Rehabilitation Research and Development and Development, Vol. 25, pp. 45–52
Lötzsch, J., 1995, Von audio-taktilen Grafiken zu interaktiven 3D-Modellen (From audio-tactile Graphics to interactive 3D models), In W. Laufenberg and J. Lötzsch (Eds.), Taktile Medien. Kolloquium über tastbare Abbildungen für Blinde, pp. 130–136, Deutsche Blindenstudienanstalt e.V., Carl-Strehl- Schule, Marburg, Germany
Loo, C.K.C., Hall, L.A., McCloskey, D.I., and Rowe, M.J., 1983, Proprioceptive contributions to tactile identification of figures: Dependence on figure size, Behavioral Brain Research, Vol. 7, pp. 383–386
Magee, L.E., and Kennedy, J.M., 1980, Exploring pictures tactually, Nature, Vol. 283, pp. 287–288
References 165
McLaughlin, M.L., Hespanha, J.P., and Sukhatme, G.S., 2002, Touch in virtual environments: Haptics and the design of interactive systems, Prentice Hall, Upper Saddle River, NJ
Metria, 2003, Utveckling av Tactile Mapper och Tactile GIS (Development of Tactile Mapper and Tactile GIS), Metria, Kiruna, Sweden
Michel, R., 1999, Interaktiver Layoutentwurf für individuelle taktile Karten (Interactive layout plan for individual tactile maps), Ph.D. thesis, Der Fakultät für Informatik, Otto-von-Guericke Universität, Magdeburg, Germany
Millar, S., 1997, Reading by touch, Routledge, London, UK
Monkman, G., and Taylor, P., 1993, Thermal tactile sensing, IEEE Trans. Robotics and Automatation, Vol. 9, pp. 313–318
Ohngren,¨ G., 1992, Touching voices: Components of direct tactually supported speechreading, Acta Universitatis Upsaliensis, Comprehensive Summaries of Uppsala Dissertations from the Faculty of Social Sciences, 32, Uppsala University, Uppsala, Sweden
Palacz, O., and Kurcz, E., 1978, Przydatnos’c’ zmodyfikowanego elektroftalmu EWL-300 wg Starkiewicza dla niewidomych (The usefulness of modified Electrophthalm designed be Starkiewicz for the blind), Klinika Oczna, Vol. 48, pp. 61–63
PAP, undated, http://www.papenmeier.de/reha/rehae.htm
Parkes, D., 1988, “Nomad”: An audio-tactile tool for the acquisition, use and management of spatially distributed information by visually impaired people, In A.F. Tatham and A.G. Dodds (Eds.), Procs 2nd Int. Symp. on Maps and Graphics for Visually Impaired People, pp. 24–29, London, UK
Pawlik, D.T.V., van Buskirk, C.P., Killebrew, J.H., Hsiao, S.S., and Johnson, K.O., 1998, Control and pattern specification for a high density tactile array, Procs. ASME Dynamic Systems and Control Division, Vol. DSC-64, American Society of Mechanical Engineers (http://www.asme.org)
PFM,-, IST-2000-29580-PURE-FORM, http://www.pureform.org PHA, undated, http://www.sensable.com
Pietrzak, T., Pecci, I., and Martin, B., 2006, Static and dynamic tactile cues experiments with VTPlayer nouse, Procs. Eurohaptics Int. Conf., EH 2006, July 3–6, 2006, Paris, France, pp. 63–69
Rabinozitz, W.M., Henderson, D.R., Reed, C.M., Delhorne, L.A., and Durlach, N.I., 1990, Continuing evaluation of a synthetic Tadoma system, J. Acoustical Society of America, Vol. 87, pp. 88
Reed, C.M., Rabinowitz, W.M., Durlach, N.I., Braida, L.D., Conway-Fithian, S., and Schultz, M.C., 1985, Research on the Tadoma method of speech communication, J. Acoustical Society of America, Vol. 77, pp. 247–257
Reed, C.M., Durlach, N.I., and Delhorne, L.A., 1992, Natural methods of tactual communication, In I. A. Summers (Ed.), Tactile aids for the hearing impaired, pp. 218–230, Whurr Publishers, London, UK
Richardson, B.L., Symmons, M.A., and Wuillemin, D.B., 2004, The relative importance of cutaneous and kinesthetic cues in raised line drawings identification, In S. Ballesteros Jim´enez and M. A. Heller (Eds.), Touch, blindness, and neuroscience, pp. 247–250, UNED Press, Madrid, Spain
RNIB, undated, Royal National Institute of the Blind, http://www.rnib.org
Runeson, S., and Frykholm, G., 1981, Visual perception of lifted weight, J. Experimental Psychology: Human Perception and Performance, Vol. 7, pp. 733–740
Schenkman, B., 1985, Human echolocation: The detection of objects by the blind, Acta Universitatis Upsaliensis, Abstracts of Uppsala Dissertations from the Faculty of Social Sciences 36, Uppsala University, Uppsala, Sweden
Schenkman, B., and Jansson, G., 1986, The detection and localization of objects by the blind with the aid of long cane tapping sounds, Human Factors, Vol. 28, pp. 607–618
Schiff, W., and Foulke, E., 1982, Tactual perception: A sourcebook, Cambridge University Press, Cambridge, UK
Schiff, W., and Isikow, H., 1966, Stimulus redundancy in the tactile perception of histograms, Int. J. for the Education of the Blind, Vol. 16, pp. 1–11
Schultz, M.C., Norton, S.J., Conway-Fithian, S., and Reed, C.M., 1984, A survey of the use of the Tadoma method in the United States and Canada, Volta Review, Vol. 68, pp. 733–737
Shinohara, M., Shimizu, Y., and Mochizuki, A., 1998, Three-dimensional tactile display for the blind, IEEE Trans. Rehabilitation Engineering, Vol. 6, pp. 249–256
166 4 Haptics as a Substitute for Vision
Sjöström, C., 2002, Non-visual hapic interaction design: Guidelines and applications, Ph.D. thesis, Number 2:2002, Division of Rehabilitation Engineering Research, Department of Design Sciences (CERTEC), Lund Institute of Technology, Lund, Sweden
Starkiewicz, W., and Kuliszewski, Y., 1963, Active energy radiating system: the 80-channel elektroftalm, In Procs International Congress on Technology and Blindness, American Foundation for the Blind, New York
Stevens, J.C., 1992, Aging and spatial acuity of touch, J. Gerontology, Vol. 47, pp. 35–40
Stevens, J.C., Folke, E., and Patterson, M., 1996, Tactile acuity, ageing, and Braille reading in long-term blindness, J. Experimental Psychology: Applied, Vol. 2, pp. 91–106
Summers, I.R. (Ed.), 1992, Tactile aids for the hearing impaired, Whurr Publishers, London, UK Symmons, M. and Richardson, B., 2000, Raised line drawings are spontaneously explored with a single
finger, Perception, Vol. 29, pp. 621–626
Symmons, A.A., Richardson, B.L., and Wuillemin, D.B., 2004, Active versus passive touch: Superiority depends more on the task than the mode, In S. Ballesteros Jim´enez and M.A. Heller (Eds.), Touch, blindness, and neuroscience, pp. 179–185, UNED Press, Madrid, Spain
Tan, H.Z., 2006, The role of psychophysics in haptic research: An engineer’s perspective, Plenary talk at Eurohaptics Int. Conf., EH 2006, July 3–6, Paris, France, Procs. p. 5 (abstract)
Thayer, S., 1982, Social touching, In W. Schiff and E. Foulke, Tactual perception: A sourcebook, pp. 263–304, Cambridge University Press, Cambridge, UK
TRACE, undated, TRACE R&D Center, http://trace.wisc.edu/world/doc_access/
Turvey, M.T., and Carello, C., 1995, Dynamic touch, In W. Epstein and S. Rogers (Eds.), Perception of space and motion, pp. 401–490, Academic Press, San Diego, CA
Ungar, S., Simpson, A., and Blades, M., 2004, Strategies for organizing information while learning a map by blind and sighted people, In S. Ballesteros Jim´enez and M.A. Heller (Eds.), Touch, Blindness and Neuroscience, pp. 271–280, UNED Press, Madrid, Spain
van Erp, J.B.F., Carter, J., and Andrew, I., 2006, ISO’s work on tactile and haptic interaction guidelines, Eurohaptics Int. Conf., EH 2006, July 3–6, 2006, Paris, France, Procs., pp. 467–470
Vega-Bermudez, F., Johnson, K., and Hsiao, S.S., 1991, Human tactile pattern recognition: Active versus passive touch, velocity effects, and pattern of confusion, J. Neurophysiology, Vol. 65, pp. 531–546
VIEWPLUS, undated, http://www.viewplus.com/products/touch-audio-learning/ VTPL, undated, http://www.virtouch2.com
Wall, S.A., and Brewster, S., 2004, Providing external memory aids in haptic visualizations for blind computer users, Procs. Fifth International Conf. on Disability, Virtual Reality and Associated Technology, 20th–22nd Sept. 2004, Oxford, United Kingdom, pp. 157–164, University of Reading, Reading, UK, Also available at http://www.icdvrat.reading.ac.uk/2004/index.htm
Wall, S.A., and Brewster, S., 2006, Feeling what you hear: Tactile feedback for navigation of audio graphs, In Procs. ACM CHI 2006 (Montreal, Canada), ACM Press Addison-Wesley, pp. 1123–1132
Wall, S.A., Paynter, K., Shillito, A.M., Wright, M., and Scali, S., 2002, The effect of haptic feedback and stereo graphics in a 3D target acquisition task, In S.A. Wall, B. Riedel, A. Crossan and M.R. McGee (Eds.), Eurohaptics 2002, Conf. Procs, pp. 23–29, University of Edinburgh, Edinburgh, UK
White, B.W., Saunders, F.A., Scadden, L., Bach-y-Rita, P., and Collins, C.C., 1970, Seeing with the skin, Perception and Psychophysics, Vol. 7, pp. 23–27
Whitstock, R.H., Franck, L., and Haneline, R., 1997, Dog guides, In B.B. Blash, W.R. Wiener, and R.L. Welsh (Eds.), Foundations of orientation and mobility, 2nd. Edn., pp. 260–283, AFB Press, New York Wies, E.F., Gardner, J.A., O’Modhrain, S., and Bulatov, V.L., 2001, Web-based touch display for accessible science education, In S. A. Brewster and R. Murray-Smith (Eds.), Haptic human-computer
interaction, pp. 52–60, Springer, Berlin, Germany
Wing, A.M., Haggard, P., and Flanagan, J.R. (Eds.), 1996, Hand and brain: The neurophysiology and psychology of hand movements, Academic Press, San Diego, CA, USA
Yu, W., Ramloll, R., and Brewster, S., 2001, Haptic graphs for blind computer users, In S. Brewster and R. Murray-Smith (Eds.), Haptic human-computer interaction, pp. 41–51, Springer, Berlin
Zimmerman, G.J., and Roman, C.A., 1997, Services for children and adults: Standard program design, In B.B. Blash, W.R. Wiener and R.L. Welsh (Eds.), Foundations of orientation and mobility, 2nd. Edn., pp. 383–406, AFB Press, New York
