- •Preface
- •Contents
- •1 Disability and Assistive Technology Systems
- •Learning Objectives
- •1.1 The Social Context of Disability
- •1.2 Assistive Technology Outcomes: Quality of Life
- •1.2.1 Some General Issues
- •1.2.2 Definition and Measurement of Quality of Life
- •1.2.3 Health Related Quality of Life Measurement
- •1.2.4 Assistive Technology Quality of Life Procedures
- •1.2.5 Summary and Conclusions
- •1.3 Modelling Assistive Technology Systems
- •1.3.1 Modelling Approaches: A Review
- •1.3.2 Modelling Human Activities
- •1.4 The Comprehensive Assistive Technology (CAT) Model
- •1.4.1 Justification of the Choice of Model
- •1.4.2 The Structure of the CAT Model
- •1.5 Using the Comprehensive Assistive Technology Model
- •1.5.1 Using the Activity Attribute of the CAT Model to Determine Gaps in Assistive Technology Provision
- •1.5.2 Conceptual Structure of Assistive Technology Systems
- •1.5.3 Investigating Assistive Technology Systems
- •1.5.4 Analysis of Assistive Technology Systems
- •1.5.5 Synthesis of Assistive Technology Systems
- •1.6 Chapter Summary
- •Questions
- •Projects
- •References
- •2 Perception, the Eye and Assistive Technology Issues
- •Learning Objectives
- •2.1 Perception
- •2.1.1 Introduction
- •2.1.2 Common Laws and Properties of the Different Senses
- •2.1.3 Multisensory Perception
- •2.1.4 Multisensory Perception in the Superior Colliculus
- •2.1.5 Studies of Multisensory Perception
- •2.2 The Visual System
- •2.2.1 Introduction
- •2.2.2 The Lens
- •2.2.3 The Iris and Pupil
- •2.2.4 Intraocular Pressure
- •2.2.5 Extraocular Muscles
- •2.2.6 Eyelids and Tears
- •2.3 Visual Processing in the Retina, Laternal Geniculate Nucleus and the Brain
- •2.3.1 Nerve Cells
- •2.3.2 The Retina
- •2.3.3 The Optic Nerve, Optic Tract and Optic Radiation
- •2.3.4 The Lateral Geniculate Body or Nucleus
- •2.3.5 The Primary Visual or Striate Cortex
- •2.3.6 The Extrastriate Visual Cortex and the Superior Colliculus
- •2.3.7 Visual Pathways
- •2.4 Vision in Action
- •2.4.1 Image Formation
- •2.4.2 Accommodation
- •2.4.3 Response to Light
- •2.4.4 Colour Vision
- •2.4.5 Binocular Vision and Stereopsis
- •2.5 Visual Impairment and Assistive Technology
- •2.5.1 Demographics of Visual Impairment
- •2.5.2 Illustrations of Some Types of Visual Impairment
- •2.5.3 Further Types of Visual Impairment
- •2.5.4 Colour Blindness
- •2.5.5 Corrective Lenses
- •2.6 Chapter Summary
- •Questions
- •Projects
- •References
- •3 Sight Measurement
- •Learning Objectives
- •3.1 Introduction
- •3.2 Visual Acuity
- •3.2.1 Using the Chart
- •3.2.2 Variations in Measuring Visual Acuity
- •3.3 Field of Vision Tests
- •3.3.1 The Normal Visual Field
- •3.3.2 The Tangent Screen
- •3.3.3 Kinetic Perimetry
- •3.3.4 Static Perimetry
- •3.4 Pressure Measurement
- •3.5 Biometry
- •3.6 Ocular Examination
- •3.7 Optical Coherence Tomography
- •3.7.1 Echo Delay
- •3.7.2 Low Coherence Interferometry
- •3.7.3 An OCT Scanner
- •3.8 Ocular Electrophysiology
- •3.8.1 The Electrooculogram (EOG)
- •3.8.2 The Electroretinogram (ERG)
- •3.8.3 The Pattern Electroretinogram
- •3.8.4 The Visual Evoked Cortical Potential
- •3.8.5 Multifocal Electrophysiology
- •3.9 Chapter Summary
- •Glossary
- •Questions
- •Projects
- •4 Haptics as a Substitute for Vision
- •Learning Objectives
- •4.1 Introduction
- •4.1.1 Physiological Basis
- •4.1.2 Passive Touch, Active Touch and Haptics
- •4.1.3 Exploratory Procedures
- •4.2 Vision and Haptics Compared
- •4.3 The Capacity of Bare Fingers in Real Environments
- •4.3.1 Visually Impaired People’s Use of Haptics Without any Technical Aid
- •4.3.2 Speech Perceived by Hard-of-hearing People Using Bare Hands
- •4.3.3 Natural Capacity of Touch and Evaluation of Technical Aids
- •4.4 Haptic Low-tech Aids
- •4.4.1 The Long Cane
- •4.4.2 The Guide Dog
- •4.4.3 Braille
- •4.4.4 Embossed Pictures
- •4.4.5 The Main Lesson from Low-tech Aids
- •4.5 Matrices of Point Stimuli
- •4.5.1 Aids for Orientation and Mobility
- •4.5.2 Aids for Reading Text
- •4.5.3 Aids for Reading Pictures
- •4.6 Computer-based Aids for Graphical Information
- •4.6.1 Aids for Graphical User Interfaces
- •4.6.2 Tactile Computer Mouse
- •4.7 Haptic Displays
- •4.7.1 Information Available via a Haptic Display
- •4.7.2 What Information Can Be Obtained with the Reduced Information?
- •4.7.3 Haptic Displays as Aids for the Visually Impaired
- •4.8 Chapter Summary
- •4.9 Concluding Remarks
- •Questions
- •Projects
- •References
- •5 Mobility: An Overview
- •Learning Objectives
- •5.1 Introduction
- •5.2 The Travel Activity
- •5.2.1 Understanding Mobility
- •5.2.2 Assistive Technology Systems for the Travel Process
- •5.3 The Historical Development of Travel Aids for Visually Impaired and Blind People
- •5.4 Obstacle Avoidance AT: Guide Dogs and Robotic Guide Walkers
- •5.4.1 Guide Dogs
- •5.4.2 Robotic Guides and Walkers
- •5.5 Obstacle Avoidance AT: Canes
- •5.5.1 Long Canes
- •5.5.2 Technology Canes
- •5.6 Other Mobility Assistive Technology Approaches
- •5.6.1 Clear-path Indicators
- •5.6.2 Obstacle and Object Location Detectors
- •5.6.3 The vOICe System
- •5.7 Orientation Assistive Technology Systems
- •5.7.1 Global Positioning System Orientation Technology
- •5.7.2 Other Technology Options for Orientation Systems
- •5.8 Accessible Environments
- •5.9 Chapter Summary
- •Questions
- •Projects
- •References
- •6 Mobility AT: The Batcane (UltraCane)
- •Learning Objectives
- •6.1 Mobility Background and Introduction
- •6.2 Principles of Ultrasonics
- •6.2.1 Ultrasonic Waves
- •6.2.2 Attenuation and Reflection Interactions
- •6.2.3 Transducer Geometry
- •6.3 Bats and Signal Processing
- •6.3.1 Principles of Bat Sonar
- •6.3.2 Echolocation Call Structures
- •6.3.3 Signal Processing Capabilities
- •6.3.4 Applicability of Bat Echolocation to Sonar System Design
- •6.4 Design and Construction Issues
- •6.4.1 Outline Requirement Specification
- •6.4.2 Ultrasonic Spatial Sensor Subsystem
- •6.4.3 Trial Prototype Spatial Sensor Arrangement
- •6.4.4 Tactile User Interface Subsystem
- •6.4.5 Cognitive Mapping
- •6.4.6 Embedded Processing Control Requirements
- •6.5 Concept Phase and Engineering Prototype Phase Trials
- •6.6 Case Study in Commercialisation
- •6.7 Chapter Summary
- •Questions
- •Projects
- •References
- •7 Navigation AT: Context-aware Computing
- •Learning objectives
- •7.1 Defining the Orientation/Navigation Problem
- •7.1.1 Orientation, Mobility and Navigation
- •7.1.2 Traditional Mobility Aids
- •7.1.3 Limitations of Traditional Aids
- •7.2 Cognitive Maps
- •7.2.1 Learning and Acquiring Spatial Information
- •7.2.2 Factors that Influence How Knowledge Is Acquired
- •7.2.3 The Structure and Form of Cognitive Maps
- •7.3 Overview of Existing Technologies
- •7.3.1 Technologies for Distant Navigation
- •7.3.2 User Interface Output Technologies
- •7.4 Principles of Mobile Context-aware Computing
- •7.4.1 Adding Context to User-computer Interaction
- •7.4.2 Acquiring Useful Contextual Information
- •7.4.3 Capabilities of Context-awareness
- •7.4.4 Application of Context-aware Principles
- •7.4.5 Technological Challenges and Unresolved Usability Issues
- •7.5 Test Procedures
- •7.5.1 Human Computer Interaction (HCI)
- •7.5.2 Cognitive Mapping
- •7.5.3 Overall Approach
- •7.6 Future Positioning Technologies
- •7.7 Chapter Summary
- •7.7.1 Conclusions
- •Questions
- •Projects
- •References
- •Learning Objectives
- •8.1 Defining the Navigation Problem
- •8.1.1 What is the Importance of Location Information?
- •8.1.2 What Mobility Tools and Traditional Maps are Available for the Blind?
- •8.2 Principles of Global Positioning Systems
- •8.2.1 What is the Global Positioning System?
- •8.2.2 Accuracy of GPS: Some General Issues
- •8.2.3 Accuracy of GPS: Some Technical Issues
- •8.2.4 Frequency Spectrum of GPS, Present and Future
- •8.2.5 Other GPS Systems
- •8.3 Application of GPS Principles
- •8.4 Design Issues
- •8.5 Development Issues
- •8.5.1 Choosing an Appropriate Platform
- •8.5.2 Choosing the GPS Receiver
- •8.5.3 Creating a Packaged System
- •8.5.4 Integration vs Stand-alone
- •8.6 User Interface Design Issues
- •8.6.1 How to Present the Information
- •8.6.2 When to Present the Information
- •8.6.3 What Information to Present
- •8.7 Test Procedures and Results
- •8.8 Case Study in Commercialisation
- •8.8.1 Understanding the Value of the Technology
- •8.8.2 Limitations of the Technology
- •8.8.3 Ongoing Development
- •8.9 Chapter Summary
- •Questions
- •Projects
- •References
- •9 Electronic Travel Aids: An Assessment
- •Learning Objectives
- •9.1 Introduction
- •9.2 Why Do an Assessment?
- •9.3 Methodologies for Assessments of Electronic Travel Aids
- •9.3.1 Eliciting User Requirements
- •9.3.2 Developing a User Requirements Specification and Heuristic Evaluation
- •9.3.3 Hands-on Assessments
- •9.3.4 Methodology Used for Assessments in this Chapter
- •9.4 Modern-day Electronic Travel Aids
- •9.4.1 The Distinction Between Mobility and Navigation Aids
- •9.4.2 The Distinction Between Primary and Secondary Aids
- •9.4.3 User Requirements: Mobility and Navigation Aids
- •9.4.4 Mobility Aids
- •9.4.5 Mobility Aids: Have They Solved the Mobility Challenge?
- •9.4.6 Navigation Aids
- •9.4.7 Navigation Aids: Have They Solved the Navigation Challenge?
- •9.5 Training
- •9.6 Chapter Summary and Conclusions
- •Questions
- •Projects
- •References
- •10 Accessible Environments
- •Learning Objectives
- •10.1 Introduction
- •10.1.1 Legislative and Regulatory Framework
- •10.1.2 Accessible Environments: An Overview
- •10.1.3 Principles for the Design of Accessible Environments
- •10.2 Physical Environments: The Streetscape
- •10.2.1 Pavements and Pathways
- •10.2.2 Road Crossings
- •10.2.3 Bollards and Street Furniture
- •10.3 Physical Environments: Buildings
- •10.3.1 General Exterior Issues
- •10.3.2 General Interior Issues
- •10.3.4 Signs and Notices
- •10.3.5 Interior Building Services
- •10.4 Environmental Information and Navigation Technologies
- •10.4.1 Audio Information System: General Issues
- •10.4.2 Some Technologies for Environmental Information Systems
- •10.5 Accessible Public Transport
- •10.5.1 Accessible Public Transportation: Design Issues
- •10.6 Chapter Summary
- •Questions
- •Projects
- •References
- •11 Accessible Bus System: A Bluetooth Application
- •Learning Objectives
- •11.1 Introduction
- •11.2 Bluetooth Fundamentals
- •11.2.1 Brief History of Bluetooth
- •11.2.2 Bluetooth Power Class
- •11.2.3 Protocol Stack
- •11.2.4 Bluetooth Profile
- •11.2.5 Piconet
- •11.3 Design Issues
- •11.3.1 System Architecture
- •11.3.2 Hardware Requirements
- •11.3.3 Software Requirements
- •11.4 Developmental Issues
- •11.4.1 Bluetooth Server
- •11.4.2 Bluetooth Client (Mobile Device)
- •11.4.3 User Interface
- •11.5 Commercialisation Issues
- •11.6 Chapter Summary
- •Questions
- •Projects
- •References
- •12 Accessible Information: An Overview
- •Learning Objectives
- •12.1 Introduction
- •12.2 Low Vision Aids
- •12.2.1 Basic Principles
- •12.3 Low Vision Assistive Technology Systems
- •12.3.1 Large Print
- •12.3.2 Closed Circuit Television Systems
- •12.3.3 Video Magnifiers
- •12.3.4 Telescopic Assistive Systems
- •12.4 Audio-transcription of Printed Information
- •12.4.1 Stand-alone Reading Systems
- •12.4.2 Read IT Project
- •12.5 Tactile Access to Information
- •12.5.1 Braille
- •12.5.2 Moon
- •12.5.3 Braille Devices
- •12.6 Accessible Computer Systems
- •12.6.1 Input Devices
- •12.6.2 Output Devices
- •12.6.3 Computer-based Reading Systems
- •12.6.4 Accessible Portable Computers
- •12.7 Accessible Internet
- •12.7.1 World Wide Web Guidelines
- •12.7.2 Guidelines for Web Authoring Tools
- •12.7.3 Accessible Adobe Portable Document Format (PDF) Documents
- •12.7.4 Bobby Approval
- •12.8 Telecommunications
- •12.8.1 Voice Dialling General Principles
- •12.8.2 Talking Caller ID
- •12.8.3 Mobile Telephones
- •12.9 Chapter Summary
- •Questions
- •Projects
- •References
- •13 Screen Readers and Screen Magnifiers
- •Learning Objectives
- •13.1 Introduction
- •13.2 Overview of Chapter
- •13.3 Interacting with a Graphical User Interface
- •13.4 Screen Magnifiers
- •13.4.1 Overview
- •13.4.2 Magnification Modes
- •13.4.3 Other Interface Considerations
- •13.4.4 The Architecture and Implementation of Screen Magnifiers
- •13.5 Screen Readers
- •13.5.1 Overview
- •13.5.2 The Architecture and Implementation of a Screen Reader
- •13.5.3 Using a Braille Display
- •13.5.4 User Interface Issues
- •13.6 Hybrid Screen Reader Magnifiers
- •13.7 Self-magnifying Applications
- •13.8 Self-voicing Applications
- •13.9 Application Adaptors
- •13.10 Chapter Summary
- •Questions
- •Projects
- •References
- •14 Speech, Text and Braille Conversion Technology
- •Learning Objectives
- •14.1 Introduction
- •14.1.1 Introducing Mode Conversion
- •14.1.2 Outline of the Chapter
- •14.2 Prerequisites for Speech and Text Conversion Technology
- •14.2.1 The Spectral Structure of Speech
- •14.2.2 The Hierarchical Structure of Spoken Language
- •14.2.3 Prosody
- •14.3 Speech-to-text Conversion
- •14.3.1 Principles of Pattern Recognition
- •14.3.2 Principles of Speech Recognition
- •14.3.3 Equipment and Applications
- •14.4 Text-to-speech Conversion
- •14.4.1 Principles of Speech Production
- •14.4.2 Principles of Acoustical Synthesis
- •14.4.3 Equipment and Applications
- •14.5 Braille Conversion
- •14.5.1 Introduction
- •14.5.2 Text-to-Braille Conversion
- •14.5.3 Braille-to-text Conversion
- •14.6 Commercial Equipment and Applications
- •14.6.1 Speech vs Braille
- •14.6.2 Speech Output in Devices for Daily Life
- •14.6.3 Portable Text-based Devices
- •14.6.4 Access to Computers
- •14.6.5 Reading Machines
- •14.6.6 Access to Telecommunication Devices
- •14.7 Discussion and the Future Outlook
- •14.7.1 End-user Studies
- •14.7.2 Discussion and Issues Arising
- •14.7.3 Future Developments
- •Questions
- •Projects
- •References
- •15 Accessing Books and Documents
- •Learning Objectives
- •15.1 Introduction: The Challenge of Accessing the Printed Page
- •15.2 Basics of Optical Character Recognition Technology
- •15.2.1 Details of Optical Character Recognition Technology
- •15.2.2 Practical Issues with Optical Character Recognition Technology
- •15.3 Reading Systems
- •15.4 DAISY Technology
- •15.4.1 DAISY Full Audio Books
- •15.4.2 DAISY Full Text Books
- •15.4.3 DAISY and Other Formats
- •15.5 Players
- •15.6 Accessing Textbooks
- •15.7 Accessing Newspapers
- •15.8 Future Technology Developments
- •15.9 Chapter Summary and Conclusion
- •15.9.1 Chapter Summary
- •15.9.2 Conclusion
- •Questions
- •Projects
- •References
- •Learning Objectives
- •16.1 Introduction
- •16.1.1 Print Impairments
- •16.1.2 Music Notation
- •16.2 Overview of Accessible Music
- •16.2.1 Formats
- •16.2.2 Technical Aspects
- •16.3 Some Recent Initiatives and Projects
- •16.3.2 Play 2
- •16.3.3 Dancing Dots
- •16.3.4 Toccata
- •16.4 Problems to Be Overcome
- •16.4.1 A Content Processing Layer
- •16.4.2 Standardization of Accessible Music Technology
- •16.5 Unifying Accessible Design, Technology and Musical Content
- •16.5.1 Braille Music
- •16.5.2 Talking Music
- •16.6 Conclusions
- •16.6.1 Design for All or Accessibility from Scratch
- •16.6.2 Applying Design for All in Emerging Standards
- •16.6.3 Accessibility in Emerging Technology
- •Questions
- •Projects
- •References
- •17 Assistive Technology for Daily Living
- •Learning Objectives
- •17.1 Introduction
- •17.2 Personal Care
- •17.2.1 Labelling Systems
- •17.2.2 Healthcare Monitoring
- •17.3 Time-keeping, Alarms and Alerting
- •17.3.1 Time-keeping
- •17.3.2 Alarms and Alerting
- •17.4 Food Preparation and Consumption
- •17.4.1 Talking Kitchen Scales
- •17.4.2 Talking Measuring Jug
- •17.4.3 Liquid Level Indicator
- •17.4.4 Talking Microwave Oven
- •17.4.5 Talking Kitchen and Remote Thermometers
- •17.4.6 Braille Salt and Pepper Set
- •17.5 Environmental Control and Use of Appliances
- •17.5.1 Light Probes
- •17.5.2 Colour Probes
- •17.5.3 Talking and Tactile Thermometers and Barometers
- •17.5.4 Using Appliances
- •17.6 Money, Finance and Shopping
- •17.6.1 Mechanical Money Indicators
- •17.6.2 Electronic Money Identifiers
- •17.6.3 Electronic Purse
- •17.6.4 Automatic Teller Machines (ATMs)
- •17.7 Communications and Access to Information: Other Technologies
- •17.7.1 Information Kiosks and Other Self-service Systems
- •17.7.2 Using Smart Cards
- •17.7.3 EZ Access®
- •17.8 Chapter Summary
- •Questions
- •Projects
- •References
- •Learning Objectives
- •18.1 Introduction
- •18.2 Education: Learning and Teaching
- •18.2.1 Accessing Educational Processes and Approaches
- •18.2.2 Educational Technologies, Devices and Tools
- •18.3 Employment
- •18.3.1 Professional and Person-centred
- •18.3.2 Scientific and Technical
- •18.3.3 Administrative and Secretarial
- •18.3.4 Skilled and Non-skilled (Manual) Trades
- •18.3.5 Working Outside
- •18.4 Recreational Activities
- •18.4.1 Accessing the Visual, Audio and Performing Arts
- •18.4.2 Games, Puzzles, Toys and Collecting
- •18.4.3 Holidays and Visits: Museums, Galleries and Heritage Sites
- •18.4.4 Sports and Outdoor Activities
- •18.4.5 DIY, Art and Craft Activities
- •18.5 Chapter Summary
- •Questions
- •Projects
- •References
- •Biographical Sketches of the Contributors
- •Index
Biographical Sketches of the Contributors
Elizabeth M. Ball
Liz Ball began a PhD in December 2002 at the Ergonomics and Safety Research Institute, Loughborough University. Since April 2004 Liz has also worked parttime as a project co-ordinator in the Campaigns Team at Sense. Before this, in 2002 Liz completed an MSc in Information Technology (Human Factors) and in 1999 graduated with a BSc in Psychology, also from Loughborough University. In 1998 she was awarded the departmental prize for psychology and in 1999 was awarded the Sir Robert Martyn Faculty Prize for academic achievement and service to the community. Liz has also worked as an information officer for Rethink and as a learning support assistant in a mainstream school.
Liz’s research, for both her MSc and PhD, is in the field of orientation and mobility for blind and partially sighted people. Her MSc project looked at user requirements for navigation systems to be used by visually impaired pedestrians. Her PhD is looking at blind and partially sighted people’s experiences of learning to become independent travellers and is focussing on orientation and mobility training.
Paul Blenkhorn
Paul Blenkhorn is currently Professor of Assistive Technology in the School of Informatics, University of Manchester (formerly the Department of Computation, UMIST) where he has been for the past 12 years. Prior to that he was first a Research Fellow at the Research Centre for the Education of the Visually Handicapped (1983– 1986) and then Director of R&D and one of the founders of the assistive technology company Dolphin Systems (1986–1991).
For many years his interests have been in the two areas of computer access and the use of technology to support people with profound and multiple disabilities. He has developed screen readers and magnifiers on several platforms including the BBC micro, the NEC 8201 and MS DOS (in the 1980s) and more recently for Microsoft Windows.
Nicholas A. Bradley
Since August 2006, Nick has been working as a Research Fellow for NHS Education for Scotland based in Glasgow, Scotland. He undertakes research into the development of techniques for improving the quality of patient care across various health care professions. Prior to this, Nick managed a usability team at a contact
710 Biographical Sketches
centre software company called Graham Technology headquartered in Inchinnan, Scotland. In 2005 he completed a PhD in the Department of Computer and Information Sciences at Strathclyde University. His research involved investigating Human Computer Interaction (HCI) issues of context-aware mobile systems, particularly those used by people with visual impairments to navigate and orientate. Prior to this, Nick graduated in 2001 with a BSc (Hons) in Ergonomics at the Human Sciences Department at Loughborough University. He also received a Diploma of Professional Studies (DPS) after completing a student placement at Nickleby HFE (Human Factors and Ergonomics) Ltd, Glasgow, UK. On his graduation day he was awarded the Departmental Prize for Ergonomics. He is a Graduate Member of the Ergonomics Society, a research affiliate of the Transport Research Institute and a member of the Glasgow Interactive Systems Group (GIST).
David Crombie
David Crombie has an MSc in Economics from University of London and works in the Utrecht School of the Arts, Research Institute on Digital Cultures (URIDC) as well as running the International Projects department at DEDICON (formerly FNB Nederland). He is a founding member of the Openfocus Institute in Amsterdam and chairs the European Accessible Information Network. He has worked on many European accessible content projects and has collaborated with many of the major European organisations and individuals in this field. The HARMONICA project aimed to provide a solid strategic framework for networked access to music and related multimedia services he co-ordinated the MIRACLE project to build a worldwide on-line library of music in alternative formats. He is responsible for the accessibility components of the WEDELMUSIC, TEDUB, MULTIREADER and MUSIC NETWORK projects. He is currently involved in several cross-sectoral domains and initiatives relating to accessible software architectures; information modelling; and intelligent knowledge management strategies.
Mark D. Dunlop
Dr Dunlop is currently a Senior Lecturer in the Department of Computer and Information Sciences, Strathclyde University, Glasgow. Before joining Strathclyde, Mark was a senior scientist at the Danish Centre for Human Machine Interaction at the Risø National Laboratory. Prior to Risø, he was a lecturer in Computing Science at the University of Glasgow. He has co-organised three meetings of the International Conference Series on MobileHCI (MobileHCI 99, 01 and 04), has chaired the MobileHCI steering committee and is currently a member of the board of the journals Personal and Ubiquitous Computing and Advances in Human Computer Interaction. His research interests lie in the area of usability and evaluation of mobile systems, in particular context-aware systems, novel text input methods and visualisation on hand-helds.
Gareth Evans
Gareth Evans is currently a Senior Lecturer in the School of Informatics, University of Manchester (formerly the Department of Computation, UMIST). His current research interests include access to computer-based information for people with
Biographical Sketches |
711 |
print impairments; sensory stimulation systems for people with profound and multiple disabilities; and speech synthesis for minority languages. This work has led to the development of a number of commercial systems.
James R. Fruchterman
A technology entrepreneur, Jim Fruchterman has been a rocket scientist, founded two of the foremost optical character recognition companies, and developed a successful line of reading machines for the blind. Fruchterman co-founded Calera Recognition Systems in 1982. Calera developed character recognition that would allow computers to read virtually all printed text. In 1989 Fruchterman founded Arkenstone, a 501(c)(3) non-profit social enterprise, to produce reading machines for the disabled community based on the Calera technology. Following the sale of the Arkenstone product line in 2000, Fruchterman used the resulting capital to expand Benetech, with an explicit goal to use the power of technology to serve humanity. Benetech has numerous social technology ventures, including Bookshare.org, an on-line electronic books library for people with disabilities, and the Martus Human Rights Bulletin System. Fruchterman has also been active in public service, with two stints on U.S. federal advisory committees on disability issues. Fruchterman has received numerous awards for his work, including the 2006 MacArthur Fellowship and the Skoll Award for Social Entrepreneurship in 2004 and 2006. Fruchterman was named as an Outstanding Social Entrepreneur 2003 by the Schwab Foundation and, as such, has become a regular participant in the World Economic Forum in Davos, Switzerland since 2003. Fruchterman believes that technology is the ultimate leveller, allowing disadvantaged people achieve more equality in society.
Han Leong Goh
Han Leong Goh received his Bachelor degree in Engineering in 2002 from the National University of Singapore. He joined the Department of Engineering, National University of Singapore in 2002 as a PhD student and his current research interests include mobile computing and sensor networking.
Marion Hersh
Dr Hersh is currently a Senior Lecturer in the Department of Electronics and Electrical Engineering, University of Glasgow. She previously held an appointment as a Post-doctoral Research Fellow (1982–1985) at University of Sussex.
Her current research interests fall into three main areas:
•Assistive technology for deaf, blind and deafblind people.
•The application of fuzzy and soft computing techniques to sustainable design and decision making.
•Ethics and social responsibility issues in science and engineering.
She has run the successful conference series entitled ‘Vision and Hearing Impairment: Rehabilitation Engineering and Support Technologies for Independent Living’ with funding from the EC High Level Scientific Conferences Programme
712 Biographical Sketches
jointly with Prof. M. Johnson of the University of Strathclyde. Current research projects in the area of assistive technology include the development of principles of sustainable and accessible design, open and distance learning for people with mobility impairments with colleagues from Germany, Hungary and Romania and intelligent audiometry with her research student Roddy Sutherland.
As well as research, Dr Hersh also has well developed teaching activities in the area of assistive technology, including an undergraduate module written together with Prof. M.A. Johnson. She also regularly supervises undergraduate projects in assistive technology and is looking to market or otherwise distribute an alarm clock for deafblind people developed through these student projects.
Other recent research projects have included the Application of Fuzzy Methods and Other Soft Computing Techniques to Sustainable Design and Decision Making in collaboration with Dr. I. Hamburg, Institut Arbeit und Technik, Wissenschaftszentrum Nordrhein Westfalen, Germany and Mr L. Padeanau of the University of Craiova, Romania. Dr Hersh has recently completed a book on Computational Mathematics for Sustainable Development also published by Springer Verlag. Dr Hersh is a Member of the Institute of Mathematics and its Applications, the Institution of Electrical Engineers (UK) and the Institute of Electronic and Electrical Engineers (USA).
Rüdiger Hoffmann
Rüdiger Hoffmann studied Radio Engineering at the Dresden University of Technology and worked from 1971 to 1982 in the electronics industry. He finished his Dr.-Ing. and Habilitation theses in system theory. In 1982, he started his research work in speech communication at the Dresden University of Technology. In 1992, he was promoted to the Chair for Speech Communication. He is now the director of the Laboratory of Acoustics and Speech Communication.
His teaching and research topics are signal and system theory with a special focus on speech processing. He is involved in different research and industrial projects. The main research projects of his group include the participation at the speech translation project Verbmobil (1991–2000), the Dresden Speech Synthesis System (DRESS) and the unified experimental system for speech synthesis and recognition (UASR). The industrial application of the results is mainly directed to the inclusion of speech technology in embedded systems. He has written two textbooks on signal processing (in German) and made numerous contributions to scientific conferences and technical journals. He is the series editor of the “Studientexte zur Sprachkommunikation” (Lecture Notes in Speech Communication).
Brian S. Hoyle
Brian Hoyle is Professor in Image and Vision Systems in the Institute of Integrated Information Systems at the University of Leeds, UK. His interests span a number of areas that critically link sensing systems with integrated processing to yield innovative advances. An example was a monitoring system for a sub-sea robotwelding system for which he was awarded the IEE Measurement Prize. Further major work has been in spearheading the new technology of industrial process tomography (PT); he is a leading member of the world leading research group:
Biographical Sketches |
713 |
the Virtual Centre for Industrial Process Tomography (www.vcipt.org). He has published his work widely and has gained large scale international and national government and industry funding support. Work has also included the application of ultrasonic sensing in spatial sensing for aids for visually impaired people.
Knowledge/technology transfer has been a continuous thread in his work. He is founder of three spin-out companies: in component mapping systems—Graticule
(www.graticule.com); in process sensing using electrical PT Industrial Tomography Systems (www.itoms.com); and in ultrasonic sensory aids for the visually impaired, Sound Foresight. Sound Foresight was awarded the BBC “Tomorrow’s World Award for a Health Innovation” in 2002, for the revolutionary “UltraCane” that provides tactile information of potential hazards to its user (www.UltraCane.com). Its engineering was awarded the Sony-sponsored European Electronics Design Application Award in 2003 and the Horner’s Award in 2005.
Gunnar Jansson
Gunnar Jansson took his PhD in Psychology in 1969 at Uppsala University, Uppsala, Sweden, on some topics in visual event perception. He has been employed at the Department of Psychology, this university, as research assistant, research associate, and assistant research professor (1957–1979) as well as associate professor (1979–1997). In 1984 he was visiting associate professor at Department of Psychology, Cornell University, Ithaca, NY, and he has been visiting researcher at this university as well as at the Smith-Kettlewell Institute of Visual Sciences, San Francisco, CA, several times. Gunnar has made frequent visits to research laboratories in Europe, USA, Australia and Japan. He has guided some 100 students on different levels in their research tasks, from undergraduate papers up to PhD theses. Since 1997 he has been emeritus but continues with both research and teaching.
Gunnar’s research after his PhD thesis has mainly been directed towards basic and applied problems in connection with substitution of vision with other senses for the visually impaired, especially concerning aids for orientation and mobility, tactile pictures and haptic displays. In one EU project where he participated (MOBIC) the aim was to develop an orientation aid for visually impaired people, in another (Pure-Form), to develop a haptic display aiming at presentation of virtual copies of museum statues for haptic exploration. A third EU project (ENACTIVE) tries to find ways of increasing the haptic interaction between the computer user and computer, and a fourth (MICOLE) aims to develop multimodal aids for inclusion of visually impaired children. Typically, Gunnar’s research problems concern theoretical analyses and empirical experiments on what information is important for people’s solution of the tasks and on evaluations of available technical products and prototypes. His work has been reported in about 75 peerreviewed journals and books, as well as in a large number of contributions to international conferences and reports in Swedish in different contexts. He has co-edited a book on visual perception theory, G. Jansson, S.S. Bergström and W. Epstein, (1994) (eds) Perceiving events and objects published by Erlbaum, Hillsdale, NJ, USA.
714 Biographical Sketches
Tai Fook Lim Jerry
Tai Fook Lim Jerry received his degree in Bachelor of Electrical Engineering at the National University of Singapore in June 2005. He has since joined the education sector, where he is teaching at a public school. His current interests include J2ME applications and database development.
Michael A. Johnson
Professor Johnson’s academic career has concentrated on control engineering, theory and applications. He has significant industrial control applications and research experience. He is the author and co-author of books and technical papers on power generation, wastewater control, power transmission, and control benchmarking. He has had a number of student projects in the area of assistive technology and has successfully trained over 40 engineers to PhD level in the last 20 years. He is joint Editor to the Springer-Verlag London monograph series Advances in Industrial Control and to the Springer-Verlag London Advanced Textbooks in Control and Signal Processing series. He was an Associate Editor for AUTOMATICA in the years 1985–2002.
His interest in Assistive Technology came through undergraduate teaching commitments which included student group study dissertations at second year level and assorted project supervision at Masters and undergraduate level. These were used to explore some EEE aspects of assistive devices for the hearing impaired. Subsequent collaboration with Dr. Hersh at the University of Glasgow led to a set of targeted activities in this area. These included a new course module, a new conference series and some research projects.
Professor Johnson retired from academic life in 2002 and was made an Emeritus Professor of the University of Strathclyde in 2003. Emeritus Professor Johnson is a member of Deafblind UK and the British Retinitis Pigmentosa Society.
David Keating
Since 1990, David has been head of the National ElectroDiagnostic Imaging Unit at Gartnavel General Hospital in Glasgow and is employed as an NHS Consultant Medical Physicist. He is also an Honorary Reader with the University of Glasgow. Prior to working in vision science, he spent some time working on signal processing systems in cardiology and also in the development of synthetic speech aids for patients with acquired speech loss for which he was awarded a PhD in 1988.
Current research interests lie in the fields of anatomical and functional imaging of the visual system. Recent book chapters and publications have concentrated on multifocal electrophysiology and optical coherence tomography systems.
He is a member of the International Society for Electrophysiology of Vision (ISCEV) and the newly formed British Chapter (BRISECV); the Association for Research in Vision and Ophthalmology (ARVO); the Institute of Physics and the Institute of Physics in Engineering and Medicine.
Charles LaPierre
Charles LaPierre is the Chief Technical Officer and co-founder of Sendero Group LLC. He has a Bachelor and Masters degree in Electrical Engineering from Carleton
Biographical Sketches |
715 |
University, Canada. His work includes the integration of GPS and dead-reckoning technology. He is one of the inventors of Strider, the first talking GPS for the Blind which evolved from his fourth-year Bachelor Engineering project (1993), and is one of the patent holders for Strider. He has been a software engineer for a number of adaptive technology companies including VisuAide, Arkenstone, Benetech, and Sendero Group LLC.
Roger Lenoir
Roger Lenoir, E. M. MSc, Sound & Music Technology, was born in 1972, Geleen. Roger Lenoir studied Music Science at the University of Amsterdam and Music Technology at the High School of the Arts of Utrecht (Hilversum). From his second year he focused on the analysis and development of abstract music representations systems with special attention for the relation between representation and performance of music. In 1998 he received his BA in Music Software Development (summa cum laude) and Electronic Music Composition. In 1998 he also received his European Media MSc Sound & Music Technology from the University of Portsmouth focussing on Music Software Development (summa cum laude) with his thesis on “Representing musical expression in computer systems”. In 2000 he became a member of the research department of FNB Nederland (now DEDICON), where his activities focused on the development and implementation of new computer models of sound and music for provision of (musical) information to vision and print impaired users. Since his appointment to this positions he has been working towards the notion of fundamentally accessible frameworks for information processing that incorporate various layers of abstraction into manageable and practical systems. One of the perspectives Roger has on the field of (fundamental) accessibility is his artistic view. Next to a programmer and system architect, Roger is an artist expressing himself through drawings, computer generated graphics, music and poetry. He is also a founding member of the Openfocus Institute in Amsterdam.
Michael May
Michael May has been a pioneer in new product and business development in a variety of industries for over 25 years. He was on the founding team that invented the world’s first Laser Turntable; he invented portable heat devices for medical and sports markets. Mike was the Vice President of Arkenstone where he was instrumental in the development of the Strider GPS project, beginning in 1994. He began two adaptive technology companies, CustomEyes Computer Systems and Sendero Group.
Mike May founded the Sendero Group in January 2000 to make location information accessible to blind people worldwide on laptops, PDAs and mobile phones. Three U.S. presidents have recognized him personally for his efforts in developing and promoting technology and sports for people who are blind or visually impaired. Vice-President Gore said at the White House in 1996 “It won’t be long before we see Mike and others wearing GPS devices on their wrists.” In the fall of 2001 Mike May and the Sendero Group was awarded a five-year, $2.5 million
716 Biographical Sketches
Research and Development Grant called “Wayfinding Technologies for People with Visual Impairments: Research and Development of an Integrated Platform.”
In addition to product development, Mike May lectures extensively on wayfinding technology and on the scientific and psychological aspects of vision recovery, having been totally blind from the age of three and then regaining low vision at age 46 due to stem cell and cornea transplants. He is the subject of Crashing Through by Robert Kurson, a story of risk, adventure and the man who dared to see.
Stuart Parks
Stuart Parks is currently a principal clinical scientist with North Glasgow NHS Trust. There he provides these specialist services to six Scottish Health Boards. His research interests mainly involve the investigation of spatial and temporal processing mechanisms in the visual pathway. He is currently conducting research into retino-toxic effects of GABA-ergic antiepileptic medication, new methods for imaging the effects of retinal vein thrombosis and the development of cortical and retinal imaging techniques for paediatric ophthalmology.
Stuart is a member of the International Society for Electrophysiology of Vision (ISCEV) and the newly formed British Chapter (BRISECV); the Association for Research in Vision and Ophthalmology (ARVO);nd the Institute of Physics.
Kok Kiong Tan
Kok Kiong Tan received his PhD in 1995 from the National University of Singapore (NUS). Prior to joining the university, he was a research fellow at SIMTech, a national R&D institute spearheading the promotion of R&D in local manufacturing industries, where he has been involved in managing industrial projects. He is currently an associate professor with NUS and his current research interests are in precision motion control and instrumentation, advanced process control, controller autotuning and general industrial automation.
Dean A. Waters
Dean Waters gained a BSc in Ecology from the University of East Anglia in 1988. He worked for the RSPB and National Trust on wildlife conservation until starting a PhD at the University of Bristol on the interactions between echolocating bats and their insect prey. This was followed by a post-doctoral position on echolocation signal processing using artificial neural networks. In 1995, he started a lectureship at the University of Leeds where he conducts research into bat echolocation and bioacoustics. Dean is a director of Sound Foresight Ltd, a University of Leeds spin-out company which develops ultrasonic guidance systems for the visually impaired.
