The client does not run in a browser, it runs in its own window, it does not use HTML to any great extent and therefore the Web Accessibility standards (WAI) are not sufficient and in some cases not relevant.
Anyone that has a vision impairment and uses a screen reader to access a computer and the web can not access SL, because even the textual information displayed in the client is not accessible by the screen reader.
- Include an accessibility section in the help.
- Make the help screens accessible without a mouse.
- Make the text in help sizable.
- Make any text on the client configurable for size and color, including the menus, the avatar names, messages.
- Enable the numbering of objects on the screen so that instead of having to click on an object you can choose the object by number (rather like the 'say what you see feature' in Vista).
- A text-to-voice feature for chat, in stereo so that the avatars location can be estimated, and the ability to configure the voice to fit the avatar.
- Provide a text list of avatars in the vicinity and voice announcements of entries and exits.
- Simulation of an electronic white stick.
Second Life Class Action Suit
The first, one-time barrier, is that the registration process uses a catpcha that a blind person cannot use; for a solution to this problem see ‘Bloor helps ITA do it better than Google'.
But the real problem comes with the user interface, which gives a visual representation of the SL terrain, any avatars in your vicinity, any object you can interact with, and any instruction displayed on SL notice displays. None of this information is available via a screen reader and none of it can be pointed at without a mouse. Further, the controls such as chat, search, help can only be activated by a mouse click.
No Second Life For Visually Impaired
If you access the Second Life Client viewer with a screen reader like Hal, Jaws or Window-Eyes, nothing will be spoken aloud, nothing will apear on your braille line.
Presently, not only is SL not compatible with screen readers, the SL website itself is largely inaccessible to people with visual impairments. Feedback from an online questionnaire I designed demonstrates that 8 out of 10 visually impaired users were unable to register for an account on the SL website. This is due to the fact that the site does not conform with W3C accessibility guidelines. Linked images have no alt attributes and form fields do not link correctly.
After attempting to register for an account one questionnaire participant responded by saying:
“I found no easy step by step guide that would say what to expect, or even give me any reason to overcome the obstacles for joining”… their reasons for wanting to join SL - “..an online community to join. But only if it represented a cross-section of real life. I’m not interested in anything that so flagrantly excludes disabled people”.
Accessibility and Second Life (Revised)
A student relying solely on a screen reader will be shut out from Second Life.
What to do if you have a visually impaired student in a course using Second Life? Think about what learning objectives made you choose Second Life. Is it communication? Maybe alternate chatrooms or Skype could be enabled.
Is it a visual experience? Then you can treat Second Life as you would other graphics or animation - that is, provide lots of descriptive text.
Accessibility and democracy in Second Life
It would require a tremendous amount of Alt tagging and/or audio describing to make the rich and evolving virtual world of "Second Life" intelligible,
useful and enjoyable to blind and low-vision users.
[SLED] Blind people in SL - Idle speculation
This would be easier with a client SDK that could trap text, use text to speech and allow keyboard macros, but given the existing client could we not have a HUD or head mounted scripted object that 'spoke' information. Location, people's names as they came and went, object IDs. Within the current system, these would probably have to be pre-recorded and linked to specific text, say in a notecard. Alternatively, objects in an 'accessible' area could be able to self report, say if someone approached them within a certain distance for a certain time. This area could be made the home location for participants. We could even run a competition to design accessible vending machines that used sound instead/as well as text.
To aid people with visual impairments - most people who are blind aren't actually 'blind' - it would be great to have control over field of view in the client, which could effectively allow mouse view of a small angle to be the equivalent of a magnified image, much as PC viewing software allows the whole screen to be enlarged. Sadly, this would not easily include text. However, if we had a HUD object repeating any 'heard' text in the mouselook view, then even this might be possible. This would require chat in the mouselook view...
Ah well, maybe when I have a PhD student to throw at it...
[SLED] Re: Blind people in SL - Idle speculation
However, the Second Life client doesn't currently give screen reader access to chat or IM text. In fact, you can't even read the menus with JAWS. If the client did have that most basic accessibility--chat, IM and menus--blind users would still need some assistance getting around.
[IxDA Discuss] Target.com Loses Accessibility Law Suit
I was part of a discusssion of accessibility of virtual worlds like Second Life, for people who "browse with their ears". It turned out that the first problem wasn't even in Second Life itself. It was that the login page was designed inaccessibly. People using a screen reader couldn't even get into the worlds to find out if they could use them or not. Nothing special, new or difficult. Just a login screen. But just as much a barrier as any locked door.
Three Dimensional Web Interfaces
Perhaps we should not focus exclusively on screen readers and haptics to provide access for blind people in 3D virtual reality. If the aim of virtual reality is to become more and more life like, lets think about the actual real life experience of individuals moving about in the real world and how they interact with other people.
Blind and low vision people generally are mobile outside familiar surroundings with the aid of a cane, a guide dog or a sighted companion. When more assistance is needed, there is usually a store staff person or a passerby to whom one can ask for directions or other information. This latter is not something that just blind people do. It is natural human behaviour.
Why not have a service avatar to provide a similar service. Imagine a humanoid robot like C3PO, the protocol android in Starwars, who could guide the avatar of a player, give verbal directions, describe scenes and activities, etc. This is rather like a personal tour guide. Add some more services, like language translation for players in other countries, ASL for players who are deaf, information retrieval to answer questions knowlegeably and you broaden the appeal and usefulness of such an avatar. They would serve more than just the sight impaired players.
I think there is a lot of technology that is already out there that could be brought to bear on this. In Japan, for example, some stores have robots that can greet customers and even take them to a particular department. voice and natural language recognition, text to speech and text to ASL engines, language translation software are already very advanced and improving. The underlying architecture of the virtual space must have some basic navigation functions that might respond to verbal commands in lieu of a joystick or whatever it is that players use to travel about in Second Life.
A service companion avatar should probably become a standard feature in 3D virtual reality in the same way that online help is a ubiquitous feature in Windows.