Friday 30 November 2007

SLED Accessibility Threads

Following are excerpts from the SLED mailing list, which deal with accessibility and visual impairment,


SLeek for the vision impaired?

Sean FitzGerald (seanf at tig.com.au) writes,

But the really neat thing about SLeek that leads me to think it may be useful for the vision impaired (if screen readers work with it) is that it has a "Follow" function that lets you nominate to hitch your av to another av in range... a virtual version of a vision impaired person holding the arm of a guide. It works quite well. Then the guide just has to describe the environment.



Blind people in SL - Idle speculation

Mike Reddy (mike.reddy at newport.ac.uk) writes,

This would be easier with a client SDK that could trap text, use text to speech and allow keyboard macros, but given the existing client could we not have a HUD or head mounted scripted object that 'spoke' information. Location, people's names as they came and went, object IDs. Within the current system, these would probably have to be pre-recorded and linked to specific text, say in a notecard. Alternatively, objects in an 'accessible' area could be able to self report, say if someone approached them within a certain distance for a certain time. This area could be made the home location for participants. We could even run a competition to design accessible vending machines that used sound instead/as well as text.

To aid people with visual impairments - most people who are blind aren't actually 'blind' - it would be great to have control over field of view in the client, which could effectively allow mouse view of a small angle to be the equivalent of a magnified image, much as PC viewing software allows the whole screen to be enlarged. Sadly, this would not easily include text. However, if we had a HUD object repeating any 'heard' text in the mouselook view, then even this might be possible. This would require chat in the mouselook view...

Ah well, maybe when I have a PhD student to throw at it...


Jeff Hiles (jeffrey.hiles at wright.edu) wrote

As Danielle said, right now you would have to pair a blind student with another student or with an assistant who could navigate, read, and describe what's on the screen. That's not unique to people with visual disabilities, though.

The visually impaired could participate more directly, though, if the SL client was accessible to screen readers. I know blind people who have embraced instant messaging with clients that work with JAWS. So, in theory, it would be possible for people who can't see to carry on their own text conversations in Second Life. That degree of independence, I think, would make the experience more immediate and immersive.

However, the Second Life client doesn't currently give screen reader access to chat or IM text. In fact, you can't even read the menus with JAWS. If the client did have that most basic accessibility--chat, IM and menus--blind users would still need some assistance getting around.


Accessibility

Lisa Dawley (lisadawley at boisestate.edu) writes,
I was doing a presentation in our amphitheater one day. A gentleman in a wheel chair asked me if I could make the stadium accessible, because there wasn't a seat large enough for him to "land" his wheelchair and he had to float.


Second life for the visually impaired

Roome, Thomas C (thomas.roome at student.utdallas.edu ) writes,

In the near future the Internet will make a shift from web sites to a 3D environment spaces. The same information that is on a web site can be available to people in a 3D environment, but the question is how can a 3D environment be accessible for people with disabilities? The UTD Accessibility Island will be trying to find the answers to this question. One of the, island goal is to provide information on video game accessibility and general information on the different disabilities. Another goal is to create a conference center for people to discuss different topics around Accessibility. The last major goal of the island is to provide some land for research and development, and I want to form an in world research team of scripters, programmer, educators and people with disabilities. If you would like to become a research team member, then please contact Tom06 Castro or e-mail thomas.roome at student.utdallas.edu



Further thoughts on people with visual disabilities in Second Life

Jeff Hiles (jeffrey.hiles at wright.edu) writes,

When I work with JAWS users in real life, they sometimes ask me to give them my arm and guide them where they need to go. What if you could "give an arm" in Second Life and lead someone around? Better yet, what if you could do that invisibly so no one else in Second Life knew you were there? The key would be for you to be able to guide someone remotely, without having to be in the same room as the person you were guiding.

For example, as a guide, you would have the ability move your friend's avatar through Second Life, and to see what that avatar would see. But your friend would have control of chat and IM. From your computer, you would move the avatar through Second Life wherever your friend asked you to take it. The two of you would communicate by voice, say through Skype, and you would describe everything you saw.


Danielle Mirliss (dmirliss at yahoo.com, Danielle Damone in SL) also comments,

I also work closely with several students on my campus that are blind and they would be willing to give us feedback on the experience.

Disability in SL

The BBC disability website, Ouch! discuss some of the appeal of SL in Staying in is the New Going Out
A new nightlcub called Wheelies officially opens its doors this Friday, the 1st September, at 9pm UK time.

Owned by Simon Stevens, who has cerebral palsy, Wheelies aims to make guests feel comfortable about disability as well as dancing, drinking and just plain having a good time.

And in the comments, Kopilo Hallard quite rightly says,
The point is that he couldn't go out and socialise and SL gives him a platform so that he can meet his needs (ie socialisation) even in his current physical state.

This gives him an escape from reality, a breath from being physically unable to do things.

Besides that point, SL is a great way to network with people from all over the world. To gain perspectives which may not be abled to be gained in the geographical region due to culture, social or other conforms.

Also SL gives developing artists both music, graphic, programming, etc a way of having more exposure which they can not just gain in their day to day life, if you like in a similar way to myspace, except the music can be played live.

Additionally, on the SLED mailing list, Jeff Hiles (Farley Scarborough in SL) recommends,
In addition to the many articles on the Web about Simon Stevens and his
Wheelies night club, you may want to look at Fez Rutherford's blog,
"2nDisability." He has created avatar animations that simulate disabilities.

Also, Cubey Terra has made three very nice wheelchairs that are available free at the GNUbie Store at Indigo. They are down the ramp and to the left.

http://slurl.com/secondlife/Indigo/195/62/40

Thursday 29 November 2007

Mailing Lists & Fora

Generally accessibility issues seem to be dealt by Linden with John Lester, AKA Pathfinder Linden.

"Pathfinder Linden: well, I'm very interested in things we might be able to do from LL's perspective to make SL more accessible

Pathfinder Linden: so please hit me with recommendations :)"

The following resources might be useful,


SLED also has a forum with a thread called Supporting Visually Impaired Users, though it looks like it hasn't had any posts since January 2007.

From that forum Jonathon Richter has the following to say,
I concur that we ought to frame the problem in terms of the various affordances that Second Life as a medium allows its users - indeed, the benefits of SL over other types of learning media are precisely the selling points as to why we want accessibility to these incredible learning environments, yes? So, first - documenting the various affordances and the skills/inputs required to successfully navigate the media and receive said benefits is crucial.
And Jeff Farley has this to point out:
"There's a slogan in the disability-rights movement that goes 'Nothing about us without us.'"

Following are some extracts from the SLED mailing list that deal with visual impairment:

Different users of Second Life

Jeff Hiles writes,
But I think the most promising technological aide lies inways to make SL chat accessible, since the SL client isn't accessible to screen readers.
If I had to accommodate someone today, I'd approach it like a stage production and provide a skilled audio describer. The describer might join the blind person or communicate through VOIP. The person could then tell the describer where to move the avatar and what to do with it, while the describer summed up the scenes along the way. If it was done right, perhaps no one in SL would know the person was blind.

Ideally, the person's screen reader would have access to his avatar's chat so that communication would be direct, not through the describer. I'm not sure if that level of accessibility is possible yet.


SimTeach host the Education Wiki and also recently published a transcript of a meeting called Supporting Students With Disabilities", which was about
"discussing accessibility issues within Second Life, with a particular focus on how to best accommodate students with disabilities when SL is used for educational purposes"


This is the summary,

* The user interface and software of SL does not currently allow much freedom in regards to how it is manipulated (e.g., mouse versus keyboard). The UI is currently not JAWS- compliant as well. The use of XML-based user interfaces in future versions could provide great flexibility for tuning the software to a user's needs.

* The vast amount of visual information in SL is currently inaccessible to residents with visual disabilities. The addition of metadata (like the ALT and LONGDESC tags used for images in HTML) was suggested. While enforcing the inclusion of helpful metadata is tricky, it was agreed that educational builds at least should adhere to a standard.

* Regarding accommodations for a student with disabilities in SL, it was suggested that equivalent RL practices could be applied. A blind student might have a companion to assist him or her. This led to an interesting question regarding whether the companion or the student or both would have avatars in SL.



The following parts of the conversation have been cut from their context and reassembled without any intervening and off-topic posts,

Kate Spatula: have anyone of you had an instance where a person with a disability, say visual issues, was involved in a class using SL?
buridan Simon: /not that i know of

Ali Andrews: not yet

Gus Plisskin: Kate: Not visual issues, but I've build footpedals for those with carpal tunnel who can't use mouse buttons.

Kate Spatula: so that's one concern... the interface is very mouse-heavy on here, isn't it?

buridan Simon: /mouse heavy as compared to?

Ali Andrews: especially when building

Janor Slichter: more keyboard commands to drive menus and actions?

Gus Plisskin: yes, but SL needs mouse heavy. An alternative would be very tough

Janor Slichter: the way gestures work in chat?

buridan Simon: i dunno, i find that i use the arrow keys a fair amount

buridan Simon: /and the tab

otakup0pe Neumann: Hello everyone. I know that lots of builders do just that

otakup0pe Neumann: Rez a cube, and use tab/arrow keys / numpad for the specifics

buridan Simon: /what would be nice is better proximity detection for friends and colleagues with audio cues.... so a friend approaches and a sound could get louder....

Rubaiyat Shatner: I think a big issue with accessibility is to somehow expose the data so that it can be read if it is text and translated if not

Corwin Carillon: if the cleint was JAWS compliant you would get some of the with HUDs buridan

Janor Slichter: Kate, are you referring to being able to add special functions, like with add-ins, to the client, to accomodate certain needs?

Kate Spatula: that's one approach that could be taken, or providing hooks for external software to use (like JAWS requires), or these could all be optiosn built in to SL

otakup0pe Neumann: I sense this is a direction that LL wants to move in... but i really have no idea

Farley Scarborough: JAWS access and keyboard access are both very standard on Windows apps

Kate Spatula: so here's the difficult question, if you had a class where SL was a key facet, and one student was blind, what would you do?

Ellie Brewster: seems to me that you'd have to get them a companion

Ellie Brewster: just as you do in a rl class

buridan Simon: /All of the students that i've had that were visually impaird had companion assigned anyway

Kate Spatula: so would they have an avatar on here or just the companion or both

Farley Scarborough: There are profesional describers we use in RL

Krisjohn Twin: @Kate: I just walked into this room, sat down at a pre-defined spot and started typing. How hard could that be to script for someone who is blind? Most of the 3D interface in SL is wasted.

buridan Simon: /it is true the 3d doesn't matter as much as proximity

Krisjohn Twin: Heck, an IRC bridge to this room would probably be more than enough to participate in this discussion.

Ellie Brewster: you can use sound files as cues. Tie them to the scenery

Farley Scarborough: Ah, but the visual's... they aren't wasted on the blind.

otakup0pe Neumann: And scripting movement will get more interesting with libsl.

Farley Scarborough: Listen to an audio described movie

Kate Spatula: so let's consider this room. could we augment it to make it more accessible beyond just visual

Farley Scarborough: the visual description is very important

Gus Plisskin: For the person who's visually-impaired, rather than blind

Ellie Brewster: what about using a different channel for viz impaired?

Gus Plisskin: with description? that'd work

otakup0pe Neumann: Do you mean chat channel Ellie ?

Ellie Brewster: yes

buridan Simon: i think someone has an irc bridge

otakup0pe Neumann: there are several

otakup0pe Neumann: we have developed one (we being my company)

otakup0pe Neumann: and i knwo tehre is one with libsl

buridan Simon: /Actually i know irc, and im bridge

otakup0pe Neumann: and the #secondlife irc channel runs one

Kate Spatula: i'm loolking right now at pictures of some famous philosophers hanging on the walls. the environment could provide a list of tagged objects to the user

Kate Spatula: which would be useful to scripters as well

otakup0pe Neumann: my company is in the process of developing a "hidden" metadata system for SL object

otakup0pe Neumann: uhh. hidden is a bad word.

otakup0pe Neumann: ubiquitous ? heh.

buridan Simon: /hah good luck with that... tagging perhaps, but object standard metadata...

Kate Spatula: the challenge, as it is in web accessibility, is making sure the data is provided

buridan Simon: /cidoc is a bugger

otakup0pe Neumann: maybe metadata is also a poor word ;)

buridan Simon: /metadata is the word... it means data about data

otakup0pe Neumann: I know. There are many kinds of metadata.

otakup0pe Neumann: And as we just saw, only so much room in a script.

Kate Spatula: i'm sure i could force rubaiyat to tag Trotsky's, but what about *insert random place* here

otakup0pe Neumann: and kate, good point again. tagging the whole grid is a daunting task =O

otakup0pe Neumann: let alone both grids !

otakup0pe Neumann: and having them all work together.

Ali Andrews: but isn't it tagged already, in the edit window?

buridan Simon: /tagging is also an area where you will have a good number of people who vary and some who actively resistantly participate by tagging wrongly

otakup0pe Neumann: That's a idfferent kind of tagging Ali.

Ali Andrews: how is it different? It can list the name, discription... it just needs to be done consistantly as we do when we build web pages

buridan Simon: 'everything is a cube'

otakup0pe Neumann: This is true Ali

otakup0pe Neumann: Consistency is the key.

Kate Spatula: there is a difference, ali.

Ali Andrews: so at least for our educational builds we can start a standard

otakup0pe Neumann: it's metadata ,but not strictly descriptive

otakup0pe Neumann: i wonder how many "objects" are around here.

Kate Spatula: web pages have a structure that supports the use of those descriptions. however, accessing just names and descriptions on here is fairly unstructured

otakup0pe Neumann: lack of consistency....

buridan Simon points out that there are standards, and it is better to attempt to conform to a standard than to create one anew

Bryan Mnemonic: does linden tag any objects with metadata at all?

Pathfinder Linden: not really, not in the sense you're thinking about

otakup0pe Neumann: object name, description, groups, that is all metadata

otakup0pe Neumann: but yeah. not too "descriptive"

Kate Spatula: here's a related issue... avatakind of like all the image alt tags that say "image"

Kate Spatula: rs and disability. aside from pathfinder with his lack of a nose and rubaiyat's inability to store fat, none of our avatars really display signs of disability

Bryan Mnemonic: I wonder if that can be added to the "edit" build window so folks ban begin adding specific tags, or a limited number of them based on a drop down menu

Kate Spatula: going back to our blind student, would his avatar have a white cane an dsunglasses?

otakup0pe Neumann: kate : it's up to them

buridan Simon: /it would seem to me that identity is up to them

Bryan Mnemonic: for instance, we riright click on this couch, and when we click "more"perhaps there could be an additional tab

Bryan Mnemonic: with metadata options

buridan Simon: /metadata that can be filled with 'arse' and related words

Farley Scarborough: Ah, but of course the blind student couldn't see where to click, so wouldn't have a mouse

Kate Spatula: i'm not saying to enforce avatar appearances, but consider the importance of avatar image to reflecting disabilitiy

otakup0pe Neumann: do you think other avatars would requrie that visual cue ?

Kate Spatula: there are two sides ot the issue. one is whether other avtars need the cues, but then there's also the importance of digital identity and one's avatar image

Pathfinder Linden: there was a recent paper about social cues and avatars in SL recently...

Farley Scarborough: To expand Kate's question... Many people with disabilities you can see in RL can go through SL without anyone knowing. Could someone who cant' see?

otakup0pe Neumann: i think it should be the option of the avatar in question

Kate Spatula: that's the point... are the avatar options enough to reflect some of these needs

otakup0pe Neumann: both sides really. does the blind student want to advertise that they are blind ?

otakup0pe Neumann: and do other avatars want those cues ?

Pathfinder Linden: aha, here it is: http://www.stanford.edu/group/vhil/papers/second_life_nonverbal.pdf

Kate Spatula: actually, they might want to advertise that they're blind, or in a wheelchair, etc. it's part of their identity and it's an issue of whether that person wants to express that and if they can express that

otakup0pe Neumann: but it's up to them.

Kate Spatula: RL example: i once roleplayed with a person who insisted that his elf cleric had a wheelchair built by gnomes

Farley Scarborough: It's only up to them if they can get about without anyone knowing.

Kate Spatula: from a disability critical studies viewoint, SL has an extreme bias towards physical perfection and able-bodiness

buridan Simon: /metadata is always a social and organizational issue first

Kate Spatula: for me, i think one of the larger open questions is how do people with disabilities view SL... both in terms of its usability and its potential

Farley Scarborough: There's a slogan among disability rights advocates: "Nothing about us without us."

Krisjohn Twin: For me, a larger question is how do people who barely know how to turn on a PC cope with SL at all? I've got plenty of able-bodied staff that have no chance coping with this environment.

Friday 23 November 2007

The Acoustic Ecology of the First Person Shooter

I posted about Mark Grimshaw's Phd thesis on another blog earlier this year, but as it's relevant to this project too I thought it might be useful to repeat the link, especially as he gave a cut down presentation at DiGRA '07 in Tokyo.

Audio RTS

It sounds like an unusual prospect, but Jean-Luc Pontico has created a convincing Sound RTS. It's cross platform (Windows, Linux, Mac) and localised (French, English, German, Italian, Spanish) and free!

I'm particularly impressed with the menu system and the speech samples used. Very clear and easy to use. Real time 3D audio works reasonably well to give the impression of units moving around, and continual reports of building progress are useful and analogous to visually observing the state of a build or progress bar.

Apparently it has mutliplayer features and allows the players to use their own maps as well. There's an active forum and blog.

This is an impressive game with high quality production values and a complete RTS implementation.

Tuesday 20 November 2007

Crackdown Audio

At the weekend I was speaking to Roland Peddie, one of my old colleagues from the games industry. He mentioned that his last game, Crackdown, received a BAFTA for its use of audio, for which he was the programmer.

Team Xbox have an interview with Raymond Usher, the audio director, about The Audio of Crackdown in which he refers to the code Roland wrote:

"a revolutionary audio shader that uses the surrounding geometry to dynamically color and time delay the direct and indirect audio reflections."
"When we heard the results of our complex Reverb/Reflections/Convolution or “Audio-Shader” system in Crackdown, we knew that we could make our gunfights sound like that, only in real-time! Because we are simulating true reflections on every 3D voice in the game, with the right content we could immerse the player in a way never before heard."


Real-time "early-reflection" processing might be useful for our current project as a way of situating the user in a complex and dynamic environment.

Funkhouser, Thomas A.; Tsingos, Nicolas; Carlbom, Ingrid; Elko, Gary; Sondhi, Mohan; West, Jim. Interactive Acoustic Modeling for Virtual Environments <http://www.cs.princeton.edu/~funk/acoustics.html> (Last accessed 22nd November 2007)



Crackdown. Real Time Worlds (Microsoft Game Studios: 20th February 2007). Xbox 360.

See also Metacritic and VGChartz

Monday 19 November 2007

Gesture Self-Voicing

It's not my intention to continue adding self-voicing capabiliities to this viewer, but I did just throw together some code which announces gestures. This currently only works on your own avatar, but clearly it would be similar to announce (visual only) gestures and animations from other nearby users.

I've commited my code changes to the repository. Please let me know if you're able to use them as I haven't tested getting the code and building from scratch, and I'm also only commiting those changes that I think are needed, but I could very well be wrong.

Wednesday 14 November 2007

Self Voicing : Proof of concept

Without wanting to jump the gun and accounce this project as a success before it's got started, I just had a nice affirmation that self voicing is useful.

I wanted to take some screenshots to illustrate accessibility problems for another post to this blog, but I was dealing with some other applications and still had my headphones on with my self-voicing SL viewer was running in the background with my avatar at the start location on Orientation Island. This is clearly a busy thoroughfare, so it's perhaps not surprising that after a while someone would try to chat to me. What did surprise me was when my viewer spoke the chat text and interrupted me from my other work.

I'd only tested it with bots and alts previously which is a bit contrived. This occasion was less of a technical affirmation and more of a social one - the viewer alerted me to something that needed my response similar to the way instant message clients use audio to alert the user. In that sense it worked: someone in SL wrote me a message and wanted me to respond. Although I was AFK for the purposes of that application I was still able to keep my presence 'alive' even without needing to see the chat message. The same thing happened a little later with a friendship notification.

Also interesting to reflect on the adage "One person's accessibility issue is another's usability issue." I am ostensibly working on an accessibility project, but it turns out to have positive usability results.

Friday 9 November 2007

Audio Game Walkthrough

I find it difficult to play audio games. I imagine this is because I do not have the quality of auditory attention that blind players do, and despite having many many years of experience playing and developing conventional games, this difference in auditory skill clearly affects my ability to design a game for a blind player.

In order to try to immerse myself in the space of an audio game I've been listening to Ivan Fegundez's walkthrough of GMA Games' Lone Wolf Mission 2.

My initial reactions to this recording and to my own playthrough of Terraformers were similar in that I felt confused and alienated by the audioscape. I wonder though if this is simply a question of interface, context and meaning that presents a similar state when playing or watching someone else play a new game.



Aside from the mildly comic interuptions from the speaker's mother and the ringing phone, I found it very interesting to listen to this game as it gave me an opportunity to try to get inside the head of an accomplished audio gamer. One of the most interesting aspects was the way I tried to adapt to the audio-only stimulus: by shutting my eyes I found that I could increase my concentration on the sounds of the game, and despite the extremely fast speech announcements after some time I found that I was filtering for only the relevant information based on pattern recognition. After hearing the spoken announcements from the game I became used to the structure of the sentences, and was able to focus my attention only on those key phrases which contained the variable data. For example "Island 100 off port twenty three hundred yards", and with this data I was able to construct a mental model in real time. Projecting myself into this mental space I felt my relation to the other game entities in terms of direction and distance, such that when the submarine's engine was running I could imagine myself moving forwards through the space, using the announcements to maintain triangulation between myself and the other objects in the water.

Accessibility SDKs

The Mozilla developers have an extensive extensive article on MSAA, with lots of general advice for developers.
  • Use MSAA whenever you have created custom controls where you're handling the drawing, mouse and keyboard accessibility in your own code. MSAA is the only way to let all AT's know what your code is doing.

  • Don't use MSAA if you're not running on a Windows platform ... hope that one was obvious.

  • Don't use MSAA for exposing documents or other specialized data, if it's important that the user get access to formatting information.

The RNIB as usual has some good advice on Effective Keyboard Access
"All input functionality needs to be keyboard operable but not necessarily all user interface elements. If there are multiple ways to perform a task, only one of them needs to be available from the keyboard, though it is best if all possible forms are keyboard operable."
ISO/FDIS 9241-171:2007, 9.3.2 Enable full use via keyboard, Note 5.

In particular they highlight the following issues pertinent to SL,
We often come across screens that contain a huge number of controls. There are sometimes good reasons for this but less cluttered screens are often more usable.

Tab order is a critical aspect of accessibility for a keyboard user.

It should be possible to move the keyboard focus to non-editable interface elements

This is followed by a section on The Programmatic Interface with the following key points
Access technologies need to be able to identify accurately all the different types of controls and their labels

Visible focus - This is the 'I-beam', highlight or outline that indicates which screen element has the input focus, ie where an action from the keyboard will take place. This is essential information for a keyboard or voice input user who doesn't have the luxury of just moving the mouse and clicking.

Compatibility with access technologies - This is mainly achieved by using standard accessibility services provided by the operating system and software toolkits.

In terms of Second Life, there are clients/viewers for 3 different operating systems which would imply using (at least) 3 different accessibility SDKs: OSX
Windows
KDE, Gnome (Unix)

This current pilot project will only attempt a Windows prototype client. In order to be fully cross platform, something like an abstracted accessibility API would need to be implemented in the application (similar to Mozilla's technique), wrapping the OS-specific API.

This approach would seem to be appropriate for the user navigating around the window-like elements of SL, but something more is needed to describe the main content of the screen. Whether this is sonification similar to that used in Terraformers, or a Guide-Bot as imagined by Josh Markwodt, or the desciptive and radar techniques prototyped by IBM, is as-yet unclear. User testing on a variety of prototypes would need to be conducted to have a better idea which way to procede.

Local Services (Brighton, UK)

These resources might be useful for contacting visually impaired people in the local area, for interviews and application testing:

National Association of Local Societies for Visually Impaired People, region 2 (South East) has a number of local societies, including The Brighton Society for the Blind.

The RNIB has a residential home in Brighton, Wavertree House

Brighton and Hove City Council Sensory Services team includes Rehabilitation Officers for the Visually Impaired (ROVI).

Thursday 8 November 2007

Accessibility Analysis Literature Review

I've been considering the work others have already conducted on analysing the inaccessiblility of SL:

Abrahams Accessibility
The client does not run in a browser, it runs in its own window, it does not use HTML to any great extent and therefore the Web Accessibility standards (WAI) are not sufficient and in some cases not relevant.
Anyone that has a vision impairment and uses a screen reader to access a computer and the web can not access SL, because even the textual information displayed in the client is not accessible by the screen reader.
  1. Include an accessibility section in the help.
  2. Make the help screens accessible without a mouse.
  3. Make the text in help sizable.
  4. Make any text on the client configurable for size and color, including the menus, the avatar names, messages.
  5. Enable the numbering of objects on the screen so that instead of having to click on an object you can choose the object by number (rather like the 'say what you see feature' in Vista).
  6. A text-to-voice feature for chat, in stereo so that the avatars location can be estimated, and the ability to configure the voice to fit the avatar.
  7. Provide a text list of avatars in the vicinity and voice announcements of entries and exits.
  8. Simulation of an electronic white stick.


Second Life Class Action Suit
The first, one-time barrier, is that the registration process uses a catpcha that a blind person cannot use; for a solution to this problem see ‘Bloor helps ITA do it better than Google'.
But the real problem comes with the user interface, which gives a visual representation of the SL terrain, any avatars in your vicinity, any object you can interact with, and any instruction displayed on SL notice displays. None of this information is available via a screen reader and none of it can be pointed at without a mouse. Further, the controls such as chat, search, help can only be activated by a mouse click.


No Second Life For Visually Impaired
If you access the Second Life Client viewer with a screen reader like Hal, Jaws or Window-Eyes, nothing will be spoken aloud, nothing will apear on your braille line.
Presently, not only is SL not compatible with screen readers, the SL website itself is largely inaccessible to people with visual impairments. Feedback from an online questionnaire I designed demonstrates that 8 out of 10 visually impaired users were unable to register for an account on the SL website. This is due to the fact that the site does not conform with W3C accessibility guidelines. Linked images have no alt attributes and form fields do not link correctly.

After attempting to register for an account one questionnaire participant responded by saying:

“I found no easy step by step guide that would say what to expect, or even give me any reason to overcome the obstacles for joining”… their reasons for wanting to join SL - “..an online community to join. But only if it represented a cross-section of real life. I’m not interested in anything that so flagrantly excludes disabled people”.


Accessibility and Second Life (Revised)
A student relying solely on a screen reader will be shut out from Second Life.

What to do if you have a visually impaired student in a course using Second Life? Think about what learning objectives made you choose Second Life. Is it communication? Maybe alternate chatrooms or Skype could be enabled.

Is it a visual experience? Then you can treat Second Life as you would other graphics or animation - that is, provide lots of descriptive text.


Accessibility and democracy in Second Life
It would require a tremendous amount of Alt tagging and/or audio describing to make the rich and evolving virtual world of "Second Life" intelligible,
useful and enjoyable to blind and low-vision users.


[SLED] Blind people in SL - Idle speculation
This would be easier with a client SDK that could trap text, use text to speech and allow keyboard macros, but given the existing client could we not have a HUD or head mounted scripted object that 'spoke' information. Location, people's names as they came and went, object IDs. Within the current system, these would probably have to be pre-recorded and linked to specific text, say in a notecard. Alternatively, objects in an 'accessible' area could be able to self report, say if someone approached them within a certain distance for a certain time. This area could be made the home location for participants. We could even run a competition to design accessible vending machines that used sound instead/as well as text.

To aid people with visual impairments - most people who are blind aren't actually 'blind' - it would be great to have control over field of view in the client, which could effectively allow mouse view of a small angle to be the equivalent of a magnified image, much as PC viewing software allows the whole screen to be enlarged. Sadly, this would not easily include text. However, if we had a HUD object repeating any 'heard' text in the mouselook view, then even this might be possible. This would require chat in the mouselook view...

Ah well, maybe when I have a PhD student to throw at it...


[SLED] Re: Blind people in SL - Idle speculation
However, the Second Life client doesn't currently give screen reader access to chat or IM text. In fact, you can't even read the menus with JAWS. If the client did have that most basic accessibility--chat, IM and menus--blind users would still need some assistance getting around.


[IxDA Discuss] Target.com Loses Accessibility Law Suit
I was part of a discusssion of accessibility of virtual worlds like Second Life, for people who "browse with their ears". It turned out that the first problem wasn't even in Second Life itself. It was that the login page was designed inaccessibly. People using a screen reader couldn't even get into the worlds to find out if they could use them or not. Nothing special, new or difficult. Just a login screen. But just as much a barrier as any locked door.


Three Dimensional Web Interfaces
Perhaps we should not focus exclusively on screen readers and haptics to provide access for blind people in 3D virtual reality. If the aim of virtual reality is to become more and more life like, lets think about the actual real life experience of individuals moving about in the real world and how they interact with other people.

Blind and low vision people generally are mobile outside familiar surroundings with the aid of a cane, a guide dog or a sighted companion. When more assistance is needed, there is usually a store staff person or a passerby to whom one can ask for directions or other information. This latter is not something that just blind people do. It is natural human behaviour.

Why not have a service avatar to provide a similar service. Imagine a humanoid robot like C3PO, the protocol android in Starwars, who could guide the avatar of a player, give verbal directions, describe scenes and activities, etc. This is rather like a personal tour guide. Add some more services, like language translation for players in other countries, ASL for players who are deaf, information retrieval to answer questions knowlegeably and you broaden the appeal and usefulness of such an avatar. They would serve more than just the sight impaired players.

I think there is a lot of technology that is already out there that could be brought to bear on this. In Japan, for example, some stores have robots that can greet customers and even take them to a particular department. voice and natural language recognition, text to speech and text to ASL engines, language translation software are already very advanced and improving. The underlying architecture of the virtual space must have some basic navigation functions that might respond to verbal commands in lieu of a joystick or whatever it is that players use to travel about in Second Life.

A service companion avatar should probably become a standard feature in 3D virtual reality in the same way that online help is a ubiquitous feature in Windows.

Tuesday 6 November 2007

Code Available

I've set up a project page on Google code where you can download the source to my viewer. You should follow the instructions on how to download and build the default viewer first, then once you've successfully got that built locally you can try using my indra directory instead.

Good luck!
Please post on the project page or here if there are any problems.

Thursday 1 November 2007

Self Voicing

I've just added self voicing to the Windows viewer.
Here's an example.

I launch SL from Visual Studio, walk up to an object called "Healthy" who chats to me. Everything he and I write in chat is spoken.
I also demo clicking the object, to which it responds with a chat and also issues me with a notification which is also spoken.



I got my inspiration for this test from the following films,





You can purchase this product from SL Exchange