Friday, 30 November 2007

SLED Accessibility Threads

Following are excerpts from the SLED mailing list, which deal with accessibility and visual impairment,


SLeek for the vision impaired?

Sean FitzGerald (seanf at tig.com.au) writes,

But the really neat thing about SLeek that leads me to think it may be useful for the vision impaired (if screen readers work with it) is that it has a "Follow" function that lets you nominate to hitch your av to another av in range... a virtual version of a vision impaired person holding the arm of a guide. It works quite well. Then the guide just has to describe the environment.



Blind people in SL - Idle speculation

Mike Reddy (mike.reddy at newport.ac.uk) writes,

This would be easier with a client SDK that could trap text, use text to speech and allow keyboard macros, but given the existing client could we not have a HUD or head mounted scripted object that 'spoke' information. Location, people's names as they came and went, object IDs. Within the current system, these would probably have to be pre-recorded and linked to specific text, say in a notecard. Alternatively, objects in an 'accessible' area could be able to self report, say if someone approached them within a certain distance for a certain time. This area could be made the home location for participants. We could even run a competition to design accessible vending machines that used sound instead/as well as text.

To aid people with visual impairments - most people who are blind aren't actually 'blind' - it would be great to have control over field of view in the client, which could effectively allow mouse view of a small angle to be the equivalent of a magnified image, much as PC viewing software allows the whole screen to be enlarged. Sadly, this would not easily include text. However, if we had a HUD object repeating any 'heard' text in the mouselook view, then even this might be possible. This would require chat in the mouselook view...

Ah well, maybe when I have a PhD student to throw at it...


Jeff Hiles (jeffrey.hiles at wright.edu) wrote

As Danielle said, right now you would have to pair a blind student with another student or with an assistant who could navigate, read, and describe what's on the screen. That's not unique to people with visual disabilities, though.

The visually impaired could participate more directly, though, if the SL client was accessible to screen readers. I know blind people who have embraced instant messaging with clients that work with JAWS. So, in theory, it would be possible for people who can't see to carry on their own text conversations in Second Life. That degree of independence, I think, would make the experience more immediate and immersive.

However, the Second Life client doesn't currently give screen reader access to chat or IM text. In fact, you can't even read the menus with JAWS. If the client did have that most basic accessibility--chat, IM and menus--blind users would still need some assistance getting around.


Accessibility

Lisa Dawley (lisadawley at boisestate.edu) writes,
I was doing a presentation in our amphitheater one day. A gentleman in a wheel chair asked me if I could make the stadium accessible, because there wasn't a seat large enough for him to "land" his wheelchair and he had to float.


Second life for the visually impaired

Roome, Thomas C (thomas.roome at student.utdallas.edu ) writes,

In the near future the Internet will make a shift from web sites to a 3D environment spaces. The same information that is on a web site can be available to people in a 3D environment, but the question is how can a 3D environment be accessible for people with disabilities? The UTD Accessibility Island will be trying to find the answers to this question. One of the, island goal is to provide information on video game accessibility and general information on the different disabilities. Another goal is to create a conference center for people to discuss different topics around Accessibility. The last major goal of the island is to provide some land for research and development, and I want to form an in world research team of scripters, programmer, educators and people with disabilities. If you would like to become a research team member, then please contact Tom06 Castro or e-mail thomas.roome at student.utdallas.edu



Further thoughts on people with visual disabilities in Second Life

Jeff Hiles (jeffrey.hiles at wright.edu) writes,

When I work with JAWS users in real life, they sometimes ask me to give them my arm and guide them where they need to go. What if you could "give an arm" in Second Life and lead someone around? Better yet, what if you could do that invisibly so no one else in Second Life knew you were there? The key would be for you to be able to guide someone remotely, without having to be in the same room as the person you were guiding.

For example, as a guide, you would have the ability move your friend's avatar through Second Life, and to see what that avatar would see. But your friend would have control of chat and IM. From your computer, you would move the avatar through Second Life wherever your friend asked you to take it. The two of you would communicate by voice, say through Skype, and you would describe everything you saw.


Danielle Mirliss (dmirliss at yahoo.com, Danielle Damone in SL) also comments,

I also work closely with several students on my campus that are blind and they would be willing to give us feedback on the experience.

4 comments:

minister of enlightenment said...

so basically, SL is not accessible. I must say, I find the solutions proposed less than compelling: I've been looking at the novint falcon: i think a touch aware interface with "gps" capabilities would be interesting. After all, that is how I navigate IRL: I hardly use a sighted guide....

http://home.novint.com/products/novint_falcon.php

Gareth R White said...

Thanks for your comments, MoE.

Your idea of using a virtual GPS is definitely interesting, and one that others have picked up on too. In some ways the virtual nature of SL makes it almost redundant though as the constraints of our physical laws do not always apply here. Specifically SL provides Search and Teleport features which allow you to find people, places and events, and instantaneously relocate your avatar to their location.
This high-level movement then is not problematic, it's rather the fine-grained control that your GPS isn't able to help you with in the real world either. Once your RL GPS or SL Search and Teleport has got you to the general area of interest you still need some means of finding your way around that space.
It's at this stage that your suggestion of the Novint Falcon would be useful as a haptic bridge between the virtual and actual worlds - a sort of white stick or long cane used as a mobility tool to feel the surrounding environment.
It would be great if haptic devices like the Novint were widespread, but in practice we'll have a much broader impact by providing a free software client that uses audio feedback to provide an impression of the surroundings rather than requiring users to buy an expensive piece of haptic equipment.
At this stage the most important goal for this research project is to quickly create some kind of interface to SL for blind users. Even if it's not terribly compelling it will an important first step in establishing an active in-world community. Long term research goals of finding the best solution would then have a better chance of success because we'd have real users to work with.

minister of enlightenment said...

what would be useful then is a text based subset of SL. The problem with 3D environments like SL's is that they require gestural interaction with that environment, exactly the kind of interaction where a voice guide doesn't help me much. It would be useful if sound were available that would aid me in navigating, and would impart a sense of direction, but i do think that some kind of haptic interface is going to be indispensible, although i see your point about providing "imperfect" access to gain critical mass. SL ported to the wii.... I must confess, I don't see your point about the need for hardware. I use a braille display that costs 1000s of dollars, so the 150k of the Novint Falcon or similar is just another piece of adaptive hardware to acquire.

Gareth R White said...

It's great to have your thoughts on the subject, thanks for coming back!

Also interesting that you mention a text only interface. I've done some work modifying Linden's client so that chat text is spoken, and am looking into making a lot more of the interface compatible with standard screen readers so that at least that aspect would be usable. There's also a very interesting project called Sleek which is a custom-built client which uses normal windows (hence is compatible with screen readers) and no 3D graphics. If you have an SL account (which presumably you don't as the registration process is notoriously inaccessible) you can use Sleek to log in and chat to other people. There are lots of features missing, but still it looks promising.

You've picked up on one of the important issues with a non-visual interface though, the absence of gestural interaction. I think a lot of communication in Second Life is based on visuals - for instance there's a large range of animations that people use as a kind of virtual body language. I'm concerned with how to communicate this kind of language to a blind user as I believe the animations can be custom-made by any user which makes it infeasible to automatically translate into descriptive text that could be spoken by a screen-reader. And although it might be technically feasible for users to manually describe their animations when their created I don't think many people would do so.

One of the benefits of being able to develop our own customised clients (as the client source code is publicly available) means that we could actually support haptic devices as an optional extra. You've convinced me that this is an avenue worth investigating. I'll discuss it with some of the other researchers here who've worked with devices like this before. Perhaps we even have some equipment in the university that I could use.

This discussion is very useful for me and I was wondering if you could send me your email address in case I need to get in touch in the future? I'm best reached through my university address which is g.white@sussex.ac.uk

Feel free to continue posting here too though as I know other people read the blog and might be following this thread.