Friday, 9 November 2007

Accessibility SDKs

The Mozilla developers have an extensive extensive article on MSAA, with lots of general advice for developers.
  • Use MSAA whenever you have created custom controls where you're handling the drawing, mouse and keyboard accessibility in your own code. MSAA is the only way to let all AT's know what your code is doing.

  • Don't use MSAA if you're not running on a Windows platform ... hope that one was obvious.

  • Don't use MSAA for exposing documents or other specialized data, if it's important that the user get access to formatting information.

The RNIB as usual has some good advice on Effective Keyboard Access
"All input functionality needs to be keyboard operable but not necessarily all user interface elements. If there are multiple ways to perform a task, only one of them needs to be available from the keyboard, though it is best if all possible forms are keyboard operable."
ISO/FDIS 9241-171:2007, 9.3.2 Enable full use via keyboard, Note 5.

In particular they highlight the following issues pertinent to SL,
We often come across screens that contain a huge number of controls. There are sometimes good reasons for this but less cluttered screens are often more usable.

Tab order is a critical aspect of accessibility for a keyboard user.

It should be possible to move the keyboard focus to non-editable interface elements

This is followed by a section on The Programmatic Interface with the following key points
Access technologies need to be able to identify accurately all the different types of controls and their labels

Visible focus - This is the 'I-beam', highlight or outline that indicates which screen element has the input focus, ie where an action from the keyboard will take place. This is essential information for a keyboard or voice input user who doesn't have the luxury of just moving the mouse and clicking.

Compatibility with access technologies - This is mainly achieved by using standard accessibility services provided by the operating system and software toolkits.

In terms of Second Life, there are clients/viewers for 3 different operating systems which would imply using (at least) 3 different accessibility SDKs: OSX
KDE, Gnome (Unix)

This current pilot project will only attempt a Windows prototype client. In order to be fully cross platform, something like an abstracted accessibility API would need to be implemented in the application (similar to Mozilla's technique), wrapping the OS-specific API.

This approach would seem to be appropriate for the user navigating around the window-like elements of SL, but something more is needed to describe the main content of the screen. Whether this is sonification similar to that used in Terraformers, or a Guide-Bot as imagined by Josh Markwodt, or the desciptive and radar techniques prototyped by IBM, is as-yet unclear. User testing on a variety of prototypes would need to be conducted to have a better idea which way to procede.

No comments: