tag:blogger.com,1999:blog-88247782407951327312024-03-13T14:19:24.392+00:00Second Life for the Visually ImpairedGareth R. Whitehttp://www.blogger.com/profile/16484025446664877676noreply@blogger.comBlogger36125tag:blogger.com,1999:blog-8824778240795132731.post-91711471455393178902009-09-21T10:47:00.002+01:002009-09-21T10:53:23.984+01:00Audio Game MakerI just came across a game creation tool that I thought readers might be interested in. It's called <a href="http://www.audiogamemaker.com/">Audio Game Maker</a> and is designed for visually impaired users.<br /><br />Sounds interesting. Let me know if you try it out!Gareth R. Whitehttp://www.blogger.com/profile/16484025446664877676noreply@blogger.com2tag:blogger.com,1999:blog-8824778240795132731.post-74648203464252535912008-12-19T10:55:00.002+00:002008-12-19T10:59:03.887+00:00KestrelGuido Corona and Bill Carter have posted some more information about their Kestrel project at IBM,<br /><br /><a href="http://services.alphaworks.ibm.com/virtualworlds/">Virtual Worlds User Interface for the Blind</a><br /><br />Looks like really exciting stuff!<br /><br />There's a brief summary on <a href="http://www.virtualworldsnews.com/2008/12/ibm-prototyping-virtual-worlds-interface-for-the-blind.html">Virtual Worlds News</a>.Gareth R. Whitehttp://www.blogger.com/profile/16484025446664877676noreply@blogger.com1tag:blogger.com,1999:blog-8824778240795132731.post-59502370032186230962008-11-20T12:05:00.002+00:002008-11-20T12:17:34.360+00:00Related WorkI'm not involved in accessibility anymore, but did just come across some really interesting sounding work,<br /><br />The first two from <a href="http://www.icdvrat.rdg.ac.uk/2008/abstracts.htm">International Conference Series on<br />Disability, Virtual Reality and Associated Technologies</a>. I don't think the papers are available yet though.<br /><br />Accessible virtual environments for people who are blind – creating an intelligent virtual cane using the Nintendo Wii controller, L Evett, D J Brown, S Battersby, A Ridley and P Smith, Nottingham Trent University, UK<br /><blockquote>People who are blind tend to adopt sequential, route-based strategies for moving around the world. Common strategies take the self as the main frame of reference, but those who perform better in navigational tasks use more spatial, map-based strategies. Training in such strategies can improve performance. Virtual Environments have great potential, both for allowing people who are blind to explore new spaces, reducing their reliance on guides, and aiding development of more efficient spatial maps and strategies. Importantly, Lahav and Mioduser have demonstrated that, when exploring virtual spaces, people who are blind use more and different strategies than when exploring real physical spaces, and develop relatively accurate spatial representations of them. The present paper describes the design, development and evaluation of a system in which a virtual environment may be explored by people who are blind using Nintendo Wii devices, with auditory and haptic feedback. Using this technology has many advantages, not least of which are that it is mainstream, readily available and cheap. The utility of the system for exploration and navigation is demonstrated. Results strongly suggest that it allows and supports the development of spatial maps and strategies. Intelligent support is discussed.</blockquote>Virtual reality rehabilitation – what do users with disabilities want?, S M Flynn, B S Lange, S C Yeh and A A Rizzo, University of Southern California, USA<br /><blockquote>This paper will discuss preliminary findings of user preferences regarding video game and VR game-based motor rehabilitation systems within a physical therapy clinic for patients with SCI, TBI and amputation. The video game and VR systems chosen for this research were the Sony PlayStation® 2 EyeToy™, Nintendo® Wii™, and Novint® Falcon™ and an optical tracking system developed at the Institute for Creative Technologies at the University of Southern California. The overall goals of the current project were to 1) identify and define user preferences regarding the VR games and interactive systems; 2) develop new games, or manipulate the current USC-ICT games to address these user-defined characteristics that were most enjoyable and motivating to use; and 3) develop and pilot test a training protocol aimed to improve function in each of the three groups (TBI, SCI and amputation). The first goal of this research will be discussed in this paper.</blockquote>And a presentation called "<a href="http://www.ics.heacademy.ac.uk/events/presentations/669_middlesbroughpres.ppt">Serious Games for People with Physical and Cognitive Impairments</a>"<br /><blockquote>Virtual Cane/Guide Dog<br />WiiMote can be used as a pointing device, and can give auditory, visual and haptic (it rumbles) feedback<br />Virtual cane – uses auditory and haptic feedback<br />Support with an intelligent agent which gives spoken warnings and advice<br />Combine to create a virtual guide dog</blockquote><a href="http://refmanager.ntu.ac.uk/ntu.asp?username=CMP3EVETTLJ">Another publication</a>, though I'm not sure this paper's actually available yet either,<br />EVETT, L., RIDLEY, A., BATTERSBY, S. and BROWN, D., 2008. A Wiimote controlled interface to virtual environments for people who are blind - mental models and attentional demands . In: Interactive Technologies, Nottingham, November 2008 .Gareth R. Whitehttp://www.blogger.com/profile/16484025446664877676noreply@blogger.com0tag:blogger.com,1999:blog-8824778240795132731.post-83976135713913335082008-10-21T16:42:00.004+01:002008-10-21T16:49:28.210+01:00Paper AvailableOur paper is now available in the ACM Electronic Library. If you're unable to access it there you can send me an email or a message here and I'll get a copy to you. Please send an email to G.White at my university, sussex.ac.uk<br /><br />Here are example citation data,<br /><br />White, G. R., Fitzpatrick, G., and McAllister, G. 2008. <a href="http://doi.acm.org/10.1145/1413634.1413663">Toward accessible 3D virtual environments for the blind and visually impaired</a>. In <i>Proceedings of the 3rd international Conference on Digital interactive Media in Entertainment and Arts</i> (Athens, Greece, September 10 - 12, 2008). DIMEA '08, vol. 349. ACM, New York, NY, 134-141. DOI= <a href="http://doi.acm.org/10.1145/1413634.1413663">http://doi.acm.org/10.1145/1413634.1413663</a><br /><br /><br />@inproceedings{1413663,<br />author = {Gareth R. White and Geraldine Fitzpatrick and Graham McAllister},<br />title = {<a href="http://doi.acm.org/10.1145/1413634.1413663">Toward accessible 3D virtual environments for the blind and visually impaired</a>},<br />booktitle = {DIMEA '08: Proceedings of the 3rd international conference on Digital Interactive Media in Entertainment and Arts},<br />year = {2008},<br />isbn = {978-1-60558-248-1},<br />pages = {134--141},<br />location = {Athens, Greece},<br />doi = {<a href="http://doi.acm.org/10.1145/1413634.1413663">http://doi.acm.org/10.1145/1413634.1413663</a>},<br />publisher = {ACM},<br />address = {New York, NY, USA},<br />abstract = {3D virtual environments are increasingly used for education, business and recreation but are often inaccessible to users who are visually impaired, effectively creating a digital divide. Interviews with 8 visually impaired expert users were conducted to guide design proposals, and a review of current research into haptics and 3D sound for auditory displays is presented with suggestions for navigation and feedback techniques to address these accessibility issues. The diversity and volatility of the environment makes <i>Second Life</i> an unusually complex research object, suggesting the applicability of our work for the field of HCI and accessibility in 3D virtual environments.}<br />}Gareth R. Whitehttp://www.blogger.com/profile/16484025446664877676noreply@blogger.com0tag:blogger.com,1999:blog-8824778240795132731.post-5306474775465294242008-09-23T12:15:00.004+01:002008-09-23T17:17:07.498+01:00Paper PublicationLast week I returned from Athens where I presented our paper at <a href="http://www.dimea2008.org/">DIMEA 2008</a>.<br />I thought the presentation went well, with about 15-20 people in the audience. There were a few good questions at the end where we discussed amongst other things the use of synthetic verses naturalistic sounds for 3D spatialisation, from a semiotic point of view, and the potential of formal, structured data like VRML for environments like <span style="font-style: italic;">Second Life</span>.<br /><br />The presentation itself dealt with 3D virtual environments more generally than the paper, which uses <span style="font-style: italic;">SL</span> as a case study. I demonstrated screen readers and <span style="font-style: italic;">AudioQuake</span> as an example of 3D sonification in an audio game. All presentations in the conference were arranged by theme, and my paper was included in the track called "Social and Collaborative Spaces", so the emphasis of my talk was to raise awareness of the issues for blind and visually impaired users in these environments, and also to call into question just how "social and collaborative" they are when they exclude a sector of society. Following the previous day's excellent keynote by Professor Michael Meimaris, in which he talked about "Digital Natives" and "Digital Immigrants" I coined the phrase "Digital Outcasts" in the context of rapidly evolving but inaccessible technology.<br /><br />Please get in touch if you'd like a copy of the paper or presentation slides.Gareth R. Whitehttp://www.blogger.com/profile/16484025446664877676noreply@blogger.com0tag:blogger.com,1999:blog-8824778240795132731.post-31732236520208973612008-07-04T15:34:00.002+01:002008-07-04T15:38:28.519+01:00Power Up: The GameAs referred to in a <a href="http://blindsecondlife.blogspot.com/2008/07/publications.html">previous post</a>, some of the IBM folk have developed an accessible game called <a href="http://www.powerupthegame.org/">PowerUp</a> which is described in the following paper,<br /><br /><pre id="1358752">@inproceedings{1358752,<br />author = {Shari M. Trewin and Mark R. Laff and Anna Cavender and Vicki L. Hanson},<br />title = {<a href="http://doi.acm.org/10.1145/1358628.1358752">Accessibility in virtual worlds</a>},<br />booktitle = {CHI '08: CHI '08 extended abstracts on Human factors in computing systems},<br />year = {2008},<br />isbn = {978-1-60558-012-X},<br />pages = {2727--2732},<br />location = {Florence, Italy},<br />doi = {<a href="http://doi.acm.org/10.1145/1358628.1358752">http://doi.acm.org/10.1145/1358628.1358752</a>},<br />publisher = {ACM},<br />address = {New York, NY, USA},<br />abstract = {Virtual worlds present both an opportunity and a challenge to people with disabilities. Standard ways to make such worlds accessible to a broad set of users have yet to emerge, although some core requirements are already clear. This paper describes work in progress towards an accessible 3D multi-player game that includes a set of novel tools for orienting, searching and navigating the world.}<br />}<br /></pre>My first impressions suggest that it's quite similar to Second Life in some respects (fixed name lists, Orientation Center).<br /><br />I'd be interested to hear feedback from some blind or VI players.Gareth R. Whitehttp://www.blogger.com/profile/16484025446664877676noreply@blogger.com5tag:blogger.com,1999:blog-8824778240795132731.post-27741411516696223062008-07-03T15:21:00.003+01:002008-07-06T13:53:01.726+01:00Other Papers<pre id="1358752">Although our project's finished I've just coincidentally come across some interesting papers that seem relevant to our work,<br /><br />Desurvire, H. & Wiberg, C. (2007). Master of the Game: The Crucial Role of Accessibility in Future Game Design. In Wiberg, C & Wiberg, M. (eds.) Proceedings of CMID´07 - The First International Conference on Cross-Media Interaction Design, March 22-25, 2007.<br /><br /><br />@inproceedings{1358752,<br />author = {Shari M. Trewin and Mark R. Laff and Anna Cavender and Vicki L. Hanson},<br />title = {<a href="http://doi.acm.org/10.1145/1358628.1358752">Accessibility in virtual worlds</a>},<br />booktitle = {CHI '08: CHI '08 extended abstracts on Human factors in computing systems},<br />year = {2008},<br />isbn = {978-1-60558-012-X},<br />pages = {2727--2732},<br />location = {Florence, Italy},<br />doi = {<a href="http://doi.acm.org/10.1145/1358628.1358752">http://doi.acm.org/10.1145/1358628.1358752</a>},<br />publisher = {ACM},<br />address = {New York, NY, USA},<br />abstract = {Virtual worlds present both an opportunity and a challenge to people with disabilities. Standard ways to make such worlds accessible to a broad set of users have yet to emerge, although some core requirements are already clear. This paper describes work in progress towards an accessible 3D multi-player game that includes a set of novel tools for orienting, searching and navigating the world.}<br />}<br /><br />@inproceedings{354375,<br />author = {Jochen Schneider and Thomas Strothotte},<br />title = {<a href="http://doi.acm.org/10.1145/354324.354375">Constructive exploration of spatial information by blind users</a>},<br />booktitle = {Assets '00: Proceedings of the fourth international ACM conference on Assistive technologies},<br />year = {2000},<br />isbn = {1-58113-314-8},<br />pages = {188--192},<br />location = {Arlington, Virginia, United States},<br />doi = {<a href="http://doi.acm.org/10.1145/354324.354375">http://doi.acm.org/10.1145/354324.354375</a>},<br />publisher = {ACM},<br />address = {New York, NY, USA},<br />}<br /><br />@inproceedings{274525,<br />author = {Sandy Ressler and Qiming Wang},<br />title = {<a href="http://doi.acm.org/10.1145/274497.274525">Making VRML accessible for people with disabilities</a>},<br />booktitle = {Assets '98: Proceedings of the third international ACM conference on Assistive technologies},<br />year = {1998},<br />isbn = {1-58113-020-1},<br />pages = {144--148},<br />location = {Marina del Rey, California, United States},<br />doi = {<a href="http://doi.acm.org/10.1145/274497.274525">http://doi.acm.org/10.1145/274497.274525</a>},<br />publisher = {ACM},<br />address = {New York, NY, USA},<br />}<br /><br />@inproceedings{638263,<br />author = {Chieko Asakawa and Hironobu Takagi and Shuichi Ino and Tohru Ifukube},<br />title = {<a href="http://doi.acm.org/10.1145/638249.638263">Auditory and tactile interfaces for representing the visual effects on the web</a>},<br />booktitle = {Assets '02: Proceedings of the fifth international ACM conference on Assistive technologies},<br />year = {2002},<br />isbn = {1-58113-464-9},<br />pages = {65--72},<br />location = {Edinburgh, Scotland},<br />doi = {<a href="http://doi.acm.org/10.1145/638249.638263">http://doi.acm.org/10.1145/638249.638263</a>},<br />publisher = {ACM},<br />address = {New York, NY, USA},<br />}<br /><br />@inproceedings{1028657,<br />author = {Andreas Hub and Joachim Diepstraten and Thomas Ertl},<br />title = {<a href="http://doi.acm.org/10.1145/1028630.1028657">Design and development of an indoor navigation and object identification system for the blind</a>},<br />booktitle = {Assets '04: Proceedings of the 6th international ACM SIGACCESS conference on Computers and accessibility},<br />year = {2004},<br />isbn = {1-58113-911-X},<br />pages = {147--152},<br />location = {Atlanta, GA, USA},<br />doi = {<a href="http://doi.acm.org/10.1145/1028630.1028657">http://doi.acm.org/10.1145/1028630.1028657</a>},<br />publisher = {ACM},<br />address = {New York, NY, USA},<br />}<br /><br />@inproceedings{634213,<br />author = {Calle Sj\"{o}str\"{o}m},<br />title = {<a href="http://doi.acm.org/10.1145/634067.634213">Using haptics in computer interfaces for blind people</a>},<br />booktitle = {CHI '01: CHI '01 extended abstracts on Human factors in computing systems},<br />year = {2001},<br />isbn = {1-58113-340-5},<br />pages = {245--246},<br />location = {Seattle, Washington},<br />doi = {<a href="http://doi.acm.org/10.1145/634067.634213">http://doi.acm.org/10.1145/634067.634213</a>},<br />publisher = {ACM},<br />address = {New York, NY, USA},<br />}<br /><br />@inproceedings{1321285,<br />author = {Mar\'{\i}a J. Ab\'{a}solo and Jos\'{e} Mariano Della},<br />title = {<a href="http://doi.acm.org/10.1145/1321261.1321285">Magallanes: 3D navigation for everybody</a>},<br />booktitle = {GRAPHITE '07: Proceedings of the 5th international conference on Computer graphics and interactive techniques in Australia and Southeast Asia},<br />year = {2007},<br />isbn = {978-1-59593-912-8},<br />pages = {135--142},<br />location = {Perth, Australia},<br />doi = {<a href="http://doi.acm.org/10.1145/1321261.1321285">http://doi.acm.org/10.1145/1321261.1321285</a>},<br />publisher = {ACM},<br />address = {New York, NY, USA},<br />}<br /><br /></pre>Gareth R. Whitehttp://www.blogger.com/profile/16484025446664877676noreply@blogger.com0tag:blogger.com,1999:blog-8824778240795132731.post-75142404863772556822008-06-26T10:35:00.002+01:002008-06-26T10:44:29.181+01:00PublicationOur paper's been accepted for a conference in Athens called <a href="http://www.dimea2008.org/">DIMEA 2008</a>: The 3rd ACM International Conference on Digital Interactive Media in Entertainment & Arts.<br /><br />I'll include a citation and link to the paper when it's published.<br /><br />Thanks once again to everyone who participated in this study. Without your input it would have been impossible!Gareth R. Whitehttp://www.blogger.com/profile/16484025446664877676noreply@blogger.com0tag:blogger.com,1999:blog-8824778240795132731.post-67176930475903714482008-06-05T14:45:00.002+01:002008-06-05T14:49:42.639+01:00Study FinishedThank you so much to all of the people who participated in our study, and to those who offered but were not involved.<br /><br />We've now concluded this research project and no longer need any more participants.<br /><br />The result of this work was is paper which has been submitted to an academic conference for publication later on this year. We'll post more details when they're available.Gareth R. Whitehttp://www.blogger.com/profile/16484025446664877676noreply@blogger.com1tag:blogger.com,1999:blog-8824778240795132731.post-80980515965964071482008-02-11T14:03:00.001+00:002008-05-06T20:16:02.494+01:00Blind Volunteers Needed[EDIT: 6th May 2008] This stage of the project is now complete, we're no longer looking for volunteers. Many thanks to those who participated.<br /><br />---<br /><br />For the next stage of our project we will conduct interviews with people who are blind or significantly visually impaired.<br /><br />We'd like to take 30 minutes of your time for a voice chat to hear about your experiences of getting around in the real world, and any experiences you have of doing so in virtual worlds. The aim is to direct our further work developing interaction techniques for blind users in <span style="font-style: italic;">Second Life</span>.<br /><br />For more information please read the <a href="http://www.informatics.sussex.ac.uk/users/gw43/publications/The%20Accessibility%20of%20Second%20Life%20for%20Blind%20Users/Explanatory%20Statement.doc">Explanatory Statement</a> and <a href="http://www.informatics.sussex.ac.uk/users/gw43/publications/The%20Accessibility%20of%20Second%20Life%20for%20Blind%20Users/Consent%20Form.doc">Consent Form.</a><br /><br />If you'd like to participate please send an email to Gareth White, <a href="mailto:G.White@sussex.ac.uk">G.White@sussex.ac.uk</a>, confirming that you've read both of these documents and agree to the conditions listed in the consent form.Gareth R. Whitehttp://www.blogger.com/profile/16484025446664877676noreply@blogger.com5tag:blogger.com,1999:blog-8824778240795132731.post-15978010636666537872008-01-16T09:46:00.000+00:002008-01-18T14:23:46.341+00:00HapticsFollowing up from an <a href="http://blindsecondlife.blogspot.com/2007/10/haptic-wearables.html">initial thought</a> and a more extensive <a href="http://blindsecondlife.blogspot.com/2007/11/sled-accessibility-threads.html#c5402307325646810143"> conversation on another post</a> I've begun considering a multi-modal approach to this project, mixing spatial audio with haptic feedback from devices such as the <a href="http://home.novint.com/products/novint_falcon.php">Novint Falcon</a>.<br /><br /><div style="text-align: center;"><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://a362.ac-images.myspacecdn.com/images01/100/l_e60f2f3486a02c8d38907baa6aad4829.jpg"><img style="cursor: pointer; width: 200px;" src="http://a362.ac-images.myspacecdn.com/images01/100/l_e60f2f3486a02c8d38907baa6aad4829.jpg" alt="" border="0" /></a><br /></div><br />Briefly, the Falcon is a consumer 3D haptic device targeted at the mass gaming market. Bit-Tech recently published an <a href="http://www.bit-tech.net/gaming/2007/12/07/novint_falcon_limited_edition/1">in-depth review</a> which is worth reading. The product is interesting for this project as a mobility device to complement spatial audio. Think of it as a super-sensitive long cane white stick that could allow <i>SL</i> residents to reach out from the physical world and realistically feel objects in the virtual world.<br /><br />Here's an informative <a href="http://www.gametrailers.com/player/usermovies/47527.html">promotional demo</a>, and a rather <a href="http://www.gametrailers.com/player/usermovies/36783.html">enthusiastic hands-on</a> by CNET's Veronica Belmont from the CES 2007 show. Novint also gave an audio <a href="http://cmpmedia.vo.llnwd.net/o1/gdcradio-net/GAMA/Gama_025.mp3">interview</a> with Gamasutra and a <a href="http://home.novint.com/pdf/2007-03-04_GDC_Developer_Presentation.pdf">presentation</a> at the 2007 Game Developers Conference.<br /><br />Optional hardware devices are usually ignored by game developers as the effort required to support them is not justified by the small number of users who own the devices, and as there are few games that support them it's unlikely that many gamers would buy such devices: Catch 22. However as <a href="http://blindsecondlife.blogspot.com/2007/11/sled-accessibility-threads.html#c4273492896031464502%20">pointed out by a reader of this blog</a>, blind users will often spend thousands of dollars on specialist hardware such as Braille keyboards, so the $190 that the Falcon costs is a relatively small investment. Furthermore support for the device in <i>SL</i> and other open source or moddable software can be implemented by the community rather than relying on industry.<br /><br />The Falcon compares favourably to other haptic devices which cost upwards of ten times the price (for example <a href="http://www.sensable.com/products-haptic-devices.htm%20">Sensable Technologies' Phantom</a> range of haptic devices are in the range of <a href="http://inition.co.uk/inition/compare.php?SubCatID=36">multi-thousand pounds Sterling</a>.), and there are already some other academic researchers investigating haptics in <i>Second Life</i> and the Falcon in particular.<br /><br /><a href="http://jeffvandrimmelen.info/">Jeff VanDrimmelen</a>, an Academic Computing Expert in the <a href="http://oasis.unc.edu/">Office of Arts and Sciences Information Services</a>, University of North Carolina at Chapel Hill publishes research on a site called <a href="http://haptic.edutechie.com/research/">Haptic Education - Adding the Tactile Sensation to Virtual Learning</a>.<br /><br /><center><a style="left: 340px ! important; top: 0px ! important;" title="Block this object with Adblock Plus" class="abp-objtab-03419634249895045 visible ontop" href="http://www.youtube.com/v/S5h4owxpHcI&rel=1"></a><a style="left: 340px ! important; top: 0px ! important;" title="Block this object with Adblock Plus" class="abp-objtab-03419634249895045 visible ontop" href="http://www.youtube.com/v/S5h4owxpHcI&rel=1"></a><a style="left: 340px ! important; top: 0px ! important;" title="Block this object with Adblock Plus" class="abp-objtab-03419634249895045 visible ontop" href="http://www.youtube.com/v/S5h4owxpHcI&rel=1"></a><a style="left: 340px ! important; top: 0px ! important;" title="Block this object with Adblock Plus" class="abp-objtab-03419634249895045 visible ontop" href="http://www.youtube.com/v/S5h4owxpHcI&rel=1"></a><object height="355" width="425"><param name="movie" value="http://www.youtube.com/v/S5h4owxpHcI&rel=1"><param name="wmode" value="transparent"><embed src="http://www.youtube.com/v/S5h4owxpHcI&rel=1" type="application/x-shockwave-flash" wmode="transparent" height="355" width="425"></embed></object></center><br />VanDrimmelen's team have focused on another virtual environment called <a href="http://www.opencroquet.org/index.php/Main_Page">Croquet</a>, but have also considered <span style="font-style: italic;">Second Life</span> and make some interesting observations,<br /><blockquote>The <a href="http://millionsofus.com/blog/archives/24%29" title="The Second Half of Second Life: Haptics">creators of Second Life actually started their project out with a large haptic device</a>, but soon abandoned it for more financially appealing options.<br /><br />In Second Life the only way to navigate with a mouse is to bring up an on screen navigation menu that you have to click to move the avatar. It works okay when the avatar is flying, but otherwise you just end up using the buttons on the handle to move around. However, just in case anyone wants to work with the script, <a href="http://haptic.edutechie.com/files/second_life_haptic_USB.PIE" title="GlovePIE Haptic Second Life Script" target="_blank">here it is</a>.<br /></blockquote>In Linden's default client movement is controlled using the keyboard, but in my own research I have recently been able to control by walking and flying using a force feedback joystick (Logitech Wingman Strike Force 3D). This was made possible by using a free 3rd party tool called GlovePIE which VanDrimmelen's team also employed. The tool works by intercepting output from the joystick and injecting the corresponding keyboard signals, such that by moving the joystick left and right the <span style="font-style: italic;">Second Life</span> avatar turns left and right, and moving the joystick forward and backwards moves the avatar forward and back. The same technique is used by VanDrimmelen's team to use the Novint Falcon as input device for Croquet. This approach appears to offer a very quick and easy way to prototype haptics in <span style="font-style: italic;">Second Life</span>. VanDrimmelen continues, however,<blockquote>It should be noted that about the same time we found the GlovePIE software Novint announced they are working on <a href="http://home.novint.com/games/release_schedule.php" title="Novint Falcon Game Release Schedule">drivers that will work with not only Second Life, but World of Warcraft as well</a>.</blockquote>Currently both of these drivers are "in exploration phase" with no estimated completion date. Also in their (busy!) <a href="http://home.novint.com/games/release_schedule.php">release schedule</a> Novint also describe another interesting product, "Feelin' It: Blind Games™":<br /><blockquote>Novint will release a number of games that can be played entirely without sight. For example, in a bowling game, you will be able to feel the extents of the lane, feel the weight of the ball as it is thrown, and hear the pins crash down. After throwing the ball and hitting the pins, the game will bring up a touchable representation of how the ball traveled down the lane to guide the user's muscle memory for future shots, and the user will be able to feel with a 3D cursor which pins are still standing. All the information needed to play the game and become a true master, will be available without any graphics.</blockquote>Further haptic research in <span style="font-style: italic;">Second Life</span> is being conducted by <a href="http://www.dii.unisi.it/%7Emdepascale">Maurizio de Pascale</a>, <a href="http://www.dii.unisi.it/%7Emulatto">Sara Mulatto</a>, <a href="http://www.dii.unisi.it/%7Edomenico">Domenico Prattichizzo</a> from the <a href="http://sirslab.dii.unisi.it/research/haptic/">Haptics Group</a> of the <a href="http://sirslab.dii.unisi.it/">Siena Robotics and Systems Lab</a>, in the Dipartimento di Ingegneria Informatica at the University of Siena. In particular they have a paper called "<a href="http://sirslab.dii.unisi.it/research/haptic/projects/second_life_haptics/">Bringing Haptics to Second Life: A Haptics-enabled Second Life Viewer for Blind Users</a>", which is due for publication at the <a href="http://hasworkshop.org/orgncomm.shtml">"Haptic in Ambient Systems"</a> conference, which takes place in Quebec City, Canada on February 11-14, 2008.<br /><br /><div style="text-align: center;"><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://sirslab.dii.unisi.it/public/media/screenshot.second_life_h.png"><img style="cursor: pointer; width: 200px;" src="http://sirslab.dii.unisi.it/public/media/screenshot.second_life_h.png" alt="" border="0" /></a><br /></div><br />Judging from the screenshot, I would imagine that the Siena team are not using the Novint, but rather a different haptic device that has a stylus, perhaps one of <a href="http://www.sensable.com/">SensAble Technology</a>'s Phantom range which seem popular in academic research.<br /><br />Another research project that is of interest as inspiration for our <span style="font-style: italic;">Second Life</span> work is the <a href="http://www.isrg.reading.ac.uk/haptictorch/index.htm">Haptic Torch</a> from the <a href="http://www.isrg.reading.ac.uk/home.htm">Interactive Systems Research Group</a> at the University of Reading.<br /><br /><div style="text-align: center;"><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://www.isrg.reading.ac.uk/haptictorch/torch1.jpg"><img style="cursor: pointer; width: 200px;" src="http://www.isrg.reading.ac.uk/haptictorch/torch1.jpg" alt="" border="0" /></a><br /></div><blockquote>"The unique design of the torch allows users to range from sighted individuals in low-light conditions to people who are both deaf and blind. The torch provides a method of alerting users to presence of potentiol hazards using non-contact measurement techniques. An subtle tactile (touch) interface conveys relevent information to the user while not interfering with other senses." [sic]<br /></blockquote>Whereas the Haptic Torch is only capable of signifying the presence of objects, the Falcon could be used to reach out and feel their shape, and this immediate physical stimuli will assist the users construction of a mental map of the virtual space.<br /><br />Of the suppliers that Novint list as selling the Falcon, Fry's looks like the best for delivery outside the USA and Canada, although it should be noted that in the <a href="http://cmpmedia.vo.llnwd.net/o1/gdcradio-net/GAMA/Gama_025.mp3">Gamasutra podcast</a> mentioned earlier, Novint's CEO, Tom Anderson, mentioned that they are providing Falcons to game development studios for free and they have also used the Falcon as an interface for medical dental simulators with the Harvard School of Dental Medicine. With their aggressive PR policy perhaps they'd extend this generous offer to other academic research projects too?<br /><br /><br /><a href="http://shop2.outpost.com/template/help/index/FE30/Service3/Assistance/Middle_Topics/A3InternationalShipping">Fry's</a><br />"Through DHL, we quickly ship international orders just about anywhere in the world for very reasonable rates."<br /><br /><a href="http://www.skymall.com/shopping/internationalshipping.htm">SkyMall</a><br />"Many SkyMall products are available for delivery outside the United States."<br /><br /><a href="http://www.tigerdirect.ca/sectors/Help/international.asp">TigerDirect</a><br />"For all international orders, export, and distribution please contact our sales force at: 800-800-8300"<br /><br /><a href="http://www.circuitcity.com/ccd/lookLearn.do?cat=-13415&edOid=105496&c=1">CircuitCity</a><br />"Due to our manufacturer distribution agreements, we are not permitted to ship products to international addresses except APO/FPO or U.S. Territory addresses. Circuit City does not ship to Puerto Rico."<br /><br /><a href="https://www.gogamer.com/helppopup.htm?tab=ordering&file=help_shipping_info">GoGamer</a><br />"GoGamer does not ship to International Destinations at this time."<br /><br /><a href="http://www.jr.com/templates/information/pop_shoppingHelp.tem">JR</a><br />"At the present time, we only ship to the Continental U.S., Alaska, Hawaii, U.S. territories, Puerto Rico, and Canada. J&R proudly ships to our Armed Forces APO/FPO customers."Gareth R. Whitehttp://www.blogger.com/profile/16484025446664877676noreply@blogger.com5tag:blogger.com,1999:blog-8824778240795132731.post-89425465273916353472007-11-30T11:56:00.000+00:002007-11-30T12:44:10.434+00:00SLED Accessibility ThreadsFollowing are excerpts from the SLED mailing list, which deal with accessibility and visual impairment,<br /><br /><br /><a href="https://lists.secondlife.com/pipermail/educators/2007-June/010997.html">SLeek for the vision impaired?</a><br /><br />Sean FitzGerald (seanf at tig.com.au) writes,<br /><br /><blockquote>But the really neat thing about SLeek that leads me to think it may be useful for the vision impaired (if screen readers work with it) is that it has a "Follow" function that lets you nominate to hitch your av to another av in range... a virtual version of a vision impaired person holding the arm of a guide. It works quite well. Then the guide just has to describe the environment.</blockquote><br /><br /><br /><a href="https://lists.secondlife.com/pipermail/educators/2006-August/001735.html">Blind people in SL - Idle speculation</a><br /><br />Mike Reddy (mike.reddy at newport.ac.uk) writes,<br /><br /><blockquote>This would be easier with a client SDK that could trap text, use text to speech and allow keyboard macros, but given the existing client could we not have a HUD or head mounted scripted object that 'spoke' information. Location, people's names as they came and went, object IDs. Within the current system, these would probably have to be pre-recorded and linked to specific text, say in a notecard. Alternatively, objects in an 'accessible' area could be able to self report, say if someone approached them within a certain distance for a certain time. This area could be made the home location for participants. We could even run a competition to design accessible vending machines that used sound instead/as well as text.<br /><br />To aid people with visual impairments - most people who are blind aren't actually 'blind' - it would be great to have control over field of view in the client, which could effectively allow mouse view of a small angle to be the equivalent of a magnified image, much as PC viewing software allows the whole screen to be enlarged. Sadly, this would not easily include text. However, if we had a HUD object repeating any 'heard' text in the mouselook view, then even this might be possible. This would require chat in the mouselook view...<br /><br />Ah well, maybe when I have a PhD student to throw at it...</blockquote><br /><br />Jeff Hiles (jeffrey.hiles at wright.edu) <a href="https://lists.secondlife.com/pipermail/educators/2006-August/001777.html">wrote</a><br /><br /><blockquote>As Danielle said, right now you would have to pair a blind student with another student or with an assistant who could navigate, read, and describe what's on the screen. That's not unique to people with visual disabilities, though.</blockquote><br /><blockquote>The visually impaired could participate more directly, though, if the SL client was accessible to screen readers. I know blind people who have embraced instant messaging with clients that work with JAWS. So, in theory, it would be possible for people who can't see to carry on their own text conversations in Second Life. That degree of independence, I think, would make the experience more immediate and immersive.<br /><br />However, the Second Life client doesn't currently give screen reader access to chat or IM text. In fact, you can't even read the menus with JAWS. If the client did have that most basic accessibility--chat, IM and menus--blind users would still need some assistance getting around.</blockquote><br /><br /><a href="https://lists.secondlife.com/pipermail/educators/2007-November/015923.html">Accessibility</a><br /><br />Lisa Dawley (lisadawley at boisestate.edu) writes,<br /><blockquote>I was doing a presentation in our amphitheater one day. A gentleman in a wheel chair asked me if I could make the stadium accessible, because there wasn't a seat large enough for him to "land" his wheelchair and he had to float.</blockquote><br /><br /><a href="https://lists.secondlife.com/pipermail/educators/2007-May/009036.html">Second life for the visually impaired</a><br /><br />Roome, Thomas C (thomas.roome at student.utdallas.edu ) writes,<br /><br /><blockquote>In the near future the Internet will make a shift from web sites to a 3D environment spaces. The same information that is on a web site can be available to people in a 3D environment, but the question is how can a 3D environment be accessible for people with disabilities? The UTD Accessibility Island will be trying to find the answers to this question. One of the, island goal is to provide information on video game accessibility and general information on the different disabilities. Another goal is to create a conference center for people to discuss different topics around Accessibility. The last major goal of the island is to provide some land for research and development, and I want to form an in world research team of scripters, programmer, educators and people with disabilities. If you would like to become a research team member, then please contact Tom06 Castro or e-mail thomas.roome at student.utdallas.edu</blockquote><br /><br /><br /><a href="https://lists.secondlife.com/pipermail/educators/2006-August/001785.html">Further thoughts on people with visual disabilities in Second Life</a><br /><br />Jeff Hiles (jeffrey.hiles at wright.edu) writes,<br /><br /><blockquote>When I work with JAWS users in real life, they sometimes ask me to give them my arm and guide them where they need to go. What if you could "give an arm" in Second Life and lead someone around? Better yet, what if you could do that invisibly so no one else in Second Life knew you were there? The key would be for you to be able to guide someone remotely, without having to be in the same room as the person you were guiding.<br /><br />For example, as a guide, you would have the ability move your friend's avatar through Second Life, and to see what that avatar would see. But your friend would have control of chat and IM. From your computer, you would move the avatar through Second Life wherever your friend asked you to take it. The two of you would communicate by voice, say through Skype, and you would describe everything you saw.</blockquote><br /><br />Danielle Mirliss (dmirliss at yahoo.com, Danielle Damone in SL) also <a href="https://lists.secondlife.com/pipermail/educators/2006-August/001787.html">comments</a>,<br /><br /><blockquote>I also work closely with several students on my campus that are blind and they would be willing to give us feedback on the experience.</blockquote>Gareth R. Whitehttp://www.blogger.com/profile/16484025446664877676noreply@blogger.com4tag:blogger.com,1999:blog-8824778240795132731.post-60833403769487320202007-11-30T11:00:00.001+00:002007-11-30T11:12:28.814+00:00Disability in SLThe BBC disability website, <i><a href="http://www.bbc.co.uk/blogs/ouch/">Ouch!</a></i> discuss some of the appeal of <i>SL</i> in <a href="http://www.bbc.co.uk/blogs/ouch/200608/staying_in_is_the_new_going_ou.shtml">Staying in is the New Going Out</a><br /><blockquote>A new nightlcub called Wheelies officially opens its doors this Friday, the 1st September, at 9pm UK time.<br /><br />Owned by Simon Stevens, who has cerebral palsy, Wheelies aims to make guests feel comfortable about disability as well as dancing, drinking and just plain having a good time.</blockquote><br />And in the comments, Kopilo Hallard quite rightly says,<br /><blockquote>The point is that he couldn't go out and socialise and SL gives him a platform so that he can meet his needs (ie socialisation) even in his current physical state.<br /><br />This gives him an escape from reality, a breath from being physically unable to do things.<br /><br />Besides that point, SL is a great way to network with people from all over the world. To gain perspectives which may not be abled to be gained in the geographical region due to culture, social or other conforms.<br /><br />Also SL gives developing artists both music, graphic, programming, etc a way of having more exposure which they can not just gain in their day to day life, if you like in a similar way to myspace, except the music can be played live.</blockquote><br />Additionally, on the SLED mailing list, Jeff Hiles (Farley Scarborough in SL) recommends,<br /><blockquote>In addition to the many articles on the Web about Simon Stevens and his<br />Wheelies night club, you may want to look at Fez Rutherford's blog,<br />"<a href="http://2ndisability.blogspot.com/">2nDisability</a>." He has created avatar animations that simulate disabilities.<br /><br />Also, Cubey Terra has made three very nice wheelchairs that are available free at the GNUbie Store at Indigo. They are down the ramp and to the left.<br /><br /><a href="http://slurl.com/secondlife/Indigo/195/62/40">http://slurl.com/secondlife/Indigo/195/62/40</a></blockquote>Gareth R. Whitehttp://www.blogger.com/profile/16484025446664877676noreply@blogger.com0tag:blogger.com,1999:blog-8824778240795132731.post-81396659871892073182007-11-29T14:41:00.000+00:002007-11-30T12:38:57.605+00:00Mailing Lists & ForaGenerally accessibility issues seem to be dealt by Linden with John Lester, AKA <a href="http://zero.hastypastry.net/pathfinder/">Pathfinder Linden</a>.<br /><br /><blockquote>"Pathfinder Linden: well, I'm very interested in things we might be able to do from LL's perspective to make SL more accessible<br /><br />Pathfinder Linden: so please hit me with recommendations :)"</blockquote><br />The following resources might be useful,<br /><br /><ul><li><a href="http://list.academ-x.com/listinfo.cgi/slrl-academ-x.com">Second Life Researchers</a> is a mailing list for researchers (obviously).</li><li><a href="https://lists.secondlife.com/cgi-bin/mailman/listinfo/healthcare">Healthcare Support and Education</a> is "for people interested in or currently using Second Life for Healthcare Support and Education".</li><li><a href="https://lists.secondlife.com/cgi-bin/mailman/listinfo/educators">Second Life Educators</a> [SLED] is a mailing list "for educators interested in or currently using Second Life".</li></ul><br />SLED also has a <a href="http://simteach.com/forum/index.php">forum</a> with a thread called <a href="http://simteach.com/forum/viewforum.php?f=14">Supporting Visually Impaired Users</a>, though it looks like it hasn't had any posts since January 2007.<br /><br />From that forum Jonathon Richter has the following to say,<br /><blockquote>I concur that we ought to frame the problem in terms of the various affordances that Second Life as a medium allows its users - indeed, the benefits of SL over other types of learning media are precisely the selling points as to why we want accessibility to these incredible learning environments, yes? So, first - documenting the various affordances and the skills/inputs required to successfully navigate the media and receive said benefits is crucial.</blockquote>And Jeff Farley has this to point out:<br /><blockquote>"There's a slogan in the disability-rights movement that goes 'Nothing about us without us.'"</blockquote><br />Following are some extracts from the SLED mailing list that deal with visual impairment:<br /><br /><a href="https://lists.secondlife.com/pipermail/educators/2007-February/006554.html">Different users of Second Life</a><br /><br />Jeff Hiles writes,<br /><blockquote>But I think the most promising technological aide lies inways to make SL chat accessible, since the SL client isn't accessible to screen readers.</blockquote><blockquote>If I had to accommodate someone today, I'd approach it like a stage production and provide a skilled audio describer. The describer might join the blind person or communicate through VOIP. The person could then tell the describer where to move the avatar and what to do with it, while the describer summed up the scenes along the way. If it was done right, perhaps no one in SL would know the person was blind.<br /><br />Ideally, the person's screen reader would have access to his avatar's chat so that communication would be direct, not through the describer. I'm not sure if that level of accessibility is possible yet.</blockquote><br /><br /><a href="http://simteach.com/wiki/index.php?title=Main_Page">SimTeach</a> host the <a href="http://www.simteach.com/wiki/index.php?title=Second_Life_Education_Wiki">Education Wiki</a> and also recently published a transcript of a meeting called <a href="http://www.simteach.com/wiki/index.php?title=Supporting_students_with_disabilities_Transcript_30_August_2006">Supporting Students With Disabilities</a>", which was about <blockquote>"discussing accessibility issues within Second Life, with a particular focus on how to best accommodate students with disabilities when SL is used for educational purposes"</blockquote><br /><br />This is the summary,<br /><br /><blockquote> * The user interface and software of SL does not currently allow much freedom in regards to how it is manipulated (e.g., mouse versus keyboard). The UI is currently not JAWS- compliant as well. The use of XML-based user interfaces in future versions could provide great flexibility for tuning the software to a user's needs.<br /><br /> * The vast amount of visual information in SL is currently inaccessible to residents with visual disabilities. The addition of metadata (like the ALT and LONGDESC tags used for images in HTML) was suggested. While enforcing the inclusion of helpful metadata is tricky, it was agreed that educational builds at least should adhere to a standard.<br /><br /> * Regarding accommodations for a student with disabilities in SL, it was suggested that equivalent RL practices could be applied. A blind student might have a companion to assist him or her. This led to an interesting question regarding whether the companion or the student or both would have avatars in SL.</blockquote><br /><br /><br />The following parts of the conversation have been cut from their context and reassembled without any intervening and off-topic posts,<br /><br /><blockquote>Kate Spatula: have anyone of you had an instance where a person with a disability, say visual issues, was involved in a class using SL?<br />buridan Simon: /not that i know of<br /><br />Ali Andrews: not yet<br /><br />Gus Plisskin: Kate: Not visual issues, but I've build footpedals for those with carpal tunnel who can't use mouse buttons.<br /><br />Kate Spatula: so that's one concern... the interface is very mouse-heavy on here, isn't it?<br /><br />buridan Simon: /mouse heavy as compared to?<br /><br />Ali Andrews: especially when building<br /><br />Janor Slichter: more keyboard commands to drive menus and actions?<br /><br />Gus Plisskin: yes, but SL needs mouse heavy. An alternative would be very tough<br /><br />Janor Slichter: the way gestures work in chat?<br /><br />buridan Simon: i dunno, i find that i use the arrow keys a fair amount<br /><br />buridan Simon: /and the tab<br /><br />otakup0pe Neumann: Hello everyone. I know that lots of builders do just that<br /><br />otakup0pe Neumann: Rez a cube, and use tab/arrow keys / numpad for the specifics<br /><br />buridan Simon: /what would be nice is better proximity detection for friends and colleagues with audio cues.... so a friend approaches and a sound could get louder....<br /><br />Rubaiyat Shatner: I think a big issue with accessibility is to somehow expose the data so that it can be read if it is text and translated if not<br /><br />Corwin Carillon: if the cleint was JAWS compliant you would get some of the with HUDs buridan<br /><br />Janor Slichter: Kate, are you referring to being able to add special functions, like with add-ins, to the client, to accomodate certain needs?<br /><br />Kate Spatula: that's one approach that could be taken, or providing hooks for external software to use (like JAWS requires), or these could all be optiosn built in to SL<br /><br />otakup0pe Neumann: I sense this is a direction that LL wants to move in... but i really have no idea<br /><br />Farley Scarborough: JAWS access and keyboard access are both very standard on Windows apps<br /><br />Kate Spatula: so here's the difficult question, if you had a class where SL was a key facet, and one student was blind, what would you do?<br /><br />Ellie Brewster: seems to me that you'd have to get them a companion<br /><br />Ellie Brewster: just as you do in a rl class<br /><br />buridan Simon: /All of the students that i've had that were visually impaird had companion assigned anyway<br /><br />Kate Spatula: so would they have an avatar on here or just the companion or both<br /><br />Farley Scarborough: There are profesional describers we use in RL<br /><br />Krisjohn Twin: @Kate: I just walked into this room, sat down at a pre-defined spot and started typing. How hard could that be to script for someone who is blind? Most of the 3D interface in SL is wasted.<br /><br />buridan Simon: /it is true the 3d doesn't matter as much as proximity<br /><br />Krisjohn Twin: Heck, an IRC bridge to this room would probably be more than enough to participate in this discussion.<br /><br />Ellie Brewster: you can use sound files as cues. Tie them to the scenery<br /><br />Farley Scarborough: Ah, but the visual's... they aren't wasted on the blind.<br /><br />otakup0pe Neumann: And scripting movement will get more interesting with libsl.<br /><br />Farley Scarborough: Listen to an audio described movie<br /><br />Kate Spatula: so let's consider this room. could we augment it to make it more accessible beyond just visual<br /><br />Farley Scarborough: the visual description is very important<br /><br />Gus Plisskin: For the person who's visually-impaired, rather than blind<br /><br />Ellie Brewster: what about using a different channel for viz impaired?<br /><br />Gus Plisskin: with description? that'd work<br /><br />otakup0pe Neumann: Do you mean chat channel Ellie ?<br /><br />Ellie Brewster: yes<br /><br />buridan Simon: i think someone has an irc bridge<br /><br />otakup0pe Neumann: there are several<br /><br />otakup0pe Neumann: we have developed one (we being my company)<br /><br />otakup0pe Neumann: and i knwo tehre is one with libsl<br /><br />buridan Simon: /Actually i know irc, and im bridge<br /><br />otakup0pe Neumann: and the #secondlife irc channel runs one<br /><br />Kate Spatula: i'm loolking right now at pictures of some famous philosophers hanging on the walls. the environment could provide a list of tagged objects to the user<br /><br />Kate Spatula: which would be useful to scripters as well<br /><br />otakup0pe Neumann: my company is in the process of developing a "hidden" metadata system for SL object<br /><br />otakup0pe Neumann: uhh. hidden is a bad word.<br /><br />otakup0pe Neumann: ubiquitous ? heh.<br /><br />buridan Simon: /hah good luck with that... tagging perhaps, but object standard metadata...<br /><br />Kate Spatula: the challenge, as it is in web accessibility, is making sure the data is provided<br /><br />buridan Simon: /cidoc is a bugger<br /><br />otakup0pe Neumann: maybe metadata is also a poor word ;)<br /><br />buridan Simon: /metadata is the word... it means data about data<br /><br />otakup0pe Neumann: I know. There are many kinds of metadata.<br /><br />otakup0pe Neumann: And as we just saw, only so much room in a script.<br /><br />Kate Spatula: i'm sure i could force rubaiyat to tag Trotsky's, but what about *insert random place* here<br /><br />otakup0pe Neumann: and kate, good point again. tagging the whole grid is a daunting task =O<br /><br />otakup0pe Neumann: let alone both grids !<br /><br />otakup0pe Neumann: and having them all work together.<br /><br />Ali Andrews: but isn't it tagged already, in the edit window?<br /><br />buridan Simon: /tagging is also an area where you will have a good number of people who vary and some who actively resistantly participate by tagging wrongly<br /><br />otakup0pe Neumann: That's a idfferent kind of tagging Ali.<br /><br />Ali Andrews: how is it different? It can list the name, discription... it just needs to be done consistantly as we do when we build web pages<br /><br />buridan Simon: 'everything is a cube'<br /><br />otakup0pe Neumann: This is true Ali<br /><br />otakup0pe Neumann: Consistency is the key.<br /><br />Kate Spatula: there is a difference, ali.<br /><br />Ali Andrews: so at least for our educational builds we can start a standard<br /><br />otakup0pe Neumann: it's metadata ,but not strictly descriptive<br /><br />otakup0pe Neumann: i wonder how many "objects" are around here.<br /><br />Kate Spatula: web pages have a structure that supports the use of those descriptions. however, accessing just names and descriptions on here is fairly unstructured<br /><br />otakup0pe Neumann: lack of consistency....<br /><br />buridan Simon points out that there are standards, and it is better to attempt to conform to a standard than to create one anew<br /><br />Bryan Mnemonic: does linden tag any objects with metadata at all?<br /><br />Pathfinder Linden: not really, not in the sense you're thinking about<br /><br />otakup0pe Neumann: object name, description, groups, that is all metadata<br /><br />otakup0pe Neumann: but yeah. not too "descriptive"<br /><br />Kate Spatula: here's a related issue... avatakind of like all the image alt tags that say "image"<br /><br />Kate Spatula: rs and disability. aside from pathfinder with his lack of a nose and rubaiyat's inability to store fat, none of our avatars really display signs of disability<br /><br />Bryan Mnemonic: I wonder if that can be added to the "edit" build window so folks ban begin adding specific tags, or a limited number of them based on a drop down menu<br /><br />Kate Spatula: going back to our blind student, would his avatar have a white cane an dsunglasses?<br /><br />otakup0pe Neumann: kate : it's up to them<br /><br />buridan Simon: /it would seem to me that identity is up to them<br /><br />Bryan Mnemonic: for instance, we riright click on this couch, and when we click "more"perhaps there could be an additional tab<br /><br />Bryan Mnemonic: with metadata options<br /><br />buridan Simon: /metadata that can be filled with 'arse' and related words<br /><br />Farley Scarborough: Ah, but of course the blind student couldn't see where to click, so wouldn't have a mouse<br /><br />Kate Spatula: i'm not saying to enforce avatar appearances, but consider the importance of avatar image to reflecting disabilitiy<br /><br />otakup0pe Neumann: do you think other avatars would requrie that visual cue ?<br /><br />Kate Spatula: there are two sides ot the issue. one is whether other avtars need the cues, but then there's also the importance of digital identity and one's avatar image<br /><br />Pathfinder Linden: there was a recent paper about social cues and avatars in SL recently...<br /><br />Farley Scarborough: To expand Kate's question... Many people with disabilities you can see in RL can go through SL without anyone knowing. Could someone who cant' see?<br /><br />otakup0pe Neumann: i think it should be the option of the avatar in question<br /><br />Kate Spatula: that's the point... are the avatar options enough to reflect some of these needs<br /><br />otakup0pe Neumann: both sides really. does the blind student want to advertise that they are blind ?<br /><br />otakup0pe Neumann: and do other avatars want those cues ?<br /><br />Pathfinder Linden: aha, here it is: <a href="http://www.stanford.edu/group/vhil/papers/second_life_nonverbal.pdf">http://www.stanford.edu/group/vhil/papers/second_life_nonverbal.pdf</a><br /><br />Kate Spatula: actually, they might want to advertise that they're blind, or in a wheelchair, etc. it's part of their identity and it's an issue of whether that person wants to express that and if they can express that<br /><br />otakup0pe Neumann: but it's up to them.<br /><br />Kate Spatula: RL example: i once roleplayed with a person who insisted that his elf cleric had a wheelchair built by gnomes<br /><br />Farley Scarborough: It's only up to them if they can get about without anyone knowing.<br /><br />Kate Spatula: from a disability critical studies viewoint, SL has an extreme bias towards physical perfection and able-bodiness<br /><br />buridan Simon: /metadata is always a social and organizational issue first<br /><br />Kate Spatula: for me, i think one of the larger open questions is how do people with disabilities view SL... both in terms of its usability and its potential<br /><br />Farley Scarborough: There's a slogan among disability rights advocates: "Nothing about us without us."<br /><br />Krisjohn Twin: For me, a larger question is how do people who barely know how to turn on a PC cope with SL at all? I've got plenty of able-bodied staff that have no chance coping with this environment.<blockquote></blockquote></blockquote>Gareth R. Whitehttp://www.blogger.com/profile/16484025446664877676noreply@blogger.com0tag:blogger.com,1999:blog-8824778240795132731.post-2634827102336449302007-11-23T17:28:00.000+00:002007-11-23T17:41:57.766+00:00The Acoustic Ecology of the First Person ShooterI posted about Mark Grimshaw's <a href="http://www.wikindx.com/mainsite/phd.html">Phd thesis</a> on <a href="http://game-culture.blogspot.com/2007/06/acoustic-ecology-of-first-person.html">another blog</a> earlier this year, but as it's relevant to this project too I thought it might be useful to repeat the link, especially as he gave a <a href="http://www.digra.org/dl/display_html?chid=http://www.digra.org/dl/db/07311.06195.pdf">cut down presentation</a> at <a href="http://www.digra2007.jp/">DiGRA '07</a> in Tokyo.Gareth R. Whitehttp://www.blogger.com/profile/16484025446664877676noreply@blogger.com2tag:blogger.com,1999:blog-8824778240795132731.post-36876426993070190092007-11-23T11:38:00.000+00:002007-11-23T11:55:55.731+00:00Audio RTSIt sounds like an unusual prospect, but Jean-Luc Pontico has created a convincing <a href="http://jlpo.free.fr/soundrts/">Sound RTS</a>. It's cross platform (Windows, Linux, Mac) and localised (French, English, German, Italian, Spanish) and free!<br /><br />I'm particularly impressed with the menu system and the speech samples used. Very clear and easy to use. Real time 3D audio works reasonably well to give the impression of units moving around, and continual reports of building progress are useful and analogous to visually observing the state of a build or progress bar.<br /><br />Apparently it has mutliplayer features and allows the players to use their own maps as well. There's an active <a href="http://groups.google.com/group/soundrtschat">forum</a> and <a href="http://soundrts.blogspot.com/">blog</a>.<br /><br />This is an impressive game with high quality production values and a complete RTS implementation.Gareth R. Whitehttp://www.blogger.com/profile/16484025446664877676noreply@blogger.com0tag:blogger.com,1999:blog-8824778240795132731.post-50658499330241203202007-11-20T16:35:00.000+00:002007-11-22T14:32:32.435+00:00Crackdown AudioAt the weekend I was speaking to <a href="http://www.mobygames.com/developer/sheet/view/developerId,238254/">Roland Peddie</a>, one of my old colleagues from the games industry. He mentioned that his last game, Crackdown, received a BAFTA for its use of audio, for which he was the programmer.<br /><br />Team Xbox have an interview with <a href="http://www.mobygames.com/developer/sheet/view/developerId,13330/">Raymond Usher</a>, the audio director, about <a href="http://interviews.teamxbox.com/xbox/1885/The-Audio-of-Crackdown/p1/">The Audio of Crackdown</a> in which he refers to the code Roland wrote:<br /><br /><blockquote>"a revolutionary audio shader that uses the surrounding geometry to dynamically color and time delay the direct and indirect audio reflections."</blockquote><blockquote>"When we heard the results of our complex Reverb/Reflections/Convolution or “Audio-Shader” system in Crackdown, we knew that we could make our gunfights sound like that, only in real-time! Because we are simulating true reflections on every 3D voice in the game, with the right content we could immerse the player in a way never before heard."</blockquote><br /><br />Real-time "early-reflection" processing might be useful for our current project as a way of situating the user in a complex and dynamic environment.<br /><br />Funkhouser, Thomas A.; Tsingos, Nicolas; Carlbom, Ingrid; Elko, Gary; Sondhi, Mohan; West, Jim. <i>Interactive Acoustic Modeling for Virtual Environments</i> <<a href="http://www.cs.princeton.edu/~funk/acoustics.html">http://www.cs.princeton.edu/~funk/acoustics.html</a>> (Last accessed 22nd November 2007)<br /><br /><object width="425" height="355"><param name="movie" value="http://www.youtube.com/v/cS0NcLyXXys&rel=1"></param><param name="wmode" value="transparent"></param><embed src="http://www.youtube.com/v/cS0NcLyXXys&rel=1" type="application/x-shockwave-flash" wmode="transparent" width="425" height="355"></embed></object><br /><br /><i>Crackdown</i>. Real Time Worlds (Microsoft Game Studios: 20th February 2007). Xbox 360.<br /><br />See also <a href="http://www.metacritic.com/games/platforms/xbox360/crackdown?q=crackdown">Metacritic</a> and <a href="http://vgchartz.com/games/game.php?id=405">VGChartz</a>Gareth R. Whitehttp://www.blogger.com/profile/16484025446664877676noreply@blogger.com0tag:blogger.com,1999:blog-8824778240795132731.post-1412243949909665952007-11-19T14:57:00.001+00:002007-11-19T15:01:07.450+00:00Gesture Self-VoicingIt's not my intention to continue adding self-voicing capabiliities to this viewer, but I did just throw together some code which announces gestures. This currently only works on your own avatar, but clearly it would be similar to announce (visual only) gestures and animations from other nearby users.<br /><br />I've commited my code changes to the repository. Please let me know if you're able to use them as I haven't tested getting the code and building from scratch, and I'm also only commiting those changes that I <span style="font-style: italic;">think</span> are needed, but I could very well be wrong.Gareth R. Whitehttp://www.blogger.com/profile/16484025446664877676noreply@blogger.com0tag:blogger.com,1999:blog-8824778240795132731.post-68437222502977703162007-11-14T11:53:00.000+00:002007-11-14T16:30:20.600+00:00Self Voicing : Proof of conceptWithout wanting to jump the gun and accounce this project as a success before it's got started, I just had a nice affirmation that self voicing is useful.<br /><br />I wanted to take some screenshots to illustrate accessibility problems for another post to this blog, but I was dealing with some other applications and still had my headphones on with my self-voicing <span style="font-style: italic;">SL</span> viewer was running in the background with my avatar at the start location on Orientation Island. This is clearly a busy thoroughfare, so it's perhaps not surprising that after a while someone would try to chat to me. What did surprise me was when my viewer spoke the chat text and interrupted me from my other work.<br /><br />I'd only tested it with bots and alts previously which is a bit contrived. This occasion was less of a technical affirmation and more of a social one - the viewer alerted me to something that needed my response similar to the way instant message clients use audio to alert the user. In that sense it worked: someone in <span style="font-style: italic;">SL</span> wrote me a message and wanted me to respond. Although I was AFK for the purposes of that application I was still able to keep my presence 'alive' even without needing to <span style="font-style: italic;">see</span> the chat message. The same thing happened a little later with a friendship notification.<br /><br />Also interesting to reflect on the adage "<span style="font-style: italic;">One person's accessibility issue is another's usability issue</span>." I am ostensibly working on an accessibility project, but it turns out to have positive usability results.Gareth R. Whitehttp://www.blogger.com/profile/16484025446664877676noreply@blogger.com1tag:blogger.com,1999:blog-8824778240795132731.post-37904769659035216782007-11-09T15:54:00.000+00:002007-11-20T12:52:06.845+00:00Audio Game WalkthroughI find it difficult to play audio games. I imagine this is because I do not have the quality of auditory attention that blind players do, and despite having many many years of experience playing and developing conventional games, this difference in auditory skill clearly affects my ability to design a game for a blind player.<br /><br />In order to try to immerse myself in the space of an audio game I've been listening to <a href="http://www.sonokids.com/eraygames/index.php?editpage=yes&link=Home&header=Home">Ivan Fegundez's</a> <a href="http://skular.onlinestoragesolution.com/bct950LoneWolfMission2.mp3">walkthrough</a> of <a href="http://www.gmagames.com/">GMA Games</a>' Lone Wolf Mission 2.<br /><br />My initial reactions to this recording and to my own playthrough of Terraformers were similar in that I felt confused and alienated by the audioscape. I wonder though if this is simply a question of interface, context and meaning that presents a similar state when playing or watching someone else play a new game.<br /><br /><object width="425" height="355"><param name="movie" value="http://www.youtube.com/v/G3Dl_nn2fW0&rel=1"></param><param name="wmode" value="transparent"></param><embed src="http://www.youtube.com/v/G3Dl_nn2fW0&rel=1" type="application/x-shockwave-flash" wmode="transparent" width="425" height="355"></embed></object><br /><br />Aside from the mildly comic interuptions from the speaker's mother and the ringing phone, I found it very interesting to listen to this game as it gave me an opportunity to try to get inside the head of an accomplished audio gamer. One of the most interesting aspects was the way I tried to adapt to the audio-only stimulus: by shutting my eyes I found that I could increase my concentration on the sounds of the game, and despite the extremely fast speech announcements after some time I found that I was filtering for only the relevant information based on pattern recognition. After hearing the spoken announcements from the game I became used to the structure of the sentences, and was able to focus my attention only on those key phrases which contained the variable data. For example "<span style="font-weight:bold;">Island</span> 100 off <span style="font-weight:bold;">port</span> <span style="font-weight:bold;">twenty three</span> hundred yards", and with this data I was able to construct a mental model in real time. Projecting myself into this mental space I felt my relation to the other game entities in terms of direction and distance, such that when the submarine's engine was running I could imagine myself moving forwards through the space, using the announcements to maintain triangulation between myself and the other objects in the water.Gareth R. Whitehttp://www.blogger.com/profile/16484025446664877676noreply@blogger.com0tag:blogger.com,1999:blog-8824778240795132731.post-67401249188283457222007-11-09T13:19:00.001+00:002007-11-09T14:15:15.648+00:00Accessibility SDKsThe Mozilla developers have an extensive <a href="http://www.mozilla.org/access/windows/msaa-server">extensive article on MSAA</a>, with lots of general advice for developers.<br /><ul><li><span class="MSAA_decision">Use MSAA</span> whenever you have created custom controls where you're handling the drawing, mouse and keyboard accessibility in your own code. MSAA is the only way to let all AT's know what your code is doing.</li><br /><li><span class="MSAA_decision">Don't use MSAA</span> if you're not running on a Windows platform ... hope that one was obvious.</li><br /><li><span class="MSAA_decision">Don't use MSAA</span> for exposing documents or other specialized data, if it's important that the user get access to formatting information. </li></ul><br />The RNIB as usual has some good advice on <a href="http://www.rnib.org.uk/xpedio/groups/public/documents/PublicWebsite/public_sackeyb.hcsp">Effective Keyboard Access</a><br /><blockquote><strong>"All input functionality needs to be keyboard operable</strong> but not necessarily all user interface elements. If there are multiple ways to perform a task, only one of them needs to be available from the keyboard, though it is best if all possible forms are keyboard operable."<br />ISO/FDIS 9241-171:2007, 9.3.2 Enable full use via keyboard, Note 5.</blockquote><br />In particular they highlight the following issues pertinent to <span style="font-style: italic;">SL</span>,<br /><blockquote>We often come across screens that contain a huge number of controls. There are sometimes good reasons for this but less cluttered screens are often more usable.<br /><br />Tab order is a critical aspect of accessibility for a keyboard user.<br /><br />It should be possible to move the keyboard focus to non-editable interface elements</blockquote><br />This is followed by a section on <a href="http://www.rnib.org.uk/xpedio/groups/public/documents/PublicWebsite/public_sacprogram.hcsp">The Programmatic Interface</a> with the following key points<br /><blockquote>Access technologies need to be able to identify accurately all the different types of controls and their labels<br /><br />Visible focus - This is the 'I-beam', highlight or outline that indicates which screen element has the input focus, ie where an action from the keyboard will take place. This is essential information for a keyboard or voice input user who doesn't have the luxury of just moving the mouse and clicking.<br /><br />Compatibility with access technologies - This is mainly achieved by using standard accessibility services provided by the operating system and software toolkits.</blockquote><br />In terms of Second Life, there are clients/viewers for 3 different operating systems which would imply using (at least) 3 different accessibility SDKs: <a href="http://developer.apple.com/documentation/Accessibility/Conceptual/AccessibilityMacOSX/OSXAXModel/chapter_4_section_1.html">OSX</a><br /><a href="http://www.blogger.com/post-create.g?blogID=8824778240795132731">Windows</a><br /><a href="http://accessibility.kde.org/">KDE</a>, <a href="http://developer.gnome.org/projects/gap/">Gnome</a> (Unix)<br /><br />This current pilot project will only attempt a Windows prototype client. In order to be fully cross platform, something like an abstracted accessibility API would need to be implemented in the application (<a href="http://www.mozilla.org/access/windows/msaa-server#The_Implementations_Behind_IAccessible">similar</a> to Mozilla's <a href="http://www.mozilla.org/projects/ui/accessibility/unix/nsIAccessibleLibrary/nsIAccessibleLibrary.html">technique</a>), wrapping the OS-specific API.<br /><br />This approach would seem to be appropriate for the user navigating around the window-like elements of <span style="font-style: italic;">SL</span>, but something more is needed to describe the main content of the screen. Whether this is sonification similar to that used in <span style="font-style: italic;">Terraformers</span>, or a <a href="http://markwordt.googlepages.com/robotmotionplanningproject">Guide-Bot</a> as imagined by Josh Markwodt, or the desciptive and radar techniques <a href="http://news.bbc.co.uk/1/hi/technology/6993739.stm">prototyped</a> by IBM, is as-yet unclear. User testing on a variety of prototypes would need to be conducted to have a better idea which way to procede.Gareth R. Whitehttp://www.blogger.com/profile/16484025446664877676noreply@blogger.com0tag:blogger.com,1999:blog-8824778240795132731.post-37837690441066717502007-11-09T10:58:00.000+00:002007-11-09T11:09:20.879+00:00Local Services (Brighton, UK)These resources might be useful for contacting visually impaired people in the local area, for interviews and application testing:<br /><br />National Association of Local Societies for Visually Impaired People, region 2 (<a href="http://www.nalsvi.cswebsites.org/default.aspx?page=2046">South East</a>) has a number of local societies, including The <a href="http://www.bsblind.co.uk/full/index.htm">Brighton Society for the Blind</a>.<br /><br />The RNIB has a residential home in Brighton, <a href="http://www.rnib.org.uk/xpedio/groups/public/documents/publicwebsite/public_waverhouse.hcsp">Wavertree House</a><br /><br />Brighton and Hove City Council <a href="http://www.brighton-hove.gov.uk/index.cfm?request=c1105981">Sensory Services</a> team includes Rehabilitation Officers for the Visually Impaired (ROVI).Gareth R. Whitehttp://www.blogger.com/profile/16484025446664877676noreply@blogger.com0tag:blogger.com,1999:blog-8824778240795132731.post-7216399771485645902007-11-08T09:30:00.000+00:002008-01-14T15:51:48.048+00:00Accessibility Analysis Literature ReviewI've been considering the work others have already conducted on analysing the inaccessiblility of <span style="font-style: italic;">SL</span>:<br /><br /><a href="http://www.it-analysis.com/blogs/Abrahams_Accessibility/2007/1/second_life_open_source_accessible_.html">Abrahams Accessibility</a><br /><blockquote>The client does not run in a browser, it runs in its own window, it does not use HTML to any great extent and therefore the Web Accessibility standards (WAI) are not sufficient and in some cases not relevant.</blockquote><blockquote>Anyone that has a vision impairment and uses a screen reader to access a computer and the web can not access SL, because even the textual information displayed in the client is not accessible by the screen reader.</blockquote><blockquote><ol><li>Include an accessibility section in the help.</li><li>Make the help screens accessible without a mouse.</li><li>Make the text in help sizable.</li><li>Make any text on the client configurable for size and color, including the menus, the avatar names, messages.</li><li>Enable the numbering of objects on the screen so that instead of having to click on an object you can choose the object by number (rather like the 'say what you see feature' in Vista).</li><li>A text-to-voice feature for chat, in stereo so that the avatars location can be estimated, and the ability to configure the voice to fit the avatar.<br /></li><li>Provide a text list of avatars in the vicinity and voice announcements of entries and exits.</li><li>Simulation of an electronic white stick.</li></ol></blockquote><br /><br /><a href="http://www.it-analysis.com/blogs/Abrahams_Accessibility/2006/11/second_life_class_action.html">Second Life Class Action Suit</a><br /><blockquote>The first, one-time barrier, is that the registration process uses a catpcha that a blind person cannot use; for a solution to this problem see ‘<a href="http://www.it-analysis.com/business/compliance/content.php?cid=8390">Bloor helps ITA do it better than Google</a>'.</blockquote><blockquote>But the real problem comes with the user interface, which gives a visual representation of the SL terrain, any avatars in your vicinity, any object you can interact with, and any instruction displayed on SL notice displays. None of this information is available via a screen reader and none of it can be pointed at without a mouse. Further, the controls such as chat, search, help can only be activated by a mouse click.</blockquote><br /><br /><a href="http://www.magnifiers.org/news.php?action=fullnews&id=249"> No Second Life For Visually Impaired</a><br /><blockquote>If you access the Second Life Client viewer with a screen reader like Hal, Jaws or Window-Eyes, nothing will be spoken aloud, nothing will apear on your braille line.</blockquote><blockquote>Presently, not only is SL not compatible with screen readers, the SL website itself is largely inaccessible to people with visual impairments. Feedback from an online questionnaire I designed demonstrates that 8 out of 10 visually impaired users were unable to register for an account on the SL website. This is due to the fact that the site does not conform with W3C accessibility guidelines. Linked images have no alt attributes and form fields do not link correctly.<br /><br />After attempting to register for an account one questionnaire participant responded by saying:<br /><br />“I found no easy step by step guide that would say what to expect, or even give me any reason to overcome the obstacles for joining”… their reasons for wanting to join SL - “..an online community to join. But only if it represented a cross-section of real life. I’m not interested in anything that so flagrantly excludes disabled people”.</blockquote><br /><br /><a href="http://ets.tlt.psu.edu/gaming/node/212">Accessibility and Second Life (Revised)</a><br /><blockquote>A student relying solely on a screen reader will be shut out from Second Life.<br /><br />What to do if you have a visually impaired student in a course using Second Life? Think about what learning objectives made you choose Second Life. Is it communication? Maybe alternate chatrooms or Skype could be enabled.<br /><br />Is it a visual experience? Then you can treat Second Life as you would other graphics or animation - that is, provide lots of descriptive text.</blockquote><br /><br /><a href="http://kestrell.livejournal.com/343509.html">Accessibility and democracy in Second Life</a><br /><blockquote>It would require a tremendous amount of Alt tagging and/or audio describing to make the rich and evolving virtual world of "Second Life" intelligible,<br />useful and enjoyable to blind and low-vision users.</blockquote><br /><br /><a href="https://lists.secondlife.com/pipermail/educators/2006-August/001735.html">[SLED] Blind people in SL - Idle speculation</a><br /><blockquote>This would be easier with a client SDK that could trap text, use text to speech and allow keyboard macros, but given the existing client could we not have a HUD or head mounted scripted object that 'spoke' information. Location, people's names as they came and went, object IDs. Within the current system, these would probably have to be pre-recorded and linked to specific text, say in a notecard. Alternatively, objects in an 'accessible' area could be able to self report, say if someone approached them within a certain distance for a certain time. This area could be made the home location for participants. We could even run a competition to design accessible vending machines that used sound instead/as well as text.<br /><br />To aid people with visual impairments - most people who are blind aren't actually 'blind' - it would be great to have control over field of view in the client, which could effectively allow mouse view of a small angle to be the equivalent of a magnified image, much as PC viewing software allows the whole screen to be enlarged. Sadly, this would not easily include text. However, if we had a HUD object repeating any 'heard' text in the mouselook view, then even this might be possible. This would require chat in the mouselook view...<br /><br />Ah well, maybe when I have a PhD student to throw at it...</blockquote><br /><br /><a href="https://lists.secondlife.com/pipermail/educators/2006-August/001777.html">[SLED] Re: Blind people in SL - Idle speculation</a><br /><blockquote>However, the Second Life client doesn't currently give screen reader access to chat or IM text. In fact, you can't even read the menus with JAWS. If the client did have that most basic accessibility--chat, IM and menus--blind users would still need some assistance getting around.</blockquote><br /><br /><a href="http://lists.interactiondesigners.com/pipermail/discuss-interactiondesigners.com/2007-October/021268.html">[IxDA Discuss] Target.com Loses Accessibility Law Suit</a><br /><blockquote>I was part of a discusssion of accessibility of virtual worlds like Second Life, for people who "browse with their ears". It turned out that the first problem wasn't even in Second Life itself. It was that the login page was designed inaccessibly. People using a screen reader couldn't even get into the worlds to find out if they could use them or not. Nothing special, new or difficult. Just a login screen. But just as much a barrier as any locked door.</blockquote><br /><br /><a href="http://blindconfidential.blogspot.com/2007/09/three-dimensional-web-interfaces.html#7625081911838372691"> Three Dimensional Web Interfaces</a><br /><blockquote>Perhaps we should not focus exclusively on screen readers and haptics to provide access for blind people in 3D virtual reality. If the aim of virtual reality is to become more and more life like, lets think about the actual real life experience of individuals moving about in the real world and how they interact with other people.<br /><br />Blind and low vision people generally are mobile outside familiar surroundings with the aid of a cane, a guide dog or a sighted companion. When more assistance is needed, there is usually a store staff person or a passerby to whom one can ask for directions or other information. This latter is not something that just blind people do. It is natural human behaviour.<br /><br />Why not have a service avatar to provide a similar service. Imagine a humanoid robot like C3PO, the protocol android in Starwars, who could guide the avatar of a player, give verbal directions, describe scenes and activities, etc. This is rather like a personal tour guide. Add some more services, like language translation for players in other countries, ASL for players who are deaf, information retrieval to answer questions knowlegeably and you broaden the appeal and usefulness of such an avatar. They would serve more than just the sight impaired players.<br /><br />I think there is a lot of technology that is already out there that could be brought to bear on this. In Japan, for example, some stores have robots that can greet customers and even take them to a particular department. voice and natural language recognition, text to speech and text to ASL engines, language translation software are already very advanced and improving. The underlying architecture of the virtual space must have some basic navigation functions that might respond to verbal commands in lieu of a joystick or whatever it is that players use to travel about in Second Life.<br /><br />A service companion avatar should probably become a standard feature in 3D virtual reality in the same way that online help is a ubiquitous feature in Windows.</blockquote>Gareth R. Whitehttp://www.blogger.com/profile/16484025446664877676noreply@blogger.com0tag:blogger.com,1999:blog-8824778240795132731.post-51469067345627191532007-11-06T13:24:00.001+00:002007-11-07T14:58:11.408+00:00Code AvailableI've set up a <a href="http://code.google.com/p/blindsecondlife/">project page</a> on Google code where you can download the source to my viewer. You should follow the instructions on how to download and build the default viewer first, then once you've successfully got that built locally you can try using my indra directory instead.<br /><br />Good luck!<br />Please post on the project page or here if there are any problems.Gareth R. Whitehttp://www.blogger.com/profile/16484025446664877676noreply@blogger.com0tag:blogger.com,1999:blog-8824778240795132731.post-39622905850661162072007-11-01T16:00:00.000+00:002007-11-30T13:32:57.468+00:00Self VoicingI've just added self voicing to the Windows viewer.<br />Here's an example.<br /><br />I launch <span style="font-style: italic;">SL</span> from Visual Studio, walk up to an object called "Healthy" who chats to me. Everything he and I write in chat is spoken.<br />I also demo clicking the object, to which it responds with a chat and also issues me with a notification which is also spoken.<br /><br /><iframe allowfullscreen='allowfullscreen' webkitallowfullscreen='webkitallowfullscreen' mozallowfullscreen='mozallowfullscreen' width='635' height='527' src='https://www.blogger.com/video.g?token=AD6v5dxOdmxd46bkxyT2QgDjsgEH30G-lTux35ZavHbGREboUGHtha3f98r0nPfxSqT2_iyxjFf6pvMKbjcpvVaokw' class='b-hbp-video b-uploaded' frameborder='0'></iframe><br /><br />I got my inspiration for this test from the following films,<br /><br /><a style="left: 0px ! important; top: 15px ! important;" title="Block this object with Adblock Plus" class="abp-objtab-0549497680208634 visible ontop" href="http://www.youtube.com/v/b7tb1wWqGZ8&rel=1"></a><object height="355" width="425"><param name="movie" value="http://www.youtube.com/v/b7tb1wWqGZ8&rel=1"><param name="wmode" value="transparent"><embed src="http://www.youtube.com/v/b7tb1wWqGZ8&rel=1" type="application/x-shockwave-flash" wmode="transparent" height="355" width="425"></embed></object><br /><br /><a style="left: 0px ! important; top: 0px ! important;" title="Block this object with Adblock Plus" class="abp-objtab-0549497680208634 visible ontop" href="http://www.youtube.com/v/a209xEeJjL0&rel=1"></a><object height="355" width="425"><param name="movie" value="http://www.youtube.com/v/a209xEeJjL0&rel=1"><param name="wmode" value="transparent"><embed src="http://www.youtube.com/v/a209xEeJjL0&rel=1" type="application/x-shockwave-flash" wmode="transparent" height="355" width="425"></embed></object><br /><br />You can purchase this product from <a href="http://www.slexchange.com/modules.php?name=Marketplace&file=item&ItemID=196663">SL Exchange</a>Gareth R. Whitehttp://www.blogger.com/profile/16484025446664877676noreply@blogger.com1