{"id":24,"date":"2018-04-09T13:37:32","date_gmt":"2018-04-09T10:37:32","guid":{"rendered":"http:\/\/accessiblerealities.com\/blog\/?p=24"},"modified":"2020-02-05T06:46:27","modified_gmt":"2020-02-05T04:46:27","slug":"accessible-realities-game-accessibility-for-people-who-are-blind-or-have-low-vision","status":"publish","type":"post","link":"http:\/\/accessiblerealities.com\/blog\/accessible-realities-game-accessibility-for-people-who-are-blind-or-have-low-vision\/","title":{"rendered":"Accessible Realities: Making video games and XR accessible for people who are blind or have low vision"},"content":{"rendered":"<h3><span style=\"font-weight: 400;\">Summary<\/span><\/h3>\n<p><em><strong>How can we make it easier for video game developers to add accessibility for people who are blind or have low vision?<\/strong><\/em><\/p>\n<p><span style=\"font-weight: 400;\">In trying to help answer this question I have developed an accessibility software library for the Unreal Engine 4 game engine. This blog post shows the library in action and describes its capabilities.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The main feature of the library is to enable developers to easily provide automated audio cues and descriptions of a 3D scene in front of the user to support their orientation and mobility. The library focuses on scans of the elements (characters and objects) that are in front of the user. Additional features include: first-person view capability, collision prediction, game and area specific audio instructions and slow motion. All of these features are demonstrated in action in the videos below.\u00a0Accessible Realities library is currently in Alpha stage.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Some of Accessible Realities\u2019 features could potentially be reused in other non-game environments, for example in real world narrators like the Microsoft Seeing AI application (see Demo 4 video for details). Other quite intriguing future opportunities are listed in the Future section. My hope is that this library, after additional testing, combined with many other libraries (like ones that enable accessible menus) and best practices would help provide better video game and <\/span><a href=\"https:\/\/en.wikipedia.org\/wiki\/X_Reality_(XR)\"><span style=\"font-weight: 400;\">XR<\/span><\/a><span style=\"font-weight: 400;\"> experiences for people who are disabled and specifically who are blind or have low vision.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Accessible Realities at its current state is a result of an intensive one year of full-time research and development. It was created without any external funding with my own savings. After 20 years in the high tech industry I decided to take time to contribute to the community via this social impact project. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">Game accessibility for the blind is quite a challenge, I am trying hard to improve the library all the time and your helpful feedback is very welcome. I wanted to have something to show that is more than an idea, something one can hear and get a feel of, one that a tester could try out instead of waving my hands without a concrete implementation. The library is currently at Alpha stage waiting for additional testers and more games. I thought that a year into the project and around the time of GDC, GAConf and CSUN 2018 conferences would be a good opportunity to share its current status with the community.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The goals of this post are to:<\/span><\/p>\n<ol>\n<li>Share info with the community so others can evaluate, use and build and improve upon it<\/li>\n<li style=\"font-weight: 400;\"><span style=\"font-weight: 400;\">Call for people who are blind or have low vision to join Alpha testing<\/span><\/li>\n<li style=\"font-weight: 400;\"><span style=\"font-weight: 400;\">Call for video game companies and developers to join Alpha testing<\/span><\/li>\n<li style=\"font-weight: 400;\"><span style=\"font-weight: 400;\">Ask for feedback from game accessibility experts<\/span><\/li>\n<li style=\"font-weight: 400;\"><span style=\"font-weight: 400;\">Call for collaborations with social impact intentions (surprise me!)<\/span><\/li>\n<\/ol>\n<p><span style=\"font-weight: 400;\">I will be happy to hear from you<\/span><span style=\"font-weight: 400;\">: either via Twitter at <\/span><a href=\"https:\/\/twitter.com\/AccessibleXR\"><b>@AccessibleXR<\/b><\/a> <span style=\"font-weight: 400;\">or via this <\/span><a href=\"http:\/\/bit.ly\/axxr-collaboration\"><b>online form<\/b><\/a><span style=\"font-weight: 400;\">.<\/span><\/p>\n<h3><span style=\"font-weight: 400;\">Accessible Realities<\/span><\/h3>\n<p><span style=\"font-weight: 400;\">So let\u2019s hear it in action&#8230;<\/span><\/p>\n<h4><span style=\"font-weight: 400;\">Demo 1: Platformer Game<\/span><\/h4>\n<p><span style=\"font-weight: 400;\">Platformer game integration: adding a radar that scans which objects are in front of the player.<\/span><\/p>\n<p><iframe loading=\"lazy\" title=\"Accessible Realities Demo with Platformer Game\" width=\"840\" height=\"473\" src=\"https:\/\/www.youtube.com\/embed\/rSUJcLQEEAM?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture\" allowfullscreen><\/iframe><\/p>\n<p><span style=\"font-weight: 400;\">This is Epic Games\u2019 Platformer sample game. Integration took around 1.5 days.\u00a0<\/span><span style=\"font-weight: 400;\">Note that playing successfully still requires a lot of skill. Clicking immediately as you hear the sound does not guarantee success in the game.<\/span><\/p>\n<h4><span style=\"font-weight: 400;\">Demo 2: Library Walkthrough in 3D Fighting Game<\/span><\/h4>\n<p><span style=\"font-weight: 400;\">This video walks through the capabilities of the library one by one. It is demonstrated with Epic Games\u2019 Couch Knights sample multiplayer 3D fighting game.<\/span><\/p>\n<p><iframe loading=\"lazy\" title=\"Accessible Realities Walkthrough\" width=\"840\" height=\"473\" src=\"https:\/\/www.youtube.com\/embed\/aGffaEXDHAY?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture\" allowfullscreen><\/iframe><\/p>\n<p><span style=\"font-weight: 400;\">Let\u2019s go through the features in the demo:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The main functionality of Accessible Realities is to provide <\/span><a href=\"https:\/\/www.youtube.com\/watch?v=n6kANg1K3nE&amp;feature=youtu.be&amp;t=407\"><span style=\"font-weight: 400;\">locational and instructional accessible audio<\/span><\/a><span style=\"font-weight: 400;\">.\u00a0<\/span><\/p>\n<ul>\n<li><span style=\"font-weight: 400;\"><strong> 3D Scene Scan<\/strong>s (<a href=\"https:\/\/youtu.be\/aGffaEXDHAY?t=33\">0:33<\/a>, <a href=\"https:\/\/youtu.be\/aGffaEXDHAY?t=97\">1:37<\/a>): as heard in the video the library enables the developer to easily provide automated audio-based runtime descriptions of the scene in front of the user to support their orientation and mobility. \u00a0For cases where high accuracy is required there is a left-to-right scene elements 3D Scan that provides for each element, one by one, their detailed location. For example: \u201cTable: far-left, mid-height, near; Couch: center, up, very-far\u201d. Similarly the library includes another feature (not shown in the video) in which a natural language description is constructed along the lines of \u201cTable is the leftmost element, further to the right and more distant there is a couch which is the most distant element, further to the right there is knight #1 directly in front of you, which is the closest element\u201d<\/span><\/li>\n<li><span style=\"font-weight: 400;\"><strong>Axis Scene Scans<\/strong><strong>:<\/strong> for actions that require fast response times or for experienced users there are fast audio scans of the scene elements: first, there is horizontal left-to-right scan (<a href=\"https:\/\/youtu.be\/aGffaEXDHAY?t=45\">0:45,<\/a>\u00a0<a href=\"https:\/\/youtu.be\/aGffaEXDHAY?t=113\">1:53<\/a>). In addition there is vertical top-to-bottom scan and distance closest-element-to-most-distant scan (<a href=\"https:\/\/youtu.be\/aGffaEXDHAY?t=69\">1:09<\/a>). The scans use stereophonic sound to convey their relative \u00a0left-to-right location<\/span><\/li>\n<li><span style=\"font-weight: 400;\"><strong>First Person View<\/strong> (<a href=\"https:\/\/youtu.be\/aGffaEXDHAY?t=76\">1:16<\/a>)<strong>:<\/strong> the user can also choose to use first-person point of view capability. This could be a great help where third-person or top-down views are too detailed or hard to imagine<\/span><\/li>\n<li><span style=\"font-weight: 400;\"><strong>Audio Instructions<\/strong>\u00a0(<a href=\"https:\/\/youtu.be\/aGffaEXDHAY?t=17\">0:17<\/a>): can be made available for the entire duration of the game for the user by a single keystroke. Current granularity is: <strong>Game Instructions, Accessibility Instructions<\/strong> that describe accessibility specific features and <strong>Scene Instructions<\/strong> that require a bit more work to set up but provide the user with static audio descriptions of the area they are in (in the videos you can hear areas such as: &#8220;on table&#8221;, &#8220;on floor&#8221;, &#8220;on couch&#8221; etc.)<\/span><\/li>\n<li><span style=\"font-weight: 400;\"><strong>Radar Zoom <\/strong>(<a href=\"https:\/\/youtu.be\/aGffaEXDHAY?t=137\">2:17<\/a>)<strong>:<\/strong> when there are many elements in the scene the user can focus using the radar\u2019s zoom capability to zoom in and out. In addition, the developer at runtime can change the active radar tags (which elements are described for the user) for example based on the area the player is in<\/span><\/li>\n<li><span style=\"font-weight: 400;\"><strong>Collision Prediction <\/strong>(<a href=\"https:\/\/youtu.be\/aGffaEXDHAY?t=122\">2:02<\/a>)<strong>:<\/strong>\u00a0in many cases the user wants to know what they will bump into if they go straight ahead.<\/span><\/li>\n<li><span style=\"font-weight: 400;\"><strong>Slow Motion<\/strong>\u00a0(<a href=\"https:\/\/youtu.be\/aGffaEXDHAY?t=162\">2:42<\/a>) control can enable the user to slow down the game in cases of cognitive overload<\/span><\/li>\n<li><span style=\"font-weight: 400;\"><strong>Audio Speed<\/strong>\u00a0(<a href=\"https:\/\/youtu.be\/aGffaEXDHAY?t=64\">1:04<\/a>) and <strong>Volume<\/strong> controls are available. <\/span><\/li>\n<li><span style=\"font-weight: 400;\"><strong>Choice:<\/strong> the game developer has full control with regard to which features they want to enable for their game<\/span><\/li>\n<\/ul>\n<h4><span style=\"font-weight: 400;\">Demo 3: 3D fighting Game<\/span><\/h4>\n<p><span style=\"font-weight: 400;\">The following video \u201cspeaks\u201d for itself. <\/span><\/p>\n<p><iframe loading=\"lazy\" title=\"Accessible Realities Demo with Couch Knights Game\" width=\"840\" height=\"473\" src=\"https:\/\/www.youtube.com\/embed\/06HnNraqzWE?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture\" allowfullscreen><\/iframe><\/p>\n<h4><span style=\"font-weight: 400;\">Demo 4: Real World Audio Augmentation<\/span><\/h4>\n<p><span style=\"font-weight: 400;\">What we see in the video below is a non game real world use case of Accessible Realities. The object recognition tags in the video are created by artificial intelligence in the\u00a0<\/span><span style=\"font-weight: 400;\">cloud. Note: part of the video is played at 2x normal speed.\u00a0It is recommended to watch this video with Closed Captions on (click on the CC icon in the video&#8217;s toolbar). <\/span><\/p>\n<p><iframe loading=\"lazy\" title=\"Accessible Realities in Audio-based AR\" width=\"840\" height=\"473\" src=\"https:\/\/www.youtube.com\/embed\/Lsri6037iIE?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture\" allowfullscreen><\/iframe><\/p>\n<p><span style=\"font-weight: 400;\">This is a video recording from a demo Android app I have developed using Accessible Realities library. Yet, this is just a proof of concept to demo the scene scan in the real world and not the main use case of the library. Real world audio description is a different problem domain than video games and should be carefully verified especially with regard to safety concerns. Nevertheless, adding mid-level audio description of all the elements in the scene one by one with their relative positions might be a great addition to complement <\/span><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/rich-representations-visual-content-screen-reader-users\/\"><span style=\"font-weight: 400;\">other representation methods of visual content<\/span><\/a><span style=\"font-weight: 400;\">. It could complement methods such as low-level single element (like barcode or face) detection and high-level AI-based scene description. This capability might be useful as an additional channel to Microsoft <\/span><a href=\"https:\/\/www.microsoft.com\/en-us\/seeing-ai\/\"><span style=\"font-weight: 400;\">Seeing AI<\/span><\/a><span style=\"font-weight: 400;\"> or <\/span><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/product\/soundscape\/\"><span style=\"font-weight: 400;\">Soundscape<\/span><\/a><span style=\"font-weight: 400;\"> mobile apps to enrich the experience of the user. Accessible Realities might also be very easily integrated with technologies that algorithmically scan a real world scene and translate it to one made of 3D models, as presented in this <\/span><a href=\"https:\/\/www.youtube.com\/watch?v=TDY0KAmspQI\"><span style=\"font-weight: 400;\">video by Resonai<\/span><\/a><span style=\"font-weight: 400;\">. These 3D models could include built-in tags and audio descriptions and so can immediately provide orientation and mobility information for people who are blind or have low vision. A more advanced usage could even provide AI-based navigation instructions for avoiding obstacles and reaching a target, something like a virtual guide dog.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">To try out the effectiveness of this scene description method before implementing it, I asked a friend to carefully simulate the scan shown in the video while I was pointing a mobile phone with the camera on and trying (very carefully) to navigate to a certain object. It worked \ud83d\ude42 .<\/span><\/p>\n<h3><span style=\"font-weight: 400;\">Usage<\/span><\/h3>\n<p><span style=\"font-weight: 400;\">Accessible Realities is an infrastructure and not a specific game. There is a very simple integration process that includes 5 main basic steps. These steps could be done in Unreal Engine 4 entirely using Blueprints without any coding, if desired:<\/span><\/p>\n<ol>\n<li style=\"font-weight: 400;\"><span style=\"font-weight: 400;\">Import the library<\/span><\/li>\n<li style=\"font-weight: 400;\"><span style=\"font-weight: 400;\">Drag and drop a single Blueprint object into the level<\/span><\/li>\n<li style=\"font-weight: 400;\"><span style=\"font-weight: 400;\">Tag your actors (either manually or programmatically)<\/span><\/li>\n<li style=\"font-weight: 400;\"><span style=\"font-weight: 400;\">Record (or use existing) audio assets to represent each of your tags<\/span><\/li>\n<li style=\"font-weight: 400;\"><span style=\"font-weight: 400;\">Override the default configuration parameters using a single key-value editor screen. One of the configuration parameters to edit is the dictionary that maps between actor tags and audio assets.<\/span><\/li>\n<\/ol>\n<p><span style=\"font-weight: 400;\">The library takes it from there&#8230;<\/span><\/p>\n<h3><span style=\"font-weight: 400;\">Background and Motivation<\/span><\/h3>\n<p><span style=\"font-weight: 400;\">Digital worlds, realities and games are becoming more and more part of our lives. Leveraging the high flexibility of the digital medium we should aim to provide disabled people with equal access to these worlds.<\/span><\/p>\n<blockquote><p><strong><em>\u201cthere are 2.2 billion active gamers in the world\u201d<\/em><\/strong>\u00a0<b><i><span style=\"font-weight: 400;\">&#8211; <\/span><span style=\"font-weight: 400;\"><a href=\"https:\/\/newzoo.com\/insights\/articles\/newzoo-2017-report-insights-into-the-108-9-billion-global-games-market\/\">Newzoo, 2017 report<\/a><\/span><\/i><\/b><\/p>\n<p><b><i>\u201cMore than a billion people in the world today experience disability.\u201d\u00a0<\/i><\/b><span style=\"font-weight: 400;\">&#8211; <\/span><a href=\"http:\/\/www.who.int\/disabilities\/en\/\"><span style=\"font-weight: 400;\">World Health Organization<\/span><\/a><\/p>\n<p><b><i>\u201cAn estimated 253 million people live with vision impairment: 36 million are blind and 217 million have moderate to severe vision impairment\u201d &#8211;\u00a0<a href=\"http:\/\/www.who.int\/mediacentre\/factsheets\/fs282\/en\/\"><span style=\"font-weight: 400;\">World Health Organization<\/span><\/a><\/i><\/b><\/p>\n<p><b><i>\u201cAs well as the numbers making good business sense, there is human benefit. Games are entertainment, culture, socialising, things that mean the difference between existing and living.\u201d &#8211;\u00a0<a href=\"http:\/\/gameaccessibilityguidelines.com\"><span style=\"font-weight: 400;\">GameAccessibilityGuidelines.com<\/span><\/a>\u00a0<\/i><\/b><\/p>\n<p><b><i>\u201cWhenever a game adds an accessibility feature, it feels like it\u2019s made just for me. If a game makes an attempt to reach out to me I am going to remember that for the rest of my life.\u201d &#8211;\u00a0<a href=\"https:\/\/twitter.com\/stevesaylor\"><span style=\"font-weight: 400;\">Steve Saylor<\/span><\/a><\/i><\/b><\/p><\/blockquote>\n<p><span style=\"font-weight: 400;\">The goal of this post is not to <\/span><a href=\"http:\/\/www.includification.com\/\"><span style=\"font-weight: 400;\">include a list<\/span><\/a><span style=\"font-weight: 400;\"> of all available <\/span><a href=\"http:\/\/gameaccessibilityguidelines.com\/\"><span style=\"font-weight: 400;\">game accessibility resources<\/span><\/a><span style=\"font-weight: 400;\"> but here are just a few links that can be useful if you want to learn more about the subject: many emotional <\/span><a href=\"https:\/\/www.youtube.com\/watch?v=Ls_CD4mB42s\"><span style=\"font-weight: 400;\">stories<\/span><\/a><span style=\"font-weight: 400;\">, <\/span><a href=\"http:\/\/ian-hamilton.com\/favourite-game-accessibility-quotes-of-2017\/\"><span style=\"font-weight: 400;\">quotes<\/span><\/a><span style=\"font-weight: 400;\"> and <\/span><a href=\"https:\/\/www.youtube.com\/playlist?list=PLVEo4bPIUOsmhxWT181OPVq9Z1P8Qjf19\"><span style=\"font-weight: 400;\">videos<\/span><\/a><span style=\"font-weight: 400;\"> exist which are great inspiration and sources of knowledge and motivation for enabling accessibility in <\/span><a href=\"https:\/\/www.ted.com\/talks\/jane_mcgonigal_gaming_can_make_a_better_world\"><span style=\"font-weight: 400;\">games<\/span><\/a><span style=\"font-weight: 400;\">. There are <\/span><a href=\"https:\/\/www.youtube.com\/watch?v=P7n9s7yBlGw\"><span style=\"font-weight: 400;\">videos<\/span><\/a><span style=\"font-weight: 400;\"> available discussing <\/span><a href=\"https:\/\/www.youtube.com\/watch?v=hQD-IgyWFlM\"><span style=\"font-weight: 400;\">game accessibility<\/span><\/a><span style=\"font-weight: 400;\"> for the blind and <a href=\"https:\/\/www.youtube.com\/watch?v=nmmqarQRSSE&amp;list=PLXcBFfRlLcpiGcMcdCpJ_L1Okqm2uQvrL\">ones<\/a>\u00a0<\/span><a href=\"https:\/\/www.youtube.com\/watch?v=lpDoYgGC9QI\"><span style=\"font-weight: 400;\">showing<\/span><\/a> <a href=\"https:\/\/www.youtube.com\/watch?v=F4eAkdXpU_g\"><span style=\"font-weight: 400;\">video<\/span><\/a> <a href=\"https:\/\/www.youtube.com\/watch?v=vivZNuUih7I\"><span style=\"font-weight: 400;\">games<\/span><\/a> <span style=\"font-weight: 400;\"><a href=\"https:\/\/www.youtube.com\/user\/superblindman\">played<\/a>\u00a0<\/span><span style=\"font-weight: 400;\"><a href=\"https:\/\/youtu.be\/THbVXGulDUE\">by<\/a>\u00a0<\/span><a href=\"https:\/\/www.youtube.com\/user\/snowball\/featured\"><span style=\"font-weight: 400;\">blind<\/span><\/a> <a href=\"https:\/\/www.youtube.com\/user\/illegallysighted\"><span style=\"font-weight: 400;\">gamers<\/span><\/a><span style=\"font-weight: 400;\">. I have found <\/span><a href=\"https:\/\/twitter.com\/ianhamilton_\"><span style=\"font-weight: 400;\">Ian Hamilton<\/span><\/a><span style=\"font-weight: 400;\"> and <\/span><a href=\"http:\/\/www.smashclay.com\/\"><span style=\"font-weight: 400;\">Adriane Kuzminski<\/span><\/a><span style=\"font-weight: 400;\"> online resources especially useful for this project\u2019s purposes.<\/span><\/p>\n<h3><span style=\"font-weight: 400;\">Use Cases<\/span><\/h3>\n<p><span style=\"font-weight: 400;\">As mentioned, the main functionality of Accessible Realities is to provide <\/span><a href=\"https:\/\/www.youtube.com\/watch?v=n6kANg1K3nE&amp;feature=youtu.be&amp;t=407\"><span style=\"font-weight: 400;\">locational and instructional accessible audio<\/span><\/a><span style=\"font-weight: 400;\">. The library could be integrated with many types of games and applications and help in making them accessible for people who are blind or have low vision. For example: sidescrollers, interactive story games, point and click adventures, turn based strategy games, some 3D action games<\/span><span style=\"font-weight: 400;\">\u00a0and more. It could serve as a research platform (as described in the Future section below). It could also be used for Virtual Reality and Mixed Reality where the computer generated content could be tagged or even for Augmented Reality with AI-based object recognition as shown in Demo 4 above.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">In the game itself, the library could be used for many purposes:<\/span><\/p>\n<ul>\n<li style=\"font-weight: 400;\"><span style=\"font-weight: 400;\">Identification and navigation to 3D elements (both characters and objects) in space<\/span><\/li>\n<li style=\"font-weight: 400;\"><span style=\"font-weight: 400;\">Orientation and mobility in spaces like buildings using audio tags in places like doors, windows, corridors etc.<\/span><\/li>\n<li style=\"font-weight: 400;\"><span style=\"font-weight: 400;\">Orient and focus when there are a lot of elements using the radar zoom in feature<\/span><\/li>\n<li style=\"font-weight: 400;\"><span style=\"font-weight: 400;\">Exploration &#8211; use slow motion to enable the user to switch between regular play and slow or very slow motion for exploring the level<\/span><\/li>\n<li style=\"font-weight: 400;\"><span style=\"font-weight: 400;\">Very fast games or scenes could use slow motion<\/span><\/li>\n<li style=\"font-weight: 400;\"><span style=\"font-weight: 400;\">Instructions &#8211; game level instructions, instructions specific to an area and instructions describing the accessibility features. All available by one keystroke anytime<\/span><\/li>\n<li style=\"font-weight: 400;\"><span style=\"font-weight: 400;\">Player mood &#8211; use area specific scene descriptions for encouraging players as they proceed in the game<\/span><\/li>\n<li style=\"font-weight: 400;\"><span style=\"font-weight: 400;\">Translation to multiple languages by using multiple sets of audio recordings<\/span><\/li>\n<li style=\"font-weight: 400;\"><span style=\"font-weight: 400;\">Create multiple difficulty levels for blind gamers using different number and types of audio tags<\/span><\/li>\n<li style=\"font-weight: 400;\"><span style=\"font-weight: 400;\">Create 3D audio-only games<\/span><\/li>\n<\/ul>\n<p><span style=\"font-weight: 400;\">Quite a few of the guidelines from the comprehensive list at <\/span><a href=\"http:\/\/gameaccessibilityguidelines.com\/\"><span style=\"font-weight: 400;\">http:\/\/gameaccessibilityguidelines.com\/<\/span><\/a><span style=\"font-weight: 400;\"> could be implemented this way. This could be a topic for a separate post in which a guideline is quoted and a short video demonstrates its realization using Accessible Realities.<\/span><\/p>\n<h3><span style=\"font-weight: 400;\">Strengths<\/span><\/h3>\n<h4><span style=\"font-weight: 400;\">Semantic Content<\/span><\/h4>\n<p><span style=\"font-weight: 400;\">One of the advantages of the library is its focus on semantic content. The developer tags the game content for what it is in terms familiar to the user (castle, chair etc.). In addition to tags, by integrating with the game engine the library has full knowledge of the game world that could be made accessible for the user: locations, velocities, colors, animation states (very powerful future feature) and more. The library currently does not scan pixels one by one which could take a lot of time and would require the user to identify the content. Nevertheless there are use cases where this type of scan can be useful (for example a very detailed and realistic scan of landscapes or art or pixel-by-pixel scan of the depth of the scene). This capability would require additional development but could fit well and easily added using the existing infrastructure provided by the library. I wrote about potential uses of semantic content in images back in <\/span><a href=\"http:\/\/blog.techscouter.net\/techscouter\/seed-semantic-editing-encoding-and-decoding\"><span style=\"font-weight: 400;\">a blog post in 2010<\/span><\/a><span style=\"font-weight: 400;\">. This project is sort of a follow-up to one of the ideas described in that post.<\/span><\/p>\n<h4><span style=\"font-weight: 400;\">Hardware<\/span><\/h4>\n<p><span style=\"font-weight: 400;\">No special hardware is required. There is no need for any special haptic device or even a mouse, only a keyboard is currently used. In cases where a keyboard is not accessible enough the standard keyboard replacements could fit. For mobile phones one could use special gestures or on-screen special buttons to trigger the scans or other actions.<\/span><\/p>\n<h4><span style=\"font-weight: 400;\">Audio Recordings<\/span><\/h4>\n<p><span style=\"font-weight: 400;\">Because of the lack of solid text to speech support in the game engine the library uses a practical approach of audio recordings. This is limited in some ways but could be very powerful in other ways like: reusing existing game assets instead of voice (like effects or short music clips), recording voice actors unique to the game for customized experiences or translation to other languages that are not well supported.<\/span><\/p>\n<h4><span style=\"font-weight: 400;\">Tags<\/span><\/h4>\n<p><span style=\"font-weight: 400;\">The audio recordings are mapped to tags that the developer assigns to elements in the game (like table, couch, knight etc.) either manually in case of basic games or programmatically for more dynamic ones. Advanced developers could add creative uses of tags, for example: communicate an element\u2019s state via a dynamic audio tag (like: \u201crunning rabbit\u201d or \u201cjumping green monster\u201d). Another usage would be to generate virtual entities with audio tags (like the center of a group of spaceships or of a team of opponents) &#8211; the virtual entity would mark the group of objects\u2019 center and identity for the user.<\/span><\/p>\n<h3><span style=\"font-weight: 400;\">Weaknesses<\/span><\/h3>\n<h4><span style=\"font-weight: 400;\">Text to Speech<\/span><\/h4>\n<p><span style=\"font-weight: 400;\">Arbitrary text like score, health and time indication for example, are currently not supported. There are advantages for using recorded audio like mentioned above but it has also major limitations. The main reason for not having text to speech is the lack of solid universal support for it in the game engine. Text to speech in game engines is a challenge all by itself (even though some options do exist).<\/span><\/p>\n<h4><span style=\"font-weight: 400;\">Testing<\/span><\/h4>\n<p><span style=\"font-weight: 400;\">The library was designed in such a way that it can be easily modified. So once feedback from testers was given it took a short time to incorporate it into the library. But I must say it was quite difficult to find testers with low vision and it took a lot of time and effort. New testers who are blind or have low vision for additional testing are very welcome to join the Alpha. This is one of the main goals of this blog post. <\/span><\/p>\n<h4><span style=\"font-weight: 400;\">UI<\/span><\/h4>\n<p><span style=\"font-weight: 400;\">Accessibility for UI (<a href=\"https:\/\/docs.unrealengine.com\/en-us\/Engine\/UMG\">UMG<\/a> in Unreal Engine language) including menu items is not supported. An example for a plugin that supports this functionality (for Unity) is: <\/span><a href=\"http:\/\/www.metalpopgames.com\/assetstore\/accessibility\/doc\/index.html\"><span style=\"font-weight: 400;\">UI Accessibility Plugin<\/span><\/a><span style=\"font-weight: 400;\"> by <a href=\"http:\/\/metalpopgames.com\/\">MetalPop<\/a><\/span><span style=\"font-weight: 400;\">.<\/span><\/p>\n<p>&nbsp;<\/p>\n<h3><span style=\"font-weight: 400;\">Future<\/span><\/h3>\n<p><span style=\"font-weight: 400;\">While working on the project I have thought about a few future opportunities. Here is a list of some of them. I will be happy to discuss these and other collaborations.<\/span><\/p>\n<h4><span style=\"font-weight: 400;\">Game Engine Built-In Accessibility<\/span><\/h4>\n<p><span style=\"font-weight: 400;\">It would be a great advancement for accessibility in games and XR if the game engines themselves would provide built-in accessibility features like colorblind modes, text to speech, accessible subtitles library, accessible UI components etc. Hopefully, the <\/span><a href=\"https:\/\/twitter.com\/AbleGamers\/status\/968254164202647559\"><span style=\"font-weight: 400;\">new colorblind mode from Epic Games which got a lot of excited comments<\/span><\/a><span style=\"font-weight: 400;\"> is a good sign for things to come. And even better would be to develop a standard, for example an addition to <\/span><a href=\"https:\/\/www.khronos.org\/openxr\"><span style=\"font-weight: 400;\">OpenXR<\/span><\/a><span style=\"font-weight: 400;\"> standard that would handle accessibility in the same way for all major game engines and platforms. Personally, I would be more than happy to contribute to either<\/span><a href=\"https:\/\/www.unrealengine.com\/en-US\/ue4-on-github\"><span style=\"font-weight: 400;\"> Epic Game\u2019s Unreal Engine<\/span><\/a><span style=\"font-weight: 400;\"> directly (via a coordinated Pull Request) or to work with many others, more capable than me, towards designing and implementing a standard accessibility solution.<\/span><\/p>\n<h4><span style=\"font-weight: 400;\">Research<\/span><\/h4>\n<p><span style=\"font-weight: 400;\">The features shown above are just one possible implementation. The data that is gathered by the library like an object\u2019s identity, coordinates, etc. could be represented in many other ways as well. Off-the-shelf game engines could be a fantastic research environment with many advantages to try out and experiment with different approaches for accessibility.<\/span><\/p>\n<h4><span style=\"font-weight: 400;\">XR Semantic Object Model<\/span><\/h4>\n<p><span style=\"font-weight: 400;\">In the same way that HTML has Document Object Model, there could be advantages for developing a cross-platform semantic object model for Virtual and Real World Realities. One of the main advantages could be making this model accessible to tools such as screen readers.<\/span><\/p>\n<h4><span style=\"font-weight: 400;\">Audio-based Interfaces<\/span><\/h4>\n<blockquote><p><b><i>\u201cThe number of mobile phone users in the world is expected to pass the five billion mark by 2019. [&#8230;] the number of smartphone users in the world is expected to reach 2.7 billion by 2019.\u201d<\/i><\/b><\/p>\n<p><span style=\"font-weight: 400;\">&#8211; <\/span><a href=\"https:\/\/www.statista.com\/statistics\/274774\/forecast-of-mobile-phone-users-worldwide\/\"><span style=\"font-weight: 400;\">statista.com<\/span><\/a><\/p><\/blockquote>\n<p><span style=\"font-weight: 400;\">Unique opportunities exist by specifically solving the challenge of game accessibility for the blind. For example, full or partial solution would (with additional work of course) open up the market of more than 2 billion (!) basic mobile phone (non-smartphone) users to the game industry via audio-only cloud-based interfaces (\u201cdial to play\u201d style). Many of these users are located in developing countries. Less demands for CPU and GPU also mean games could have an option to run on cheaper computers. Audio-only interfaces could be integrated also to smartphones and voice-based digital assistants. Audio mode for a game should not be seen as a replacement for high end graphics display but it could complement it for certain use cases. Many times providing accessibility to one specific group benefits a lot of other groups as well.<\/span><\/p>\n<p>&nbsp;<\/p>\n<h3><span style=\"font-weight: 400;\">Call for Action<\/span><\/h3>\n<p><span style=\"font-weight: 400;\">Accessible Realities is currently in Alpha stage. If you are a <\/span><b>person who is blind or have low vision<\/b><span style=\"font-weight: 400;\"> or a <\/span><b>video game company or developer<\/b><span style=\"font-weight: 400;\"> I invite you to join the Alpha of Accessible Realities.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">In addition, feedback from <\/span><b>game accessibility experts<\/b><span style=\"font-weight: 400;\"> is very welcome. Accessibility for the blind is a complex task, I try hard to improve and your helpful advice is always welcome.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">To join the Alpha or a future Beta or to provide feedback or <strong>for<\/strong> <\/span><b>any other collaboration<\/b><span style=\"font-weight: 400;\">, I\u2019m available for you on Twitter at <\/span><a href=\"https:\/\/twitter.com\/AccessibleXR\"><b>@AccessibleXR<\/b><\/a> <span style=\"font-weight: 400;\">and via <\/span><a href=\"http:\/\/bit.ly\/axxr-collaboration\"><b>this online form<\/b><\/a><span style=\"font-weight: 400;\">.<\/span><\/p>\n<p>&nbsp;<\/p>\n<h3><span style=\"font-weight: 400;\">Finally, Thank You and Credits<\/span><\/h3>\n<p><span style=\"font-weight: 400;\">Many people have helped and contributed a lot to this project so I will take this opportunity to say thank you. If the result is still a work in progress that is my full responsibility. It\u2019s because of myself and my decisions (and I would hope also the non trivial nature of the challenge \ud83d\ude09 ) and not in any way because of any of the extremely helpful people below.\u00a0<\/span><span style=\"font-weight: 400;\">So without further ado, a very big \u201cThank You!\u201c to:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Orit, my wife, for her huge support on so many levels.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">To my family for their great support and continuous feedback &#8211; to my parents Ilana and Yosef, my sister Lilach and her husband Momi and the rest of the Shuti crew Dror, Eyal and Yoav. To Sarah and Hagit, Barak, Yonatan, Amit and Rotem. To my uncles, my aunts and their families and finally, to my late grandmothers who provided me with inspiration for this social impact project. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">Migdal Or organization &#8211; very big thank you to Rita, Doron and Gabi &#8211; for contributing a lot of very valuable feedback from their vast experience in the field.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Ofir &#8211; for testing and providing valuable feedback.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Ian Hamilton and Adriane Kuzminski for their great online resources that they provide on game accessibility and specifically on accessibility for the blind community. Special &#8220;Thank You!&#8221; to Adriane for her awesome feedback and comments on an early draft of this post.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">A very big thank you for their great advice, testing, words of wisdom and encouragement to the following friends, colleagues and awesome people: Moti, Barak, Amir, Adi, Eylon, Iris, Hilla, Roi, Yochai, Dror, Sharon, Dr. Lahav, Eyal, Amir, Omer, Ori, Michael, Ilan, Eldad.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">And finally, a special thanks to Arie Croitoru and Amir Green and the Unreal Engine Israel Meetup group members.<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400;\">Thank you all, the progress made could not have been done without you!<\/span><\/p>\n<p>Zohar (<a href=\"https:\/\/twitter.com\/AccessibleXR\"><b>@AccessibleXR<\/b><\/a><b> )<\/b><\/p>\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Summary How can we make it easier for video game developers to add accessibility for people who are blind or have low vision? In trying to help answer this question I have developed an accessibility software library for the Unreal Engine 4 game engine. This blog post shows the library in action and describes its &hellip; <a href=\"http:\/\/accessiblerealities.com\/blog\/accessible-realities-game-accessibility-for-people-who-are-blind-or-have-low-vision\/\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;Accessible Realities: Making video games and XR accessible for people who are blind or have low vision&#8221;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[15,4,13,11,17,21,5,18,14,6,20,10,16,12,7,19,9,8],"class_list":["post-24","post","type-post","status-publish","format-standard","hentry","category-uncategorized","tag-a11y","tag-accessibility","tag-accessible-realities","tag-ar","tag-audio","tag-augmented-reality","tag-blind","tag-game-accessibility","tag-gamedev","tag-low-vision","tag-mixed-reality","tag-mr","tag-ue4","tag-unreal-engine-4","tag-video-games","tag-virtual-reality","tag-vr","tag-xr"],"_links":{"self":[{"href":"http:\/\/accessiblerealities.com\/blog\/wp-json\/wp\/v2\/posts\/24","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/accessiblerealities.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/accessiblerealities.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/accessiblerealities.com\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/accessiblerealities.com\/blog\/wp-json\/wp\/v2\/comments?post=24"}],"version-history":[{"count":101,"href":"http:\/\/accessiblerealities.com\/blog\/wp-json\/wp\/v2\/posts\/24\/revisions"}],"predecessor-version":[{"id":160,"href":"http:\/\/accessiblerealities.com\/blog\/wp-json\/wp\/v2\/posts\/24\/revisions\/160"}],"wp:attachment":[{"href":"http:\/\/accessiblerealities.com\/blog\/wp-json\/wp\/v2\/media?parent=24"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/accessiblerealities.com\/blog\/wp-json\/wp\/v2\/categories?post=24"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/accessiblerealities.com\/blog\/wp-json\/wp\/v2\/tags?post=24"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}