In the second half of March, tens of thousands of industry professionals met in San Francisco for the 33rd edition of the Game Developers’ Conference. Amongst the attendees were hundreds of accessibility professionals, advocates, academics, and students, all working together and with the broader developer community to remove unnecessary barriers in video games.
Since its inception in 2003, the IGDA Game Accessibility Special Interest Group (IGDA-GASIG) has used GDC as its home base of operations. Each year, the group hosts a roundtable session at the conference to discuss the state of accessibility in games, identify where work needs to be done to increase awareness, and collectively assess the trajectory of the industry in relation to accessibility. For the last three editions, we have also concurrently hosted the Game Accessibility Conference (GAConf) with GDC. (Information, resources and videos related to GAConf 2019 are forthcoming and will be posted to www.gaconf.com.)
There were two planned topics at this year’s roundtable: Microsoft’s new Adaptive Controller for the Xbox One, and the CVAA—or Communications and Video Accessibility Act—which finally took full effect for the games industry this January. However, like any healthy discussion, other topics arose and were explored together as a group. These included: 1) how accessible game design contributes to improved game design; 2) how to effectively implement subtitles in games; and 3) how to incorporate assistive gameplay modes into competitive experiences.
Given the various experience levels of the people in attendance—which ranged from university student to industry veteran—efforts were made to present the subject matter in a way where everyone could participate. This meant laying a groundwork for each subject so that any attendee could ask questions or provide ideas that would contribute to the discussion and influence its direction.
Microsoft Adaptive Controller
Released in September of 2018, the Microsoft Adaptive Controller represents a monumental advancement in accessibility for the games industry. While assistive controls have existed for some time in different forms, never before has such a comprehensive accessibility package been provided by a gaming platform. From the controller design and its ability to support additional input devices, to the hardware’s full integration with the Xbox operating system through the Xbox Accessories app, an entire console library became vastly more accessible overnight. Publicity surrounding the controller also increased the general public’s awareness of accessible gaming: if Microsoft’s holiday ad created an explosion then its superbowl ad was comparatively the detonation of a megaton bomb. 103.4 million viewers learned about games accessibility that day, and celebrities like Cher continued to increase exposure by tweeting about the device.
The design of controller is unique and can be confusing to those who are unfamiliar with existing assistive controls and the powers of controller remapping. When compared to the standard Xbox One controller, only a handful of features are recognizable: the d-pad, menu button, view button, and home button. Everything else, however, is foreign: two large circular buttons, the USB ports on the left and right side, and the nineteen 3.5mm jacks on the back side. The two circular buttons are simply built-in adaptive switches that can be pressed with multiple fingers, the palm of the hand, an elbow, or even a chin. The USB ports allow for HID controllers to be attached, such as analog sticks. Finally, the 3.5mm jacks allow for up to nineteen third-party adaptive switches to be connected. Each input device is customizable through the Xbox Accessories app and players can design an entire control scheme that works best for them. They are also able to create different controller profiles for different games, or even multiple profiles for the same game.
After introducing the device, accessibility specialists in attendance were quick to point out that, powerful as it may be, the Adaptive Controller is not a solution for all accessibility issues; developers still need to focus on the experience of the player and remove as many barriers as possible. While the controller has afforded physically impaired gamers the opportunity to enjoy titles that were previously difficult or impossible to play, it truly excels with accessible games (that is, titles where the developers identified barriers during the design process and avoided or removed them). With the advent of this device, the need for accessibility considerations has only increased—there are now many more players who have progressed past the hardware barriers that previously existed and are now encountering barriers within the games instead.
Some developers look at the controller and erroneously regard it as a source of more work for their project: more programming, more testing, and more retooling. Aside from possibly an increase in design considerations, which really is necessary for any accessible or well designed game, no additional effort is required. To support the Adaptive Controller, do not think about the device or the technology behind it. Instead, think about the people who would be using the device: those who have limited reach, limited mobility, limited endurance, limited capacity, and/or limited strength.
An attendee asked if supporting the Adaptive Controller complicates Microsoft’s certification process. The answer is no, it does not, because the first-party Xbox Accessories app provides all of the functionality required to use the controller. In other words, the app drives the controller and does so without creating any additional burden on developers. At a technical level, the Adaptive Controller is no different than a regular Xbox One controller.
Communications and Video Accessibility Act (CVAA)
On January 1st, 2019, the video games industry finally fell under the umbrella of the Communications and Video Accessibility Act (CVAA). A series of waivers had been granted by the FCC since 2012 for necessary R&D time, but these expired at the end of last year. Now, all video game software and hardware featuring online communication must comply with the accessibility expectations outlined by the CVAA. This urges a significant shift for the industry as the legislation covers a wide range of accessibility accommodations. Not surprisingly, developers, publishers and hardware manufacturers are now attempting to safely navigate this space and ensure that their products aren’t violating new communications laws.
The beginnings of the CVAA hearken back to 1990 when the Americans with Disabilities Act established that communication is a fundamental entitlement for people with disabilities. In just a few short years, the Internet became available to the masses and would drastically augment people’s ability to communicate; this also had the unfortunate effect of creating many loopholes in this landmark legislation. It wasn’t until 2010 with the passing of the CVAA that new laws were instituted for text messaging, voice over IP, and video conferencing.
The CVAA stipulates that any kind of technology or software using the aforementioned methods of communication must be accessible as much as is reasonably possible for those who are deaf or hard of hearing, are blind or with poor vision, have limited strength and/or mobility, etc. The FCC determines what is “reasonably possible” and takes into account the size of a company along with the amount of resources it has, and how far along in development a product was when the compliance deadline hit on January 1st, 2019. (Note that a full exemption does not exist for companies below a certain size.) It’s also worth pointing out that the CVAA isn’t just for video games: any electronic device or software communicating over the Internet must comply, so the games industry is just one of many that is affected.
So what does this all mean? Ambiguity exists because the CVAA does not require a particular list of features to be implemented and instead specifies end results and performance objectives. This affords developers a great deal of flexibility to solve accessibility problems according to what fits their products best, but it also produces unique challenges because there isn’t a simple checklist that can be followed. There are only two procedural requirements: consider accessibility early in the design process, and involve people with disabilities and accessibility experts to provide experiential feedback.
Like with any new legislation, there is still a lot of grey area and it will take time for developers to establish conventions to satisfy expectations outlined by the CVAA. In general, if your company is actively working towards compliance then you probably don’t have much to worry about. In turn, if your company is not working accessibility into your development processes then you certainly increase the likelihood of somebody notifying the FCC of your oversight. In the web accessibility space, there has been a spike in lawsuits in recent years related to ADA Title III against certain types of companies for not providing accessible websites. Fortunately, the CVAA takes a very different approach and requires that all complaints be filed through the FCC and not independently. In other words, companies that fall under the CVAA cannot have legal filings brought against them by individuals. After an issue is raised, the FCC acts as a mediator and opens a dialogue between the consumer and the company with a 30 day window to find a resolution. After 30 days have passed, the consumer can choose to extend the window or file a complaint—the latter of which can ultimately lead to the levying of significant fines. For more detailed information on this process, refer to the article Demystifying CVAA.
It is important to emphasize that the CVAA encompasses accessibility requirements for online communication only; the FCC has no jurisdiction over other aspects of the game development process. This means that accessible game design, while incredibly important, is not a part of the CVAA. To summarize, if any form of chat is included as a feature in a game then players with a wide range of disabilities must be able to communicate with others. These same players must also be able to navigate through any menus and interfaces to use these communication features.
An attendee from Sweden stated that the game she is working on is not accessible and then asked a series of questions: 1) What can be done to help persuade her company to prioritize accessibility development efforts?; 2) How does the CVAA affect my game?; and 3) Does equivalent legislation exist in Europe? Firstly, generating dialogue about the CVAA serves as an effective starting point. Since consequences now exist for not making a game or gaming device accessible, never before has there been such a strong argument for developers and publishers to make accessibility a priority. Producers can also take an increased role in this process because, in their day-to-day work, everything is always about creating solutions as much as possible. It helps to understand that generally everyone wants to help with accessibility, but it’s not always understood that they actually can help. So creating dialogue within the company establishes that these needs exist, and then it falls on the producer to provide each person or group with direction to the next action. Secondly, even if your company is based in another part of the world, your product must comply with CVAA accessibility guidelines if it is to be sold in the United States. Being based in another country does not preclude your product from receiving complaints through the processes established by the FCC. Finally, in 2019, the European Union passed the European Accessibility Act which is a broad piece of legislation that includes provisions that are equivalent to the CVAA. Thus, making adjustments to your game development pipeline for the CVAA also works to satisfy similar or equivalent laws in other countries.
An attendee from Epic Games asked: “What kind of strategies exist for meeting the ‘reasonable actions’ towards compliance?” As mentioned above, this depends entirely on how much a company has in the way of available resources. For a company the size of Epic, expectations would be great and the FCC would not be very lenient. This is because larger games companies are better equipped to tackle these types of problems and so they will be held to a much higher standard than small- or medium-sized games companies. This is again where there is a grey area because the FCC will treat each case differently and has the final say on what compliance really means. But in general, the more effort you’re making then the lower likelihood there will be of anything getting out of hand.
We concluded our conversation on the CVAA with one more attendee question: “Is it okay to utilize use third party products/services to provide communication functionality?” The answer is straightforward: developers are the entities with whom consumers have relationships. If a developer chooses to integrate third party communication services, the developer is the one who is liable for any accessibility issues with that service; liability cannot be delegated.
Accessibility Promotes Improved Game Design
By means of the Xbox Adaptive Controller, the discussion gravitated toward a more general philosophy: accessible/inclusive design typically results in improved game design. It is a simple yet powerful idea that accessibility considerations can enrich the experience for everyone. This means that the additional time, energy and money that is invested in making a game more accessible will actually benefit all who play it, and not just a smaller demographic of people with permanent physical impairments.
One immediate example is remappable controls. This feature allows every player to create a configuration that best fits his or her play-style. Another example would be the utilization of colors for foreground objects that contrast well with background objects. This will help everyone to discern visual details whether or not they are visually impaired. A more profound example would be to avoid controls that require unreasonable physical exertion, such as button mashing. At the very least, these types of controls will cause fatigue, but the potential also exists to create new physical impairments through repetitive stress injuries. Yes, your game could actually end up hurting someone! Depending on the person, his or her gaming habits, and their awareness of the physical stress that is being endured, these injuries may not heal completely.
Developers must ask themselves if it is really necessary to require players to press a button rapidly, or hold down a button for an extended duration of time. Can the same effect be achieved another way? Requiring complicated controller inputs to progress through a game may create an unpleasant experience for your audience, and these negative experiences are only amplified for players who have physical impairments. In some cases, complicated controls are necessary, but more often than not an alternative approach exists that is equally functional and will improve the experience for everyone.
One particular accessibility feature that contributes to a better overall game experience is subtitles. With a little consideration, it’s not hard to imagine how subtitles can benefit everyone: a baby is sleeping in the next room, a person is traveling on a busy train, reading on-screen text allows the player to better follow the progression of the storyline, and so on. They are widely used by people who are not hard of hearing or deaf. In fact, Assassin’s Creed: Origins tracked the number of players who enabled subtitles and found that it was 60%—well more than half of all who bought the game!
Perhaps even more important than the inclusion of subtitles is how they are presented on-screen. If the text is too small, if it is relegated to a poor location on the screen, and/or if it is not contrasted properly with background graphics then the feature will at least be a source of frustration, and at worst will render it unusable. Other misimplementations exist, such as having too much text appear at once. An excessive amount of information can distract the player and interrupt the flow of the game. It also allows for the unfortunate possibility of important details being revealed before the associated events or actions occur.
There is an art to the subtitling process and it can require some amount of finesse to create an optimal experience. On one hand, the television and movie industries have been doing it for decades and have well established conventions for representing humor, sarcasm, accents, music, etc. through text (the BBC offers thorough subtitle guidelines on Github). On the other hand, TV and film have it much easier because both mediums are non-interactive. Of course, games are highly interactive and can branch in many different directions: a scene can change dramatically from one moment to the next based on a player’s decisions. Fortunately, there are solutions to these problems. For example, parenthesized text can communicate to a player background noises—such as the crackling of a nearby fire or the caterwauling of a creature lurking in the darkness. This can be displayed simultaneously with dialogue and activated contextually according to the position of the player character in relation to her surroundings, the same as voice recordings and sound effects. Refer to the article How to do subtitles well: basics and good practices for more information on how to effectively implement subtitles in games.
An attendee shared that in her experiences, subtitles are not always the same as what is spoken and this can be disorienting to a player. Inaccuracies in subtitles exist because voice actors use the script they are given as a foundation and then ad-lib to get a more organic performance. Because subtitles are typically following the original script, conventional development pipelines result in a natural disconnect between what is spoken and the text that is displayed on-screen. Not surprisingly, additional team coordination is required to remedy this issue: send the original script to your voice actors, expect that there will be changes through the creative process, then bring back an updated script and reconcile it with the subtitle text database. It is also necessary to involve any cinematics editors who might be cutting out lines to make a cutscene more impactful, as they don’t always know that the script has changed.
The same attendee also mentioned that if a character speaks rapidly then it can make subtitles difficult to read. Captions frequently incorporate paraphrasing, but it’s important to note that a significant percentage of the people who enable subtitles also have cognitive disabilities; these people cannot process speech at a normal speed, especially when the speech is coupled with animation. This means that when implementing captions, it is actually possible to create a new barrier while attempting to remove another! In this case, if you are paraphrasing when it isn’t necessary then you may be creating additional challenges for your audience. Typically, it’s best to paraphrase only when there really is too much being communicated in a short amount of time. Also, try to keep captions as close as possible to what is actually being said. Another important consideration is that if a character is speaking so rapidly that the associated subtitles are difficult to follow then it’s quite possible that the words are unintelligible. In this case, fast-moving captions effectively replicates the true experience and this can be an acceptable outcome.
Competitive Titles with Assistive Gameplay Modes
A concerned attendee had developed a competitive racing game for mobile phones. After its release, he was contacted by a quadriplegic player requesting that the game be more accessible. He and his team considered implementing an automatic shifting feature, but they were concerned that if it was made available to everyone then too many people would “cheat” and thus spoil the game’s competitive spirit. The developer wanted to somehow accommodate this person and others with physical impairments, but didn’t know how to do so without “breaking” his game. The only workable solution he could think of was to require players to provide proof of their physical disability in order to gain access to this assistive gameplay mode—not a very reasonable solution. He knew that this would be problematic, but could not find another path forward.
An accessibility specialist posited that a feature cannot be considered cheating if it is available to all players, and then referenced the assistive gameplay modes available in Mario Kart 8. For this title, Nintendo implemented auto-steering and auto-acceleration, and allowed players to enable these modes in the game’s options menu. When competing online, a player using one or both of these assistive modes is shown with a small antenna projecting from the back of the kart. For players with impairments, these modes help to level the playing field; for everyone else, they provide an alternative play-style that some may find preferable to the default experience.
It was revealed that the mobile racing game in question provides players with only two inputs—acceleration and shifting—and the cornerstone of the experience is using these two controls with precision to compete with other players. The developer indicated that any further simplification was not possible without fundamentally changing the game. Also, he questioned whether it is okay to reveal to other players that someone has an assistive mode enabled as it may generate unwanted attention.
Ultimately, it was determined that the best solution was to allow auto-shift as an option for all players, but also to allow competitors using a different control scheme to be filtered out during the matchmaking process. This caters to all parties by preserving the original experience and offers important accessibility accommodations to those who need or want them.
Thank you to the following participants for their contributions to such a fantastic discussion on game accessibility: Ian Hamilton (Independent), Cherry Thompson (Independent), Sam Thompson (Sony Worldwide Studios), Douglas Pennant (Creative Assembly), Karen Stevens (Electronic Arts), Chad Philip Johnson (Anacronist Software), and the 40+ additional people who attended this year’s roundtable!