Esports is a growing industry that has begun to reach the mainstream with A-Tier professional scenes, such as League of Legends, Counter-Strike: Global Offensive, and Fortnite. Today, with multi-million dollar contracts and international events becoming more common, there are millions of dollars being put into establishing esports as a professional industry. Preceding this success, though, was the grassroot origins of passionate communities that ran LAN tournaments inside houses. No community exemplifies this passion more than the Fighting Game Community (FGC). As the professional scene in the FGC grows, the community itself must grow and improve their professionalism and inclusivity as well in order for them to thrive.
Fighting games is a genre that is held in high regard due to the pure mechanical skill and reactions that can be shown through it. The high-skill ceiling it is known for, though, heavily relies on being able to visually recognize actions on screen in order to decide how to respond. With this in mind, fighting games are not currently designed for competitors who have visual impairments. With inspiration from a YouTuber named novriltataki, we have created a “Blind Mode” concept to create a more inclusive gaming experience. Some central ideas from the video include:
"While fighting games have many useful sounds that indicate what is going on, most of the information is presented visually. By converting the visual information to audio, blind players will be able to grasp what's going on a lot better."
If we apply the Pareto Principle to solve this problem, we can assume that 20% of visual info holds 80% of overall importance, and the other 80% of visual info only adds 20% importance. Based on our knowledge of fighting games, we think that this crucial 20% is the position of your character on the screen in relation to the opponent.
What It Does
The main goal of “Blind Mode” is to help players with vision impairments with getting the most important information in a fighting game. Our “Blind Mode” utilizes stereo sound to supplement the important visual information given in the base game. The information we have translated into audio cues are player distance from opponent, enemy’s vertical distance from the ground (during jumps and juggles), and health points (HP) status.
How We Built It
We built our concept using a Street Fighter III: 3rd Strike ROM that is running in the emulator, FinalBurnNeo. This is the primary emulator used by today's 3rd Strike players to play online. Using a Lua script, we are able to locate important value addresses through the ROM’s raw memory. With these values, Lua prints a separate text file with the important values to be used as input for our Python script (player position and health). It then uses this to determine when and what sounds will be played. The script is also responsible for adjusting the speed and pitch of the soundbytes based on the data being input. Speed adjustments are used to express distance between players (x-axis). The soundbyte used is a metronome-like track that rises in beats-per-minute (BPM) as the two players get closer; it is played throughout each round. Pitch adjustment is used to reflect changes in the opponent’s y-axis position. For this, we used a note that sustains as long as the opponent’s y-coordinate does not equal 0. In addition, the note pitch modulates up the higher the opponent is, accurately reflecting the opponent’s jump arc.
Challenges We Ran Into
Finding the memory values for the ROM was harder than expected. There is no public memory map, so we had to rely on Street Fighter community leaders' example scripts to slowly find the memory addresses we needed for the health values and stage positions. Additionally, When using an open source pitch and speed adjustment library, aupyom, we discovered that it only supported mono sound. We modified it slightly to support stereo sound in our driver program.
Accomplishments that We’re Proud of
As a part of the competitive gaming community, we are extremely proud to contribute to creating a more inclusive environment for those with visual impairments. Even if this is not a complete product for every fighting game, we believe the fundamentals of our system can be adapted to modern games, such as Street Fighter 5, Guilty Gear Strive, Tekken 7, Mortal Kombat 11, and more. With professional players like SonicFox beginning to promote inclusivity within the FGC, there is no better time than now to curate tools for all demographics to enjoy these games.
What We Learned
With Lua, we became familiar with how a ROM works with an emulator, and can appreciate how developers cleverly used their extreme memory constraints while creating games. When solving our stereo sound issue, we learned about how sound cards read in data and how a .wav file's raw audio data are stored and processed. Our solution involved starting to learn some basic numpy for matrix operations, as well, as stereo audio uses two dimensional numpy arrays. We also practiced accounting for a blind audience by making controls friendly to the visually impaired.
What’s Next for Blind Mode
To continue this project, we can foresee four main improvements: introducing more accurate stage indicators, improving latency issues, increasing option customizations, and using less repetitive sound indicators. As of right now, we do not have any specific indicators about where on the stage the individual is - we only indicate distance from the opponent. This could involve creating “zones” around the stage that, when crossed, creates a unique sound byte. We would also want to convert our Python script to a faster language, like C++, to decrease latency. Due to the time constraints for this event, we used Python due to our familiarity with the language, but converting the script to C++ would be an improvement on responsiveness. We would also want to introduce more options for users to customize their experience. This could include volume sliders for each sound bite. Adding blind accessibility to these menus would also be important. Finally, we would want to change the repetitive distance (x-axis) indicator. While we used the metronome-like track, we do believe that there could be better options that could convey the same spatial information in a more appealing way.