Sitting too long lately has caused my previously improving spinal condition to worsen. This prompted me to consider what exactly was hindering my recovery from my spinal disease. I’ve noticed that everyone else in the gaming department, except myself, is also dealing with some form of suboptimal health. Some have bad knees, others suffer from frequent migraines, and some have terrible sleeping patterns. It seems that people around me who regularly deal with computer games are prone to sub-health.
This phenomenon reminds me of the WHO article on Gaming Disorder. The potential disorders that gaming and the metaverse bring to humanity have been widely identified and acknowledged. I believe that my classmates' and my own sub-health can also be considered a form of Gaming Disorder in a broader sense. Unlike the narrower definition of Gaming Disorder, our condition isn’t necessarily caused by playing too many games, but rather by being exposed to excessive amounts of digitally produced content and virtual world information in the production of the games. I would like to name this kind of Gaming Disorder as metaverse exhaustion.
As a victim of metaverse exhaustion and a practitioner in the new media industry, I wanted to find a way to address the current problem. I believe that the perceptual characteristics of the visually impaired can provide us with a brand new perspective on facing metaverse exhaustion. When we are confronted with a screen, our senses are limited to a virtual world that is predominantly visual and auditory. Our five human senses are "flattened" into two senses. The metaverse does not give us an experience that fully satisfies our sensory needs, so prolonged contact with it leaves us in a state of long-term sensory downgrading. In my opinion, this long-term sensory downgrading significantly contributes to metaverse exhaustion.
Similarly, the visually impaired experience a long-term loss of one of their senses, namely vision. This forces them to actively or forcibly rely on other senses or abilities, such as memory, hearing, and touch, to carry out their daily activities. This inspired me to tweak the sensory experience in the design of my metaverse. Does the experience improve when we reduce our over-reliance on sight and enable other senses for gameplay in the metaverse?
To explore this, I designed "Oops! Lights Out!," a game based on the perceptual characteristics of the visually impaired. The game tells the story of a little bird octopus who escapes her haunted apartment, avoiding capture by a ghost after the lights go out. In the game, I designed three non-visual interactions:
Auditory: After the light is broken, the player can't see the location of the ghost and can only judge its position and avoid it through designed 3D sound effects.
Memory: There is an invisible ghost fire wall in the scenes. The bird octopus will die if she touches the wall. Players need to judge the position of the wall by memory through constant trial and error.
Haptic: To avoid overburdening the player's memory, the game comes with an external heating pad. When the player is about to hit the wall directly in front of them, the heating pad will become hot.
Comments