Virtual Reality was Virtually Real, for About Five Minutes in the Nineties
Every other Wednesday in Fads!Crazes!Panics!, Luke T. Harrington looks at one of the random obsessions to have gripped the public mind in the recent past, and tries, in vain, to make sense of it all.
If you lived through the nineties, odds are very good that when you first heard about the Oculus Rift or PSVR, your response was, “Wait, they’re trying this again?” Those of us currently in our thirties still remember the last time “virtual reality” was predicted to be The Next Big Thing That Would Definitely Set the World on Fire. And then, y’know, it didn’t.
It’s difficult to trace the exact origin of virtual reality since, despite what marketers might tell you, there’s no clear, agreed-upon definition of what virtual reality even is. That didn’t stop local news broadcasts, however, from airing stories about “It’s called ‘virtual reality,’ and it could change the way we work, play, and even live!” Accompanying the typical voiceover would be B-reel footage of the now-iconic setup: people wearing head-mounted displays and staring blindly at their own hands, which were shrouded in gloves that dripped with wires. Even if “What is virtual reality?” was an unanswerable question, there did seem to be a basic layman’s definition: head-mounted displays, haptic feedback, motion controls, polygonal graphics. And for whatever reason, we all thought it was the obvious logical conclusion of computers existing.
That’s just how things were in the nineties. Computer technology was increasing in capability by the minute, and a larger economic boom (note to people under the age of thirty: “economic boom” means a time when people have a lot of money) had left people with piles and piles of cash to spend on really stupid stuff. Accordingly, there were dozens of watchwords: virtual reality, multimedia, cyberspace. None of us really knew what they meant, but we were all certain that they were about to change the world. And, in a perennially future-obsessed culture, the phrase “change the world” always comes with the implied tag “…for the better.”If the content is good, the immersion follows; if the content is bad, no amount of technology can save it.
VR, it’s generally agreed, traces its origins mainly to NASA’s labs, where it was developed for training and simulations. There were other industrial applications as well, including in the military and the automotive industry, but its potential as an amusement device was undeniable. Or, at least, it seemed that way to someone. Accordingly, hardware sold under the name “Virtuality” (it’s a clever portmanteau, you see) stampeded into arcades, showcasing, mainly, a game called Dactyl Nightmare, where up to four players could run around on chessboard-inspire terrain and shoot at pterodactyls, because sure, why not. It was…well, it was a few minutes of motion sickness that you paid five bucks for, but people were excited about it.
It was successful in the arcades—at least for a hot minute—but the holy grail, of course, was bringing it home. In retrospect, this seems insane by its very nature: even if home VR set peddlers could solve obvious problems like motion sickness, the whole thing made almost no sense as a consumer product. Who was this mythical individual who could block out hours of time to be dead to the real world in order to explore virtual ones? This individual who could devote an entire room of his or her house to fumbling around covered in wires, to say nothing of spending hundreds or even thousands of dollars on hardware like head-mounted displays, motion controllers, cameras, and treadmills? This individual who couldn’t wait to make gaming a wholly solitary and extremely awkward experience? No one had any answer to these questions, but in the nineties we made electronics first and asked questions later.
As it turns out, though, while it’s possible to conceive of all sorts of experiences, packing them into affordable, user-friendly consumer products is another matter entirely. Sega announced a VR headset for its (very-two-dimensional) Genesis console, but abruptly cancelled it when it learned that it, like basically all the hardware Sega released in the early nineties, tended to make people physically ill. Atari similarly announced a VR attachment for their Jaguar, but then wisely put the Jaguar out of its misery instead. The only product that made it to market was one that failed and was swept under the rug so quickly that, unless you were ten years old in 1995, you’ve probably never heard of: the Nintendo Virtual Boy. The Virtual Boy itself was the perfect illustration of how the thinking had outstripped the technology by at least a dozen years: its graphics were stereoscopic, but it couldn’t really do 3D; its “head-mounted display” was actually a table-mounted display; its controller was just a controller and not one of those weird glove things we were all excited about for some reason. And that was to say nothing of the name, which took all the non sequitur of the name “Game Boy” and confused it further. (Is a “virtual boy” a lifelike robot, or…?)
After the debacle of the Virtual Boy, the virtual reality craze seemingly disappeared overnight, as people realized that simulated 3D environments were just as easily explorable on a TV or computer monitor as they were with a dozen wires strapped to your head. Maybe there was a bit of a learning curve to moving the camera with a second joystick, but no more of one than there was to swinging a head-mounted display around without losing your lunch. Accordingly, here in the future, most of us continue to get our videogames through the standard black-plastic-box-hooked-up-to-our-TV, which will surely disappoint any time travelers from the early nineties who visit us. And sure, Oculus and PSVR…exist…but I don’t think I’ve ever met anyone who owns either.
Ultimately, the problem with VR was that it, like so much that happened in the nineties, put technology ahead of content. “Immersion,” after all, comes from high-quality content, not cutting-edge technology. We’ve all had the experience of getting so lost in a movie that we forgot there was a world outside the screen—even if we were watching on a tiny TV screen or a phone. Even a good novel—a medium that you’ll notice makes no attempt to simulate reality at all—can pull you in and “immerse” you in its world. If the content is good, the immersion follows; if the content is bad, no amount of technology can save it.
A human being, after all, is not just a body; nor is he or she merely a mind. Convincing the mind or the body that it actually is getting attacked by pterodactyls is a cool trick, but it’s no substitute for stirring the soul—something all good films, books, and yes, games do. Technology, at its best, can connect people—as we’re all learning now. At its worst, it just makes us flail around with stupid things on our faces.