The following blog post, unless otherwise noted, was written by a member of Gamasutras community.
The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company.
In some ways, The Gallery began in 1992. My first experience in virtual reality was that year, with the aptly named Virtuality—an early VR platform steeped in the cheese of late-80s sci-fi design. It wasn’t a good experience, and honestly it left me feeling nauseous, but it was a bubbly and bulky dream for a future that technology simply wasn’t ready for. That’s when the obsession started; one day VR would be just like the movies.
By 2012, I was personally experimenting with approximating 3D vision in a virtual space. There were forums where people were just throwing ideas up against the wall, trying to figure out what hardware we would need to build VR the way we saw it. We developed simulators and collimated displays, all hacked together in our garages. Palmer Luckey was there too, and he had figured out a way to make a headset affordable using off-the-shelf components. He promised to send a few of us the parts to test his new schematic.
But then Palmer got quiet on the forums, and John Carmack’s name started floating around. We knew then that the formula was cracked, and it was time to start taking VR more seriously.
That year, I formed Cloudhead with two colleagues, Christopher Roe and Matt Lyon, with a vision to build a game specifically for virtual reality. We decided early on that, for VR to be the VR we imagined, we would need some sort of hand input as well. There was only one device at the time that we thought might work—a flat-game peripheral called the Razer Hydra, which tracked hand position using a weak magnetic field.
In the spring of 2013, we launched a successful Kickstarter, received our first Rift, and got to work.
Starting with DK1 meant that the development of The Gallery became defined by design iteration.
An early experiment in hybrid locomotion
First came overcoming motion sickness during artificial rotations. We introduced snap turns (“VR Comfort Mode”) as a way to skip the perceptual hiccup of seeing movement while the inner ear doesn’t feel it. The only way to skip it was to literally skip it; skip increments of rotation. (We found our sweet spot to be at 10-degree increments.) We consulted for the Perceptual Psychologist at Oculus, and they ended up including snap turns in their Best Practices guide.
Next was iterating hand interaction systems that just didn’t exist yet. We had to figure out how to manipulate, grab, carry, and use objects in a 3D space with virtual hands in a natural way. As well, we needed constraint systems. If you grab and turn a door handle in VR, for instance, your meat-space (aka real world) hand will move independent of that fixed object in virtual space, creating a cognitive disconnect. We found that we could trick the brain by giving the virtual hand some affordance to stick to the handle, even when the meat-space hand isn’t perfectly in place, and then unsnap it if it moves too far away.
That process repeated with DK2 and positional tracking. Technologies improved, our team grew in size, and every time we were introduced to new hardware capabilities, we had to rethink design fundamentals.
And then there was roomscale
When Valve brought us to a secret summit in 2014, everything changed. The implementation of roomscale was a dramatic shift in design. On the one side, it was a huge wave of relief; tracked hand input and full, volumetric movement was going to be ‘a thing’ with commercial hardware. On the other side, we had to reverse-engineer our entire game and reconstruct its framework to fit roomscale VR. Not only that, but as soon as we crossed the 90fps threshold, perceiving VR became like looking into a true representation of reality. Even if you’re holding up something cartoony, the smoothness of the motion makes your brain think, “Oh, that’s a real thing, it’s just painted to look cartoony.” It also meant that smooth, artificial forward traversal could make people feel ill or uncomfortable because peripheral vection was more easily perceived—an issue that only affected artificial rotation beforehand.
Cloudhead Dan with the V Minus-1 Vive prototype
So, how do we move an entire room through 3D space without artificially pushing it forward? And what happens if the player only has the carpet in front of their TV as their play area? What if they only have the space in front of their desk? What if they have a full living room? All of these questions came before Valve had time to introduce chaperone as the VR standard.
Our solution was an elastic playspace and teleportation system which we called Blink. It began its life as a simple teleport—you point to where you want to go, push a button, and suddenly you’re there. We started adding layers of complexity, one by one. A reticle. A preview of your relative orientation. A preview of where your new boundaries will be. The ability to rotate your projected orientation. The ability to rotate your play volume itself. We wanted players to ideally orient their playspace to take full advantage of however much room they had, so they could comfortably move around in their volume without worrying about boundaries.
Finally, we added a cinematic fade to mask the “blink” between choosing your desired location, and ending up at the new location. Along with some naturally timed footfalls, we created a system where the further you teleport away, the longer the fade to black lasts and the more footfalls you hear. Added together, it created a form of locomotion that fit the world and the flow of the game, allowed full use of the player’s space, and—most importantly—wouldn’t make anyone sick.
Finding our fit
The entire time we were building these new systems, and solving each new subsequent problem, we were also trying to ship a game. Valve and HTC had quietly given us access to one of the first Vive devkits with only one stipulation: Make something incredible. What we learned as we built our first roomscale demo was that smaller, more well-considered spaces with a tight narrative loop fit the format. We stepped back into the whole arc of The Gallery’s narrative and started to sculpt it down to be something more intimate.
An early iteration of the beach (Top) and the final beach scene (Bottom) in Call of the Starseed, redesigned for roomscale VR
Despite The Gallery being a fantasy experience at its core, every time we shifted direction, or a new piece of technology came online, we always went back to the beach—the level most grounded in reality. Having that real-world constant helped players better acclimate to the virtual environment, and better learn gameplay interactions without a sensory overload. It also made the transition to fantasy that much more wondrous. We ended up redesigning the beach numerous times before it became the opening level you see in Call of the Starseed. And even after launch, we iterated on the scene again to support Valve’s “Knuckle” controllers last fall.
Call of the Starseed
When we launched Call of the Starseed alongside the HTC Vive in April 2015, we weren’t sure what the response would be. We worked countless long nights to meet that release date, and had to scope down many ideas that we just couldn’t make work in time. We knew that for VR to resonate with people, our experience had to not make them sick. Everyone within that first VR launch period knew that. But we also knew that we had to make our experience good. And, honestly, we didn’t know if it was.
The final sewer layout was fundamentally changed to better suit roomscale VR
At launch, the reviews were polarizing. There were comments that the experience was too short, or that we priced it too high at $29.99. Both were completely valid concerns from the public, but it was difficult for us to contextualize those comments, because every developer in VR had worked so hard and taken so many risks (financial and otherwise) to be there years prior. In the general Steam landscape, players expect that a game X hours long is worth X. It left us in a pickle, because we had to find a price to make a good ROI in a very small VR market. The playtime also varied; for many, Starseed was a 2 to 3-hour experience. But players more acclimated to VR were less likely to stop and touch the roses, and could finish it in half the time.
And then there were reviews that said Call of the Starseed was the first VR experience to make them cry, or the first to completely fill them with wonder. And it kept trickling in like that, with comments going so far as saying Starseed was the best gaming experience they’ve had in their entire life. Admittedly, I’m more jaded than most, but when you get reviews like those it’s hard to really believe them.
Still, they kept coming, and keep coming to this day. People approach us at events and reiterate those same sentiments. Eventually we realized that it was having the impact we really hoped it would—not just The Gallery, but the whole promise of virtual reality. Eliciting that sense of wonder in any medium is difficult, but virtual reality takes that up several notches. VR was enabling The Gallery to feel like a true memory of an event. A real moment in people’s lives.
The promise of virtual reality
Four months after launch, Valve and HTC reached out with the tremendous honour of including Call of the Starseed in the second Vive content bundle. By that time, our design motif of making an approachable VR experience that was comfortable and gradual was no longer feeling complex enough for some. That was partly our intention; we designed Starseed for everybody, in the same way that movies are for everybody.
Now, I am a massive, nerdy fan of all the Indiana Jones movies—even the bad ones. So, to me, virtual reality and The Gallery have always been about bringing 80s movies to life. Roomscale VR is about emulating a fantastical sci-fi future, rife with personal Holodecks. It’s about immersion, being taken to new worlds, and eliciting a childlike wonder. All the types of experiences I dreamt of growing up in the 80s, and have been teased with ever since.
With The Gallery, we wanted create that sense of adventure, fantasy, and freedom. To give people the chance to step into those characters and their journeys. People who have always wanted to participate in an adventure, but never could, for whatever reason.
It’s not often that you get to be there at the birth of a new medium, that you get to influence what it can become. That’s the opportunity that all of us understood—from the blood, sweat, and tears of the Cloudhead team, now nearly 20 strong; to the incredible passion of each and every developer and fan who’s been a part of this past year, forging the industry, and creating memorable experiences. Real moments.
VR isn’t just like the movies quite yet. But it’s becoming something much more important.
This article and images was originally posted on Gamasutra News
by Denny Unger