Half-Life: Alyx Commentary
a1_01_intro1

Hi, my name is Gabe Newell, and welcome to Half-Life: Alyx.
To say this product was challenge would be an understatement:
it's our first Half-Life title in over ten years,
and it's our first one in Virtual Reality.
In addition to it being a great VR experience that still
felt like a Half-Life game, we wanted Half-Life: Alyx to serve players
who've been waiting to find out what happens after Episode 2,
and at the same time provide an on-ramp to anyone
who hasn't played a Half-Life game at all.
Ultimately, whether we succeeded in any of that is up to you to decide
but hopefully, this commentary mode will give you some insight into
the set of problems we faced, and how we chose to tackle them.
To listen to a commentary node, pick up the floating hologram of a
radio headset in front of you, and simply put it on your head.
Once you're done listening to the commentary,
you can take the radio headset off, or simply put on the next one.
Please let me know what you think.
I can be reached at [email protected]
While I'm unable to reply to all the emails I receive, I do read them all,
and they're a critical part of the feedback we use to evaluate our work.
Thanks, and have fun!

a1_01_intro2

As in the case of any Half-Life game, the first hour of Alyx must convey a
lot of information to the player, hopefully in a natural and entertaining way.
Players need to know who they are, where they are, what kind of world
they're in, who the other important characters and factions are, and so on.
At the same time, they need to learn how to actually play the game.
How do they move? What can they interact with, and how?
What should they be trying to do? etc.
Figuring out exactly what information we would try to convey in
these first two levels, and how it would be conveyed,
took us over two years of playtesting.
It's tempting to try and include everything we think a player should know,
but all information has a priority, and the more we include,
the more chance there is that a player misses something important
because they're distracted or confused by something less important.
So over the next two levels we'll use commentary nodes to point out some of
the specific places where we grappled with conveying information to the player.
In-between these nodes, you'll likely be able to spot many other
examples for players to find, if they're interested in
paying closer attention and understanding more.

a1_01_citadel_vista

One of the pieces of important information we struggled to convey was
exactly when Half-Life: Alyx is set within the series' timeline.
Whilst we tried to reflect it carefully in many of the world details,
it was never a hundred percent successful during playtests.
Eventually, we decided it was too important to be left to players
to figure out, and so we put it in our opening text crawl.
The Citadel's incomplete state provided us with a early
opportunity to reinforce the timeline visually.
Half-Life: Alyx takes place five years before the events of Half-Life 2,
in which the towering Citadel represents the Combine seat of power.
As we see here, however,  Early Combine infrastructure is a cluttered
mess of cables and unusual structures sprawled all over the city,
surrounding the Citadel's construction, and served as a useful indicator of the timeframe.
We went on to reinforce this impression in other places throughout the following level.
Combine architecture tends to be abstract in scale and form so we
included some familiar visual cliches of human construction:
The large, external scaffolds, some distinction of floor levels,
rows of small lights and the tidying of the silhouette into more
structured vertical elements, trying to strike a balance between capturing
the alien nature of the citadel whilst grounding its construction
in enough familiarity to better understand its size and state.
Large sweeping vistas, such as this one, proved an interesting challenge in virtual reality.
Their distance from the viewer means that they don't benefit from
stereoscopic vision the way that small close-up objects do.
Because of this, the small objects on the balcony often draw the attention
of the players before they've even noticed the imposing structure in the distance.
If we moved the Citadel too close, it felt absurdly huge and difficult to take in.
Yet, placed alone on the horizon, it felt two dimensional and flat.
We found the solution in filling the space in between the viewer
and the distant structure with many scale references.
The city itself, huge supply cables leading toward the vanishing point,
large repeated combine buildings, heavy pollution fog, careful lighting,
and all the animated elements, especially the helicopter whose sound begins in the fade in.
All these were carefully composed help draw the attention of the player
to the Citadel first, before letting their attention wander to take in
the details of the city and settle ultimately on the inviting items here on the balcony.

a1_01_greenhouse

In these first areas of the game, we grappled a lot with the density
and prioritization of the information we were conveying to the player.
Playtesting showed us that players in VR were easily distracted,
often due to the better peripheral vision, such that they'd focus
on one scene element while losing track of others.
This was exacerbated in these first rooms because players were
acclimatizing to being in VR, often for their first time.
In addition, different players progress at different speeds.
We saw some players spend 30 seconds on the opening balcony, and others 30 minutes.
As a result, we did a lot of iteration over this set of opening areas,
each time changing what information we tried to convey, and where.
Initially, the video call from Eli took place in the later refuge room
with the snark and camcorder, but players were often too engrossed in
interacting with all the detail in the room to pay attention to the conversation between Eli and Russell.
We moved the video call out to the starting balcony,
but there it distracted from the establishing shot of City 17 and the Citadel.
It also came at a time when players were still figuring out core game elements,
like movement and hand interaction with the world.
That collision made simple things hard, like when to show the player a
movement tooltip if they immediately started the video call.
Eventually, we settled on inserting the greenhouse between the starting
balcony and the refuge, which allowed us to move the call to an area
with a slightly obscured vista and fewer interactable objects.
This avoided the distraction problems, and ensured players were past the
initial gameplay moments of figuring out their hardware and movement setup.
It also had the bonus of allowing us to add further detail to the opening
balcony and Alyx's refuge without fear of creating distractions from the video call.

a1_01_strider

Beyond conveying critical information about the state of the world,
the first two levels of Half-Life: Alyx also needed to show some of VR's strengths.
Conveying a sense of scale is something VR does much better than flat screens
and our early playtests showed us that nothing quite sold that scale more than seeing a Strider up close.
In addition to the visuals, to fully illustrate the scale of a Strider
at close range we needed to do a lot of work on its audio treatment.
Players had strong expectations about feeling the weight, power and presence
of the Strider through its movements, particularly as it stepped on the balcony the player was occupying.
In order to convey this power through sound, we layered multiple sound effects,
allowing us to address ranges in the audio frequency spectrum and the temporal nature of those elements separately.
We then combine those elements in the Source 2 Audio Engine to control them as one sound.

a1_01_refuge

This room, internally referred to as Alyx's refuge, was initially
the location where the Eli video call took place.
However, once that video call was moved out to the balcony,
this room was freed up to solve other problems.
In the Half-Life series, we've always tried to ensure that there's
generally enough narrative for all players to understand where they're
going and why, but for players who slow down and pay attention,
there's more detail that can lead to a greater understanding.
With the balcony being a place to get familiar with the game's inputs,
the greenhouse being where the story kicks off, and the Strider there
to surprise players with spectacle, we decided that the refuge could
be a quiet moment for players, where they could discover more narrative
detail in the environment.
At the same time, it could be a place where they can play
around a bit more, now that they're acclimated to being in VR.
Spending time in the refuge is meant to bring players into Alyx's
role in the resistance and her place in the world.
We wanted it to feel like a real stake-out spot, where Alyx had been cooped up for weeks or
months while planning the heist, doing research on the Combine.
The camcorder, snark and whiteboards are all toys that respond to detailed
interaction, and reward deeper investigation, giving curious players more
information about the world of City 17 and what Alyx has been doing in it.

a1_01_olga

Two critical pieces of information that we needed players to understand were
simple to state, but not so easy to convey: that players were playing as
Alyx Vance, not the Gordon Freeman they've played in all prior Half-Life games,
and that Alyx is going to speak, unlike Gordon.
These probably seem obvious to you now, but until we focused on them,
it wasn't obvious to playtesters.
Ensuring that everyone realized they were playing as Alyx was something we
decided to hit with the biggest hammer we could, and named the product after her.
But even with that, we were very careful in Eli's video call to ensure it's
reinforced immediately.
Olga, the character introduced here, was added to further drive
it home by calling out when the player exits the elevator.
Olga is the first character the player meets in person, and her dialog here,
and again ahead in the alley, is aimed at driving home the fact that she's
conversing with the player, and that the disembodied female
voice the player hears is Alyx speaking back.
This was harder to convey in Eli's video call, because it already features
Russell speaking 'off screen,' whereas in this scene there's clearly no-one
else around in the conversational space other than Olga and the player.
Without Olga, we found that, even though they had already spoken to Eli
and Russell over the video call, playtesters still didn't feel like they were Alyx.

a1_01_music

When we started Half-Life: Alyx, we were very curious to find
out how VR would impact our music design.
The Half-Life series has always used music sparingly,
often leaving the ambient sounds of the world to provide the audio background.
But it is a tool we like to use to highlight moments where the world state
has changed in some important way - in this case, that something's riled up the Combine.
In our early playtests of this scene, without music but with many of the
other visual and audio elements represented, we found some players didn't
understand that they should move along, and that others became
distracted by the interactive objects in the laundry.
We experimented with some very simple 'action' music to communicate the
changing of the Combine threat level, as well as the story's intention for
Alyx to get to safety, and found that players were less likely
to forget their quest and move towards their goal.
The music was kept simple and 'cinematic' sounding to allow it to blend
with the purposely complementary background sounds and gently start
to normalize the use of music in VR in a familiar way.
This success encouraged us to use the same technique in scenes
with similar requirements later in the game.

a1_01_alley

The remainder of this level is designed to continue informing the player about
the world they're in, and to support the increase in the Combine's alertness.
We also wanted detail to make the city feel alive,
something that becomes much harder later on, once the player has a weapon.
This was tricky to design, due to the challenge we're often faced with:
the friction between narrative state and gameplay state.
Here, the narrative implies that Alyx should be making her way
back to Eli as quickly and quietly as possible.
But gameplay wise, we're at the start of the game, and players are
still testing the limits of everything they can see and touch.
Unsurprisingly, playtesting showed us that some players would dedicate
themselves to following the narrative and quickly move through the area, while
others would completely ignore the narrative and explore every nook and cranny.
While it's tempting to simply remove all distractions in an attempt to unify
all players towards following the narrative, it would send the wrong signal
about the density of our world, and how the game intends to reward players for exploring it.
So we spent months refining this section, adding, changing and removing elements.
We tried to find the right level of narrative tension, where players felt like
they should be moving along, but not so quickly that they can't stop to tinker
with something.
We put in enough interactive objects and narrative moments where players were
were always rewarded for looking around, but we designed all those distractions
to end quickly, so we could encourage players to continue moving along.

a1_02_crash

In order to limit the possibility of motion sickness that could be
induced by moving the player in a vehicle, we initially wrote
this scene to have Alyx wake up after the van had already crashed.
Players found this underwhelming, however, since the game seemed to be
needlessly holding back an exciting moment that they could have experienced for themselves.
Addressing this meant that that we had to tackle the technical problem
of moving the player comfortably in a vehicle.
In this van and in the train at the end of the map, you'll notice that
the outside world is only visible through small windows.
These outside views are necessary to provide motion parallax, but the small
window size ensures that the player's field of view is always dominated by the
vehicle interior, which is naturally in the player's own frame of reference.
This greatly reduces the occurrence of motion sickness.
Aside from the parallax of the outside world, we use animated lighting
and careful sound design to produce the effect of riding in a vehicle.

a1_02_pacing1

Where the last map was focused on conveying a lot of information about
the world state to the player, this map is where we need to start the player's larger quest.
That means the player must leave this map with an understanding
of what's happened to Eli, and what they're going to do about it.
The player also needs to meet Russell, so they have a
face behind all the radio dialog throughout the game.
Finally, we need the player to collect the gravity gloves
and the pistol, and learn how to use them both.
This is a daunting list.
From experience, we know that if we cram all that into a single scene,
players won't be able to remember more than a couple of highlights.
So this map's purpose is essentially narrative pacing:
We need to draw apart the required events, separating them with
enough time and space that players could ingest and remember them.
At the same time, we're a little under the gun.
Playtesting showed us that by now, some players have been playing
for 30 minutes, and they're wondering when they're going to see some enemies.

a1_02_pacing2

One of the first steps the writing team took on this map was to flesh
out a scene between Russell and Alyx that covered all the dialog we needed.
Due to the fact that the player is meeting Russell for the first time,
and that there are a significant number of in-fiction events that Alyx
and Russell need to talk about, the scene ended up containing a lot of dialog.
We knew there was no way we could build a scene that long without boring the
player, and the most important elements would be lost in the noise.
One of the first tools we used to address this was the introduction of Russell's drone.
It allowed us to take chunks of the dialog from that original scene
and deliver them throughout the area preceding the scene.
This meant the scene itself could skip over the in-fiction
preliminaries and get right to the critical goal delivery.

a1_02_pacing3

In addition to delivering narrative at a consumable pace, this map needed to
continue communicating the state of the world outside of the Quarantine zone.
We also wanted players to understand when Half-Life: Alyx takes place in the
larger Half-Life timeline and to see for themselves the massive scale of the Citadel under construction.
The power of the Combine is embodied in their indifference towards humanity.
They have conquered Earth with little effort,
and are now plundering it for their own ends.
They put a modicum of effort into their propaganda, but it's ultimately uncaring.
Later, in Half-Life 2, we see Breen's attempts to talk to humanity with
more seriousness, but it's unclear whether that directive comes
from the Combine or it is just Breen's attempt to be useful to them.
Meanwhile, the Combine construction is brutally efficient.
The Citadel's pervasive feeder cables make no attempt to fit into the
lives of the residents as the Combine drain the city of its power.
Within this playground, a nostalgic nod to Half-Life 2, this enormous cable
has been run right through the mural on the wall, demonstrating the
Combine's total lack of interest in the history behind the city.

a1_02_russell_choreo

Russell's laboratory is the first time the player is in the same room as
another character, and it highlights one of the main challenges we
faced when designing choreographed scenes in VR:
keeping the player and characters out of each other's space.
In previous Half-Life games, we could always forcefully move the
player if we really needed them out of the way.
We made several attempts at finding a similar solution in VR, but found
that it often made players feel deeply uncomfortable or disoriented.
So instead, we designed these spaces to allow for a natural
separation between the player and our characters.
Where natural separators couldn't exist, we had to restrict players
from moving into the character's space, removing that
limitation as soon as the character had moved out of the way.
Another challenge was balancing how often characters respond to player
actions while delivering important narrative information.
In early experiments, we allowed characters to stop mid-sentence,
which allowed them to respond immediately to the player,
and then continue on with their dialog.
While technically straightforward, this was incredibly difficult to execute in animation and dialog without feeling unnatural.
The combination of custom reactions necessary to cover the range of
possible player interaction, and their resulting return to
narrative dialog, quickly multiplied out of control.
Eventually, we settled on a solution where we allowed only a minimum of
interactions during important narrative delivery sections, and a wider
range of responses to more complex reactions after the scene has completed.
In the case of Russell, we layer subtle head facing, and natural
eye-contact on top of his performance while he is talking with Alyx.
Then, after the scene when he is typing at the computer, he will
respond to players trying to touch him, or objects being thrown around.

a1_02_russell_typing

Right from our earliest experiments in VR, we learned that players expect
a high level of audio fidelity, even from minute interactions, particularly
when the sounds are meant to correspond to clearly observable visuals.
For instance, in one of our prior games, it may have been acceptable to
use a generic sequence of computer key sounds when Russell was typing.
In VR, however, we know that a player can observe the animations very
closely and even put their head right next to Russell's keyboard if they choose.
In this case, players would notice a lack of one-to-one
correspondence between the keypress animation and sound.
For Half-Life: Alyx, we authored a set of individual key
press sounds to be driven by animation.
For example...
As Russell's fingers animate to press a key, one of these keypress sounds is
played, resulting in precise synchronization of the animation and sound.

a1_02_gg_training1

One frustration players encounter in VR is having to physically bend
down or reach out for objects in the virtual world,
especially if they have limited real world space.
A common solution for this is to provide a way to bring objects to
the player instead of forcing the player to go to them.
We knew this would be an important feature in Half-Life: Alyx with its
sprawling levels and players' desire to explore every corner of the world.
We didn't want this mechanic to feel too gamey or magical, so we decided to
integrate it into the fiction of the game by making it a feature of the
gravity gloves, which are a natural precursor to the gravity gun of Half-Life 2.
This allowed us to present the mechanic as a real physical action that's
enabled by a piece of in-fiction technology.
To emphasize the physicality that is unique to the medium of VR,
we used a physical gesture to activate the pull, instead of a button press,
and then required the player to actually catch the object
instead of automatically attaching it to the hand.
This made the mechanic feel more natural and rooted in the fiction of the
game's universe, while allowing the player to feel powerful and accomplished.

a1_02_gg_training2

When the player 'pulls' an object with the gravity gloves, we use the game's
physics engine to apply a launch force to that object with a trajectory that
is partially influenced by the direction of the player's pull gesture.
We set the object's launch velocity to result in a standard flight time from pull
to catch so the player can develop a rhythm that eventually becomes automatic.
We want the object to arrive at the hand as reliably as possible,
so after the initial launch we continually apply small impulse
forces along the trajectory (represented by the white arrows).
Those impulses help counter gravity and compensate for collisions with
obstacles along the way, while continuously steering the object
toward the hand, which is itself a moving target.
A generous catch range and subtle haptic feedback when the object enters
that range virtually guarantees that it will end up in the player's
hand every time, even if their attention is focused elsewhere.
One other benefit of implementing this with physics is that we didn't
need to do custom handling for an object's size, mass, or inertia,
and it automatically compensated for the non-standard gravity
that players encounter inside the vault at the end of the game.

a2_01_resin

Even in our earliest experiments in bringing Half-Life to VR,
we found that players enjoyed exploring every nook and cranny of the environment.
With the increased fidelity of VR, players not only expected to be able to
interact with a vast array of objects, but they also had a strong desire to find something useful from time to time.
This was so universal among testers that in addition to more searchable
locations and interactive objects, we added a new resource: Resin.
Level designers could place this new resource in fun and interesting locations
that weren’t tied to game progression, rewarding players' wider-ranging exploration.
We also tied this new resource to a longer term goal in the form of weapon
upgrades, further motivating players to explore the environment.
Deep scavenger hunts and keen observation would result in more resin to spend as a reward.

a2_01_toner_intro

Internally, we refer to this type of puzzle as a 'toner' puzzle,
due to its similarity to a real-world cable toner tool that
can be used to locate electrical wires in walls.
The toner mechanic was conceived as a tool for designers to craft player
experiences around knowing that the player had to put their
head and hands in specific locations.
The initial design merely required the player to use their multitool to
push a ball of energy along a path, similar to a child's bead maze toy.
While this simple design was successful at getting the player
to put their head and hands in particular locations, the lack of
branching meant that the player wasn't required to make any choices,
nor was there any possibility of negative consequences.
To address these issues, we added the rotating junction elements,
which allowed for branching paths and became the player's
primary means of interacting with toner puzzles.
While the toner puzzles ramp up in complexity throughout the game,
this first toner puzzle is designed to use a minimal number of elements
to introduce players to the key ideas that power flows through wires,
power can be re-routed by rotating junctions and power has an effect
on the physical world such as turning on lights and opening gates.
The sparking electrical outlet that branches down toward the floor
is not strictly necessary for this simple puzzle, but it was added
here to expose players to its behavior since it was causing
confusion in later puzzles when players were seeing
it for the first time in more complex configurations.

a2_01_lighting

From our earliest experiments in VR, we realized how critical high-quality,
high-density lighting was to conveying a sense of realism.
In game development, any technology choice involves trade-offs
between quality, runtime performance and development time.
For Half-Life: Alyx, we chose a high quality, high performance light
mapping solution which required a significant amount of offline pre-processing.
We performed the lightmap preprocessing using a render farm,
which could take several hours for a single map at the highest quality,
and we put a lot of effort into mitigating the cost of that
high iteration time for designers on the team.
Specifically, we implemented a real-time approximation to the final lighting
result, which allowed designers to light scenes in real time with
confidence that the high-performance lightmaps would match their intent. Most lights in Half-Life: Alyx are baked into a single term that
describes distance falloff and shadowing, allowing us to change the
brightness and color of lights in real-time at no cost.
This encoding also allows us to apply shadows from dynamic props and
characters, without having to render any static geometry into shadow maps.
Our lightmaps store directional information using the popular
Ambient Highlight Direction encoding, an improvement over Valve's own
Radiosity Normal Mapping technique pioneered in Half-Life 2.
This new lightmap encoding ensures that bumpy surfaces appear bumpy even when not under direct light.
This blinking lightsource provides a good illustration of this technique,
as you can perceive the surface detail even during times that the light source is off.

a2_01_barnacles

Barnacles, while an iconic Half-Life creature,
presented some interesting VR-specific challenges.
While their tongue is intended to trap hasty or unobservant players,
those caught by them often didn't realize what was happening.
They would become disoriented, or even motion sick,
as the barnacle lifted them up to be eaten.
These unfortunate players almost universally died and failed to
learn what was happening, even after multiple encounters.
To address this issue, we recreated the tongue as a particle system.
This allowed us to both physically simulate it as well as
reliably position it for maximum visibility.
If a player does get grabbed by a barnacle, the tongue always wraps right
in front of their face, no matter which direction they try to look.
This makes the tongue almost impossible to miss and visually guides the
player's eyes up to the location of the barnacle so they know where to shoot to free themselves.
The asphyxiation visuals that kick in as the player is being lifted not
only serve to communicate the damage that the player is taking but they
also help to reduce motion sickness by narrowing the player's field of view.
These changes, combined with sounds effects and other visuals like seeing
stars and blacking out, eliminate confusion for players and enabled us to
bring this classic enemy into the world of Half-Life: Alyx.

a2_01_pistol_training1

We got a lot of positive feedback from our initial experiments requiring
players to perform a series physical actions to reload the pistol.
Specifically: ejecting the clip, retrieving a new clip from their backpack,
inserting the clip into the gun and chambering a round.
Players enjoyed learning and improving at this skill.
We experimented with using a single button for both ejecting a clip and
chambering a round but we found that assigning these operations to two
different buttons was more interesting even though it was more mentally taxing.
In fact, it was more interesting precisely because it was more mentally taxing,
as players would frequently mix up the buttons under pressure.
In the early stages of the game, players would be fumbling for clips,
ejecting the clips when they meant to be chambering a round and generally making a mess.
To those of us observing the playtest, this looked things had gone horribly
wrong, but players themselves consistently cited this as a high point of their playthroughs.
Because of the physical nature of the reload, and the skill required,
players blamed themselves and not the game for the mistakes made as
they were learning the reloading sequence.
In this area, we provide a number of static targets in the form of barnacles
for the players to shoot at their own pace, increasing the likelihood
that they have at least a few reloads under their belt before moving on.

a2_01_locker_hacking

In prior Half-Life games, while playing as Gordon, the player saw Alyx use
her multitool to hack into Combine security systems in order to unlock doors or perform other acts of sabotage.
This presented an opportunity for us to represent Alyx's hacking skills through VR-centric mini-games.
All of our hacking mini-game designs were built to leverage
the independently-tracked head and hands common to today's VR systems.
We wanted to encourage players to use both hands simultaneously and
move their head to create parallax to understand the
spatial relationships within the puzzles.
This locker hacking puzzle is based upon one of our earliest prototypes,
in which the player was tasked with using a tool in one hand
to paint a path on a sphere held by the other hand.
There was just something fun and unique about using all of the
natural degrees of freedom of both hands to solve a puzzle in VR.
In its simplest form, this puzzle's solution is straightforward,
but the difficulty gradually increases throughout the course of the game by
introducing static or moving obstacles of various shapes and sizes on the sphere.

a2_02_art_pacing

This file has no closed captions!

a2_02_health_injectors

In our early prototypes, players could heal themselves by grabbing a
Half-Life 2 health pack and pressing down on a controller button,
which obviously wasn't a very interesting VR interaction.
During the development of Half-Life: Alyx, the Counter-Strike team
added health injectors into CS:GO as part of the new Danger Zone game mode
they were developing.
When we saw this, we felt that an injector would work as an intuitive
healing item in VR as well.
Players would press a button to arm the injector as a way of indicating their intent to use it.
This prevented unintentional use of the injector and gave us an
opportunity to showcase how physical the interaction was
going to be via haptics, animation and sound.
Some testers even initially responded with dread over
the fact they were going to have to use a needle on themselves
and looked away when performing the injection action.
We also had to create approximations of the player's arms and body,
as playtesters all had their own preferences for where they wanted to apply the injection.
We even added the ability to inject into the head
since so many players tried to do it just to see if it would work.

a2_02_pistol_training2

We had to ensure that players would become proficient at reloading their
pistol during the early levels of the game or later encounters in the game would be overly difficult.
The prior map contained only static targets in the form of barnacles.
Players could engage the barnacles at their own pace,
but there were enough of them that players would need to reload at least a few times.
This first live zombie is locked behind the chain-link fence and cannot
reach the player until the player opts in to the encounter by shooting the lock on the door.
The placement of zombies in the upcoming section of the game is designed to
require plenty of reloading, to ensure that the player builds up this skill.

a2_02_train_cars

Virtual reality provides the opportunity to add physicality to gameplay,
but this needs to be balanced against player fatigue and accessibility.
For example, we experimented with a low ceiling in this area,
requiring players to crouch the whole time that they made
their way across the tops of these train cars.
Playtesters did not enjoy having to crouch for so long,
so we raised the ceiling.

a2_02_breakable_glass

The breakable glass panels in Half-Life: Alyx are yet one more system designed
to convince players that they are present in a living, breathing world.
The panels detect the position of an incoming impact and procedurally generate
new, smaller glass panels in a radial pattern around the impact position.
These new panels can then break into smaller panels themselves, and so on.
We found that players would often break glass panels accidentally then
spend time 'testing' just how interactive they really were by trying to
break them more, eventually realizing that, yes, the glass really
did shatter at the appropriate location.
At one point, we experimented with building special puzzles out of breakable
glass, but in the end we found the glass to be more appropriate as a passive
interactive element in the world, one that reacts in a satisfying way to a
variety of impacts, from zombie melee attacks, to incoming gun fire
or any other physical interaction in the game.

a2_02_weapon_upgrades

Weapon upgrades are a new feature for the Half-Life franchise and
fictionally this mechanic paired well with Alyx’s tinkering nature.
We approached this design challenge first by thinking about what sort of
upgrades could be meaningful and also provide additional value to
players without removing established interactions.
With limited resin to spend, the upgrade system also provided an avenue
for players to personalize and invest in their favorite weapon,
or streamline an interaction on another weapon, increasing its value to them.
For example, the reflex sight still requires players to physically aim but
it improves the feedback on exactly where their shot will hit
and exposes previously hidden enemy weak points.
While some players enjoyed this feedback and new targeting opportunities,
others didn't and would instead focus on upgrades that suited their style of play.
This allowed players to spend their limited resin on the upgrades they cared about the most.
Another aspect of weapon upgrades that we focused on was not necessarily
obvious when we first started to design and test combat in VR.
Due to the more physical nature of combat in VR it can really push on the
mental and physical capacity of certain players as they’re required to
utilize both their hands and move their head to take in the environment dealing with enemies.
While this is a very natural experience there's also a lot
going on at any given point with reloading and combat.
As we observed players in some of these more intense combat situations,
we started finding opportunities to allow some upgrades to reduce the
number of times a physical interaction was required or reduce
the complexity of a particular interaction.
The autoloader on the shotgun is a good example.
When added to the shotgun, the normally slow, methodical action of feeding
shells into the gun is replaced with a far simpler, faster interaction of
slapping the shells into the autoloader before they get automatically fed into the gun.
This allows players to look around, focus less on the gun while reloading,
and take in more of what’s going on around them.

a2_02_comfort_pacing

At the beginning of the Half-Life: Alyx project, we had the expectation
that players would want to take a break after 30-45 minutes,
since that's what we were seeing with The Lab and other
smaller-scale VR experiences available at the time.
Because of this, we designed the pacing of the game's
levels around play sessions of that duration.
Over the course of development, however, the game itself included more
down time between intense combat encounters and the
comfort of the new crop HMDs improved significantly.
In fact, as the game progressed, we found that players were happy to play for hours on end.
At the end of a playtest, players would often remark that this
was the longest they had ever played VR in a single session.

a2_02_headcrab_foreshadowing

Until this point in the game, players have only seen
headcrabs attached to the heads of zombies.
This led playtesters familiar with headcrabs from prior
games to anticipate encountering them on their own.
We decided to play up this anticipation by foreshadowing the first headcrab
encounter with a headcrab that drops out of a vent before disappearing,
as well as headcrab sounds in the environment, including a threatening
headcrab sound coming from the partially boarded up doorway ahead.

a2_03_mural

There are two layers to the markings on the walls in this area:
The square and dot patterns that are the functional elements of the
environmental puzzle and the mural representing the Vortigaunt's
interpretation of the events of the Half-Life saga.
For the first few years of development, this area only contained the
dot patterns supporting the perspective puzzle, but as we fleshed out
the rest of the level, we found that playtesters responded especially
positively to the Vortigaunt's wall markings leading up to the hideout.
It was at this point that we realized that this area could serve as a canvas
for our Vortigaunt's artistic magnum opus.
To do this, we organized the dot patterns into a sort of swirling star field
behind the Vortigaunt's art.
The imagery itself, which was inspired by petroglyphs and other traditional
rock art, portrays many significant events from the Combine invasion,
painted hurriedly by the injured Vortigaunt, almost as if he was trying
to capture what fleeting memories he still
had of the Vortessence from which he had been severed.
The fact that Vortigaunts have a complicated relationship with time gave
us a lot of leeway in terms of drawing this as a strict, linear timeline.
For example, you may notice that the Vortigaunt uses contemporary Combine
troops as shorthand for Combine invasions from the past or that the timeline
is just generally hazy: a little from the past, a little from the
present and a little looking forward to the future events of Half-Life 2.

a2_03_lore_music

There are many musical themes and motifs used throughout the game
to represent unspoken aspects of the story and lore.
For example, the Vortigaunts play an integral role in the sub-plot of the
game and we chose to use music to subconsciously communicate the repeating
nature of their role both in the timeline of Half-Life: Alyx and
in the larger timeline of the Half-Life saga.
Within the music in this scene, you'll hear variations of themes used
throughout the timeline of the game, from the very first sounds
of the game up to the finale in the Vault.
Variations of these motifs represent the imprisoned chanting of the Vortigaunts
The extraction of their energy to confine the G-Man
As well as historically familiar sounds representing the Vortessence itself.

a2_03_alyx_reacts

During our earliest attempts to write Alyx, our instinct was
to have her react to absolutely everything.
But what we learned from playtesting was that Alyx reacting
TOO much could actually be off-putting.
Say a player is in a really tense, zombie-filled level, and
they’re having a great time, but suddenly Alyx is in their
head sounding angry or frightened or anxious.
These Alyx reactions were creating a disconnect between how the player
was feeling and how the character they were roleplaying as was feeling.
So we took a lot of those out.
The Alyx reactions we left in tended to be the ones
that were practically universal.
Alyx's reaction here to the surreal entrance to the Vort hideout
is a good example, where it was just so different from everything
the player had seen up until then, and kind of takes you by surprise.
We watched so many playtests where the player would see it
and say, 'Whoa, that's weird.'
So we had Alyx say the same thing a second later, almost like an echo.

a2_03_hideout_interactivity

For as long as we've been making choreographed scenes like this one,
we've found that there are two types of players: those who become
engrossed with the characters and those who tinker with everything
around them, paying little attention to the scene's action.
To make these environments entertaining for the tinkerers,
we usually add objects for them to find and play with.
Since this scene is split into two rooms, we were able to add a large
number of interactions into this first room for players to
discover after the Vortigaunt has moved to the second room.
Some interactive objects are pure physically driven toys,
like the dangling headcrab corpses and the hanging pans.
Others are animations triggered by the player's actions, like the way the
caged headcrabs react to the player approaching or shooting them.
One type of object on the table is a squeezable headcrab heart,
which is a remnant from an older version of this scene, where the player
would revive a wounded Vortigaunt by squeezing nectar from these hearts.
While that scene didn't work out, we found the experience too
compelling and, well, too gross to cut completely from the game.

a2_03_hideout_scene

The fundamental challenge of any choreographed dramatic scene like this
one is making sure that we entertain the player without undermining
the delivery of critical gameplay information.
In this case, the player needed to leave this scene with the knowledge that
the Vortigaunts have been imprisoned and that the player should seek the 'Northern Star.'
This particular scene went through many different variations during
development, from a cooking scene with an extremely traumatized
Vortigaunt to this humorous version.
We even experimented with an interaction where the player was
required to squeeze the nectar of several headcrab hearts
into the Vortigaunt's mouth to revive him.
While this was a memorable and novel VR interaction that ensured the
player was engaged with the scene, the interactivity on the player's
part often resulted in playtesters remembering the actions they took
but not the critical gameplay information.
Eventually, we ended up with this more humorous scene,
which struck an effective balance between entertainment and goal delivery.
Importantly, we also distributed the scene across two separate rooms,
breaking the information delivery into two steps: the first room where the
player meets and learns about the state of the Vortigaunt,
and the second room where the player is given their new goal.
Separating these physically into rooms seemed to help players more reliably
retain the critical information delivered in the second room.
While this scene did undergo a lot of changes, one story beat remained
virtually unchanged from the very first roughed-in version:
the moment the Vortigaunt tosses a headcrab carcass down to the player and yells 'Sustenance!'
Right from the start, this moment always got a great reaction
from playtesters and remained, all the way to the final product.

a2_04_armored_headcrabs1

We spend time on every game design element thinking about how the player
is expected to learn of its existence, and understand its details.
Some design elements are so important that the game experience will 'break'
if the player doesn't understand them, while others may just be fun diversions,
or rewards for exploring the game's systems.
Any required element will require work on our part to teach the player.
In Half-Life, we try to teach elements in ways that feel as natural as
possible, entwined within the core flow of the game so that you don't feel like
the game just stopped to teach you something.
We don't always succeed, but the attempts usually lead us somewhere interesting.
In this case, we need to teach the players about the Armored Headcrab.
The key elements that we want them to understand are that the headcrab is
invulnerable on the top, but has a weak spot on its belly.
We would also like the player to learn that the headcrab exposes its weak
spot right before it jumps, but this isn't required, the player can fall back
to shooting the weak spot as the headcrab scrambles around after a leap.

a2_04_armored_headcrabs2

Teaching design elements of enemies is always tricky.
Players have shown us for years that it's really
hard to learn something new while under attack.
We've also seen that invulnerability is a particularly
difficult enemy feature to teach.
If players are blasting away at an enemy, they're often not
paying close attention to the impact feedback.
If the enemy is vulnerable in some areas and not others, players may
kill the enemy with a spray of gunfire, not realizing that some of
their shots did nothing while others hit the critical weak spots.
So when we approached the Armored headcrab training, we knew we'd
need to do significant work.
The first step was to avoid placing the player under duress
by containing the enemies inside this wire fence.
Next, we placed headcrabs at eye level facing the player,
exposing their bright, pulsing weak spot.
If the player shoots one immediately, they're rewarded with a
unique death effect where the headcrab explodes into chunks.
This effect is designed to show the player they did something special to kill
this headcrab, as opposed to how they've been killing headcrabs up until now.
There are a couple of other pieces in the setup here that are important.
A second headcrab is placed right behind the lock the player must shoot
to enter the cage, encouraging the player to notice the weak spot and try shooting it.
Two more headcrabs are loose in the cage, scripted to
repeatedly attack invisible points on the wire fence.
This ensures they're constantly demonstrating the way they expose
their weak points before they leap, and this allows the player to
practice hitting the weak spot from behind the safety of the fence.

a2_04_armored_headcrabs3

With any section of natural, entwined training, we've observed that there's
rarely any one thing that succeeds in teaching all players,
nor do 100% of players leave with the perfect understanding we'd like.
If player understanding is absolutely critical, we have to build a more bulletproof
gate that ensures the player demonstrates understanding before moving onwards.
In fact, we do just that when the player acquires the shotgun in the next area.
In the case of the armored headcrab, however, playtesting showed that we
could afford to let a few players through with less than perfect understanding.
This section gave them enough of a chance to experiment that they got a sense of
what was going on, and further encounters with armored headcrabs solidified it.
There was still a lot of iteration on the visual design of the
headcrab to further drive home its armor and weak point,
and we also added a custom response specific to shotgun fire.
In that case, the armored headcrab flips over
and wriggles for a second, exposing its weak point.
This was added to reduce the resource loss we saw happening in
players who were still struggling to understand the armored headcrab
mechanics after they acquired the shotgun.

a2_04_shotgun_gate1

One of the challenges we always face when giving a new weapon or tool
to the player is ensuring that they collect it before moving on.
In these cases, we try to establish some kind of 'gate' that
the player can only get past using the new item.
This can be especially tricky in cases where we are giving the player a
new tool that doesn't fundamentally change their capabilities, as in the
case of the shotgun, where almost everything that can be done
with the shotgun can also be done with the pistol.
In contrast, the pistol gate, where the player had to acquire the pistol
before being able to shoot the lock off of the fence back at Russell's lab,
was simple to design and execute.
For this shotgun gate, we not only needed to ensure that the player collected
the shotgun, but also that they learned how to load it before proceeding.
We playtested a number of approaches to teaching the reload mechanics
during combat, and repeatedly found that it didn't work.
When faced with learning a new, complex task under duress, players would
retreat back to using their pistol, which was a known quantity to them.

a2_04_shotgun_gate2

Once the player has collected the shotgun,
they are only halfway through the shotgun gate.
Next, they need to learn how to reload it.
To drive home the necessity for the player to learn how to use this
new weapon, we added an incoming threat, using a variety
of audio and visual effects to sell it.
While our goal is to make the player feel like they have a
limited amount of time, the time pressure is entirely fake,
and everything proceeds based on the player's progress in loading
the shotgun for the first time.
Once the shotgun has been loaded, a short timer starts,
after which zombies burst through the walls.
If the player happens to look at the walls before that timer is up,
we abort the timer and introduce the zombies.
The combination of a timer and a look trigger is a common trick used
in places where we'd like the player to see a neat visual moment,
but we can't wait forever or the game simply won't progress.
The breaking of the walls by the zombies is the
final step in opening the shotgun gate.
Having new enemies open the way forward when they enter the
player's space is a fairly tried and true method,
but it's never one we're particularly happy with.
It's always a bit too convenient, and lacks agency on the player's part.
But in a setup like this, which contains a novel puzzle,
new weapon and its associated mechanics, plus an ambush,
we're already dealing with a lot of design complexity
packed into a small space, and further additions would
complicated the scenario even further.

a2_04_shotgun_design1

The visual design of the shotgun included a number of challenges.
Shotguns in games are generally powerful, and we communicate that power with bulk.
As a result, they're big, and held in two hands.
But our experiments with two-handed VR weaponry
didn't reach a level we were satisfied with.
When holding a two handed object in real life, the object itself provides a
physical connection between the two hands, and that constraint allows
players to stop thinking about it consciously, in addition to providing physical feedback.
In VR, without that connection or feedback, we found that players had to
keep maintaining their own internal sense of how they were holding their
two handed weapon, or they'd drift their hands into positions that were hard for us to reconcile.
As a result, our two handed weapons required players to 'play along' more than we liked.
Since one of our core goals with Half-Life: Alyx was to allow players to
fully immerse themselves by ceasing to think about what they were doing
in real life, we decided to focus on single handed weapons, including the shotgun.

a2_04_shotgun_design2

Our initial instinct with the shotgun design was to use a regular real-world
shotgun, but modified with a sawn-off stock and barrel so it
would seem less cumbersome when held with one hand.
This approach always resulted in the handle being at the rear of the weapon
and we ran into the problem where the weapon always felt front heavy.
Obviously, the VR controller in your hand wouldn't reflect that, leading
to a disconnect between what you were feeling and what you were seeing.
Placing the handle more centrally, and underneath the weapon, helped avoid this disconnect.
This implied center of mass wasn't really maintained well as the weapon
accrued upgrades throughout the game, but the first impression was the most critical.
Since we were going with an odd form, we were very keen to use common shotgun
visual elements wherever we could, to help with player recognition and to ground the design in some reality.
One element we borrowed was a common shotgun tube magazine that stores the
shells horizontally in line with the barrel, and we were just able to fit
the gameplay requirement of 6 shells without the overall length becoming unwieldy.
You can see the effort we had to go to for further shell capacity in the design of our autoloader upgrade. Our first reloading mechanic was based on the realistic insertion of one shell at a time.
Playtesters found this to be overly tedious, although we did receive some
positive comments that it made them feel like an action movie star.
The simple addition of allowing players to insert two shells at a time solved
this issue, allowing us to keep the good part of the design whilst losing the bad.
We were keen to reinforce the whole action movie star thing and
experimented with some big gesture the player could do to finalize reloading.
The inclusion of a pop-up mechanism allowed us to also add an additional
step to the reload that further separated it from our other weapons
yet had a satisfying flick of the wrist to complete the action.

a2_04_darkness_gate

As with the shotgun and the pistol, we needed a gate for the flashlight
to ensure the player couldn't proceed without it.
We tried a number of designs around the moment where the player collected
the flashlight, but they all ended up being about the player picking
up an object, and not specific to that object being a flashlight.
Given that the flashlight's primary function is to provide light,
we felt the gate needed to be an area of darkness.
We also really wanted the player to be excited to get the flashlight, and
that meant they had to experience darkness before collecting the flashlight.
This turned out to be an interesting challenge, because we already had many
poorly lit areas prior to this, and now we needed an area that was so dark
that players would correctly interpret it as a gate.
Our initial implementation was simply to ensure there were no lights within the hallway.
Unfortunately, playtesting immediately showed that wasn't enough.
Some VR headsets have extremely low black levels, and on those headsets,
enough light bounced in from this lit area to enable players to move forward.
In addition, our low ground fog provided just enough contrast in the dark
hallway for players to make out the scene and stumble through it.
Even once we removed fog and bounce light in this area,
we saw yet other ways that players tried to move forward.
Some players would fire their pistol repeatedly, using the muzzle flash as a strobing light source.
Others would use the teleport targeting UI to probe the room, because at
that point in the project, it had a small light source attached to it.
In the end, we disabled muzzle flashes inside the hallway, and the teleport UI lost its light.
The final step was the addition of some dialog from Alyx,
commenting on the darkness and her need for a light source.
This was aimed at solving a perceptual problem.
Up until now, our game has taught players that their goal is to continue
moving forward, and any time that forward progress is blocked,
it's because there's something in the local area that needs solving.
Given that structure, it's entirely rational for players to do everything
they can to find some way to see in the darkness ahead.
Alyx's dialog is aimed at telling the player that the game has recognized
their attempts to move forward here, but the solution lies elsewhere.
Subsequent playtesting showed us that this dialog, coupled with the
brightly lit area containing zombies off to the right,
helped tug players past the darkness entrance, and on towards the flashlight.

a2_04_toner_airlock

Like hacking and environmental puzzles, toner puzzles like this one are often
used to relieve tension from combat and engage the player's mind in a refreshing way.
This puzzle introduces a new junction type, which has two inputs
and two outputs, as well as branching circuits.
Functionally, it is just a simple 'airlock.'
Only one of the main Combine gates can be powered on at a time,
with an additional wrinkle that the smaller side gate
on the return path must be powered down separately.
In addition to providing a break between hectic combat encounters,
this puzzle is a gameplay mechanic refresher, since it's been some time since
the player has seen a toner puzzle and the next ones start to ramp up in complexity.

a2_04_flashlight1

In prior Half-Life games, the flashlight was attached to the player's
body, which meant that it always illuminated the world from the same angle,
relative to the player's point of view.
With tracked hands in VR, we had the opportunity to let players manipulate the
flashlight directly and playtesters reacted very positively to this new freedom.
In fact, our earliest implementation of a flashlight in Half-Life: Alyx
was an object that could be picked up in either hand or even set back down.
This allowed the player to put the flashlight down somewhere, providing
illumination for themselves while they accomplished another task.
Unfortunately, this presented the possibility that a player could leave the
flashlight behind, which would have been a game-breaking scenario.
Implementing the flashlight as an object players had to hold, also prevented
them from holding something else in that hand,
which was tedious to manage and simply too punishing of a trade-off.
In particular, it meant that weapon reloading was a nightmare, because the
player no longer had a free off-hand to perform the reload actions.
In the end, we settled on a flashlight that attaches to the
off hand and turns on and off 'automatically.'
This fixes the issues with reloading in the dark, potentially losing the
flashlight and has the nice side effect of not requiring the allocation of a
button to toggle the light, which is especially helpful on controllers with fewer buttons.
Since the automatic on/off behavior is controlled using level logic,
designers could make performance and quality trade-offs in different areas of the game.
After all, if a player could turn the flashlight on at any time,
the rendering overhead of this shadow-casting light source would
have to be accounted for throughout the entire game.
On the other hand, if the flashlight is known to be off in a given area,
a designer can make use of the resulting rendering headroom by
increasing the fidelity of the environment in other ways.

a2_04_flashlight2

Shortly after adding the flashlight to the game, we discovered that
playtesters would frequently brace their gun hand with their flashlight hand.
This meant that players had to orient their flashlight hand
awkwardly to align the light with their aim.
We decided to make this an automatic behavior of the off hand
whenever it was brought close to the grip of the gun.
If you do this, you'll see that the off hand grips the gun
in a natural pose and the flashlight aligns with the aim.
In fact, if you look closely at the end of the flashlight,
you'll notice that it has a ball joint.
This allows the flashlight to articulate based on whether
the off hand hand is gripping a gun or not.
The ball joint has four different orientations:
one for each gun and a neutral orientation for when the player
is not supporting the gun to aim.
These four orientations ensure that the flashlight is always
pointed in an ideal direction for the player.
In one-handed mode, the flashlight attaches to the gun hand
at an orientation that is ideal for all scenarios.

a2_04_sandwiches

When the writers create a new character, we try to write a variety of
material from that character's perspective to help find their voice.
What motivates them? What do they sound like?
How do they relate to other characters?
It's helpful to know the answers to questions like this early in a project,
particularly when working with a team, where different people may end up writing for the character.
The 'Club Sandwich' story that Russell tells in this
section started out as one of these writing exercises.
The writing team wrote pages and pages of Russell telling Alyx stories
about life before the Combine, getting irritated with her;
telling bad jokes, pretty much the whole emotional spectrum.
As we refined Russell's character, almost all of it fell by the wayside.
We’d needed to do that work to get to Russell, but most of that didn't sound like Russell anymore.
Later in the project, we decided to make Alyx scared of the dark.
So we needed Russell to talk a lot in this section to keep Alyx company.
It didn't matter what he was saying, it just had to be this voice in
the darkness that she was grabbing on to.
And then we remembered:
'Hey, didn't we write a rambling sandwich monologue a year ago?'
And the Club Sandwich speech was resurrected.

a2_05_budgeting

When building a Half-Life game, we try to start playtesting as soon
as we can, even when we only have 10 or 15 minutes of gameplay built.
We find it extremely helpful to see how players are reacting to what we've made as soon as possible.
One of the major ways playtesting impacted Half-Life: Alyx was in revealing
how players reacted to the density of interactive elements in the world.
Before we began, we were already aware that players would likely spend
more time paying attention to the details of the world around them in VR,
but we were unprepared for how significant this would be.
Seeing some players spend thirty minutes messing around with a single interactive object really opened our eyes.
As a result of those playtesting observations, we spent much of our
development time focusing on the environment around the player,
and the level of detail and interactivity to be found there.
It might seem surprising that we didn't start with the assumption that
we would need to spend that amount of time on the world, but the reality
of game development is that we always have a finite amount of time
to build the game and we're always making trade-offs.
In the ideal case, we spend our finite time on whatever
contributes the most to the player's experience.
So, playtesting early, and thereby getting a better sense of what elements
are contributing to the player experience, helps us understand where to spend the limited time we have.
In this case, those early playtests told us to significantly increase
the amount we should budget for creating world detail and interactivity.

a2_05_barnacle_sounds

Barnacles are a unique enemy type in Half-Life: Alyx in that they are
static and often situated above the player's natural view direction.
During testing in confined or dimly lit spaces, we often saw players walk into
Barnacle tongues without noticing them.
One way we addressed this was with audio.
In particular, we made idle Barnacles emit subtle breathing,
croaking and grumbling sounds like the following.
These sounds helped alert players to the presence of Barnacles and also made
them seem like living creatures that were always active, not just a mechanical
piece of gameplay that responded only to the player's actions.
The Head Related Transfer Functions available in Steam Audio were especially
helpful, as they made it possible to accurately spatialize Barnacle sounds
so that the players could tell that the sounds were coming from above.

a2_05_manhack_introduction

Early incarnations of Manhacks were frustrating to fight in VR, since
their audio was not providing enough information about their location.
They seemed to appear from nowhere and, even when close to the player,
they lacked a dramatic presence.
We tried a variety of layered sounds with varying distance falloffs but
this often created more confusion, especially when multiple Manhacks were nearby.
Our ultimate solution to the problem consisted of two parts.
The first was to make Manhacks emit the following alarm-like warning sound
when they first came into range or were deployed by a Combine soldier.
The second part of our solution was to add more aggressive looping sounds with short falloff distances.
The short falloff of these sound loops meant that this specific component
of Manhacks' sound treatment would only be audible quite near the player.
Thus, when players heard these sounds, they knew that Manhacks had begun
hunting them, and one or more was getting close enough to be of concern.

a2_05_exterior_pacing

Before final art was added, the cramped, dark areas of this level did not take players very long to traverse.
But as the fidelity of the game increased, players spent more and more time getting to this point.
As we've learned over the years, too much time in dark,
underground areas can become fatiguing to players, and this is even more significant in VR.
This outdoor section provides a change of pace and a brief respite from the dark.
Nt only does it give players something visually different and interesting to look at,
it allows them to clearly see and assess threats in an open space,
allowing them to get some easy kills with little stress.
It was also important to provide some variety here,
as players are about to head into a dark area that requires precision shooting and flashlight usage.

a2_05_xen_barrels

This section, with a multitude of explosive barrels embedded in Xen
overgrowth, was designed to switch up player expectations
and change the pacing of the level.
Prior to this point in the game, explosive barrels
and canisters were used as offensive weapons.
In this space, they are deadly to the player;
if a barrel or canister does explode, it sets off a fatal chain reaction.
This change in priority forces players to aim
carefully and adds intensity without combat.
This has the positive side effect of training
players to utilize the pistol even more effectively.
While the automatic alignment of the flashlight with a braced weapon is
useful in direct combat, in this section we observed players taking
advantage of the decoupled weapon and flashlight as they
illuminated and targeted Barnacles in the intricate Xen foliage overhead.

a2_05_volumetric_lighting

We decided early on that the environments we wanted to portray in Half-Life: Alyx
would benefit from an investment in high quality volumetric lighting.
This component of illumination is an indispensable tool not just for creating
atmosphere, but for drawing the player's attention to critical gameplay elements.
Our system works in three phases.
In the first phase, illumination from static light sources is captured in a
three-level hierarchical volumetric clipmap surrounding the player.
The highest-resolution level of the clip map measures one cubic inch per voxel.
The clipmap is updated incrementally, less than once per frame,
as the player moves throughout the scene.
In the second phase, a compute shader ray-marches in view space throughout
the clipmaps, accumulating static lighting, dynamic lighting and volumetric extinction.
The ray marching is performed every frame and employs temporal anti-aliasing to improve quality.
In the final phase, we re-sample lighting and volumetric scattering
from the prior phase for each eye, once per frame.
The result is a texture that describes, for any given view-space depth,
volumetric scattering and extinction, which we apply during forward rendering.

a2_06_combine_sighting

We always introduce new enemies at a distance and never
while the player is under duress.
Here, the player is able to observe the Combine soldiers killing some
wandering zombies, giving the player an idea of how powerful
the soldiers and how they might operate in combat.
We located the Combine far enough away that it was plausible that they
would not notice the player, but close enough that the player could
choose to engage but remain relatively safe from return fire.

a2_06_single_combine

The first encounter with a Combine soldier
caused a number of issues in playtesting.
The biggest one was that players were often surprised
when the soldier started shooting at them.
Some players who were new to the Half-Life series
had no idea who these soldiers were.
But it's also been hours of game time since players saw another human,
and we found that many players weren't willing to immediately start shooting.
Some players also wondered why Alyx moved so quickly to killing other humans.
This area was designed to demonstrate that Combine soldiers are hostile
and will attack on sight, giving the player a reason to want to shoot back.
It starts with the dialog preceding the encounter, where the player overhears
the Combine discussing the problem of scavengers in the Quarantine Zone.
Then, when the player finally achieves visual contact with
their first Combine soldier, they witness that soldier
in the process of executing a civilian scavenger.

a2_06_first_combine_skirmish

Squad soldier AI has been a mainstay of the Half-Life series, so we were
excited to see how their design would be impacted by the move to VR.
Our early playtesting against the Half-Life 2 Combine soldier
AI in VR immediately made a number of things clear.
First, players did not like to be shot at, and often had a visceral negative response.
The soldiers frequently ran at players at full speed, firing all the while,
and we discovered that the speed of Half-Life 2 combat was simply too overwhelming in VR.
Also, players took longer to do everything in VR, especially the core
action of spotting and aiming their weapon at an enemy while staying in cover.
As a result, the soldiers tended to live much longer in VR than they did in Half-Life 2.It was clear from these early playtests that the most common reaction
among players was to immediately seek cover from enemy fire.
So we started experimenting with changes to the soldier AI, aimed at
building a combat experience around players' tendency to seek cover.
Early experiments were focused on keeping Combine soldiers at a comfortable
distance from the player, while keeping them moving around enough
to not have combat become too static.
To create lulls in firing for players to peek out from cover,
we prevented soldiers from shooting while on the run in most cases.
To keep players from remaining stationary, we created a concept of 'flushing',
where squads will attempt to flush the player out of cover,
and we gave each class of soldier a different method of flushing out the player.
But even in these early tests, it was also clear that players weren't capable
of fighting soldiers until they'd gotten skilled in a number of basic combat
actions, like reloading their gun without having to focus all of their attention on it.
So we were careful to slowly introduce the various types of soldiers,
and even the individual features within the soldier classes.
For example, these early fights against Combine soldiers only feature the
basic Grunts, and these particular Grunts don't carry grenades.
This means they don't actually have the capability to flush a player out of cover.
We found it was important in these first fights to give players all the
time they need to learn to use cover effectively, before we start
introducing enemies that could try and flush them out.

a2_06_train_crash

At this point in the game, we knew we needed a big set piece; a non-combat
moment to reward players for fighting their way through the dark, scary tunnels.
Not only that, but the pace of the story was leading up to its most
important moment thus far: the rescue of Eli Vance.
Crashing the Combine prison train that was transporting Eli felt like the
exact right thing to accomplish these goals and it gave us the opportunity to create some VR spectacle.
We wanted to achieve the cinematic chaos of causing a train wreck,
while giving the players an active role in the event.
Our first attempts to design the scenario were a challenge.
It turned out that directing the player's attention wasn't as simple as
ensuring they were in the right position through gameplay.
In our initial tests, players frequently failed to see some or all of the spectacular train wreck!
The crash was one of the first destruction sequences we put into the game,
so we had a lot to learn about player behavior in VR and how to
direct the player naturally to immerse them in the chaos.
On top of that, to continually hit our target frame rate, we needed to create a
cascading series of events, so that only some of the destruction was happening at any given time.
Sound design, level design, effects and animation had to be carefully
coordinated to lead the player's attention through each stage of the wreck.
For example, at the start of the crash, we used the sounds of train cars
piling up in the distance, encouraging players to look in that direction
before one of the train cars lifts up and blasts through the concrete connector.
At the end of the sequence, the smokestack collapse is used direct the
player's attention to their eventual path forward and we wanted
players to witness its destruction.
Before bringing down the smokestack, we gave players a brief moment of
rest and, again, use sound to draw their attention to the smokestack.
Initially, we used a rumbling sound preceding its collapse, but we found
this to be too subtle in practice and instead added the transient explosion
sound which drew player attention towards the smokestack more reliably.

a2_06_eli_rescue

For the rescue of Eli, we wanted to create a tense 'just out of reach'
moment where the player physically saves him.
But, for the rest of the story to work, Alyx and Eli had to
take different paths at the end of the scene.
We tried a few different approaches, including a version where Eli was hanging
from a ledge just in front of the player and the player
could reach down to attempt to grab Eli’s hand to pull him up.
In practice, however, players' hands would collide with the real world floor.
Not only did this make it impossible to reach Eli,
it took players out of the moment, wrecking the emotion of the scene.
Since Eli and Alyx had to take different paths, we tried an approach
where Alyx rescued Eli, but he was injured and couldn't proceed.
Unfortunately, many players would stick around, not wanting to abandon him.
Eventually, we arrived at the solution that shipped in the final product.
Separating Eli from the player with the Vortigaunt’s rescue made for a much
more meaningful return and fulfillment of the Vortigaunt's promise to help Alyx.
The datapod that Eli found was an addition that not only let us integrate
Eli back into the story later but also gave him something to do so there
was a much stronger reason why he would not accompany the player on their journey going forward.

a3_01_melee

Early on in the project, we spent some time experimenting
with the classic Half-Life crowbar.
Playtests showed us that it was an interesting tool for our player to use in
puzzles and world interactions, but we weren't very happy with the how our melee combat efforts turned out.
We felt it was really important that the crowbar was solid to the
world and enemies, and that resulted in some tricky physics challenges.
Players could easily push the crowbar into an enemy in front of them,
because nothing in the real world would stop it.
Indeed, without that real-world physical feedback,
we found playtesters exhibited a wide variety of swing behaviors.
Some would waggle the controller rapidly, others would put their whole body into it.
At the same time, playtest response to our pistol prototype was very positive,
especially around the more complex reloading mechanics.
Faced with a potentially significant R&D problem in melee combat,
we chose to refocus our efforts on ranged combat.
It also didn't help that players all associated the crowbar with Gordon Freeman, not Alyx Vance.

a3_01_vault_view

This entire section of the map, from the start of the keycard puzzle up to
this room, was added specifically to show the player the vault on its own.
Without this little detour, players would otherwise have their first view
of the vault at the entrance to the Northern Star Hotel,
which was also when they would see their first Combine substation.
So much new information all at once led many players to confuse the vault
with the substation or just miss it entirely, so we created this area
of the train station to highlight the vault on its own.
Even though players are required to enter this room to get the
second keycard, they still wouldn't necessarily notice the vault outside,
so we added some dialog with Russell about looking out the window to see it.
This may be a bit heavy-handed, but it was necessary to solve the problem
of players not recognizing the vault when they saw it later in the game.

a3_01_throwing

Although it may seem obvious that throwing would be a trivial interaction to
support in VR, a lot of nuanced work is required to get it to feel right.
And if we were going to build puzzles like this one, where the player must
throw an armed grenade through a high duct, throwing needed to work well.
A typical VR game running at 90 frames-per-second will read the
sensor data from the tracked controllers once per frame.
This means the data is read every 11 milliseconds.
That may seem like a short time step, but when throwing, a user can easily
move their hand 15 cm or more in that amount of time.
Additionally, the player may release the object by opening their hand or
releasing the trigger at any point during their throwing motion.
This makes reconstructing the act of throwing a virtual object
an interesting signal processing problem.
It's up to any given piece of VR software to derive the player's intent
from the discretely sampled controller data and we've learned a lot
over the years about what works and what doesn't.
In The Lab, which we shipped in 2016, we sampled the controller's
positional data every 11ms and used the sampled positions to
compute the controller velocity at each time step.
We then used an average of the computed velocity and direction over several
frames before and after the object release to determine the
velocity and direction to apply to the thrown object.
Although this was functional enough to ship, it didn't always feel consistent,
especially if the game's frame rate ever fell below 90 frames per second.
Later, in the Index Controller Tech Demo Moondust, we expanded on
this algorithm by biasing towards higher velocity data.
We also extended the SteamVR software interface to provide the controller's
instantaneous velocity and acceleration data from the hardware
Inertial Measurement Unit, which is computed on the controller every 2 milliseconds.
This meant that we did not have to calculate an estimate from the positions
sampled by the engine only every 11 milliseconds, as we had done on the The Lab.
This was crucial to providing a more responsive feel, as it meant the
throwing calculations were no longer tied to the rendering frame rate.
After further experimentation, we arrived at the solution used in Half-Life: Alyx,
which takes ten frames that preceded the release of the held object and
averages the three frames bracketing the peak controller velocity in that time window.
We have found that this tends to lead to a better approximation of
precisely when the player intended to release a thrown object.
This approach resulted in consistent, comfortable throwing, enabling us to
rely on players being able to accurately throw grenades, or any other objects, with confidence.

a3_01_optional_toner

This optional toner puzzle is intended to train players not to
blindly click on each junction without first understanding the implications.
This particular puzzle is designed to have the side effect of tripping a
fire alarm if the player merely clicks on each toner junction
they encounter without looking ahead.
An observant player can easily solve the puzzle without tripping the alarm,
but a careless player will trip the alarm, causing a Combine soldier
to come storming around the corner to investigate the noise.

a3_01_substation_music

Typical players spend up to two hours progressing through the interior
spaces of the Northern Star Hotel and we found in playtests that they
often had trouble remembering their next goal of disrupting
the Combine substation above.
In earlier tests, the Vault constantly reminded players of its
presence by emitting a repeating abstract musical pattern.
This pattern worked so well at drawing and keeping the player's attention
that we expanded and fictionalized it as a semi-musical sound design
pattern which we changed to emit from the Combine substation representing
it collecting and transmitting Vortigaunt energy to the Vault.
While in the hotel, the player does not have line of sight to the substation,
so we repeat this sonic pattern at varying volumes within the hotel
as a reminder of both the Combine substation and the Vault outside.
In fact, if you look closely at the Xen foliage when you are in the Hotel,
you will notice that it reacts visibly to the overspill of Vortigaunt energy as it flows from the substation.

a3_02_liquid_bottles

Given the large number of bottles throughout the game and the fact that they
feature so significantly in the distillery gameplay, we decided to make
interacting with them as interesting as possible using a procedural liquid effect.
Each bottle is assigned a random fill level, and values for the amount of
liquid agitation and foaming are calculated based on the bottle's movement.
This is stored in the object's tint color which is otherwise unused.
The shader calculates the liquid level based on the bottle's orientation in game,
since the amount of liquid visible in a bottle as a proportion of its height,
is different when oriented upright, sideways or upside-down, different values
are used for each in order to preserve the apparent volume of liquid inside.
The liquid's surface is then displaced by a number of sine waves,
modulated with the agitation value to create rippling waves.
The areas determined to contain liquid, as opposed to air,
take on the color and refraction index values of that liquid.
Bubbles and foam effects are added based on the agitation and foaming values.
These are procedural and can be modified to create a variety of different behaviors.
Several layers of bubbles are created in a local view space relative to
the object, so that they're stable from frame to frame and
can always move upwards relative to gravity.
Artists can set a variety of parameters per liquid, such as color, glow,
foaming potential, wave properties and a meniscus at the surface of the liqui
which curls up around the edges of the volume, helping to
communicate the viscosity of the liquid.
The refracted environment seen through the glass and liquid is sampled from
the same cubic environment map of the local area that's used for reflections,
and so the bottles are effectively opaque and do not have to blend with the
frame buffer in the way that transparent objects normally do.
This was a strategic design choice, as along with storing the physics data
in the tint color it means that the bottles can be rendered together as a batch,
in arbitrary order, which is critical for graphics throughput.
In the end, the refractive liquid effect allows for a variety of interactions
to take place at a fidelity that is especially compelling with tracked hand-controllers in VR.

a3_02_physics_sounds

We knew from previous VR projects that players would expect a high level
of fidelity from their interactions with objects in the world,
and during early testing of Half-Life: Alyx we received comments
that objects were not making the sounds that players expected,
given an object's material, size and assumed weight.
This led us to a more granular approach to physics sounds
than than we had used in prior products.
For example, in the past, we would have used a single physics
sound for all objects of a certain size and material.
So we would have had one sound to represent collisions of small metal objects,
another sound for large wooden objects and so on.
For Half-Life: Alyx, we built up a new library of object-specific
sounds so that we could distinguish between similar objects
like a piece of silverware and a small metal can.
Not only do we now have a library of almost two thousand object-specific
physics sounds covering over 160 different objects,
but we can further alter the sound at run time based on an object's
impact velocity and the hardness of the material it collides with.
In some cases, we even layer in sounds representing an object's contents,
such as matches in a matchbox or fuel in a jerrycan, which can be heard when
those objects are shaken by the player or collide with something at high velocity.

a3_02_xen_art

Breaching the quarantine zone revealed a new hidden secret of the city:
alien plant life from Xen has taken root on Earth and has been thriving in the shadows of City 17.
On the development team, the Northern Star Hotel was referred to
as 'Hotel Xen' before it had a proper name in the story, as it was the first
location we used to explore what Xen foliage could look like and
how it might consume any given environment.
One of the challenges the environment team faced when designing the Xen foliage,
was layering a new look onto an existing environment without
having to rebuild Xen infested versions of every prop.
We relied heavily on texture blending and kit bashing of Xen foliage shapes
to handle most of the big environmental changes when Xen-ifying a scene.
We also added some new features to our level editor, Hammer, which allowed
for more creative manipulation of assets to reduce repetition.
For example, we could pose certain types of meshes such as the Xen tentacles.
These could be placed, scaled, stretched and manipulated to snake through an
environment, giving a handcrafted look without every asset having to be built for a specific location.
These elements were editable in Hammer so they could be adjusted at any time,
but were baked into static meshes when the map was compiled to improve their run-time performance.

a3_02_xen_animation

The Xen flora in this area animate in response to the Vortigaunt
energy pulsing from the Combine substation outside.
Since the substation itself is not visible from inside the hotel,
it was important that we continue to remind players of its presence using audio
and visual cues, and the Xen flora provided a perfect opportunity to do this.
The shaders used on the Xen flora help illustrate the
plants' response to the energy pulses in a few different ways.
For example, the vili and cilia elongate and vibrate,
while many of the other plants pulsate or glow.
The intensity and timing of these effects are driven directly by the sound system,
meaning that the audio and visual reinforcement of the Combine substation pulses are always in synch.

a3_02_reviver_intro

The Reviver, one of the new enemy types introduced in Half-Life: Alyx,
arose from our observation that playtesters enjoyed searching their
environment in VR much more than they had in traditional games.
In addition to rewarding that desire to search with ammunition and resin,
we experimented with how it could be part of fighting an enemy.
We began prototyping with a fast headcrab variant we called the Runner,
modifying its behavior to run away and hide from the player after it attacked.
We found players really enjoyed looking for a small,
hidden enemy they knew was somewhere within a dense environment.
They also liked shooting at a fast moving enemy that was able
to run under and behind everyday objects, like tables and chairs.
Lastly, players talked about liking the way they got to be the aggressor in combat
for once, since everywhere else in the game they felt like they were always on the defensive.
While feedback was overall very positive, there was one specific problem:
how to encourage players to search for the Runner.
After a few Runner encounters, we found players would start to realize they didn't have to look for the Runner at all,
they could simply wait for it to run away, and then leave the room. So we started prototyping various ways in which the Runner could become more
dangerous if not found, setting up a time pressure where players had to find
the Runner quickly, or the situation would get worse.
After some experimentation, we landed on the prototype where the Runner could
revive a fallen zombie, and become a new kind of foe,
one that was more dangerous than a Runner or Zombie.
After this change, the Runner was renamed internally to the Reviver,
and we began assembling the various encounters around it.
It took a lot of playtesting to balance out all the various elements of the
Reviver fights, because they involve multiple, separate stages:
Combat versus the Reviver, searching for a hidden Reviver, and combat versus a Revived Zombie.
In the end, we still didn't feel like we'd created enough of a foolproof reason
for players to need to search for hidden Revivers, so we turned the
Reviver's heart into a key, requiring players to collect it.

a3_02_reviver_animation

An enemy's animation can elicit a powerful emotional response from players, especially in VR.
While developing the Reviver, we learned that sometimes a character
performance can have unintended consequences.
In our initial Reviver prototypes, we simply used a blue-tinted
headcrab model with rudimentary electrical effects.
The prototype was successful enough in playtesting that we decided to invest in a
new purpose-built model for the Reviver, with electrical skin and improved effects.
Players identified the new Reviver successfully in the game, but we ran into a new problem:
players didn’t want to fight it any more.
While the blue headcrab was easily identified as something dangerous,
the new model was something that players had to evaluate from scratch
and they didn't necessarily see it as threatening.
In fact, the animations authored for the new Reviver model were
interpreted by many players as friendly or even cute.
Players would often approach the creature to try and pet it or, worse,
refuse to kill it to retrieve the required puzzle piece, blocking their progression in the game.
If they did then realize that they were going to have to kill the cute
creature to proceed, they felt horrible having to go through with it.
To fix this, we re-authored most of the Reviver’s animations,
making it significantly more unsettling and menacing.
We also made the Reviver more aggressive by adding a jump attack if the player got too close.
With these changes to animation and behavior, players once again were fighting
the Reviver as intended and they no longer wanted to take it home as a pet.

a3_02_hlpr

This is the room featured in the very first shot of the Half-Life: Alyx announcement trailer.
It's not an obvious choice, for that use - the room is dark, not particularly colorful,
and it's a pretty pedestrian corridor.
Cleaning supplies and laundry baskets are not in themselves attention grabbers.
But we chose this scenario because it very effectively demonstrates
the excitement and tension that we thought was typical of the game.
All game trailers require careful design.
Conveying a game’s core experience in just a few moments is tricky even at the best of times.
But we believe the Half-Life: Alyx trailer had to work harder than most,
to meet a particularly difficult and uncommon set of challenges.
We had kept the existence of the game under wraps, so shipping the trailer was
going to be the first time in 13 years that Valve had shown any work on Half-Life.
On top of that, the trailer had to quickly communicate that Valve had
chosen to make the next installment of Half-Life a VR game.
And although we were very confident in the decision to make a VR game
and in the quality of what we had built, we still worried that revealing
those decisions to the world might cause even our most loyal fans
to make their own decision: that we had truly lost our minds.
Those of us working on the trailer knew it had to make people
so excited that they could put those worries aside.
The trailer also had to clearly convey that the game was not Half-Life 3.
Meaning that it did not take place after the events of Episode 2.
Half-Life: Alyx was, instead, a prequel of sorts,
occurring between the events of Half-Life and Half-Life 2.
In addition, the trailer had to show depth, variety, familiar characters,
and enough novelty and surprise to get across to the viewer that this
was really and truly a full-fledged installment in the Half-Life series, not a small VR experiment or a spinoff.
This was a tall order, but the entire game team rallied around
the effort because everyone understood the value of the work.
We’re just as proud of the result we achieved in the trailer
as we are of any other part of Half-Life: Alyx.

a3_03_teleport_immersion

Half-Life: Alyx supports three types of player movement.
Continuous movement, shift teleport, and blink teleport.
Early on, one of the properties of blink teleport, and to a lesser degree,
shift teleport, was that players would often refer to the actual mechanic
of teleporting when describing their movement through the environment.
'I teleported to the desk' or 'I teleported into cover' was often how
players would describe their actions and it served as an unconscious
reminder that they were interacting with a game in a specific way.
To address this, we made a couple of key changes early on that significantly
altered the way players recounted their movement, and as a result,
pulled them further into a more convincing virtual reality.
One of these changes was the visual indicator that represented the teleport destination.
We swapped the abstract cylindrical indicator used in The Lab for a pair of holographic feet.
The feet help provide a sense of physical presence and indicated
the direction the player would face upon finishing a teleport.
We also added animated feet between the player's current location and the teleport target.
These feet illustrated the path the player would traverse,
further helping provide a fiction for their movement through the environment.
Along with the visual interface changes, we also added audio of footsteps
and body movements that would play after a teleport to effectively
summarize the player's movement through the space.
This isn’t something everyone necessarily notices at first,
but it is quite obvious when it’s missing or incorrect.
The footstep audio transformed how players described their movement through
the world and players would often refer back to movement in the environment
as walking or running as opposed to referring to game terminology.

a3_03_reviver_foreshadowing

Compared to our traditional first person experiences, in VR we found
it was sometimes all too easy to create an intense atmosphere.
This can be used to the game’s advantage but it can also be detrimental in
cases where we our focus is on training players in some new mechanic.
In intense situations, players frequently fail notice or retain critical information.
And as a result, we often had to intentionally reduce tension in
areas of the game where we were introducing players to something new.
The audio of the Reviver here, as the player descends beneath the hotel,
is designed to ease players into the idea that they will be encountering
the creature that they had briefly seen as they entered the hotel earlier.
The Reviver's scream and its shadow on the wall are designed to reduce
tension by indicating the position of the Reviver around the corner, removing any potential surprise.
Telegraphing the Reviver's position and giving players the upper hand
helps set up the resurrection scene, where players want to observe
the Reviver instead of shooting at it immediately.
This is critical, because this is the first time the player sees the
Reviver reanimating a zombie and if players are too distracted to take it in,
they won't be successful in the upcoming fight.

a3_03_reviver_training

One of the challenges with the Reviver’s resurrection mechanic was training
players that the Reviver was the target and the zombie was
effectively just an inert vehicle that could not be damaged.
This was a significant departure from what we had been training players to expect.
Up until this point in the game, players would see a zombie,
shoot at it and as long as they hit it enough times it would die.
The Reviver breaks that expectation and, in early tests, players thought the
game was broken when the zombie wouldn’t die or even react to being shot.
We had to make a number of changes to revived zombies to train players that this behavior was intentional.
First, we designed the Revived zombie's introduction animation to use particle
effects and audio to draw attention to the changes that result in this new type of combat.
Second, we substituted in dull particle and impact sounds to make shooting
the zombie feel less impactful than it did in previous encounters with ordinary non-revived zombies.
And, finally, we modified the revived zombie so that it doesn’t play any
flinch animation or respond to bullet impacts, further communicating to
players that its behavior is different.
We also used the Reviver itself to illustrate the new combat experience.
Electrical effects were added to indicate the Reviver’s movement inside
the zombie, with a bright pop and glow as it emerged at a new location on the zombie's body.
With these new visuals, and satisfying impact sounds and screeches from the Reviver when hit,
playtesters were no longer confused, and consistently enjoyed the new combat mechanic.

a3_03_reviver_ground_attack

In our early designs, a revived zombie was just an invincible
zombie with a moving vulnerable spot.
Playtesting revealed this to be less tense and threatening than the reviver
on its own and since we wanted the revived zombie to be a higher-stakes
adversary, we needed to enhance its offensive capabilities.
As a natural extension of the electrified nature of the reviver,
we chose to create an electrical field around the revived zombie
so that players who got too close would take heavy electrical damage.
This worked well for many playtests, until we observed a
player who was a proficient real-world marksman.
This player was able to easily defeat the revived zombie
without ever being threatened by it.
In response, we gave the revived zombie its electrified ground counterattack,
which forces players to change positions to avoid damage.
This created a cadence of attack and retreat that was more tense and satisfying for all players.

a3_03_elevator_comfort

Since a VR game must always be rendered from a virtual camera that matches
the player's own point of view, we have limited opportunities to control what the player sees.
Elevators are one of the few tools we have in VR to direct the player's view.
A rising elevator like this one can be used similar to a crane shot that
you might see in film, going from one elevation to another,
revealing a beautiful vista or dramatic story beat.
In VR, elevator rides can be challenging to make comfortable, as the motion
of the virtual elevator is inconsistent with the player's own vestibular system.
We found that we could make the elevator rides in Half-Life: Alyx comfortable
enough for most players by moving the elevator slowly and limiting
how much the player could see out of the elevator.
This particular elevator was initially far more open but we gradually
introduced more and more visual obstruction to the design so that the player
always had some significant portion of the elevator in their field of view.
This meant that we were able to get the dramatic reveal of the hotel
exterior and Combine substation we desired, but players were comfortable
since they could always see the elevator, which was in their own frame of reference.

a3_03_substation

After exiting the elevator, the player gets their closest view yet of
the Combine substation attached to the exterior of the hotel.
While the Combine architecture can veer into the abstract, it's often
at its most successful when contrasted with a human structure.
Beyond this, we always try to drive the visual design with an invented
function, ideally to support the gameplay or story; in this case, the routing
of some unusual power source to the vault from various parts of the city.
The player doesn't know it at this point, but this substation is designed
to essentially torture the song out of a group of Vortigaunts who are housed in the attached pod farms.
The output is collected from the Vortigaunts before being 're-harmonized' by
a single 'conductor' Vortigaunt via the cable tuner apparatus seen at
the root of the transmission cables that run up to the vault itself.
This design gave us the opportunity to subsequently meet another Vortigaunt,
and acted as enough of a constraint to allow us to design an interaction
that would physically compromise the substation as the player exits the hotel in the next map.

a3_04_xen_response

Xen foliage has been designed to respond to the player in a number of ways.
In dark areas, where the flashlight is active, some Xen
flora will animate or expand in response to the light.
These Xen plants respond to the proximity of the player's hand by glowing and pulsing.
The glow effect in particular is rendered with an offset using the correct
binocular perspective for each eye, which gives the impression
that the glow originates in the interior of the organism.
The Xen flora's shader effects which rely on hand proximity take their
input from a dynamic voxel field which follows the player at all times.
The player's hands effectively draw motion trails into this voxel field,
which can be sampled by the foliage's shaders to drive deformations and other shading effects.
If the player moves their hands quickly, they produce motion trails in the voxel field.
If they hold their hands still, they produce a radiating force.
These different types of motion characteristics allow us to create
different behaviors in the foliage to give it a bit of extra personality.

a3_04_particle_effects

Particle effects present an interesting challenge in VR due to stereopsis,
which is our ability to discern the depth of objects from two slightly
different views of the world we see from our two eyes.
Traditional particle systems, as seen on the left, were typically
implemented as a series of flipbook images displayed on relatively
large polygons which orient to face the player.
Each series of images would contain effectively a little movie of a more
complex phenomenon like a wafting puff of smoke, or a flickering flame and so on.
For the monoscopic rendering used in non-VR systems, this approach holds
up well and is widely used across the industry.
Because of the stereo rendering used in VR, however, players can easily
see the flat cards, resulting in the unconvincing effect seen on the left.
To address this, we had to update our particle rendering techniques for VR.
The particle system on the right is an updated VR-friendly version of the older effect seen on the left.
One technique we used on large cards like the ones used in this smoke effect
was to create some drag on the cards' reaction to the player's head movement.
To do this, we created another particle system which lagged the player's
position and was used as the player camera for the facing direction of the smoke cards.
By removing this one-to-one relationship of head motion to sprite
rotation we were able to give the impression that the smoke was volumetric.
In addition to using more 3D elements in the creation of our effects,
we also often used a post-processing bloom on particles to help visually
merge elements that otherwise would appear as a set of flat planes.
This made the cards harder to stereo fuse, which gave them a more diffuse, volumetric feel.

a3_04_darkness

One of the tenets of Half-Life game design is creating crafted
moment-to-moment experiences that players participate in as they progress through the game.
Sometimes, this means playing with emotional responses to turn
something undesirable into a memorable highlight.
This collapsed section of the hotel features this type of experience.
Players are often instinctively pulled to the safest looking location
in a given area, and here, that would be the well lit corridor above.
But when they get there, they find their way blocked by a combine power shield.
The power cable to shut down the Combine forcefield leads them
back down into the poison headcrab-infested darkness below.
A place they really didn’t want to go.
These larger gameplay spaces also need to be resilient to player choice.
So even if a player chooses not to head up to the well lit corridor,
they likely will still have the same trepidation about heading down into the darkness below.
After the player has braved the darkness, and sheltered themselves in the
lit room where they're safe from the wandering headcrabs, they're faced
with the daunting task of cutting the power to the Combine forcefield,
because this has the side effect of plunging the area into darkness.
Not only does that make the situation worse, since the player must make
their way out of the pitch blackness with only a flashlight,
but the path to the exit is back through all the poison headcrabs.
We spent a lot of time tuning this experience through careful playtesting and evaluation.
And, often, when recounting their travels through the hotel in post-playtest
discussions, players would recall this moment of being plunged into darkness
as a high point, managing to combine a feeling of amusement with the overall sense of dread.

a3_04_ammo_balancing_1

A common game design problem is that of resource placement,
and this spot in the game is a good example.
Players have just left an area where they acquired the Combine Submachine Gun.
Whenever they get a new weapon, we try to give the player the
opportunity to experiment with it right away.
So, right after the SMG, we made sure to give the player a bunch of ammo for it,
and a supply of not-too-threatening enemies, in the form of confined
zombies and wandering headcrabs, to experiment on.
But playtesting has shown us that there's a wide range of ways
that players respond to that setup.
Some shoot a headcrab or two and move on, while others take their time
exploring the weapon, even using up all the ammunition we gave them.
Now, after the headcrabs, the player is heading toward a new area,
which features one of the hardest combat arenas in the game,
where they must fight two Revivers at the same time.
How much ammunition can we expect the player to have here?
How much should we put in this area between the two arenas to ensure all
players will have enough to fight the two Revivers? If we don't put in enough,
a player may be in a position where they cannot succeed at the upcoming encounter,
and reloading their savegames won't help them.
They would need to go back and re-play the previous area,
this time conserving more ammo.
That's not something we want to happen, nor is it something
we can even expect all players to realize.
Instead, many will repeatedly try and fail to beat the two Revivers.
So at its core, the problem with resource placement is that the dynamic
range of resources carried by players can be significant, and it gets even wider as the game progresses.
In Half-Life: Alyx, we would see some players end the game with barely
any ammunition, while other players had over 1000 unused rounds.

a3_04_ammo_balancing_2

So how can we handle resource placement when there's such a wide
dynamic range in player collection and usage patterns?
Like many games, we opted for a dynamic system that looks at the
player's current resource counts, and makes decisions based on them.
But the design of these kinds of systems can be tricky,
because they tend to affect player behavior.
For instance, in Half-Life 2, we placed breakable crates throughout the
game which dynamically spawned resources, based on the player's current health and ammo levels.
This had the advantage of allowing us to not need a huge amount of
playtesting for resource balancing, because as long as there were
enough crates around, the player would always have enough resources.
But because that system exists, we didn't have to spend much time on
resource placement, and as a result, we didn't really craft
challenges around resource collection.
Players also learned that they didn't need to really care about their
resource levels, because the crates were always going to top them up on whatever they needed most.
For Half-Life: Alyx, we wanted to revisit those assumptions.
We knew that searching for resources in a VR environment was something players really enjoyed.
So we wanted resource collection to be something players needed to do throughout the entire game.
And that meant we didn't want them to ever reach the point where
they had so much ammo and health that they didn't need to explore.
We were also able to start playtesting earlier in the product development
cycle than we had been able to when making Half-Life 2, which gave us the
confidence that we'd be able to spend the time we needed to carefully place every resource.

a3_04_ammo_balancing_3

For Half-Life: Alyx, we moved to a system that allowed us to craft a
more specific experience, using a dynamic layer that attempts to
reduce the dynamic range of resource gathering between players.
Instead of placing generic resources that switch types, level designers
carefully placed each resource type throughout the world.
Some are easy to find and some are more difficult.
Then, as the player moves forward in the world, the system opportunistically
removes resources in the path ahead, based on the player's current resource levels.
This had the benefit of allowing level designers to have a lot of
control over the player's resource amounts in different areas of the game.
They were able to craft areas where they wanted the player to
be starved of one ammo type, or encouraged to use another.
For example, in this area, designers could provide enough ammo to ensure
that the player would have enough to fight two Revivers, with the knowledge that
the ammo balancing system would remove much of it if the player was already loaded up.
We could safely give the players a large amount of Combine SMG ammo in the
previous area and know that players who ran past the poison headcrabs without firing a shot wouldn't find more here.
There are a number of additional features to address complexities that
result from this system, such as ensuring players always find something
inside a Combine Locker, or ensuring that players are still rewarded for
searching regularly and carefully, even if they're carrying a lot of ammo.
Ultimately, we want a player who is paying more attention to the environment to come out ahead resource-wise.
We just needed some dials to be able to control how far ahead they got.

a3_04_double_reviver

Up until this point, each Reviver encounter has been focused on teaching
players something new about the Reviver so that they can defeat it.
This arena represents the culmination of that training.
There are no new rules.
There’s nothing to learn about the Reviver itself.
By this point, players are armed with the knowledge required to progress.
The challenge then shifts into a combat problem where strategy and execution are required.
Players have to juggle and prioritize multiple targets, utilize all of their
weapons and manage a constrained supply of ammunition in order to prevail and
make their way back out to the exterior of the hotel so they can deactivate the Combine Substation.

a3_05_vort_energy

One of the core sub-plots of the story line is regarding the combine
imprisonment and use of the Vortigaunts to trap the G-Man.
The multi-dimensional power of the Vortigaunts to detain the seemingly
invincible G-Man is established in previous Half-Life lore.
Through the use of their prisoner pods and sub-station technology,
the Combine extract the power of the Vortigaunt's Vortessence chant,
tune that energy and then transmit it to the Vault to contain the G-Man.
To communicate this Combine process to the player without explaining
it through lengthy exposition, we used musical sound patterns
to represent the process implicitly.
If you stand near the pods holding the Vortigaunts,
you can hear them groaning their chant in a slow but semi-musical pattern.
Just past the pods is the 'resonator' within which you can
hear a distilled version of the chant energy.
Finally, the cables are activated with the concentrated Vortigaunt energy,
which is harmonized in varying patterns and intermittently
transmitted to the vault once the energy is sufficiently tuned.
If you stand near to the cables while they're tuning, you'll
notice the sheer size and height of the sound.
This was achieved by placing the sound emitters for each cable at
the locations of the actual cables themselves.
The sound of the cables covers a broad frequency range which
emphasizes Steam Audio's 'head related transfer function' processing.
That simulation combined with the feedback of the observers inner ear as
she looks upwards gives the same impression of scale that VR is known to give visual objects.

a3_05_substation_vort

Character performance within a VR game has a unique set of opportunities and challenges.
In our efforts to keep control in the hands of the player, we must create a
performance that is interesting enough to maintain players' attention
but flexible enough to allow for player freedom.
This scene has the most face-to-face interaction in the game and there is no
physical barrier between the player and the Vortigaunt when he first appears.
At one point, we played with the idea of a jump scare when the Vortigaunt is
released, upending the player's expectations of what had been
referred to only as a 'battery' up until that point.
Not only was it a bit ham-fisted, but the Vortigaunt came flailing
out of the pod, frequently into the same space occupied by the player.
To solve this problem, we cut the jump scare and slid the pod back out of
the player's space as it opened, to allow for the Vortigaunt to
emerge at a comfortable distance.
We faced a similar challenge when determining how to get both the player
and Vortigaunt to their next locations without intersecting.
We wanted to get the player and Vortigaunt side-by-side overlooking the
vista while the Vortigaunt delivered critical exposition.
It would have been awkward for the player and the Vortigaunt to take the
same route along the catwalk, so we looked at other options, including
exploring ways that the Vortigaunt may have once moved in its native environment.
With long forearms, it's feasible that the Vortigaunts may have moved
around as quadrupeds, so we chose an agile four-legged scramble up to the
higher platform where he can deliver exposition without contending with the player for space.

a3_05_prefabs

While developing Half-Life: Alyx, we had a large number of people
in many disciplines contributing to a relatively small number of maps.
This meant that we frequently had multiple people working on the same map at the same time.
One way that we managed this complexity was to break our
maps into smaller pieces referred to as 'prefabs.'
A prefab could be built out of the same types of geometry, logic,
enemies, particle systems or any other elements as a regular map.
As long as people were working in separate prefabs, they could
make changes or additions without stepping on each other's work.
A prefab could contain a large physical space like this whole
street scene or it could contain only a handful of elements.
Large prefabs that partitioned a map to allow for collaboration
were essential for a project of this scope, but many small,
reusable prefabs were created for Half-Life: Alyx as well.
A good example of this was the set of reusable vehicle prefabs used here and throughout the game.
Each vehicle was authored as a configurable prefab,
enabling designers to quickly place them around the game,
setting a few parameters to customize the vehicle for the context.
Parameters include things like the vehicle's paint color, the amount of
pre-existing damage, whether the doors were open or simply missing and
the type of ammunition or other resources the vehicle might contain.
This sped up our workflow and also meant that updates to a prefab
were automatically propagated out to all of the maps that used it.

a3_05_procedural_displacement

The tarps used by the Combine to cover buildings in the
quarantine zone are animated procedurally.
This animation was added relatively late in development, so it needed to
be both computationally inexpensive and technologically conservative.
So, no new systems.
Simple overlapping waveforms are used to displace the tarp geometry
where it hangs loosely away from the building.
The displacements are oriented along the wind direction,
and their propagation direction is further modified by the
gradients of a texture we referred to as a 'freedom of motion' map.
This map is derived from the shape of the building geometry under the tarp
and is used to attenuate the displacement so that the tarp won't animate in
places where it is meant to be pulled tight against the building.
The result is a flapping tarp that seems to conform to its surroundings
and responds appropriately to the wind.

a3_05_character_movement

The Combine soldier you see here won't fight you.
It's been programmed to run back and forth along the path shown on the ground,
in order to illustrate the stride retargeting system developed for Half-Life: Alyx.
One of the challenges of animating digital characters is making sure
that their feet don't slide along the ground as they move.
Humans are very good at picking out irregularities in the movements of virtual characters,
especially in VR, and Combine soldiers in particular need to look like their
feet are planting and their weight is shifting as they change direction,
otherwise they appear weightless and unconvincing.
In a preprocessing phase, our stride retargeting system analyzes the steps
of each authored animation and stores them as a direction-independent trajectory.
At run time, our system predicts where each foot will land on the next
step based on the motion of the animation,
the character's path and the height of the ground.
The pairs of colored boxes on the ground illustrate this process.
The red boxes show the previous foot positions.
The green boxes indicate the soldier's current foot position and the
blue boxes are drawn where each foot is predicted to land next.
Once the predicted foot positions are calculated, the system guides the
feet to the predicted locations using the trajectory information to
preserve the style of the original animation.
Once the foot lands, it is locked in place until the next step.
In the end, this system was so effective that we could essentially
make any animation walk in any direction, without foot sliding.
You can experiment with the system by moving around the area,
since this soldier has been set up to always face you while running the specified path.

a3_05_trip_mines

Trip mines have appeared in prior Half-Life games and the act of defusing
them seemed like a natural fit for a VR game, where players can perform complex operations with their hands.
By this point in the game, players have already learned how to hack Combine
technology with their multitool, but we wanted defusing a trip mine to be more tense than the hacking puzzles.
To defuse a trip mine, not only does the player have to place their hand near
the laser that could trigger the mine, but they have to perform the task within just a few seconds.
Early versions of this puzzle we inspired by lock picking,
resembling a series of tumblers that the player had to quickly align.
Unfortunately, the tumbler design required players to rotate their
wrists in an uncomfortable manner, particularly given that they
had to also avoid touching the laser beam.
We removed the rotational component but kept the timing element,
leading to the mechanic you see in the final game.
We also experimented with causing the trip mines to actually explode if the
player failed to defuse them, but most players found this too punishing,
especially in cases where a chain reaction of explosions could occur.
Instead, the mines just stay active and the player can attempt to defuse them again.
We did end up keeping this behavior for the hard difficulty setting
but added a warning sound a few seconds before the mine actually explodes,
to give players some time to move away.
We also experimented with allowing defused mines to be removed and
redeployed by the player, but we hadn't designed enough combat scenarios around
this mechanic for it to be satisfying in practice, so this was left out of the final game.

a3_05_combat_pacing

This combat arena that loops back over itself as the player winds through
the tenement buildings was part of one of the earliest test maps built
when we began adding Combine soldiers to Half-Life: Alyx.
One of the lessons we learned from this series of combat encounters was
that fighting Combine soldiers was physically taxing on players as
we needed to pay careful attention to pacing.
Taking cover, reloading, switching weapons and acquiring targets really
pushed on all of the skills that players had learned up until this point.
To help cut down on fatigue, we added gaps between the Combine encounters
and removed some of the combat altogether to better pace the experience.
The trip mines in the stairwell following this battle are a good example
of one of the pacing elements we use to slow the player down
and let them recuperate before the combat ramps back up again.

a3_05_xen_world_building

It was an interesting challenge for artists and level designers to employ
environmental storytelling to paint a picture of what may
have happened inside the walls of the quarantine zone.
Just like the hotel, these apartments have been overrun by Xen foliage
and are filled with peoples' belongings, abandoned in haste.
In this area, the walls have been torn down, either by destructive
portal storms or perhaps the Combine's clean up effort.
Barrels of fluid used by workers to try and treat or contain the Xen growth
have been left behind and it would appear that the cleanup efforts have either failed or been abandoned.
By not explaining every facet of the world, we leave players with the
opportunity to imagine for themselves the bigger picture of life in City 17.

a3_05_mannequin_headcrab

Once we had the spine of the game complete, teams went through the maps
looking for additional opportunities to add the character and tone
that players associate with Half-Life games.
This mannequin headcrab was a fun opportunity to hit on
the sci-fi / horror / B-movie vibe.
We had a few conversations about how long to leave the headcrab atop
the mannequin, and decided the experience should be analogous
to catching a small dog in the middle of some mischief.
We wanted players to feel like they had walked in on something they
weren't supposed to see, and for them to go away feeling like every inch
of the game contained something for them to find, if they chose to explore it.
After an early update to the headcrab sounds, where details like breathing
and grunting were added for the first time, we received feedback that the
new sounds were too cute and familiar, moving the headcrab too far in the
direction of a cuddly creature rather than an alien threat.
We updated the sounds to be more threatening, but for this headcrab
we brought back some of the cuter sounds to sell the
headcrab's frustration with the mannequin.

a3_06_pistol_hopper

In order to reduce the amount of reloading required in the heat of battle,
we wanted players to be able to upgrade the pistol by increasing its ammo capacity.
That may seem like a small change, but implementing it in a way that was
intuitive to players and that didn't have a lot of negative side
effects was surprisingly tricky to get right.
Our initial mechanic for increasing pistol ammo capacity was to use double-sized clips.
Once the pistol was upgraded, the player's backpack held clips with twice
as many bullets, and those were represented by a clip model that was twice as long.
But, unfortunately, the longer clips presented a lot of distracting fictional problems.
For one, they looked ridiculous protruding out of the bottom of the pistol.
Players were also left wondering what was going on in their backpack to.
transform the single-capacity clips that they put in there into the.
double-capacity versions that they were pulling out.
We were also left with the sort of absurd implication that the pistol clips found.
in the world now magically all had to be the upgraded double-sized versions.
While we did scrap the double-sized clips, the concept was still enticing.
We just needed to find a better metaphor.
So then we started looking at the hoppers were used on paintball guns
that can store large amounts of paint balls.
In fact, the final upgrade that resulted from this whole line of
thinking is still internally referred to as the bullet hopper.
So, with this in mind, we next needed to design an intuitive interaction model.
Players were already satisfied with our existing pistol interaction loop which was:
eject clip, insert the next one, chamber and then shoot,
so we wanted to preserve these skills that players were already beginning to master.
So, players would still reload the gun as before, but small mechanical prongs
would steal bullets from the inserted clip, pull them in to the hopper and
then place them into the firing chamber as needed.
Audio cues were also added to clearly communicate the state of the hopper loading process.
The hopper also impacted the visual design that we used to convey the state of the pistol.
Originally, the clips themselves had a numerical counter indicating the number
of bullets remaining and, on its own, this was was fine and players understood it easily.
But the readout on the bullet hopper was on the opposite side of the pistol
and had a pattern of blue dots indicating the amount of bullets remaining.
So these two representations in two different locations made it difficult
for players to quickly determine the total number of bullets in the upgraded gun.
To solve this, we designed a single visual readout on the side of the pistol grip.
It structurally matched the state of a pistol clip,
the pistol's bullet chamber and the bullet hopper if the upgrade was acquired.

a3_06_soundscapes

For Half-Life: Alyx, we wanted to expand the functionality of our
soundscape system, which is used to play ambient sounds.
In prior games, we could generally only play one soundscape at a time.
The player would be hearing either soundscape 'A' or soundscape 'B'
with a brief crossfade from one to the other.
In this game, the soundscapes are much more flexible and can be
overlapped in ways that enable more realistic transitions.
For example, the volume of different soundscapes can be controlled by the
player's position and the open or closed state of particular doors in the world.
We frequently use this functionality in areas like this transition between indoor and outdoor spaces.
If you pause outside for a moment, you'll notice the crickets, birds,
distant dog barks and other environmental sounds that help paint the
picture of the quarantine zone late in the day.
As you move into the indoor area, you'll notice the outdoor soundscape
diminish as you move away from the door.
In fact, if you close the door, the volume of the outdoor soundscape is reduced even further.
And as you progress into the building, you'll begin to notice the
indoor soundscape, consisting of the indistinct hum of machinery and buzz of electrical fixtures.

a3_06_suppressor_intro

It's generally a requirement that all new enemy types have a
section of the game designed to introduce them.
We rarely build the introduction early on in the process though,
because it's often through playtesting that we learn exactly what aspects
of the new enemy need to be highlighted in the introduction.
In addition to understanding the capabilities of a new enemy,
we also find introductions help solidify the very existence of
the new enemy type in the player's mind.
Without an explicit introduction, we sometimes find that playtesters
confuse multiple enemy types, conflating them in their minds.
This was the case with the Suppressor, the class of soldier that's
designed to pin a player behind cover while other soldiers advance on them.
The Suppressor is an inversion of the previous soldier classes,
who've all focused on trying to push players out of cover.
The Suppressor introduction stretches across the next two rooms.
In the first, players see the Suppressor firing at zombies,
giving the player a moment to safely observe its firing behavior.
They also get to hear its associated sounds, which are important to learn for future encounters.
n the second room, players learn to fight the Suppressor themselves.
This is relatively straightforward when the Suppressor is alone,
but will become more involved in later arenas, where we combine the Suppressor with other soldier types.

a3_06_environment_art

It isn't practical to spend development time or art resources evenly
across every part of the game, so we rely up on strategic re-use of
resources to maintain fidelity and specificity across a game's environments.
This space was initially constructed using industrial models and
textures seen throughout many previous environments.
To set it apart, we added the large cables and Vortigaunt pods to imply a
make-shift Combine power plant whose purpose was to transmit energy to the substations.
The juxtaposition of this abandoned industrial space with the large Combine
cables created enough visual interest to make the space feel meaningful at relatively low cost.
The cable motif was subsequently added to other parts of the game to
increase visual interest and give the player a sense of being lead toward the Vault.
It may not be obvious, but the cables exiting through the ceiling here
continue on to the exterior of the building, across the large construction
courtyard, and over the roof of the Distillery building,
presumably continuing onto a substation or other Combine infrastructure.
Players may not notice that continuation, but such details help guide
us in building a world that feels connected and consistent.

a3_06_combine_ai_in_vr

One problem we faced with the VR movement styles that Alyx supported was
keeping the combat experience from diverging based upon whether
players were using teleport or continuous movement.
If the combat was significantly different depending on the player's
movement setting, our playtesting requirements could explode combinatorially.
One way that we prevented this combinatorial explosion was by ensuring that AIs
could sense player movement no matter which locomotion style the player was using.
To do this, we created a visual proxy for players using teleport locomotion.
Imagine a scenario where the player is behind cover, hidden from a soldier's sight.
If the player used continuous movement to run to another piece of
nearby cover, the soldier would see them while they were out in the open.
But a player who instantly teleported from one piece of cover to the next would not be seen.
To solve this discrepancy, when a player teleports, we leave a breadcrumb
trail of invisible visual proxies, which the enemy AI can see.
These proxies pass information to the AI, similar to what it would have
gathered from seeing the player perform the movement directly.
In this case, the soldier would know the teleporting player had left the
original cover, and run across open ground to the new cover, just like a player using continuous motion.
While obviously not identical, in that the soldier didn't have a chance to
take a shot at the teleporting player, these kinds of features did allow us to
reduce the number of ways our AI logic needed to take player movement options into consideration.

a3_06_ladders

For our teleport and continuous locomotion modes, we provide corresponding ladder climbing methods.
By default, players using teleport locomotion can target a ladder and,
after a short timer, be teleported to the top or the bottom, depending on where they started.
The timer was necessary to ensure that players were intentionally using the ladder.
We didn't want them to unintentionally target a ladder and become disoriented when they found themselves at the other end.
Players using continuous locomotion can directly grab ladders to climb up or down.
In this mode, they can grab a ladder rung and move their hand down to raise their body or move their hand up to lower it.
This allows players to move up or down the ladder, rung by rung in a natural way.
This mode turned out to be so popular that we added it as an option even for players using teleport locomotion.
As it happens, most of the ladders in our game do not extend very far beyond the upper landing area.
This made dismounting the top of a ladder challenging for players using continuous ladder climbing since there is very little ladder to grab onto above the upper landing area.
To solve this, we detect when the players stop holding the ladder with both hands and automatically teleport them to either the top or bottom, based on the direction they were climbing.
Unfortunately, we don’t have a lot of explicit ladder training in the game, so some players are surprised by this teleport behavior if they accidentally let go with both hands while climbing.
Nevertheless, this was a better alternative to leaving the players hanging or forcing them to fall down if they let go with both hands, especially since some of our ladders are rather long and it would be tedious to have to re-climb just because a player happened to let go shortly before reaching the top.

a3_07_distillery_intro

Some of our earliest experiments on the project involved placing old
Half-Life 2 models in a level and walking around them in VR.
These included static models of enemies like Combine soldiers or zombies.
We found that players really enjoyed being able to inspect these models,
because VR let them get much closer than they'd been able to in the past.
Players reported getting a sense of the true size of the characters and
would often point out details that they had never noticed before, even
though they had seen the models many times throughout Half-Life 2.
These reactions led us to the idea of creating an enemy that took advantage
of that experience, one that required players to spend long
periods of time in close physical proximity to it.
But enemies typically don't let the player get near them without attacking,
so we felt it made fictional sense for this enemy to
be blind, and to feature non-combat gameplay as its focus.
Early experiments with guiding a blind creature using sounds were immediately
interesting and gave us a way to create scenarios unique to the product.
Thus, the concept for Jeff⁠, known as the 'blind zombie' internally, was born.

a3_07_xen_ears

The Xen flora you see on the wall here, which we refer to as 'Xen Ears'
for obvious reasons, animate in reaction to the same sounds as Jeff.
Early in development, we would observe situations where playtesters
would fail to notice Jeff becoming enraged by a sound.
If the player died as a result, they felt frustrated and wronged.
We worked to make Jeff's animations and sounds communicate his level
of aggression to the player as clearly as possible, but if
players weren't looking at Jeff, they could still be caught unaware.
Adding these Xen Ear growths to the environment was a way for
us to help players be more aware of Jeff's reactions to sound.
The Xen Ears work as an extra layer of visual feedback for Jeff's behavioral state;
one that lives in the environment instead of on Jeff himself.

a3_07_xen_clams

Right from the beginning of development, we intended to feature a
variety of visually distinct Xen flora and fauna in Half Life: Alyx.
But we didn't yet know how those visual elements would
be used to craft interesting gameplay.
Referred to internally as 'Xen Clams,' the Xen plants you see here were originally
built to add visual flavor to the world and were only lightly interactive.
They could track the player's head and hands,
and would close tight if the player got too close.
When we began working on Jeff, we wanted to find surprising
new ways to use sound as a core gameplay mechanic.
In particular, we wanted to make players wary of the environment around them.
We realized we could use the Xen Clams as a type of noise hazard by having
them repeatedly snap open and shut when approached by the player.
We had already placed these clams in earlier parts of the game,
which meant that players would reach the distillery
with an understanding of how they functioned.
The introduction of Jeff changed the player's relationship with the Xen Clams,
as they went from being harmless but annoying to a dangerous hazard
that players would need to take care to avoid.

a3_07_larry

Early versions of the distillery explored a variety of methods for subtly
teaching players how Jeff works without any explicit exposition.
Though some of the training techniques were quite successful, anything less
than a complete understanding of Jeff's behavior resulted in players bungling
their way through this section of the game and having a terrible time doing it.
Eventually, we determined that explicit training was necessary,
and that's where the character of Larry came in.
Larry acts as a gate, delivering critical information about
Jeff's behavior before allowing the player to proceed.
It's a dense learning environment and a delicate balancing act, ensuring that we
deliver the rules of Jeff while still making Larry an interesting character.
To spread out the exposition, you'll notice that it's distributed over a few
different story beats as Larry first introduces Jeff, then demonstrates Jeff's
predictable reaction to a breaking bottle and finally explains that the
player can prevent their own coughing on Xen spores by covering their mouth.

a3_07_coughing

A core element of both the environment and Jeff himself are the Xen
plants seen here spewing toxic gas that makes the player cough.
These were not created to be merely ornamental or to support the
novel VR interaction of players covering their mouth.
They were added specifically to solve a long-standing challenge
that we had with illustrating Jeff's personal space.
One of the trickier aspects of designing Jeff was striking the
right balance between perceived threat and actual threat.
Many scenarios in the level force the player into close
proximity with Jeff in order to create tension.
We found that these moments could be exciting, but they could also become
frustrating when players accidentally got too close to Jeff and were killed unexpectedly.
Players reported not understanding 'how close is too close.'
We tried a number of solutions, such as adding glowing tentacles to Jeff
that acted as a visual representation of the perimeter of his personal space.
Touch the tentacles, and you die.
Getting the look and feel of these tentacles to a satisfactory place
proved challenging, however, and we backed away from the idea.
Another problem we encountered was that a subset of our players quickly
became comfortable around Jeff once they became adept at keeping their
distance from him and avoiding sources of noise.
For these players, dealing with Jeff felt trivial.
He simply didn't feel like a threat to them.
We discussed a variety of potential solutions to these problems,
even wacky concepts such as giving Jeff the ability to teleport
to sources of noise, but it was difficulty to come up with ideas
that didn't feel like we were giving Jeff the ability to cheat.
Eventually, the we came up with the idea of having the player cough.
This was appealing, as it gave us a way to make the player themselves
a source of noise, thus causing Jeff to pursue them.
We were also interested in making the environment more dangerous
generally, and so our first experiments with coughing involved
littering the level with Xen plants spewing toxic gas.
Initial tests showed a lot of promise.
Players now felt like Jeff was a dangerous pursuer and they
must carefully observe the environment to avoid the gas.
We observed that players had a natural instinct to cover their
mouth with their hand to suppress their cough.
This was an affordance that VR with tracked hand controllers
supported very naturally, so we implemented this mechanic right away.
This still left us with the problem of players being unsure
of how close they could safely get to Jeff.
We realized that the earlier experiment of adding tentacles to Jeff could be
replaced with the much simpler approach of attaching a toxic cloud directly to him.
There were multiple benefits to this.
First, it gave players visual feedback for how far they
needed to stay away from Jeff to be safe.
Second, it made those times when Jeff did get close all the more tense,
with players now needing to commit a hand to covering their mouth.
Finally, we didn't need to teach the players anything new.
They could carry forward what they'd already learned about
toxic gas in the environment, and apply those lessons to Jeff himself.

a3_07_single_controller

One of our major goals in the design of Half-Life: Alyx was
to reach the widest possible audience.
This meant developing for all of the major VR hardware in the market as well
as accommodating different sized play spaces, traversal preferences and so on.
The accessibility option that had the most knock-on effects into the design
of the game, from controller bindings to game logic to level design
was the support of a single controller mode,
for players who don’t use two controllers.
In this mode, reloading is performed by moving the weapon to the shoulder
and pressing a button, much like arcade games which
require the player to fire their weapon off screen to reload.
The flashlight was modified to attach to the primary hand and gas masks
were added so that players could muffle their coughs hands free.
Even the gun used in the final Strider battle, with its two-handed
aiming and reloading controls was modified to be operable with one hand.

a3_07_jeff_attack

As we refined Jeff's behavior and visual design, we thought
lot about how we wanted players to feel about him.
The goal was to have playtesters describe Jeff with words
such as 'menacing,' 'threatening' and 'deadly.'
And for that reason, it became important that players not
survive if they got too close to Jeff, so we decided that his
attack should result in instant death.
This would teach players unambiguously that Jeff was too dangerous to
trifle with, but it left us with the challenge of teaching this
to players without necessarily killing them.
The approach we took was to give the player opportunities to
witness Jeff attack and kill other characters.
Early versions of the level included scenarios where Jeff and
the player would encounter zombies and Combine soldiers.
The player could attack these enemies, or use sound to lure
Jeff into place so that he could pummel them with his massive arms.
This gave players an opportunity to witness Jeff's power and his
method of attack, without dying at his hands themselves.
Unfortunately, this outcome wasn't guaranteed, as some players
would opt to kill those enemies themselves.
And having these other characters present was also quite expensive
in terms of animation, sound and level content.
We were already grappling with the challenge of keeping this
level well-paced and within scope, and so we cut these sections.
But this still left us with the original problem of how to
demonstrate Jeff's ability to kill, without killing the player.
We chose to address this by adding beats to existing animated sequences.
For example, when the player opens this two-handed roller door,
we added a headcrab for Jeff to munch on in full view of the player.
We also added a similar beat to the later elevator ride, with Jeff
smashing a headcrab against the wall before melting it down with acid.

a3_07_training_alcove

Even though Larry instructs the player on the basics of how Jeff works,
we found that we needed to let players put those rules to
test without having Jeff in their immediate personal space.
If we made players first interact with Jeff face-to-face,
they often fell to pieces and forgot the rules.
They would die over and over again, and their fear of Jeff
quickly turned into frustration and annoyance.
This area was created to serve as a training playground, where players can
observe Jeff reacting to various sounds while remaining safe from his attacks.
It contains many of the elements that are encountered throughout the rest
of the level including falling bottles, a padlock that
needs to be shot, and various types of Xen plants.
Jeff remains close enough to feel threatening, but he cannot attack the player.
Players can spend as much time as they need here,
and move on when they feel like they have a good grip on how Jeff works.

a3_07_jeff_design

Jeff's character design went through several significant
revisions over the course of development.
Just by looking at him, we wanted players to understand
certain things about the character.
Our first priority was to communicate his blindness.
Second, we wanted the character to feel physically imposing.
After all, the entire concept of Jeff had grown out of our observation that
just being close to a character in VR could make for a powerful experience.
Finally, we wanted players to understand the blind zombie's behavioral state at a glance.
Is he feeling angry?
Did he just hear something and is feeling curious?  And so on.
The first design that we modeled and animated was a sort of Combine worker-robot.
The robot's face featured a holographic display that could flip
between different icons that represented its state.
A large piece of rebar jutted out the back of the robot's head,
sparking electricity and spewing smoke.
This was meant to demonstrate that its visual sensors were impaired, making it blind.
We made the robot tall and hulking.
It looked heavy, unpredictable and dangerous.
However, player reactions to this robotic design showed that it had problems.
Players would initially focus on the head's holographic display.
Parsing the face and understanding the meaning of the icons proved too
challenging for players, and even worse, this element distracted from the
rebar poking through the robot's head and their understanding that it was blind.
Players found the robot imposing, but we weren't getting the
deeply visceral reaction of fear that we desired.
A trip back to the drawing board landed us with the more
organic character you see in front of you.
Drawing inspiration from the idea of coal miners succumbing to the black lung,
we focused on the concept of a member of the Combine's Xen
cleanup crew that had succumbed to Xen infection.
This gave us the visceral reaction from players which we were looking for.
A human face is still visible, but the eyes are clearly decayed, communicating blindness.
The organic nature of this design allowed us to lean more heavily on audio
to feed back the behavioral state, with animalistic grunts and roars
that came to include more human-sounding elements over time.

a3_07_freezer_toner

Toner puzzles are a useful design tool since they force players to move around
the environment and also occupy their primary hand via use of the multitool.
In this area, the player has to aim their flashlight to navigate,
point their off hand to teleport, cover their mouth to prevent coughing,
follow wires with the multitool and potentially pull bottles with their gravity gloves.
The player has to make tradeoffs as they proceed, since their two hands are over committed by design.
One-handed players are even more overwhelmed by these many tasks.
This is why we littered the distillery with gas masks the player can
attach to their mouth so they won't have to use a hand to muffle their cough.
This toner puzzle went through several revisions, many of which
left Jeff free to roam while players worked on the puzzle.
But this resulted in players having to juggle too many elements at once,
and their fear of Jeff quickly turned into frustration.
The solution was to keep Jeff locked up while players solved the puzzle,
with his distant roars maintaining a level of tension.
We were then left with the problem of how to release
Jeff so that the level could continue.
Initial ideas involved having Jeff break out from the freezer on his
own when the puzzle was complete, but this felt contrived and scripted,
compared to having the player free Jeff from the freezer.
The toner puzzle mechanic gave us a logical solution to this problem,
with the added benefit of creating a moment of high tension for the
player when they realize they're going to have to open
the freezer door and release Jeff in order to proceed.

a3_07_elevator

Many of our ideas for Jeff, involved coming up with plausible ways to force
Jeff and the player to occupy an enclosed space at the same time.
An elevator ride was on the table from the beginning, but we
were unsure of how successful it would be in execution.
There was also a concern that such a sequence could be technically expensive to pull off.
Our first prototype was extremely rudimentary.
Players would get into an elevator with Jeff, the ride would last
around thirty seconds, and then Jeff and the player would exit.
There was no complicated level scripting, no bespoke animations for Jeff,
and no special audio or lighting treatment.
Despite the relative lack of polish, players had a strong positive reaction to the sequence,
even naming it a highlight of the game when we ran company-wide playtests.
This convinced us that it was worth putting more work into the
elevator ride to make it something really special.
In the end, a lot of different disciplines came together to create
the final sequence, which combines sound, animation, scripted lighting,
and careful choreography of Jeff's movements and the player's actions.

a3_07_cheeky_headcrab

Most of the time, the noises that attract Jeff are created by the player,
who is either actively trying to lure him to a new location,
or has perhaps made a noisy mistake that could have been avoided.
The key point being that the player feels some amount of responsibility for the noises being made.
The goal of the upcoming hallway was to flip that idea on its head, if only for a moment.
Like a misbehaving pet, this cheeky headcrab appears to surprise and infuriate the player,
clumsily knocking over a series of bottles before escaping into a vent.
Classic headcrab nonsense!
We were surprised to find that many players wanted to check
and see if the headcrab had indeed escaped.
They would climb up onto the boxes and crane their head up into the vents.
This prompted the idea of adding a small easter egg to reward those players.
Try it, and you'll find that the cheeky crab has indeed escaped to be annoying another day.
The cost of adding these sorts of secrets need to be weighed
against the potential for players to actual discover them.
In this case, our playtest observations made the cost appear worth it,
in particular for the impact it would have in making the world feel more real:
a full reality where living creatures go about their lives
even if the player isn't there to witness them.

a3_07_combine_forcefields

Since the entire premise of Jeff hinged on keeping Jeff and the player
in close proximity to each other, the distillery map was constructed
as a series of chambers connected by one-way transitions.
Early on, we used two techniques in concert to keep Jeff and
the player together throughout the level.
First, each new area was gated with a special locked door.
These took the form of impenetrable 'Xen Membranes' that only Jeff could tear through.
The player would lure Jeff to the membrane using sound, and Jeff
would break it open so that both Jeff and the player could proceed.
We had always had a goal of having players describe Jeff in adversarial terms,
as a constant pursuer or 'thorn in their side.'
The problem with the Xen Membrane concept was that it lead to players
describing Jeff as a useful tool, or worse, 'angry co-op partner,' rather than something they feared.
Another way that we would keep Jeff and the player together
was via the use of one-way drops.
Jeff would fall off of catwalks, platforms and ladders into each new area.
These drops functioned as intended, but just like the Xen Membranes,
elicited an undesirable reaction from players, especially when repeated multiple times.
Watching Jeff do these pratfalls left players feeling like he was less dangerous.
They felt sorry for him and even laughed at him.
Eventually, we realized that we could use Combine force-fields,
which the player cannot pass through, as a way to create 'one-way doors.'
These force-fields allowed Jeff to move through them,
but required the player to follow via another one-way path.
This allowed us to build thresholds where Jeff and the player could
both move forward while eliminating the awkwardness of the Xen Membranes
and avoiding the repeated falls that had previously made Jeff look so clumsy.
Combine force-fields also had a few other benefits.
For one, they gave Jeff a convenient way to take a 'backstage' route
that is inaccessible to the player, like the one at the bottom of this ladder.
They also tied in nicely with Jeff's fictional background as a former member of the Combine Xen cleanup crew.

a3_07_sounds

Creating an experience with such a heavy focus on audio
presented some unique design problems.
The primary challenge was determining which sounds Jeff should be able
to hear and making sure that players understood the rules.
Half Life: Alyx contains a vast array of physics objects.
However, through playtesting, we found it best to use
only a small set of them in the distillery.
We needed to pay close attention to each object that we did include,
particularly to audio design for collisions between the object and
the various surface types used in the distillery.
It was important that what players heard matched their
expectations for how Jeff would react.
In early testing, we would find individual cases where players would say
things like 'I don't know why Jeff got angry' or
'I would have expected that noise to attract Jeff.'
Over time, we identified these cases of confusion, and addressed them one by one.
It was vital that players always be able to identify the cause
and effect of a noise that might attract Jeff.
Failing this meant players were left feeling like Jeff
acted randomly, and they'd grow frustrated.
Some experiments failed in this area.
At one point, parts of the level were littered with piles of broken glass
that would make noise if the player walked or teleported over them.
Few players would notice the broken glass and were then surprised
when Jeff attacked, so this concept was cut to avoid confusion.
We had to be wary of the opposite case too, where players would
expect purely aesthetic elements to present a noise hazard.
For example, at one point we had placed shallow pools of water seeping in among the floor tiles.
It looked really cool, but it confused some of our playtesters who thought
that the sound of their footsteps splashing through these puddles would
be audible to Jeff, even though we never intended that to be the case.

a3_07_jeff_death

Figuring out how to end the player's journey through the distillery involved a
lot of discussion about how players would ultimately end up feeling about Jeff.
Observing playtests told us that most players simply wanted
him dead, usually from the moment they met him.
But there were some who felt kind of sorry for him and
didn't necessarily want to see him killed.
Our first version of the ending involved the player's actions unintentionally
resulting in Jeff being attacked and dragged away by antlions.
Part of the goal was to demonstrate that the antlions are even more
powerful than Jeff, to amp up the tension just before players have
to square off against antlions themselves.
However, this ending left players feeling a little empty, they felt
detached from Jeff's demise and wanted to be more directly involved.
Dark stuff, to be sure.
Plus, we weren't really servicing those players who felt sorry
for Jeff and didn't necessarily want to see him killed.
There were other problems too.
The antlion attack felt random to many players, which reduced its impact.
And because Jeff presumably died off-screen, players were left without a sense of closure.
This prevented players from being able to switch gears and
settle in for their encounter with the antlions.
The trash compactor was the solution to these problems.
It gave players the opportunity to decide Jeff's fate and be
left with a sense of closure, whether they decide to squash him or just leave him imprisoned.
Even so, it has to be admitted that most players seem to crush Jeff with zero hesitation.
Monsters.

a4_01_antlion_combat

In some of our early experiments bringing Half-Life 2 enemies to VR,
we found that antlions in particular showed promise and were interesting
to fight, but tended to quickly overwhelm playtesters.
They were able to get into the player's physical space too quickly and
applied so much pressure that players just ended up endlessly
teleporting around, trying to get away.
Around the same time, we were discovering that all players,
even relative novices, have surprisingly good aim in VR.
The variability of aiming ability amongst players is simply
much lower in VR than it is in mouse-and-keyboard games.
To take advantage of the idea that players would consistently be able to
make precise shots, we started testing an antlion design which
players could slow down by shooting off the legs.
Playtesters reacted positively to both the need for accuracy and to the
organic feeling that this gave the antlions when their legs flew off
and the injured antlion continued to limp toward them.
The removal of the legs was clear to players, but presenting the fact that the
abdomen was invulnerable until the legs were removed was a longstanding challenge.
There was no single solution to communicating this effectively to
playtesters but, rather, a series of small additions that communicated the behavior.
For example, we made the abdomen dull and black when invulnerable,
and changed it to match the bright orange legs as it became vulnerable.
Shooting armored parts of the antlion would generate sparks and hard
ricochet sounds, while shooting the vulnerable body parts resulted
in orange blood spurts and soft impact sounds.
We also designed the death animations to emphasize the abdomen exploding,
and ensured that an exploded abdomen chunk was left behind after the antlion was dead.

a4_01_foliage

All of the foliage in the game animates in response to player touch and environmental wind conditions.
The foliage animation is entirely procedural and is computed on the graphics processor using vertex shader deformation.
A voxel field that travels with the player's body is used to track the location of the player's hands in space.
The hands effectively draw motion trails into this voxel field, which can be
sampled by the foliage's vertex shader and used to drive deformations,
allowing the player to bat the foliage around with their hands.
The foliage also interacts with wind, allowing us to give ambient motion to the plants.
In order to accommodate a range of environmental design, we tuned the wind
deformations at different speeds, from slight drafts to hurricane force winds.
It turned out, though, that the ambient wind was never required to be more
than a moderate breeze, and for most of development, the hurricane effect was unused.
This particular zoo habitat was added late in development and required a
critical extension to the procedural foliage animation system.
Prior to adding this habitat, the foliage didn't need to deform in response
to enemy animation or grenade explosions, but of course this area is
all about throwing grenades at zombies.
To address this, we added a low-resolution voxel field that can be
sampled by the foliage shader to apply additional deformations.
This low resolution voxel field, which is about one cubic foot per voxel,
is fixed in space and spans the entire habitat.
For comparison, the voxel field that travels with player is about two cubic inches per voxel.
Enemies and grenades within this space can write motion trails and impacts
into the low resolution voxel field, which are ultimately interpreted by the foliage shader as wind forces.
This is when preparation met opportunity.
When grenades explode, they are creating localized hurricane-force winds.

a4_01_antlion_sounds

There were very few existing Antlion sounds to work with coming
into Half-Life: Alyx, so unlike most of the familiar Half-Life
creatures, which often had legacy sounds layered in, nearly all of the
sounds for the Antlions were completely new material.
Starting with an almost blank canvas, we drew inspiration from
the behavior of the Antlions to inform their sound design.
Since the Antlions attack in waves, it seemed reasonable to us
that they communicated and organized into groups for attack.
This idea led us to incorporate complex insect-like clicking sounds,
the final version of which began as recordings of European Starlings,
which turned out to provide more variation and character than insect recordings.
Here, you can hear the original European Starling sounds.
We took those recordings and layered them with designed elements to create the Antlion sounds used in the game.

a4_01_anim_interactables

In traditional games, analog controls like levers, dials and crank wheels
usually play simple animations in response to a player's mouse click or key press.
In VR, however, players with tracked hand controllers expect to be able
to control analog interfaces by grabbing and moving them directly.
To support this, we developed a new system that we call 'Anim Interactables.'
The opening mechanism of this health station is a good example of an Anim Interactable.
To your left and right, you can see some more examples, such as a large
railroad switch, the Vortigaunt's fire alarm doorbell, a crank wheel,
a rolling cabinet door and both the antenna and tuning dial on the radio.
Anim Interactables like these allow us to give the player control of analog
interfaces while constraining their input to an authored range of motion.
The system uses the position and orientation of the player's hand to
drive the animation of a model which defines the analog interface.
As the player moves their tracked controller, the Anim Interactable
continuously performs a search to determine if playing the animation
forward or backward will place the interaction point closer to the
tracked controller's position in space.
This allows the player to drag the animation forward, backward, or hold it stationary.
he result is an intuitive correspondence between the player's body and the virtual interface.
Furthermore, these interactive animations don't necessarily have to make
physical sense; artists simply animate the range of motion they want the player to be able to control.
These Anim Interactables have their debug visualizations turned on,
so you can see the authored range of motion, including the non-linear paths
used on the railroad switch, the health station and the crank wheel.
Anim Interactables allowed us to make all kinds of objects in the world
interactive, from mundane man-made devices to intricate Combine mechanisms,
without the need to run complex physical simulations.
Feel free to play around with the Anim Interactables in this area before moving on.

a4_01_storage_toner1

Sometimes large chunks of a level are shifted around or re-ordered as we work on the game.
That's the case for this entire area and the toner puzzle it's centered around.
In fact, for much of its existence, this area didn't even have a home anywhere in the game;
it was originally built just to help prototype the toner puzzle mechanic.
Over the course of a few months, the raw gameplay mechanics of toner puzzles
were developed and playtested in a series of small test levels.
Each test level explored a single gameplay concept.
Those were things like using the toner puzzle to draw the player's
eyes to something interesting, such as Xen flora and fauna.
Creating moments of surprise when an enemy headcrab popped out of a vent.
Or requiring the player to thread their arm and hand in behind pipes or other tight spaces.
Once we figured out which of these ideas were working well, we combined
them into a single test level with a toner puzzle that strung
all of these concepts together in one single puzzle.
Further playtesting in this level made us confident that toner puzzles
should be included in the game and we started adding them in to other areas,
but we didn't include the test level itself anywhere in the game.
It wasn't until later that it became apparent that the zoo was in need of a pace break.
We pay careful attention to player fatigue when we playtest and look for
cases where a puzzle or some exploration and resource gathering
can be used to give the player a break from combat.
There was such a need toward the end of the zoo level, and thankfully in this
case we were able to make use of the toner prototype level that had already been built.

a4_01_storage_toner2

When we decided to resurrect our toner prototype level for use as a pacing
break in the zoo, we had the opportunity to make some improvements and
integrate elements that hadn't existed when we had first experimented with the toner mechanic.
For example, the newly-developed explosive Xen bloaters were added at various
points along the puzzle to surprise players and remind them that they need to keep an eye out for these hazards.
Also, in the intervening months, the toner mechanic itself had evolved
from one where players merely push a ball of energy along a path,
to one which included branching paths and rotating junctions.
This junction here in the breaker box is the linchpin of the whole puzzle,
as the player must interact with it three times: once to open the roller door
to this back area, once to switch power away from the roller door to the last
leg of the puzzle and once to re-open the roller door in order to exit.
Because this toner puzzle covers such a large physical area,
including some zombie combat in the middle, players frequently lost track of
the correspondence between the state of this junction and the roller door.
This meant that they would trap themselves back here and not understand how to proceed.
To address this, we made a lot of changes to the presentation of this junction over the course of development.
The unique breaker box and tangle of conduit are designed to make
the junction recognizable as the same junction from both sides of the wall.
The shape of the hole in the wall and the orientation of the breaker
box are designed to make sure that the player stands in the right location
to have line of sight to the roller door and see that it closes when
they route power away from it to the last leg of the puzzle.
The door itself also makes a lot of noise and an unreasonable amount
of sparks to draw attention to itself each time it moves.
This particular puzzle twist was in danger of being cut for a long time,
but with this series of refinements, enough playtesters understood its
behavior that we were comfortable with shipping it in the final game.

a4_02_tractor_beam

The emergence from the zoo gave us an opportunity to present players with a
dramatic vista and to show them how much progress they had made in their journey to vault.
This was challenging because the vault's position in the world was
already determined to be above the parking garage,
which the player would reach at the end of the map.
But placing the vault correctly in support of that location left the vault
looking too distant to create the dramatic view we wanted here.
We also intended to portray the cutting of the vault's power cables by the
Vortigaunts at this point, but having this take place at such a
distant location was unclear and confused playtesters.
The solution was to move everything closer.
The vault looked more imposing, looming above the player, and the effects
of the Vortigaunts' cable cutting efforts were much clearer.
All we needed then was a conceit to justify the two vault locations:
the location here at the start of the map and the eventual location at the end of the map.
Around this same phase of development, we were in the process of defining
exactly how the player would bring down the vault, and our solution to
that problem also helped justify the two vault locations.
Tractor beams are a familiar cliché in science fiction and often share a
common visual style of an enveloping force field, emitted net-like from a single point.
Using a tractor beam as a Combine failsafe device to catch the vault served us well here.
It allowed us to prevent Alyx from being crushed by the vault dropping out
of the sky when the Vortigaunts cut the final power cable and it gave us a
satisfactory mechanism for moving the vault to the location we needed it to be at the end of the map.
Throughout the map, the beam itself also serves as a beacon, ensuring that
the players are always able to recognize their new goal: the tractor beam control station.

a4_02_collaborator

The final twist of the game presented quite a few story-telling challenges for the writers.
The game is long – much longer than a movie – and the surprise reveal of the
G-Man at the end felt almost impossible to sustain over the entire experience.
Everyone on the team assumed that through basic deduction,
most players were going to figure it out along the way.
So the solution, like in most good stories, was to just add more stuff:
More details, more questions and more nooks and crannies to the plot.
The writers quickly realized that there was one big,
distracting name – Gordon Freeman – that would pull the player's attention
away from any theories they may be cooking up on their own.
We then realized that we could use the absence of the G-Man in the story
and the eerie parallels between the G-Man and Gordon's experiences at
Black Mesa to redirect player attention and propel the story towards the final act.
If the player thought they had uncovered a new mission, to save
Gordon Freeman from stasis, then the twist at the end could be secure.
To pull it off, the writers created a new, mysterious character:
A scientist collaborating with The Combine.
This addition came late in the project, so the writers relied on another
tried and true tact: base the character on an actor we're fond of and
then do our absolute best to cast and record with that person.
The chosen actor had struck a particular tone with the writers: she has an easy,
in-command confidence and a smooth southern, Missouri drawl.
This not only sets her apart from the other characters in the Half-Life
universe but makes the character easy to hear on the page,
which is incredibly important when time is short at the end of the project.
And once she was in the studio, her pitch-perfect performance assured us that we'd made the correct decision.
Of course, then the scene still had to be built.
Level designers figured out what existing chunk of level track could be bumped out to fit the scene.
The area we chose previously held some barnacles and a few light
puzzle elements which the designers weren't upset to lose.
The level track had to be altered to funnel you into a place where the scene
could unfold and the player could eavesdrop without getting a good look at the character.
This served two purposes. One was that it would benefit the mystery for this
scene to feel furtive and for the identity of the character to remain secret.
But also we could shroud the performance and the intricate facial animation
necessary to bring a character to life, all in shadow for the animators and the choreographers.
Like in a lot of cases, not doing the work, was the correct thing
for both the story and our tight shipping schedule.

a4_02_scavenger_hideout

The purpose of this hideout was to give players a nice
reward for being observant and exploring the environment.
We wanted to highlight the story of scavengers living behind the quarantine wall,
what their lives might look like, and why they might choose to live there.
The person who lived here was referred to internally as the 'Combine Killer,'
imagined as a person with a grudge against the Combine,
who has chosen to eke out a living in the quarantine zone,
pending their time exacting revenge on unsuspecting Combine patrols,
perhaps eventually joining the organized resistance
movement portrayed in Half-Life 2.

a4_02_spectator_hud

In recent years, video game streaming has become one of the main ways that
players engage with the culture of games and this is even more true for
VR titles where players who do not have VR equipment want to participate.
To address this audience, we created a spectator HUD, which displays the
VR player's health, ammunition, resin and the contents of their wrist pockets.
This information is overlaid on the 3D scene in the desktop window and
lets viewers track the state of the player's resources, without having to
rely on the player to look at their wrist or open up a menu.
The spectator HUD started off as a tool used by the development team to
display resources, as well as the state of controller buttons
and other diagnostic information.
An interesting side effect of this HUD was that it made playtests way more
exciting to watch, because we as an audience had information that
the player wasn't necessarily aware of.
For example, playtest observers might realize that a player had barely
survived an encounter, when the players themselves had never even checked their health.
We even saw cases where players killed an enemy with their very last bullet without realizing it themselves.
The highs and lows we felt watching our own playtests were something that
we wanted to share with the streaming audience and it's been exciting to see streamers using it in the wild.

a4_02_tanker_combat

Creating a climactic battle that capped off the antlion combat
proved to be a challenging design problem.
We wanted to create a sustained combat experience that was substantial enough
to challenge players who had refined their skills against the antlions.
But simply throwing a large number of antlions at the player all at once
was not a particularly engaging experience,
nor was it compatible with our established combat balancing.
Rather than containing the player in a restricted space and spawning
waves of enemies, we wanted to keep the player engaged and moving
through a gauntlet of constant pressure toward a goal.
To do this, we needed an arena with a clear linear flow and a porous
layout that would allow the player to maintain sight of their goal
while being flanked by swarms of antlions as they moved toward it.
This tanker yard, which we had originally built for a narrative scene
that was cut from the game, had the properties we were looking for.
The long tanker cars lead toward the intended direction of travel,
and their positioning requires the player to pay careful
attention to their sides and rear as they advance.
The Combine tractor beam tower is framed at the end of the tracks,
providing a clear goal for the player.
The placement of the glowing red sign in the direction of the tractor beam
was also a deliberate decision, a technique used by the environment artists
to subconsciously pull players in the desired direction via visual contrast,
even in the event that they've momentarily lost their bearings in the chaos of the battle.
The pacing of the combat here was carefully balanced to leave most
playtesters on the brink of being overwhelmed, and feeling like they made
it through by just the skin of their teeth.
It took many iterations of enemy placement, ammo and item placement,
and player path adjustments to achieve this balance.
We ultimately settled on a design that featured a steady stream of
advancing antlions, with antlion spitters placed at the far end of the
space applying enough pressure to prevent players from rushing toward the goal.
Combine soldiers up on the raised platforms divert player attention
to the right hand flank and provide verticality in target prioritization,
adding to the sense of frenzy we sought to create.

a4_03_chasm_puzzle

The goal of this puzzle was to create a moment of heightened stress by
requiring players to deal with a sudden threat after being
placed in a precariously disadvantageous position.
In the original design of the puzzle, players would stand on a platform
which was suspended from a cable and pull themselves across the chasm.
Halfway across, antlion spitters would emerge and start attacking.
The platform provided limited cover and players would have to make
trade-offs between taking cover and continuing to propel themselves forward.
This succeeded in creating the desired spike in intensity,
but unfortunately, the lateral movement of the platform caused
considerable motion sickness for most playtesters.
Even players who were comfortable with the vertical movement of elevators
elsewhere in the game were adversely affected here.
The puzzle that we shipped retains the same sense of tension, due to being
stranded in the middle of a chasm under attack, but does not rely on physically moving the player.
The upper portion of the wooden platform is deliberately devoid of any
substantial cover, which encourages players to jump onto the platforms below.
On the lower platforms, we provide enough cover to protect players from the
antlions, but also leave them feeling vulnerable, out in the middle of the chasm.
The cover here is porous and breakable, increasing the sense of urgency
and requiring players to engage the antlions from their compromised position.

a4_03_combine_combat

In cases where the player had to face off against Combine in relatively
open spaces like this one, we often found that playtesters would backtrack
to safety and attempt to pick away at the soldiers from a distance.
The playtesters would ultimately be successful in clearing the combat space,
but would feel unsatisfied with the encounter due to the
repetitive and static nature of the tactics they employed.
To address this, we used a few design tools to discourage this type of player behavior.
The placement of the Combine suppressor standing on the bed of the truck at the
far end of the courtyard here is the key to the designed flow of this battle.
The suppressor does not move from his planted position on the far
side of the courtyard while the player is on the near side.
His positioning, combined with his unrelenting wall of suppressing
fire, forces the player to advance in order to close the distance
required to take him out, all while under intense pressure from a
carefully crafted combination of other Combine units.
Additionally, if the player backtracks far enough from the fight,
the more mobile Combine soldiers in the encounter are told to retreat
and hold their ground until the player once again advances into the combat.
These elements together keep players moving forward into locations that
provide for more interesting opportunities for both the player and the enemy soldiers.

a4_03_vertical_tower

After the intense combat that precedes it, this tower initially serves as
a pacing break before once again ramping up the intensity on the player's
final fight up to the Combine tractor beam controls.
While the lower portion of the tower does contain combat, it has been
designed to cause players to engage with the space in a
different way than the preceding courtyard fight.
The focus here is on navigating the chaotic verticality of the tower,
which we found to be particularly successful for playtesters to experience in VR.
The manhacks that initially engage the player from above fly a path that
deliberately has them crashing around the pipes and boards overhead,
causing sparks and splinters to rain down on the player.
Combined with the disorienting nature of the space, the Combine suppressor
that emerges at the top of the stairwell is used to reintroduce an appropriate
level of intensity and to create a sense of a struggle as the player continues to scramble closer to their goal.

a4_03_vault_panel

At the start of the previous map, we introduced the tractor beam,
a fail-safe device used to catch the falling vault when the
last substation was knocked offline.
In this location, the console looks like it could be used to control the
tractor beam to dock the Vault for entry via the nearby gantry on the left.
We decided to have the vault crash to the ground spectacularly,
but for this to work emotionally, we needed the scene to increase in tension beforehand.
The whole vault docking console is absurd by design.
If the console appeared too much like a legitimate puzzle,
playtesters could take too long and let the air out of the scene,
unsure if there was something that they could actually solve.
The more inscrutable the machine looked, the more likely they were to
interact with the levers and play along with the unfolding disaster.
The first phase of the puzzle extends the catwalk and unlocks the main
lever bank which responds to the player’s input, engaging them in the solution.
Every subsequent lever thrown does nothing except increase
the tension as things continue to go wrong.
The back and forth radio dialog between Alyx and Russell completes
the scene with growing panic and keeps players in the moment,
flipping levers in an attempt to save the Vault.

a4_04_strider_intro

Five years ago, we placed playtesters beneath an idling Half-Life 2 Strider.
It was a really simple prototype, but the feedback was overwhelmingly positive.
This experience convinced us we should pursue a Strider encounter in VR.
And as a result, we built dozens of prototypes over the next few years.
Some designs pressed playtesters into very close contact with the Strider.
Some had a Strider chase them through a simple environment or shoot at them through porous cover.
In other cases, the Strider would destroy a wall the player was
hiding behind or give a performance the player was meant to observe.
A prototype built in 2017 squared the player directly off with the Strider
in a quarry, combining many of our ideas into one experience.
Based on the success of this prototype, the team moved on to building the
shipping experience ahead of you, although it would take another three years of refinement to complete.
Each iteration brought us a new set of rules and player expectations.
The Strider encounter ahead is a carefully crafted collection of beats shaped
by our own ideas of what would be exciting and fun, refined over a long period of playtesting.

a4_04_lone_combine

Through many iterations of the strider encounter, we would find ourselves having
functionally solved a problem only to realize that we also had to fictionally
contextualize the solution so that it made sense.
For example, we designed this entire map to include combat with Combine soldiers,
but how should we fictionally place them in this destroyed space?
How could we put a disabled strider close to the player?
What were the Combine doing here prior to the vault collapse?
This soldier was placed here to communicate to the player that some Combine were
in the area investigating an antlion infestation before the vault fell on them.
In fact, you can see some dead antlions nearby.
If you listen closely, you'll hear the soldier trying to contact the rest of his squad,
letting the player know that he is not alone and that there is likely to be more combat ahead.

a4_04_disabled_strider

Although we could excuse a few random combine in this area left over from
investigating the antlion infestation, we needed a way to justify the large
squads of Combine in the combat encounters that occur later in the level.
These two combine are part of a squad sent down looking for survivors after
the vault collapse and their dialog is designed to inform the player
that this strider they have found may not be dead.
Fictionally, it is the player's combat with these two soldiers that alerts
the rest of the combine in this area to the player's presence.

a4_04_lion_and_mouse

For many players, Striders are the most feared and powerful enemies in
the Half-Life franchise, and we thought it would be interesting to
flip player expectations on their head and experience what it would feel like to team up and cooperate with a Strider.
Our initial concept was inspired by the Aesop's fable of The Lion and the Mouse,
with the player freeing a disabled Strider, who would later help the player escape by clearing the way forward.
Initially, the parking garage was a Combine silo, and it was much easier
fictionally to have some type of Combine device which could constrain a Strider without injury.
When the location changed to a collapsed parking garage, we constrained the
Strider by trapping it in the elevator shaft, held down by the elevator car.
After being freed, the injured Strider would lash out at anything
nearby, Combine and player alike, as it tried to escape.
This was a strong story when told aloud, but expressing it in a way
that was understood by players turned out to be difficult.
Keeping the Strider around throughout the level, alternating between injured
and angry states as the player slowly picked their way through the garage,
was confusing to playesters and we had a very hard time trying to find the right balance.
Playtesting feedback was really consistent around this, so although we
loved the idea, we let it go in favor of a more straightforward and
achievable dynamic of Hunter and Prey that eventually shipped.
The year we spent pursuing the Lion and Mouse idea wasn't wasted, however,
as we created and refined many of the different story beats that shipped.
We also crafted some performances and technology that led to the success of the final experience.

a4_04_ai_animation

The Strider encounter boils down to a set of performances, and the Strider's AI-driven minigun.
Maintaining the illusion that the Strider was aware of, and hunting, the
player while delivering these performances required a delicate coordination between Animation and AI systems.
We used a variety of approaches to make the Strider feel like an intelligent creature, hunting the player.
Players in VR move very slowly by first person shooter standards,
so we have the Strider alternate between 'chasing' performances, following
the player through the level, and then 'hunting' performances, focused
on tracking and firing at the player, or looking for them.
Triggering the different stages of this pursuit took a lot of playtesting.
We paid close attention to how playtesters would move through the level and
then built the triggers and iterated on the animations to support the most common behaviors.
It was very much a back and forth conversation to evolve a beat into something
that most players would believe the Strider was truly hunting them.
Early playtesters would grow weary of being shot at without any downtime
or contrasting experiences, so we identified the beats that would create
that contrast and layer them in when we saw playtesters become fatigued.
Things like the strider bashing the wall, walking over you, chasing you up
an elevator shaft, all acted as ways to break up the experience and surprise the player.
Whenever we had feedback a playtester was growing or looked bored or fatigued,
we would brainstorm different ways to insert these beats,
and then test it to verify that they were paced correctly.
One of the tools we used was a sophisticated lookat system on the strider.
This system tracked the player and would procedurally orient different parts of the strider to face the player.
The body, minigun and gauss cannon all had different lookats to
breakup the effect when the strider lost or acquired the player.
These could be independently enabled or disabled at specific times,
to make it feel like the strider was directly addressing the player.
In each animation of the sequence, the lookat on the minigun and body
would be enabled or disabled based on whether the strider should be able to see the player at that point.

a4_04_elevator

It was important to communicate clearly to the player that
they were being hunted by the Strider at this point.
We wanted players to see the Strider dust itself off and fixate on
them so that there would be no doubt that they were now the prey.
This elevator ride was a useful tool to achieve this, as it is one of
the few moments of the game that we have a captive audience.
The elevator door facing the Strider was carefully designed to
provide enough protection so that players would feel safe,
but enough visibility to see the Strider's performance.

a4_04_destructible_cover

During development, we talked a lot about the experience of the Strider
minigun blasting through the walls and floors of the parking garage
while hunting the player, but weren't sure we would be able to deliver
destructible cover until around six months prior to shipping.
To address this, we designed the destructible cover to visually
communicate that it was being broken down prior to actually
fracturing and breaking.
Initially, the cover would look like an unremarkable piece of
concrete or other building material, but when it took sufficient damage,
it would start to visibly crack and throw off dust particles.
When the same piece of cover took further damage, it would then fracture
into pre-cut physics objects and generate more dust and debris particles.
Through playtesting, we determined how much damage the cover should be
able to take before breaking down and forcing the player to move.
In the end, the destructible cover system allowed us to deliver on our
goal of giving players the impression that the Strider was capable of
tearing through the parking garage with its minigun as it pursued them.

a4_04_minigun

In this area of the map, we began hearing from some playtesters
that they found the minigun firing monotonous.
Originally, the gun would fire at a set rate, regardless of the player's behavior.
To address the monotony, we made several improvements
to the Strider AI running just the minigun.
The minigun will fire if it sees the player, but take a moment to acquire them,
play an acquired sound to warn them they've locked on,
and then 'stitch' the bullets towards the player in its first volley.
The minigun volleys themselves were broken up to become more staccato,
with random intervals in between volleys.
If the player ducks behind cover, the Strider will keep firing
where it last saw the player for a short period.
These changes to the minigun behavior eliminated the monotony and
gave players the impression that they were fleeing a living creature.

a4_04_doorframe_bash

To keep the strider engagement interesting throughout the level,
we knew that we wanted to challenge the player's understanding
of what the strider was capable of.
We identified this upcoming room as a space where most players
felt they understood the Strider's capabilities and
built this next beat to surprise them.
To create the doorframe bash moment, we used the Strider's
animation to drive an offline destruction simulation that
could be played back by the game engine in real time.
We coupled this with particle effects and physics impulses on
interactive physics objects in the room to make the player feel unsafe.
To fully sell the idea that the Strider has been searching for and has
found the player, we gave the Strider a moment to pause after tearing
away the wall to look at the player before it started to fire again.

a4_04_strider_sounds

When playtesting the Strider encounter, we learned that players enjoyed
fighting a challenging enemy but responded negatively if they
took damage in a way that seemed unfair.
To address this, we used audio to communicate the state of the Strider.
For example, when the Strider did not have line of sight to the player,
it would emit a periodic hunting sound.
This sound was meant to communicate to the player that they were currently safe,
but that the Strider was still there.
Once the Strider had seen the player, it would switch to
its 'chasing' state and make corresponding chasing sounds.
When the Strider was ready to start firing at the player,
it would emit a 'target acquired' sound before firing.
Players quickly learned this pattern of sound cues and no longer
complained about being caught unaware of the Strider's targeting state.

a4_04_gauss_cannon

We needed to familiarize the player with the Strider's Gauss cannon firing
sequence so they'd be able to recognize it when they saw it again later in the map.
This particular area of the map went through a lot of iteration,
as we searched for ways to reliably place the player in the right spot
at the right time looking at the right thing:
the Gauss Cannon as it charged up and fired.
To achieve this, we ultimately settled on using a two-handed roller door.
It requires the player to stand in a specific position and orientation and
occupies their hands so we know they aren't teleporting or interacting with anything else.
We even know the precise moment that the door clears their eye line,
so we can be assured that they're going to see what we want them to see.

a4_04_destroy_strider

When the central theme of this map was The Lion and the Mouse,
the encounter with the Strider ended at this point.
The Strider simply exited off into the sunset.
We intended for the Strider to reappear in a subsequent fight,
perhaps even attacking the Combine alongside the player,
but this idea was only pursued briefly before being cut.
The problem was that playtesters felt cheated that they didn't get to
destroy the Strider after being hunted and harassed by it for so long.
In the final map, once the player steps into the rat maze below,
the Strider reappears and begins hunting the player again.
Because Alyx does not possess any weapons that can damage the Strider,
we explored a variety of ways that the player could take
out the Strider using something found in the environment.
These included ideas like triggering missiles on a downed combine helicopter,
using a Combine mortar emplacement or kicking off some
type of physics event to sweep the Strider's legs out and kill it.
We built prototypes of many of these ideas and eventually
narrowed down to the mounted gun solution that shipped.
It was simple, repeatable and was one of the easier solutions to train under fire.

a4_04_mounted_gun

In this sequence, we wanted the player to end the encounter by killing the
Strider in a dramatic boss fight using a weapon found in the environment.
This meant that we had to introduce a new weapon which
was more powerful than the player's equipped weapons and
which had to remain in the environment after the fight.
A mounted gun on an abandoned Combine vehicle made the most sense and
after some initial tests with a downed Combine helicopter repurposed
from Half-Life 2, we settled on the armored personnel carrier
since one had already been built for use in another section of the game.
Since the gun mechanism itself would be seen for the first time
while under pressure from the Strider, it had to have an aiming and
firing mechanism that the player would understand immediately.
We prototyped a variety of designs from a machine gun that auto-fired as
soon as the player grabbed the handle, to a gun that fired a sticky-bomb
that had to be detonated by the player after it had hit its target.
The auto-firing machine gun was too simple to use and lacked dramatic effect,
and the extra interaction required to detonate the sticky bomb
was too much to teach the player while under duress.
We settled on a design that requires one hand to aim
and another to both reload and fire.
Requiring the player to push the firing mechanism all the way forward to
reload was necessary to prevent the player from firing too rapidly.
The gun's shell reloading animation is directly driven by the player's
motion and communicates to the player that they must move their
hand all the way forward to reload before being able to pull it back to fire.

a5_01_broken_vault_music

As the player enters the vault and is immersed in its
warped reality, they lose contact with Russell.
With Russell no longer available to explain what the player was seeing,
we had to communicate the important concepts of the vault through
the use of visual effects, physics, sound design and music.
The most important concept was that the errant bursts of uncontrolled
energy causing anti-reality events was spillover of the Vortigaunt energy
being used to keep the prisoner in the vault as his influence
occasionally gained the upper hand through cracks in the field.
Starting from the vault crash in the prior map, we used this piece
of musical sound design to represent the newly uncontrolled
Vortigaunt energy being released from the vault.
And this piece
to accompany the reality bending bursts caused
and swayed by the G-Man's increasing reach.
Within the vault, we repeat these motifs to reinforce the Vortigaunt
energy mechanic that the player is ultimately able to use as a weapon
against the Combine, short circuit to release the prisoner, save Eli and complete the game.

a5_01_sideways_music

Right from the earliest days of the story design, the audio team
established a corresponding aesthetic plan that employed increasingly
strange sound design, soundscapes and music, culminating in a soundscape
appropriate to the broken reality of the vault and Alyx's encounter with the G-Man.
While the vault level was being designed, it became apparent that the
accompanying music would be crucial to selling the surreal nature of the vault.
Most music in the vault is based on the musical palindrome you hear in the sideways room.
By utilizing variations of elements in this piece and warping them with
multiple types of pitch and time bending, we created an appropriately abstract sonority.
While strange and dislocating, it was still consistent unto itself as
well as conceptually relevant to the multi-dimensional nature
of the G-Man and the Vortigaunt energy imprisoning him.

a5_01_surreal_vault

Half-Life: Alyx was set in a realistic type world with familiar rules.
The G-Man's presence in the vault gave us fictional justification to
experiment with different distortions of reality
and embrace surrealism in these spaces.
Early experiments with gravity anomalies and distorting spatial expectations
were promising, as the immersiveness of VR worked to exaggerate their impact.
The surreal environments proved to be fertile for puzzle solving as well,
but they tended to stall the pacing at the point in game we needed
it to be ramping up so they weren't included in the final release.
The most notable of these was in the mirrored apartment.
Players originally had to spot and correct differences between the top and
bottom apartment spaces in order to escape that area.
Instead of surreal interactive puzzles, the first half of the vault became a journey
through an increasingly bizarre set of spaces that were once
the apartment building the G-Man was captured in.

a5_01_vort_lightning

Supercharging the Russells with Vortigaunt energy helped us
achieve the goal of having an unarmed player overpower the last
few Combine soldiers that were guarding the G-man.
Initially grenades and a zero gravity hallway was how we achieved this.
Separately, we'd been experimenting with using the Russells to pull
and throw energy bolts from Vortigaunt energy nodes.
This was a natural fit for the vault combat section since it evokes the
concept of super charging the gravity gun at the end of Half-Life 2
and works with the player's well mastered Russells throwing mechanic.
This new mechanic was introduced at a point where pacing was critical,
so it had to be intuitive to pick up and quick to master.
Giving the player the Vortigaunt energy attack not only helped to tie
together that this was Vortigaunt energy containing the G-Man,
but helped to further solidify the deep Vortigaunt connection to Alyx.
It also gave the player a chance to briefly feel like they had super powers,
and be able to use that ability in a visceral,
deeply physical way that's impossible in real life.

a5_02_coetaneous

As the player first views and approaches the cage, the object of their
long quest, we wanted to give them a dramatic musical sense of the moment,
yet have it remain unsettling and consistent with the experience of the rest of the vault.
We continued to utilize variations of previously heard musical elements but
this time more clearly and slightly less abstracted.
As in previous pieces, the swirling string elements are taken from the
musical palindrome heard in the sideways room, but these pieces are
dislocated, out of sync and juxtaposed against each other to
represent multiple clashing realities and timelines.
As the player approaches the cage, a piano piece is introduced that works
with all three, almost binding them together but also standing apart,
just as the G-Man binds the various threads and eras of the Half-Life series.

a5_02_gman_room

The more surreal things got in the vault, the harder it became to design
puzzles that felt appropriately abstract but still understandable.
We found through testing that, at this stage of the game, players were hungry
for a narrative payoff and further puzzles ran the risk of becoming frustrations.
After some experimentation with more complex interactions, we settled on a
more theatrical presentation of the moment that the player frees the G-Man.
We simplified the design to a single interaction of the player gripping two
handles, sending Vortigaunt energy back into the system, destroying
the structure that was keeping the G-Man imprisoned.
The ending came together rather late in the process, since all of the
elements of art, story, level design, interaction and music all
had to be in place for it to succeed.
Nevertheless, we were happy with the outcome and the response of players has been overwhelmingly positive.