2/22/2014 – Creature AI

Status

I’ve stressed again and again that the AI in FRONTIERS is simple. But even simple AI is pretty complex. I’m overdue for a long-form devlog so here’s a big huge update about how the creature AI works in FRONTIERS.

Note: I’ll just say it once and get it out of the way – I’m not an AI guy. There are probably better ways to put a system like this together. So if you stumble across this devlog from the outside world don’t mistake it for an expert opinion.

The Components

Creatures are made up of two main parts, CreatureBody (body) and CreatureBase (brain). Here’s a list of their components.

  • Body – The creature’s avatar. Each creature has its own body object, which is set up by hand. Apart from collision information all information flows from CreatureBase object->Body object. It has the following components:
    • Skinned mesh / Skinned mesh renderer
    • CreatureBody – inherits from WorldBody. Simple Monobehavior that keeps track of BodyPart objects, which are attached to hand-picked bones in the transform hierarchy. Also handles the conversion to a ragdoll on creature death.
    • AnimationController – All creatures use the same animation controller combined with custom animation avatars.
    • CreatureAnimator – inherits from BodyAnimator. Simple Monobehavior that makes pushing animation states simpler. Animation substitutions for each creature are handled here on startup. Also used to sync up the avatar’s network state.
    • Rigid body

  • CreatureBase object – this object is shared among all creatures. It’s a single WorldItem that has the following WIScripts by default.
    • Creature – Has a lot in common with the Character script. Early on both used the same script (Alive) but their roles eventually because different enough to warrant splitting it off. The creature is the central hub of the CreatureBase object. It stores references to the body object, the creature den and all WIBehaviors. (More on those later.) It also stores many of the Actions used to broadcast messages from the den and from other WIScripts.
    • Container – All creatures are containers, and on death these containers are filled with goodies from the DeadAnimalParts category. Bones, morsels of meat, etc.
    • FillStackContainer – Instructions on when to fill the container (on death) and what to fill it with.
    • Flammable – Yes, you can set creatures on fire.
    • Damageable – Durability will vary from creature to creature, but all can be killed by damage.
    • Motile – This is the script that drives motion. It’s like the creature’s motor.
    • Looker – Looks for items of interest broken down by category. Players, Friendlies, Threats, Shinies, Tasties. This script has several Actions for dispatching messages when something important is spotted. (OnSeeShiny, etc.)
    • Listener – A simple script that listens for noises within a range. If noises are heard it tells the creature to focus on it.
    • HunterGatherer – The source of every creature’s ‘base’ behavior. If nothing else is going on it will tell the creature to wander idly until Looker spots something. If that something is a Tasty it will seek it out and eat it.

      It also has a few Monobehavior scripts:

    • AIPath – Pathfinding script, follows the path given. This is where the Motile script pushes most of its instructions.
    • Seeker – Pathfinding script, asks for / receives paths. Used by AIPath.
    • Network object / Network sync – Keeps the object synced in multiplayer

Spawning the Creature

Creatures are defined in something called a CreatureTemplate, which is just a set of State classes for each of the WIScripts in the CreatureBase object, as well as a few simple instructions for building the body with the correct mesh and texture. The template is where all the variation between creature types is stored – eg, in the template’s FillStackContainerState I might add the ‘WolfTeeth’ category, and in the BrownBear template’s FillStackContainerState I might add the ‘BearClaws’ category.

When a creature is spawned the Creatures class instantiates a copy of CreatureBase, sets the state of each WIScript from the creature’s template, instantiates a separate body object, makes sure everything is properly linked and then dumps into the world at the specified location.

Great, so how does all that add up to AI?

The goal of AI is to create the illusion of agency by getting an object to do the right things at the right times. 99% of the time the ‘right thing’ involves moving the object point A to point B. Go from where you are to where the player is. Go from where you are to away from where the fire is. Go from where you are to where the player told you to go. Etc.

There are a thousand ways to go from A to B. I broke them down to a handful of atomic actions called MotileActions, which are stored in and interprted by the Motile WIScript. The Motile script stores a queue of MotileActions and other scripts push actions onto that queue to tell the creature where to go.

Here are the important bits in a MotileAction:

  • Type – what kind of A to B we’re looking at (see below)
  • Target – Usually the ‘B’ in the A to B equation.
  • State – whether it’s waiting to start, started, finished, etc.
  • Method – pathfinding, straight-shot, flocking, etc. Different creatures use different methods to find the best path from A to B.
  • Expiration – when do we STOP trying to move from A to B? We have the following options:
    • Duration
    • TargetInRange / TargetOutOfRange
    • NextNightfall / NextDaybreak - useful for nocturnal / diurnal routine behaviors
    • Never!
  • YieldBehavior – What to do when we’re interrupted. Our options are:
    • YieldAndFinish
    • YiledAndWait
    • DoNotYield
  • Range / OutOfRange / Duration and other expiration settings
  • CustomAnimation – Overrides whatever the default BodyAnimator is doing with something custom. Usually used in combination with ‘Wait’ to make creatures do something unique.

Here are the different motile action types (ie, the different flavors of ‘A to B’):

  • Wait – Yup.
  • FollowGoal – Find a path to the goal object and move along that path.
  • FleeGoal – Find a path away from the goal object and move along that path.
  • FocusOnGoal – Orient the body object so it’s facing the goal object, but otherwise don’t move.
  • WanderIdly – Randomly place the goal object in the world, then follow it. Occasionally stop and play an idle animation. (This used to be handled by a script that pushed MotileActions periodically but I was using it so often I turned it into a separate action type.)
  • FollowTargetHolder – Attach your goal object to a TargetHolder (a script which holds goal objects and moves them in patterns around the target). This action type has four patterns:
    • Stalker – Move in a long-range circle around the target holder
    • Companion – Move alongside the target holder
    • Follower – Tag along behind the target holder
    • Attacker – Occupy one of the open ‘attacker’ spots in the target holder (N,E,S,W)

Every so often the Motile script checks the action at the top of the action queue to see what the creature should be doing. If it’s supposed to be moving it sets the creature’s target movement speed & rotation, then tells the pathfinding scripts to find a route. The Creature script uses that route to actually translate itself through the world, then lets the CreatureBody know where it is so it can catch up.

The movement produced by MotileActions is customized a bit by properties defined in the creature’s template. Running speed, walking speed, turning speed, etc. can vary a MotileAction’s output from swift and precise to slow and lumbering.

This could have been so much simpler…

Creating a basic AI script without going through the trouble of creating MotileActions is trivially easy. When I was prototyping AI behaviors I wrote something like ten in an afternoon – Wolf, Rabbit, Turtle, etc. They weren’t very sophisticated and they broke easily, but they worked. They were also ‘simple’ in the sense that you could glance at them and see what was happening within 30 seconds. It would take you a lot longer to follow a MotileAction’s life cycle from start to finish.

So why bother? Well, a couple of reasons. The first is moddability, as always. You can save, load, modify and push MotileActions from new scripts with ease. The second is modularity and code re-use – I find that staying on top of 10 simple scripts is harder than staying on top of one complex script, especially when there’s a lot of redundant code in those 10 scripts.

The third is short-term memory. Without some kind of action queue it’s impossible to create AI that reliably remembers what it was doing after an interruption.

A to… wait what was I doing?

This is more of a problem for Characters than animals – I think most people are willing to forgive a bit of short-term memory-loss from a rabbit or a bear. But say an AI creature is on its way to do something, and you interrupt it for a moment. Once the interruption is over it should get back to whatever it was doing before the interruption, right? One interruption deep isn’t too tricky to hack together without an action queue. But what if you interrupt the interruption? And what if you interrupt that interruption? How many interruptions can the creature handle before it won’t remember what it was up to before it all began?

Whenever you push a MotileAction onto the queue you specify a priority:

  • Normal – Add it to the back of the queue (but above the base action). It will be seen to in the order received.
  • ForceTop – This is so important it can’t wait.
  • ForceBase – This is the new base action. From now on whenever the creature has nothing to do, it will do this.


This priority combines with its YieldBehavior to give us a final order. Actions that are told to YieldAndWait can be temporarily usurped by other behaviors, then bubble back up to the top of the queue after those actions have expired. Actions that are told not to yield (eg ‘Run the hell away from fire’) can prevent other actions from taking the top spot until they’re done. The interactions between YieldBehavior and Priority aren’t always perfect – every once in a while you’ll get a ‘forgotten’ action that temporarily makes a creature act slightly batty – but overall it’s pretty solid and behaves as expected.

A to B to C to D…

OK great. We have a creature that can go from A to B when you tell it to. But who’s telling it to go from A to B? And what about C & D? This is where things start to get trickier.

The Hostile script is a good example of using MotileActions to produce (relatively) complex behavior. Here’s the workhorse function, UpdateHostile:

Code: Select all
public IEnumerator UpdateHostile ()
{
//give a chance for others to update target
yield return null;

if (!HasPrimaryTarget) {
//something has gone wrong
//stop being hostile for now
Finish ();
yield break;
}
//stage 1 - stalk the target
//set the mode for the benefit of other scripts
State.Mode = HostileMode.Stalking;
yield return StartCoroutine (StalkOverTime ());

//we're done stalking - do we still have a primary target?
//if not then stop being hostile
if (!HasPrimaryTarget || State.Mode != HostileMode.Stalking) {
Finish ();
yield break;
}
//stage 2 - warn the target
State.Mode = HostileMode.Warning;
yield return StartCoroutine (WarnOverTime ());
//we're done warning - do we still have a primary target?
//if not then stop being hostile
if (!HasPrimaryTarget || State.Mode != HostileMode.Warning) {
Finish ();
yield break;
}
//stage 3 - attack the target
//keep going until we no longer have a target
State.Mode = HostileMode.Attacking;
yield return StartCoroutine (AttackOverTime ());
//we're done here
Finish ();
yield break;
}

Once a creature is Hostile – ie, once the Hostile WIScript has been added to CreatureBase – the progression is really simple. First the creature stalks you, then warns you a few times, then attacks you. Stalking is done by adding a MotileAction of type FollowTargetHolder, with a FollowType of Stalking. The motile action is set to expire after a time specified in the creature’s template. Here is the StalkOverTime function:

Code: Select all
protected IEnumerator StalkOverTime ( )
{
//start stalking our target
Motile motile = worlditem.Get ();
MotileAction stalkAction = new MotileAction ();
stalkAction.Type = MotileActionType.FollowTargetHolder;
stalkAction.Method = MotileGoToMethod.Pathfinding;
stalkAction.FollowType = MotileFollowType.Stalker;
stalkAction.LiveTarget = PrimaryTarget;
stalkAction.Expiration = MotileExpiration.Duration;
stalkAction.YieldBehavior = MotileYieldBehavior.YieldAndWait;
stalkAction.RTDuration = State.StalkTime;
motile.PushMotileAction (stalkAction, MotileActionPriority.ForceTop);

//wait for the action to start before waiting
yield return StartCoroutine (stalkAction.WaitForActionToStart (0.1f));//(waits for intervals of 0.1 seconds until the action starts)

while (State.Mode == HostileMode.Stalking) {
//wait for the action to finish
if (!stalkAction.IsFinished) {
yield return null;
}
//if the stalk action is finished
//that means our stalk time is up
}
//if our state isn't stalking
//and we're not finished
//we don't need this any more
//so cancel just in case
stalkAction.Cancel ();

yield break;
}

Since stalkAction can be interrupted by another script the Hostile script is freed up from having to check whether events [x,y,z] are more important than being hostile. And since it will wait for stalkAction to finish before moving on to other stages, a whole mess of interruptions can happen without causing the creature to forget that it’s hostile.

Here’s the next stage, warning:

Code: Select all
protected IEnumerator WarnOverTime ( )
{
//start warning our target
Motile motile = worlditem.Get ();
WorldBody body = motile.Body;
//create & add the warn action
MotileAction warnAction = new MotileAction ();
warnAction.Type = MotileActionType.FollowTargetHolder;
warnAction.FollowType = MotileFollowType.Attacker;
warnAction.LiveTarget = PrimaryTarget;
warnAction.YieldBehavior = MotileYieldBehavior.YieldAndFinish;
warnAction.Expiration = MotileExpiration.Never;//only expire manually
motile.PushMotileAction (warnAction, MotileActionPriority.ForceTop);
//Debug.Log ("HOSTILE: Starting warnings");
yield return StartCoroutine (warnAction.WaitForActionToStart (0.1f));

while (NumTimesWarned <= State.NumWarnings) {
//even if the action is cancelled warn at least once before leaving
//start the warning
yield return new WaitForSeconds (State.Attack1.RTPreAttackInterval);
NumTimesWarned++;
//send this message so the character / creature / whatever can emit sounds and effects and set animation
OnHostileWarnTarget();
body.Animator.Warn = true;
//finish the warning
yield return new WaitForSeconds (State.Attack1.RTPostAttackInterval);
NumTimesWarned++;
yield return null;
}
warnAction.Cancel ();

yield break;
}

This action tells the creature to occupy an attacker space in the target object’s target holder. Then it issues a warning – for a wolf this would be barking. The number of times it warns the player before actually attacking is specified in the creature’s template. Finally, this action is set to YieldAndFinish – meaning that if it’s interrupted, it goes straight to the attack stage instead of resuming its warnings.

The warnings are meant to give the player a chance to get away before things get really hairy. But they’re not very useful if the creature always attacks after the warnings are finished. And we don’t want to clutter up the hostile script with a bunch of checks for whether or not the player is a threat – the purpose of the Hostile script is just to perform hostile actions, not handle a ton of other logic. So how do we slip that interruption in there?

This is where we get into WIBehaviors. You might have been wondering what this was all about in the UpdateHostile function:

Code: Select all
if (!HasPrimaryTarget || State.Mode != HostileMode.Warning) {
Finish ();
yield break;
}

The hostile script checks the state between each stage because it can be set externally by other scripts using the CoolOff function:

Code: Select all
public void CoolOff ()
{
State.Mode = HostileMode.CoolingOff;
}

There are many, many reasons why a creature might no longer feel hostile towards its target. The main one is because the target has left its den. So we need a way for the Hostile script to be told when a target object has left its den.

Again, this would be trivially easy if every creature behaved exactly the same and if I didn’t care about moddability. At each stage I could include a check:

Code: Select all
Creature creature = worlditem.Get ();
if (Vector3.Distance (PrimaryTarget.transform.position, creature.Den.transform.position) > creature.Den.Radius) {
State = HostileState.CoolingOff;
}



But what if a creature doesn’t cool off when you leave their den? There are several that don’t. I could include a flag in the creature’s template:

Code: Select all
Creature creature = worlditem.Get ();
if (creature.State.CoolOffWhenTargetLeavesDen && Vector3.Distance (PrimaryTarget.transform.position, creature.Den.transform.position) > creature.Den.Radius) {
State = HostileState.CoolingOff;
}


Okay, but there are some creatures that cool off during the day but not at night. What now, another flag? Oh, and what if we want to keep using the Hostile script for Characters and not just creatures? Ugh. I don’t mind flags but add more than a handful and suddenly every script becomes exactly the sort of mess that Motile was created to help us avoid.

Code: Select all
Creature creature = null;
if (worlditem.Is (out creature)) {
if (WorldClock.IsDaytime) {
if (creature.State.CoolOffOnTargetExitDen && creature.State.CoolOffDuringDay) {
//Blecchh
}
} else {
//Blecchh
}
}
Character character = null;
if (worlditem.Is (out character)) {
//character stuff
//you get the picture...
}



Do A/B/C when D/E/F

My solution is something called WIBehaviors. It’s a really simple class that acts as a bridge between the different scripts in a Creature or character object. They’re text-based so they’re easy to modify and change up from creature to creature. WIBehaviors can do four things and only four things:

  1. Subscribe to an Action owned by a WIScript
  2. Add WIScripts to an object
  3. Remove WIScripts from an object
  4. Invoke a method on a WIScript

Here’s the entire class:

Code: Select all
[Serializable]
public class WIBehavior
{
public void OnAction ()
{
if (!mInitialized || !WorldClock.IsTimeOfDay (TimeOfDay))
return;

WIScript targetScript = null;
bool scriptExists = false;
//do we have a script to add?
if (!string.IsNullOrEmpty (WIScriptMessageRecipient)) {
if (ScriptAction == BehaviorScriptAction.GetOrAdd) {
//if we're supposed to get or add it, do so
targetScript = Initializer.worlditem.GetOrAdd (WIScriptMessageRecipient);
scriptExists = true;
}
else {
//if we don't get or add it, check if it exists
if (Initializer.worlditem.Is (WIScriptMessageRecipient, out targetScript)) {
//if it exists...
if (ScriptAction == BehaviorScriptAction.Remove) {
//if we're supposed to remove it, do so now
targetScript.Finish ();
scriptExists = false;
} else {
//otherwise just report that it exists
scriptExists = true;
}
}
}
}
if ((!scriptExists) || string.IsNullOrEmpty (MethodName)) {
//not all behaviors invoke a method
return;
}

MethodInfo methodInfo = targetScript.ScriptType.GetMethod (MethodName);
if (methodInfo != null) {
methodInfo.Invoke (targetScript, new System.Object [] { });
}
}

public bool Initialize (WIScript initializer)
{
bool isValid = false;
Initializer = initializer;
//subscribe to the appropriate action
WIScript dispatcher = null;
if (initializer.worlditem.Get (WIScriptActionOwner, out dispatcher)) {
FieldInfo actionField = dispatcher.GetType ().GetField (ActionName);
if (actionField != null) {
Action action = actionField.GetValue (dispatcher) as Action;
if (action != null) {
action += OnAction;
isValid = true;
}
}
}
mInitialized = true;
return isValid;
}

public string Description = string.Empty;
[XmlIgnore]
[NonSerialized]
public WIScript Initializer;
public BehaviorTOD TimeOfDay = BehaviorTOD.All;
public string WIScriptActionOwner;
public BehaviorScriptAction ScriptAction = BehaviorScriptAction.Get;
public string ActionName;
public string WIScriptMessageRecipient;
public string MethodName;
[NonSerialized]
protected bool mInitialized = false;

public enum BehaviorScriptAction
{
Get,
GetOrAdd,
Remove,
}
}

In the past few months I’ve moved away from using SendMessage to communicate between WIScripts on a single gameObject. It’s too convenient not to use sometimes, especially when doing something that affects the entire object, but it’s inefficient when you’re only trying to target one or two scripts. WIBehaviors are filling that gap. They’re defined in a list owned by a single WIScript and initialized on startup. The only non-serialized field is their WIScript owner so they’re saved in their owner’s state on save / load. (In the case of Creature this lets me define behaviors in the creature’s template.)

On initialization the WIBehavior uses its owner reference to look up an existing WIScript on the owner object, then uses reflection to find a System.Action to subscribe to. (If either of these things doesn’t happen the behavior is marked as invalid and removed.)

This approach hinges on gathering and distributing useful information with Actions. The Looker class has the following:

Code: Select all
public Action OnSeePlayer;
public Action OnSeeSomethingFriendly;
public Action OnSeeSomethingShiny;
public Action OnSeeSomethingThreatening;
public Action OnSeeSomethingTasty;

A WIBehavior could subscribe to any one of these and add a script or invoke a method on an existing script as a result. If we wanted an extremely hostile creature we could add the following behavior to a creature’s template:

  • Description: Attack player on sight!
  • WIScriptActionOwner: Looker
  • ActionName: OnSeePlayer
  • ScriptAction: GetOrAdd
  • WIScriptMessageRecipient: Hostile
  • MethodName: AttackImmediately

Now whenever this particular creature sees the player it will get or add the Hostile script (ensuring that it exists) and then invoke the AttackImmediately script, which forces the Hostile script to bypass stalking and warning and go straight for the kill. The Hostile script is kept nice and clean while producing very different behavior.

Downsides

This approach is clean in some ways but dangerous in others. Lots and lots of behaviors lead very quickly to spaghetti and good luck debugging it. Using Actions helps a bit because it limits what information the behaviors can listen for, which constrains their use. A small set of highly specific actions tends to keep things in line. I also make sure not to use them unless they’re absolutely necessary for customizing the behavior of a single, clean script in one or two specific ways. When the usage of a script has diverged to the point where different creatures need ten WIBehaviors to get the right result I opt to branch the script and axe the behaviors.

Making use of WorldItems

One last way I make creatures do A when C is by making use of WorldItems, WIScripts and WIGroups. WorldItems cache all their WIScripts so it’s very inexpensive to check whether a WorldItem is a Creature or Hostile or whatever. WIGroups provide a way to search the world for WorldItems that have a particular script over time – sort of like GetAllComponentsOfType, except using a Coroutine. The purpose of these functions was to save and load data but I’ve found them useful for other things as well.

The Looker script is a good example. Different creatures react differently to the same object – a rabbit will eat a carrot while a wolf will eat a rabbit. The purpose of the Looker script is to inform the Creature when it sees an object of a specific type (eg, ‘Tasty’ or ‘Threatening’) and the easiest way to pre-filter objects is to categorize them based on their scripts. When a WorldItem enters the Looker’s field of view it doesn’t need to know anything about the creature using it other than which script names belong in which categories. The Looker also uses WIGroups to look up targets that aren’t immediately visible but which are in the general viscinity (useful when creatures can’t physically see a target that would be pretty obvious in real life). The same WIScript names can be used during this search. And of course a list of script names is easy to modify in the creature’s template.

Alright that’s it for now.

Read more here: 2/22/2014 – Creature AI

2/19/2014

Status

Let’s see, what did I finally make moddable today?

Skills – You can now modify existing skills or even create new skills from scratch. This includes new skill categories and sub-categories. New skills can even use existing skills as prerequisites or replace them outright. Best of all, if people want to get *really* advanced new skills can even use custom scripts!
Status Meters – Health, hunger, thirst etc. and all the effects they have on one another are all moddable. You could also add new status meters like ‘Mana’ or ‘Fabulousness’ or whatever and link them to new skills, items and abilities.
Color Schemes - All the colors for the interface (including skill category, missions, status meters etc.) can be changed. Custom colors can be added and referenced by name.
Creature Templates – You can create new creature variations. All behaviors – hostility, day/night behavior, preferred foods, etc. – can be specified, and body types / textures can be mixed and matched at will.

Productive day!

Read more here: 2/19/2014

2/18/2014

Status

It’s been pretty quiet on the devlog front, which tends to happen when I’m working the hardest. We’ve had a number of pretty cool breakthroughs over the past few weeks.

The first is that all our chunks are imported. The very first chunk I created was 1300×1300 – it’s the one alpha testers got to wander around in the Benneton alpha. Until recently I’ve been stuck working with a maximum of 2-3 1500×1500 chunks loaded on top of that. Any more and we’d run out of memory so there was no reason to import more of Given’s work as actual game content. But as we’ve expanded gameplay beyond the borders of Benneton I’ve had to find and squash inefficiencies so I could playtest it – not an easy process in Unity, which assumed you’ll be loading and unloading content in big lumps (ie levels) and not trying to load and unload it in tiny lumps as you move through an open world. At this point I’ve bumped the number of concurrently loaded chunks to around 12 and nearly all of our 30 1500×1500 chunks (not counting the Uncharted Lands) are fully imported. We still get out of memory errors on occasion and there’s a lot more to do with level of detail generation but on the whole we’re doing well.

That first chunk took weeks to set up because I was feeling my way along and building the system as I went. Yesterday I set up two in an hour. That’s a good sign.

The next breakthrough is that every city, village and landmark has been blocked out (literally blocked, usually as blue cubes, though they’re slowly being replaced.) Many have yet to be properly named – you’ll see a lot of ‘RiverbogWoods1′ locations on the current world map – but the naming of things is blissfully simple compared to the setup. And since we can finally load all the chunks at once instead of one at a time I’ve been able to walk from one major landmark to another and really get a feel for the distances between everything for the first time since Given overhauled the terrain.

Read more here: 2/18/2014

2/15/2014 – Steam Dev Days PT3

Status

Let’s see where did I leave off?

The next demo was a fully lit, fully textured assembly line from Portal 2. (I did a search for ‘portal 2 assembly line’ and found this, which is very similar to if not exactly the same as what I saw.) It was loaded while I was facing in the opposite direction so I heard it before I saw it, and when I turned to look a robotic arm nearly took my head off. On the spectrum that exists between an involuntary flinch and a completely rational decision to step out of the way this fell somewhere in the middle. This was another brush with ‘presence’ – I didn’t just duck the way you might wearing the Oculus DK1, and I didn’t duck because that’s what you’re ‘supposed to do’ – I ducked because I saw a giant metal arm moving really, really fast and I knew - I knew - that something that ‘big’ and ‘heavy’ could seriously damage me. I’ve been near industrial machinery before and this made me uncomfortable in exactly the same way.

But what’s interesting is how quickly that feeling went away once I started paying attention to the textures. When I noticed that certain features like rivets were ‘painted on’ I couldn’t stop thinking of it as cardboard covered with stickers with printed textures on them. Once I had ‘decided’ that the robot arms were made of cardboard they suddenly ‘felt’ much less ‘heavy’ and I stopped feeling uncomfortable around them. (I have a feeling that if they had been using denser geometry and a plain white texture like the white factory demo this wouldn’t have been a problem.) Forgive the over-use of quotes in this paragraph; I’ve never had to describe this sort of thing before.

I’m going to skip a few similar demos (mostly static scenes with Portal 2 geometry and environments) and get to my favorite one. You might think that seeing Portal 2 stuff would be mind-blowing, and it was to a point, but these environments also reminded me how limited my movement was – I couldn’t climb the walls or go down any hallways. I have a feeling that’s why my memories of those demos don’t stick as much – it’s mostly a memory of limitations, and that’s not very exciting.

The demo that really opened my eyes to what a VR game could be was the tiny office building. I found myself in a dark void with a knee-high office building in front of me. The building lacked a roof, sort of like a dollhouse, and inside there were dozens of abstract little people typing at computers and walking down hallways and drinking coffee and so on. (Imagine that the silhouettes from street signs peeled themselves off and started a company). Ambient sounds of phones ringing and people typing and chit-chat filled my ears as I got closer to it. The detail was fantastic, and if I wanted to see something in more detail I could just lean in and look closer. It’s hard to express how freeing it is to navigate without rotating a mouse or tapping a key – it takes you to a completely different head space.

Aha! I didn’t realize it at the time but these were actually more Portal 2 assets. I just did a search for a visual aid and found exactly what I was looking at. Just imagine this repeated a few dozen times end to end, with a bit more variety in the props and actions:

I loved this demo because when I knelt down and peered over these little people I suddenly ‘got’ the VR experience.

VR will be about intimacy.

Unlike my experiences of the bigger spaces, I felt no limitations here, only endless possibility. Small environments. Tiny movements. Spaces opening inward, not outward. Secrets within secrets, details within details. A hundred possibilities for games that no one has ever played – games no one ever could play except through this medium – flashed instantly through my head. I’ve written several down but I doubt I could forget a single one.

Everyone (including myself) kept thinking VR was going to be about making epic experiences even more epic – exploring endless dungeons, flying through vast stretches of space, battling gigantic robots, whatever, except you’re really there!!11!!. But we had it backwards. Epic is not VR’s strength and probably won’t be for a while.

The final demo drove this point home. It was the epitome of epic and should have left me breathless but the experience was flat next to the smaller, more intimate moments. Now I know I’ve seen a music video by the same artist using the same rendering technique in the past, but for the life of me I can’t find it – just know that they flew me through a crazy undulating world of isosurfaces rendered using a ray-marching algorithm that samples time in a strange way, and that the result is like zooming around the MCP’s brain in TRON. (Side note: viewing this using the DK1 would have made me instantly nauseated but after a full five minutes I didn’t even feel light-headed. VR sickness really does look to be a solvable problem.) This was a real trip and I enjoyed the hell out of it… but watching it on a 2D screen is only about 25% less cool than seeing it in VR. I know that sounds insane – I can hardly believe I’m typing it – but the little things felt like the big things should have and vice versa.

Aha, found it! Now I want you watch this video. Then you’ll be able to imagine how goddamn impressive that little office scene must have been if it made a greater impact than this mind-blowing cosmic journey like this:

(By the way, if you’re interested in watching any of the dev days talks you can catch them all here.)

Read more here: 2/15/2014 – Steam Dev Days PT3

2/10/2014

Status

The dreaded out of memory error has been appearing in earnest since I began bringing all the new chunks online. With up to 9 Benetton-sized chunks visible in detail (and even more visible in low quality) texture memory gets eaten up fast. Given has been pretty good about texture re-use – using the same texture with overlay detail when we need rocks for a different region, for example – but with this much variety in the locations we always knew we’d have to deal with this eventually.

My first target is structure textures. My goal is to reduce the texture size (currently a massive 4096×4096 per variation per pack) by half through more efficient UV mapping. Next are world item packs, many of which aren’t atlased and most of which use their own material. I’m going to put a 1024×1024 cap on those and see where that leaves us. Of we’re still hurting then I’m coming for the tees. We use a lot of varieties of trees even within a species and Unity’s tree system uses one texture per tree type. But I strongly suspect I can force some more texture re-use there.

Read more here: 2/10/2014

2/5/2014

Status

Where have I been, you ask? Working on a new gameplay video and a new alpha. Hang tight, it’ll be worth the wait.

Across the Rift, the Tower looms…

Read more here: 2/5/2014