Welcome back part 2: Code

·

·

In the first post I talked about the changeover to this new site, but what has become of the game since you last saw it?

As I write this, on 9/25/2024, you can’t actually download it to find out, because I have to tidy it up and write a short piece of code to tie it into this new website, now that I’ve settled on how that’s all going to work. It won’t take long (I promise!!!), but I can only do one thing at a time. So for now, let me give you the big picture:

The last build I was able to put out was in late October 2022. That’s so long ago that I don’t even have a copy I can look at! If I remember right, the game opened with us standing in a field, near a farm. There have been at least three new worlds since then, and two changes of storyline, so it’s all a bit blurry. I can’t remember what the creatures looked like, but they may have been brown, horsey-looking things. Dunno.

Anyway, at that point a rabbit hole opened and I fell down it! The biggest code change as far as the virtual world is concerned, was that I decided it was time to change rendering pipelines, from the rapidly dying standard pipeline to one of the two fancy new ones Unity had started to provide. I chose HDRP, the High Definition Render Pipeline.

HDRP allows for far better graphics and lighting, but it was a big change, with a lot of side-effects. We don’t actually have all that much in the way of fancy graphics yet, but at least we can, now. And it’s future-proof. Meanwhile, better weather effects and vegetation are steadily being implemented, partly as a result of HDRP and partly the adoption of other tools that this change opened up.

So, since I had to rebuild the entire world for HDRP, I built a new one. And then another. And another… Gradually, this resolved into the new scenario, which I’ll talk about in the next post.

As for the creatures, I don’t think they had a ‘complete’ brain yet, and there were many problems with getting the physics engine to support real, muscle-driven locomotion. Finally, I think I’ve pretty much solved that enormous problem. Not only can the new creatures walk without falling over or exploding, but they can trot, too. They can even gallop, after a fashion, and sometimes it’s hilarious to watch them try. I don’t think anyone has ever done this before, at least, not without secretly cheating, so I think it’s actually quite a big achievement that the creatures can really move around under their own steam, trip over obstacles, struggle up slopes, etc. This is not key frame animation; it’s not ‘the hand of God’, dragging them around, with their legs merely stumbling along in reaction; they really walk by pushing their feet against the ground, and they go literally where they want to go. Phew!

I’ve now completed their whole brain structure. Not finished, just completed – there are literally hundreds of parameters to tweak, yet, and probably some better ideas will occur to me at some point. But the sensorimotor loop is now fully closed at all levels of the hierarchy, to put it in technical terms.

I improved the structure of their emotional systems and added social emotions. They can now infer what other creatures are feeling, by watching their expressions and how they behave, and this allows them to change their own behavior as a result. So, for instance, if they think their mother is sad, they might feel bad themselves and decide to bring her a present, to cheer her up. Whereas, if they think their sworn enemy is sad, they might feel something very different and choose to respond accordingly. Obviously, this is an extremely complex system in real life, so I can only hope to approximate such things yet, but the set of mechanisms I have are quite advanced, by the standards of simulated emotions. If indeed there are any standards out there to aspire to. They’ll just need a lot of tweaking before the creatures react emotionally enough not to be psychopaths, yet not so emotionally that they all get terminally depressed.

In the process, I’ve changed the way they learn, meaning that they can now learn by observation. Before, if one creature eventually learned how to light a fire, for instance, the others would still have to discover this skill entirely for themselves. Given how much trial and error is involved in learning anything, this meant that very little progress could be made. But now, if one creature learns that pressing a certain button lights the fire, any other creatures who see them do it, may get the hint and be able to copy the trick next time they feel cold. In real life, observational learning is hugely important, especially for creatures without language, so I’m glad I was able to find a way to implement something similar.

Obviously, interpreting what another creature is doing from your viewpoint as an observer, and then being able to do the same thing for the same reasons yourself, from your own viewpoint, is incredibly sophisticated in real life (it may be what ‘mirror neurons’ are all about). So I had to ‘cheat’ somewhat to do it on a personal computer, but it’s potentially a big step forward and I’ll doubtless be able to improve on it over time.

On the subject of language, they used to be able to learn human words and speak them, up to a point, but I’ve decided this is too ‘fake’, and spoils the sense of immersion, so I’ve removed it and haven’t yet replaced it with anything. If anyone has to learn a language, we should learn theirs, not them ours. They do now ‘mutter’ to themselves, as they go about their day, and we can probably distinguish angry muttering from distress, or tell when they’re feeling really cold. But I have to do a lot more work on this, yet.

They do go blue with cold, flush red when they’re hot, and go green when they’re sick, but I think they may have done that already. I definitely improved on this, but I forget exactly how. They shiver when cold, wobble when drunk; that sort of thing. Oh, and you can see their breath in cold weather – a tiny touch, I know, but it helps. There’s so much going on under the hood, and it’s good for some of that to appear on the surface so that we can see it, even if it’s a bit stereotyped.

The creatures you last saw had short necks, which meant looking around was a simple matter of moving a joint. The new creatures are loosely based on alpacas, which means they have very long necks. That made merely looking at things a fair bit more complicated, but I was able to fix it using the ‘instincts’ mechanism I use throughout their brains – they now decide which head and neck muscles should do what by generating a whole pattern of motor activity at once. It’s not important – it’s just another way in which my basic ‘theory’ is working out to be more powerful than I’d expected. Of course, like every other step it came with new problems – they’d catch a glimpse of something almost behind them, decide the best way to see it was to look back between their legs, and then spend the next hour walking around dragging their heads along the ground, unaware that they could just look forward now!

A surprising amount of the past two years has been spent rewriting and re-rewriting their basic visual system – how and why they move their eyes, how things attract their attention, what features enable them to recognize objects and the state of those objects, and so on. There’s still a lot wrong with how it works, but it works a lot better than it did.

Were their genes still stored in ‘DNA’ form, as mysterious strings of characters? I forget. Anyhow, this just got increasingly in the way, for very little benefit, so I dropped it. Now their genetics are human-readable. It’s not as ‘realistic’, but it’s a heck of a lot easier to debug and understand!

There’s not a lot of genetic variation in the population at the moment, because getting them to work at all has been more important than getting some to work better than others. But it’s now a simple matter of adding more ‘alleles’ to the parameters inside each gene. I can still create standardized creatures for testing, but ‘natural-born’ creatures will be able to vary in literally billions of subtle ways, right down to how quickly an individual enzyme decays, or how sensitive a cluster of neurons are to reinforcement. And these changes interact to produce much more abstract consequences – how pessimistic a creature tends to be, for instance.

I’ve improved the way tools work in the user interface, but most of this hasn’t surfaced yet and will come soon. One of the consequences is that creatures can learn by observing us, as well as observing each other. This mostly gives us a somewhat simplistic way to teach creatures things – we can just show them by doing it ourselves. But if we drop an object in front of a creature, they may think we are giving them a gift, for instance. I’ve no idea how far we can go with this, but it’s a step towards being able to train creatures and interact with them at an emotional level.

What else? The way the world is simulated has improved somewhat. For instance, I can create patches of ground where creatures can graze, or if a creature decides to lie down in a patch of nettles, it is more uncomfortable than a patch of long grass. There’s been a major change in the deep structure of the simulation – I can now define ‘payloads’, which integrate all the different responses to a state change – emitting sounds, injecting chemicals, modulating emotions, etc. In many cases, objects in the world require little to no hard code, and most of their behavior can be defined in the payloads through which they react to messages. This mostly helps me, but it will probably help modders, too.

Okay, this is a really long post already, so I’ll stop. All I can do is give you a rough idea of what I’ve been doing – the rest you’ll have to figure out by experiment! I’m so sorry I went silent for so long, but as you can see, I wasn’t wasting my time! It just got too hard to keep you up to date with these hundreds of changes (not to mention changes of mind) at the same time as actually implementing them. It’s been hard, but I think I’ve broken the back of it now.

This post was on the ontological level that I call ‘physicality’ – code-related stuff and progress with the project. The next post will be on the ‘virtuality’ level, where I can talk about the new virtual world and what I have in mind for it…


5 2 votes
Article Rating
Subscribe
Notify of
25 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
danielmewes
danielmewes
1 month ago

Having the ability for social relationships, empathy and learning from another is incredibly exciting and I think could create an entirely deeper level of relating to the creatures as well for the human player!

That’s also quite the feat. I can only imagine that there must have been some very smart insights and many details that you had to figure out to make it all work within the brain. Putting oneself into another individual’s shoes sounds straight forward at first (“just run your circuitry but replace state and sensory input by what you know or can infer about the other being’s state, duh”). But the actual mechanics of how to do this, and the “coordinate transform” between what you can extrinsically observe about another being into the other’s intrinsic “self”, must be very non trivial to get working? Even with a code-level cheat, it can’t have been *that* obvious how to do this while transferring between different brain variations and layouts?

I’m looking forward to hearing more about how you do it in the future!

On another subject:
With the absence of human-interpretable language in the game, do you worry that it might make it difficult for the player to grasp the complexity of what’s going on in the creatures’ minds? Just thinking of how we humans struggle to attribute intelligence and sometimes even sentience to non-human animals sometimes, just because they don’t express their internal state to us in a way we’re familiar with (e.g. dolphins, crows etc).

Last edited 1 month ago by danielmewes
Mabus
Mabus
1 month ago

About whatthe creatures see – how do they see? Always wondered that with norns and grendels, if they saw “numbers” for objects, their size,their position or whatever? Same question goes obviously for those new creatures – how do their eyes work? Pixels or other meta informations?

And about the language, most creatures players started to speak/understand some basic norn language anyway. After some time you just know what they mean and say – however it was neat as jumping board for new players to understand them better.

Midnight
Midnight
1 month ago

Of course, like every other step it came with new problems – they’d catch a glimpse of something almost behind them, decide the best way to see it was to look back between their legs, and then spend the next hour walking around dragging their heads along the ground, unaware that they could just look forward now!

I had a thought when reading this: I think probably the main thing stopping animals (or humans, for that matter) from doing the same thing is discomfort. When you’re upside down, all the blood rushes to your head. Also, probably your neck would get stiff from doing that. Dunno if that’s a reasonable thing to simulate or not, though.

Zach the Cat Guy
Zach the Cat Guy
1 month ago

Not only can the new creatures walk without falling over or exploding

That’s a problem for me, Steve. I was hoping to watch simulated creatures fail to feel better about myself.

Last edited 1 month ago by Zach the Cat Guy
Zach the Cat Guy
Zach the Cat Guy
1 month ago

the set of mechanisms I have are quite advanced, by the standards of simulated emotions. If indeed there are any standards out there to aspire to

The only real standards out there are set by Chris Crawford, and nobody has really met them, because nobody has bothered to try beyond the bare minimum, except maybe The Sims.

BoB3K
BoB3K
1 month ago

Great info thanks!

And it’s future-proof”

—Ha ha ha…oh yeah, I’m sure.

Chat Icon Close Icon
25
0
Would love your thoughts, please comment.x
()
x