❛THE BASIN❜💦— Creeks, Rivers & Waterfalls

That is very easy to answer… The path of clouds is not determined by the terrain. The problem is not that water moves. Proceduraly generated motion is no problem. The problem is that water moves along the terrain, i.e. its motion is determined by context. And context is what you don’t have, what you cannot have.

Erosion is easier that way, because nobody will look very closely at whether the direction of your erosion is consistent. It doesn’t even need to be, because it’s not the only geological process at work, so for a dry river bed to go uphill every once in a while is not something very notable. If your river does that… well, there’s a problem.

I’m not sure what you mean by 2D blocks. For all I can tell NMS terrain is a 3-dimensional slce out of a 4-dimensional function. But yes, a system of rivers that doesn’t take context into account is certainly possible. Not even that hard to do. But is that really desirable? Are the occasional gorgeous visuals really enough to offset the jarring feeling you get everytime you see a river flow uphills (which will be lots of times), or potentially even hanging in the air?

Again, not sure what you mean by 2D block. But yes, features strictly isolated to a small area of terrain would be possible, but again I’m not sure it’s worth the effort. You’d have pretty nasty pop-in too, because the entire area needs to be known before the ever-so-small river can actually be generated, which means that you might suddenly find yourself submerged when you were walking on dry land just a second before…

Hmmm… there migh be a misconception about voxels here. Voxels are not directly related to the 3d rendering process (the rendering process itself happens on the GPU anyways, there’s no information coming back from there). They are the information that the terrain will be rendered out of, true, but they need to go through polygonisation first. If my assumption is correct that they’re slicing a 4d function, then the voxels are the terrain. There’s no 2d representation of the terrain like there is for example in Elite Dangerous, where terrain is indeed a 2-dimensional map on which every pixel value is the height of that point. In NMS, I would think that the voxels are the output of the function.

I think I finally see what you’re getting at. This is basically what they do for volcanoes and buildings Although they don’t do that in “pre”. they’re just running a different seed and function to place them, in order to be able to query them on demand when you’re triggering a scan or something like that. The trouble with the rivers is that they are comparatively large features, even small ones, compared to these, and buildings and volcanoes can just stomp the terrain flat if they need it, that isn’t a problem. But for a river, you’d again need to know the terrain they are flowing through, so you have to generate that swath at some point before placing the river.

At least, this would be a solution that is indeed theoretically possible. The question would become, how much CPU does NMS have left to spare to generate the necessary terrain swath when you’re getting close without freezing up the simulation and without producing hte pop-ins I mentioned above. I could very well see small lakes being generated this way, all they’d need to do is dig out some terrain and fill it up with water. Although even that is already much more expensive than the flattening for buildings, because you now have to trace the circumference to find the rightt point for the local waterlevel.

But yes, in this form, it is “merely” a question of available resources, and not entirely impossible.


I would like to reply to the wonderful responses:

@gunnar thinks that perhaps a vast procedural basin would require “storing a huge amount of [data].” @stryker99 feels that “with the insane amount of [data already] on every planet,” the concept of rivers feel like “a drop in the ocean in comparison.”

I personally remain undecided as to how much data would be involved, as the actual riverbeds would likely be a layer of terrain, as is what we already have with dried riverbeds, (See OP). If the water is just a ‘static’ (unmoving) layer of terrain (rendered to look like water; like how the oceans look and you can swim in that space), if that’s all there is to it, then I wouldn’t think this would be any much bigger an amount of data than what we already have. But IF they managed to actually pull off a complex procedural system, of direction (process), intensity (process) and flow (process), I would think that the more processes that are added to the procedural system, the more data that would be involved. — Of course, as @gunnar states “the whole thing about procedural generation, [is that] the formula does not change, so the result is always the same. The place every part gets is calculated from that formula. That’s why this game is [so much smaller when compared with other games.]” — So if the entire water basin is just based on ‘one super formula,’ the actual data involved may not be all that big, and the resulting file size increase may not be, either. Granted, how complex a system? How intricate, exactly? What would the overall ‘processing impact’ be on gameplay?

Will the GPU / CPU / FPS / etc. spike, trying to load and render, and maintain its process(s)? :thinking: IDK

It does appear that it would need to be a permeant fixture in the terrain, normally these render in post. Such as the dried riverbeds, rendering in post (as a terrain layer) and then superimposed into the terrain. Sean describes this process in his video, when referring to those Perlin Worm squiggles above ground. He also states that complex cave systems are also based on Perlin Worm terrain generation, but underground. Therefore, the dried riverbeds are also based on Perlin Worm, but they actually snake along the terrain.

I tell you, all those dried riverbeds need is some water. :sweat_drops: So close, Sean. :pinching_hand: So close!

I agree with you 100% ! — It’s just that Sean and Inness stated clearly (in their GDC 2017 videos) that they run into this issue with voxels not being able to communicate, unable to get the context of each other.

Why? Because… The dry riverbed is only hugging close to the terrain due to rendering in post and then being superimposed into the terrain. And you might be able to superimpose ‘stagnate’ (non-moving) water into the riverbed, but due to the problem of voxels not being able to communicate with one another, you can’t get the context required in order to be able to animate flowing water that logically follows the terrain.

Dear Sean and HG, couldn’t these Perlin Worn riverbeds be filled with (at least) ‘stagnate’ water? — In a later step, could the ‘derivative’ (which looks at the angle) be used for ‘intensity’ of flow? Maybe even ‘direction,’ in the sense of ‘around a river bend?’ — Granted for flow, we still have this issue of voxels.


It would appear to me that we need a ❛next gen voxel,❜ for next gen apps, that fixes this problem!

Can HG (themselves) fix this ‘problem with voxels?’ :face_with_monocle: Can someone else fix it?

You’re still not understanding the gravity of the problem. The question is not where the river is now. That’s easy to determine. The question is where it comes from and where it goes.

So you have a nice bed here and you fill it with water. Even stagnant water. Great. But 500m further down, that bed teeters out into… well, into what? A ditch? a valley? You don’t know. Soooo… you have that water, even stagnant water, in there, and now it just… ends where the bed ends? That might work if the bed teeters out softly, but what if the terrain just breaks open into a wide valley?

Now either the water that is in the bed just stops, forming a Mosaic wall, or it just continues flowing as if it was in the bed, in which case you get an Abyssian waterhead, or it does what water is expected to do… it fills the entire valley up to the level it had when it left the bed. But… this water is now a product of you having walked along the river bed that was continuously filled up as you went. You approach the valley from the other side without encountering that river, now the valey might be dry. Or it might have water even higher.

A river is the result of cause and effect, and you just don’t have these in an engine of this kind. You either know where it starts, where it ends and what it does in the meantime, or you end up with a hot mess.


Are you perhaps referring to the tile-based method of procedural map generation?

That exists as well, but I‘m not sure that is what Sean and Innes are talking about.

In games like Diablo, each 2D tile of the map was pre-created by developers, and some elements (crates, plants, lights, debris, rocks) were randomly added into pre-created slots. (To use your PRE terminology). But in NMS, only the elements (crates, plants, lights, debris, rocks, space stations, trade stations, etc) are pre-created, the rest ist generated, that’s what they are talking about.
I haven’t fully understood it myself yet (I can generate hilly Perliny landscapes with Perliny trees and a fixed water table, but I haven’t figured out yet how I would add rivers to that. Or even how I can simply place a building so it faces a beautiful view or a street, and does not have its door obstructed by a hill, or its driveway ending in front of an abyss) :joy:

I also considered going back to the old 2D tiles method, it’s easier to handle.


That’s one more of those things that you don’t think about much in advance, but then suddenly realise that it’s impossible :rofl:
I think it’s mostly realisations like these that gave HG such a hard time during initial development (Oh, we thought we’d cross that bridge when we get there, but now it turns out there’s no bridge…). It just takes a while to develop an intuition for procedural systems…


2D Terrain Generation

I’m refering to whatever Sean Murray and Innes McKendrick (((THE DEV))) where referring to. :man_shrugging:

Sean Murray (a key developer of NMS) states in his GDC talk that the terrain in NMS is made up of not just one terrain layer, but numerous math(s) and terrain layers all working together at the same time. I’m hearing, “I do not know what you are talking about,” when I describe 2D terrain gen existing in NMS. So what I’m echoing is the words of Innes McKendrick (a key developer of NMS) from his 2017 GDC talk…


Innes McKendrick states:

So, to break that down, to how we actually do it, let’s start with 2D terrain generation. And so our first stage, as an optimization, more than anything else, but also because it simplifies things conceptually, is to block out some shapes in 2D. So, we’ll split areas into mountains, into maybe some smoother plains down there, there’s a river on there, and this doesn’t come out as voxel data, this is just a series of values that are saying the extent of which this voxel is a mountain, or is smooth, or is a river. So it’s a much larger amount of data, but we’re only retaining it for a short period of time, until we go on into 3D generation. And we do this essentially for each voxel column. So if you imagine we were generating on a flat plain, it would be across the x-edge, but because were generating on a sphere, then it’s across the surface of our sphere, but without any height. So what comes out of this is like generating a height map, you know, without any overhangs or caves, we have the heights of hills, things like that.

So according to him, (and he should know best, as a key developer of NMS), what he describes sounds like a 2D layer of ‘blocks,’ but not small like a voxel, instead large sections (or “areas”) where terrain will span out, or render. And he states that these 2D blocks, (or “areas”), (or “shapes”), load before the voxels and 3D gen load. So we have to factor in that stuff like this is at play, as well. And this also means that such is another ability that can (potentially) be utilized for our (experimental) concept of procedural rivers.

I don’t know @jedidia, impossible or improbable? :thinking::sweat_drops: What could they have done differently?

What makes it ‘procedurally’ possible, across 18 quintillion planets?


Ah, you’re refering to the initial area mapping, fair enough. Unfortunately, that’s too coarse for this kind of work, I’m afraid. It’s great to modifiy or inject functions based on what “terrain type” they’re on, but they don’t tell you what the terrain will look like exactly, which is kind of what you need to determine waterflow. In fact, this part of the generation doesn’t even involve elevation.

Now, you could do something like rivers with this by forcing a function operating on the “river terrain” to dig under the global water level. It’s likely they’re even doing something like that on some planets that have quite a “rivery” look to them. But it won’t give you any flow. Will give you a couple of weird canyons every now and then, but that’s survivable.

Ironically, the same thing in procedural generation… If you cannot solve a certain problem consistently, but only probably, it is certain to not work in some circumstances that your engine produces. If your solution is improbable to produce the right result, it,s no solution, because while the engine is guaranteed to produce some good results every now and then, most of it will be garbage, and therefore you can’t use the solution. Unless the garbage ones don’t matter so much, in which case it’s not really a solution either, it’s just something that produces “a cool thing” every now and then…

Also, of course, every problem in procgen can be solved with the liberal application of CPU power and storage. But then, that would defeat the point of procedural generation…




That’s gorgeous! Is it the back of your place? Some nefarious artistry?

Speaking of artistry, I have my oily mess drying before I DARE try anything with acrylics, and a study for another painting started. One more after that. For now. :sweat_smile:


The lake in front of my base. Going to make some changes. Will update the pic sometime today.


The Voxel: Context vs. Derivative

What do we know about the ‘Analytical Derivative’ maths function?

I ask this question because Sean Murray and Innes McKendrick, at GDC 2017, clearly describe this problem with rivers of water flowing, that while you can’t get ‘voxel: context’ (due to it leading to an “intractable problem”), you can get ‘voxel: derivative,’ which already enables caves, erosion and rivers.

Nobody’s words could be more important than Hello Games, the very developers of No Man’s Sky.

So notice what Sean says: “It’s a real problem if you’re generating caves, or your generating erosion, or if you’re generating rivers, because one feature needs to flow into another.” — But, Sean… We do have caves with one feature (terrain) flowing into another (terrain), and we do have erosion with one feature (terrain) flowing into another (terrain), and we do have rivers with one feature (terrain) flowing into another (terrain). — So then…

Why not have rivers with one feature (water) flowing into another (water)?

Consider his words carefully…

At 30:08…

❛Analytical Derivative.❜ This though, this is the good stuff. Right? So this is really important, anyone that’s doing noise and is not using this absolutely should. Right? So, what we want, is to have our features change in relation to different octaves of noise. Like, that’s what happens in the real-world.

So, I’m going to explain a thing, but it’s probably really obvious: We have this massive array of voxels, like billions of them, and we are going to generate one voxel. So we’re just going to pick an x, y, z position, somewhere in the universe, and we’re going to say, “Fill in this voxel.” And when you do that, you can not, as part of the equation, you can never query the voxels round about you. Because if you had to do that, to generate this voxel here, like in this world, to generate this one, you needed to ask this one what was in it, then that cascades. Because to generate that one, you needed to ask that one, to generate that one, and it goes on and on and on, and it’s an intractable problem. So because of that it’s really hard to have erosion, which is what you really want, because in order - it would be simple to have erosion if you knew that when you were generating a voxel, that it was on a mountain, or that it was high up, or low down, or that it was perhaps at the bottom of like a lake or something like that, but we can’t find that out because you can’t ask any of the voxels around you.

And it seems like this so much unsolvable problem, and it’s a real problem if you’re generating caves, or your generating erosion, or if you’re generating rivers, because one feature needs to flow into another, and it needs context, and you can’t get that context. But what you can get is the derivative. If you had that, then you would know the rate of change of noise. At that point, when you have that, you know that you’re on a slope, or you’re on a flat, or you’re turning upwards, or your turning downwards. And if you have that knowledge, and you can return from Perlin Noise the Analytical Derivative, then you’re going to be able to have certain features much more prominent on slopes, so you can make them much more noisy, or much less noisy. You know, if you were in a desert, then slopes are very smooth, if you were you know in the alps or something like that, then slopes are very craggy and noisy. And that’s exactly what we need, and it’s an awesome trick. And once you have that, then there are so many things you are enabled to do. We also, from this Uber Noise function, we need the Analytical Derivative returned, from Uber Noise. So even after it’s calculated, all this Billow and stuff like that and Rigged, we need to know that Analytical Derivative, so that we can feed it in further into other things.

Notice that Sean used the ‘Analytical Derivative’ maths function, as a workaround solution to caves and erosion, and places rivers in the same statement, as if hinting at this being the solution to water flowing.

I believe they are so close to figuring this out. Being that these 2017 GDC videos are now 4 years old, which was only 1 year after the game released, I wonder if they may have even figured it out by now.

According to Sean…

The derivative allows you to, “know the rate of change of noise.”

“At that point, when you have that, you know that you’re on a slope, or you’re on a flat, or you’re turning upwards, or your turning downwards.”

And that’s exactly what we need, and it’s an awesome trick. And once you have that, then there are so many things you are enabled to do.


That really was a stroke of genious they had there, but unfortunately it’s not enough for making rives.

To give you an idea, to generate the terrain, you run x/y (usually longitude/lattitude) through a mathematical function that will then give you a single value back, which you usually interpret as your elevation (I think HG actually queries X/Y/Z and the function just returns a boolean of whether there is a voxel there or not, since the “2-d version” doesn’t give you caves or overhangs or floating islands and stuff, but I’m also pretty sure they are using a traditional elevation function for the bedrock).
In any case, calculating the derrivative of the function used at these points will give you the slope of the graph at this point. In other words, without looking at surrounding voxels, you get an idea of how steep your terrain is at this specific point.
This is obviously very useful for a number of reasons, and when hearing it first, you might think that this solves the issue for rivers. Since you know the slope, you know what direction the water would be flowing at this voxel, right?

Well, yes, but that was never actually the problem. Knowing what way the water flows on that voxel still doesn’t tell you where the river goes down the line, and it also doesn’t tell you how wide it is, and if the current voxel is in a channel or on a grate. You may tell which way water would be flowing if there was water there, but it still doesn’t tell whether it makes any sense for water to actually be there.