- Procedural based textures are created using other software, and baked into a texture for use in a game. Geometry is also baked into a 3d file format and saved. Normals are baked and saved.
- Procedural based textures and geometry are generated on the fly.
Despite the savings of work, procedural content generation isn't all sunshine and roses. You can't just take a Simplex Noise function, stick it on a texture and call it done. Sure, if you seed it, you'll end up with different looking planets, and sure you could randomly combine it with other noise functions to get even more variation, but the problem is that this is a _Simplex_ noise pattern, not a _Planet_ or _Asteroid_ or _Star_ noise pattern, and so often the results look very different the to pictures of real planets, asteroids and stars which we are familiar with. So, the next logical step is to create our own planet noise functions. Because I'm not going to implement a simulation of geology, planetery collisions and the formation of the universe, this planet noise function will be limited to imitating a particular style of planet, or perhaps even also style of geology/biome. So, in the end, to create a convincing and large universe for the player, a lot of these noise functions need to be created by the developer.
So we come to choice 1 or choice 2. I think the choice comes down to tools available, and release size, and detail level. Well if we go with choice 1, fantastic tools for creating noise functions already exist in other tools, with real-time feedback. Here is an image of a moon, which took me all of 15 minutes to create using Blender's Cycles renderer which has the ability to define textures using procedural textures and maths. If I spent a few days creating planets using this method, I could get several hundred decent quality textures for planets.
The problem with this is that it requires you to bake the result into a texture to be distributed with the game. These textures can be huge. They are also limited in detail, excursions to the surface of the planet would require new textures.
Choice 2, my preferred choice at the moment is to implement procedural noise functions in java, and allow them to be combined and operated upon to produce planet noise functions using a scripting language (lua), and perhaps in the future using nodes in an in-game editor. It would also be possible to create a level of detail for these procedural textures, where components of the noise or smaller features (like roads or cities) are swapped out or ignored based on their frequency. (Higher Frequencies are ignored based on distance). Textures can be cached on the device for improved ingame performance. This would also dramatically reduce the distribution download size, which is certainly an issue for mobile devices.
I created a little prototype of this, in libgdx, a procedural texture for a planet, which dynamically increases detail as you approach it. One issue which I ran accross with this is that the texture is generated on a rectangle, but the planet is a sphere, it wraps around creating a seam where the sides of the rectangle texture join together. Some approaches to solving this included blending the texture. My current plan is to just generate the textures in three dimensions instead, and map them to the texture. I'll need to decide on a mapping scheme which doesn't distort the poles too much.
Another thing which might be cool is to work out a function for the texture map which represents the distance from the pole (is it just the y pixels?) which can be used to vary climate by distance from the pole.
I actually really like the pixelated feel, wasn't what I was originally going for, but it just feels great. It's also made me think, high fidelity textures and appearance is going to be hard to achieve in the environment, because the scales are just so huge, and mobile devices are just so limited. If you're having trouble trying to trick someone into thinking something is real, why not abandon the scheme entirely and embrace the unreal.
It's always been a dream of mine to be able to fly ships down to the surface with the atmosphere screaming past, managing the descent trajectory to arrive at the target on the surface without burning up. I've seen a couple of games which do this, or purport to do this. There are a number of key issues I've noted so far in the ones I've seen working...
Gameplay Issues...
detail on the planets is extremely bland and limited. They've created a procedural height map, some textures and left it at that. This leaves no recognisable landmarks for navigation close to the surface. Games like minecraft prove that even extremely simple procedurally generated landmarks can be unique and recognisable. Tall tree looking things standing on a blocky cliff overlooking a blocky sandy beach, is technically extremely simple to do, and is very recognisable. There are many scenes I won't forget in that game. One needs to provide these recognisable and unique landmarks for the player to have an immersive experience.
At the moment I've just got this rough idea of having some form of extremely simple pprocedurallygenerated objects on the surface like building, and plant distributions. Extremely simple randomly generated building and plant shapes
Technical Issues...
breaking the planet surface up into chunks for terrain mesh, and texture generation will be a bit tricky.
I think I've probably created enough prototypes for now, then next stage planned is to work on the basic 3d engine in libgdx, creating my own scenegraph allowing me to have the world coordinates of the mesh instances in the scene to move independantly of the camera, and allow the origin to be placed on the nearest planet, or even following the ship, rather than being locked in one place, and limited by how far we can move the camera/player from the actual (double precision) world origin due to floating point innacuracies of opengl. The other task will be to implement two rendering passes, one for close objects, and another for far objects, to ensure that we get double the precision out of the depth buffer to allow both very close and very far objects to be rendered in a single scene without encountering z-fighting.
Then the next part, building a procedural texture generation engine, along with lua integration, with realtime updating of the scene.
After that, bring in the gravity, some basic controls and we've got version 0.01 of the game :)
I created a little prototype of this, in libgdx, a procedural texture for a planet, which dynamically increases detail as you approach it. One issue which I ran accross with this is that the texture is generated on a rectangle, but the planet is a sphere, it wraps around creating a seam where the sides of the rectangle texture join together. Some approaches to solving this included blending the texture. My current plan is to just generate the textures in three dimensions instead, and map them to the texture. I'll need to decide on a mapping scheme which doesn't distort the poles too much.
Another thing which might be cool is to work out a function for the texture map which represents the distance from the pole (is it just the y pixels?) which can be used to vary climate by distance from the pole.
I actually really like the pixelated feel, wasn't what I was originally going for, but it just feels great. It's also made me think, high fidelity textures and appearance is going to be hard to achieve in the environment, because the scales are just so huge, and mobile devices are just so limited. If you're having trouble trying to trick someone into thinking something is real, why not abandon the scheme entirely and embrace the unreal.
It's always been a dream of mine to be able to fly ships down to the surface with the atmosphere screaming past, managing the descent trajectory to arrive at the target on the surface without burning up. I've seen a couple of games which do this, or purport to do this. There are a number of key issues I've noted so far in the ones I've seen working...
Gameplay Issues...
detail on the planets is extremely bland and limited. They've created a procedural height map, some textures and left it at that. This leaves no recognisable landmarks for navigation close to the surface. Games like minecraft prove that even extremely simple procedurally generated landmarks can be unique and recognisable. Tall tree looking things standing on a blocky cliff overlooking a blocky sandy beach, is technically extremely simple to do, and is very recognisable. There are many scenes I won't forget in that game. One needs to provide these recognisable and unique landmarks for the player to have an immersive experience.
At the moment I've just got this rough idea of having some form of extremely simple pprocedurallygenerated objects on the surface like building, and plant distributions. Extremely simple randomly generated building and plant shapes
Technical Issues...
breaking the planet surface up into chunks for terrain mesh, and texture generation will be a bit tricky.
I think I've probably created enough prototypes for now, then next stage planned is to work on the basic 3d engine in libgdx, creating my own scenegraph allowing me to have the world coordinates of the mesh instances in the scene to move independantly of the camera, and allow the origin to be placed on the nearest planet, or even following the ship, rather than being locked in one place, and limited by how far we can move the camera/player from the actual (double precision) world origin due to floating point innacuracies of opengl. The other task will be to implement two rendering passes, one for close objects, and another for far objects, to ensure that we get double the precision out of the depth buffer to allow both very close and very far objects to be rendered in a single scene without encountering z-fighting.
Then the next part, building a procedural texture generation engine, along with lua integration, with realtime updating of the scene.
After that, bring in the gravity, some basic controls and we've got version 0.01 of the game :)








