Tuesday, 18 November 2014

Procedural Planets & Detail & Art Style

Right from the early days of computer games, the choice to use procedural programming techniques for the generation of content in space games has been a popular choice. In my case, with a one man team, this is a rather obvious choice. I'm a one man team at the moment, I don't have time to paint planets by hand. But how do we implement procedural generation of content. As far as I can tell there are two main ways to do it:


  1.  Procedural based textures are created using other software, and baked into a texture for use in a game. Geometry is also baked into a 3d file format and saved. Normals are baked and saved.
  2. Procedural based textures and geometry are generated on the fly.
Despite the savings of work, procedural content generation isn't all sunshine and roses. You can't just take a Simplex Noise function, stick it on a texture and call it done. Sure, if you seed it, you'll end up with different looking planets, and sure you could randomly combine it with other noise functions to get even more variation, but the problem is that this is a _Simplex_ noise pattern, not a _Planet_ or _Asteroid_ or _Star_ noise pattern, and so often the results look very different the to pictures of real planets, asteroids and stars which we are familiar with. So, the next logical step is to create our own planet noise functions. Because I'm not going to implement a simulation of geology, planetery collisions and the formation of the universe, this planet noise function will be limited to imitating a particular style of planet, or perhaps even also style of geology/biome. So, in the end, to create a convincing and large universe for the player, a lot of these noise functions need to be created by the developer.

So we come to choice 1 or choice 2. I think the choice comes down to tools available, and release size, and detail level. Well if we go with choice 1, fantastic tools for creating noise functions already exist in other tools, with real-time feedback. Here is an image of a moon, which took me all of 15 minutes to create using Blender's Cycles renderer which has the ability to define textures using procedural textures and maths. If I spent a few days creating planets using this method, I could get several hundred decent quality textures for planets.




The problem with this is that it requires you to bake the result into a texture to be distributed with the game. These textures can be huge. They are also limited in detail, excursions to the surface of the planet would require new textures.
Choice 2,  my preferred choice at the moment is to implement procedural noise functions in java, and allow them to be combined and operated upon to produce planet noise functions using a scripting language (lua), and perhaps in the future using nodes in an in-game editor. It would also be possible to create a level of detail for these procedural textures, where components of the noise or smaller features (like roads or cities) are swapped out or ignored based on their frequency. (Higher Frequencies are ignored based on distance). Textures can be cached on the device for improved ingame performance. This would also dramatically reduce the distribution download size, which is certainly an issue for mobile devices.

I created a little prototype of this, in libgdx, a procedural texture for a planet, which dynamically increases detail as you approach it. One issue which I ran accross with this is that the texture is generated on a rectangle, but the planet is a sphere, it wraps around creating a seam where the sides of the rectangle texture join together. Some approaches to solving this included blending the texture. My current plan is to just generate the textures in three dimensions instead, and map them to the texture. I'll need to decide on a mapping scheme which doesn't distort the poles too much.

Another thing which might be cool is to work out a function for the texture map which represents the distance from the pole (is it just the y pixels?) which can be used to vary climate by distance from the pole.

I actually really like the pixelated feel, wasn't what I was originally going for, but it just feels great. It's also made me think, high fidelity textures and appearance is going to be hard to achieve in the environment, because the scales are just so huge, and mobile devices are just so limited. If you're having trouble trying to trick someone into thinking something is real, why not abandon the scheme entirely and embrace the unreal.

It's always been a dream of mine to be able to fly ships down to the surface with the atmosphere screaming past, managing the descent trajectory to arrive at the target on the surface without burning up. I've seen a couple of games which do this, or purport to do this. There are a number of key issues I've noted so far in the ones I've seen working...
Gameplay Issues...
detail on the planets is extremely bland and limited. They've created a procedural height map, some textures and left it at that. This leaves no recognisable landmarks for navigation close to the surface. Games like minecraft prove that even extremely simple procedurally generated landmarks can be unique and recognisable. Tall tree looking things standing on a blocky cliff overlooking a blocky sandy beach, is technically extremely simple to do, and is very recognisable. There are many scenes I won't forget in that game. One needs to provide these recognisable and unique landmarks for the player to have an immersive experience.

At the moment I've just got this rough idea of having some form of extremely simple pprocedurallygenerated objects on the surface like building, and plant distributions. Extremely simple randomly generated building and plant shapes

Technical Issues...
breaking the planet surface up into chunks for terrain mesh, and texture generation will be a bit tricky.

I think I've probably created enough prototypes for now, then next stage planned is to work on the basic 3d engine in libgdx, creating my own scenegraph allowing me to have the world coordinates of the mesh instances in the scene to move independantly of the camera, and allow the origin to be placed on the nearest planet, or even following the ship, rather than being locked in one place, and limited by how far we can move the camera/player from the actual (double precision) world origin due to floating point innacuracies of opengl. The other task will be to implement two rendering passes, one for close objects, and another for far objects, to ensure that we get double the precision out of the depth buffer to allow both very close and very far objects to be rendered in a single scene without encountering z-fighting.
Then the next part, building a procedural texture generation engine, along with lua integration, with realtime updating of the scene.
After that, bring in the gravity, some basic controls and we've got version 0.01 of the game :)

Touch Prototype Orbital Mechanics

Over this past year, I created a little touch prototype game using the Kivy framework for python.
One of the major advantages for using python is that it's a great feasibility test for the motion integration algorithm.

If I can get realtime floating point calculations of the RK4 algorithm done in the notoriously slow python running on a mobile device, then there should be plenty cpu of headroom for the game with the switch to java.

As outlined in the previous post, I decided to go with a calculation buffer, with linear interpolation between the points of the buffer for display/playback for the orbital motion. Firing the engines causes a recalculation of the buffer as shown in the screenshots below.






Control of the ship is by pressing and dragging on the screen to create an acceleration vector which is used to orient the ship and determine the engine power.

The github repository for this prototype is here - https://github.com/kellpossible/KivyOrbiter

For a bit of fun, I also implemented a randomly generated star background. It might also be fun to use some perlin/simplex noise as a distribution function (or brightness function) to get some patterns/coherance with the stars.

I got the orbit calculations running in a separate thread, but python's threading sucks bad because of the GIL (Global Interpreter Lock). Would have been better off keeping it in the same thread and using my own scheduling/resuming. Anyway, after some profiling, got the performance to acceptable levels on the Samsung Galaxy SII

Takehome points:
I'm really not sure what combat would be like with this orbital physics, I'm thinking probably very confusing. We'll have to see I guess. It's certainly fun, and very intuitive for someone who's played kerbal space program or orbiter to manipulate the orbital trajectory using this control scheme. The point of being able to enable a mode to counter gravity (anti-gravity) at the expense of fuel should solve this problem, and allow a gradual progression. Also, slowing down game speed/decreasing the gravity, and allowing the user to control the speed using warp might also be a good option to give the player more time to think about combat or navigation before impeding collisions.
I might actually give this a go in the prototype. See what kind of a difference there is in fuel use.

Next up is the discussion about procedural content generation for the planet and star surfaces. I had a go at building a prototype for this in Java using LibGDX, which I've chosen as the platform for the actual game.

Monday, 28 July 2014

Orbital Motion Integration

I've put quite a lot of thought into what method should be used to calculate the motion of the ships and bodies for this game.

Initial Thoughts

Speed 

The output of the motion integration system needs to be fast enough for realtime physics calculation, on limited hardware (phones/tablets). It also needs to be fast enough for realtime orbit prediction, with some form of larger/adaptive timestep.

Stable Orbits

Some bodies require very stable orbits, and only have forces applied on them by the body they are orbiting. For example, planets, space stations, moons, etc which could all be simplified to two body gravity problems to provide a form of artificial stability in the simulation, so long as the integration method is accurate enough to reproduce the underlying stability of the orbit.

Unstable/Variable Orbits

Some bodies can have many different forces applied to them, for example ships, asteroids, debris.

Conservation of Energy

If an orbit trajectory calculated in a two body problem is unstable, this might be slightly offputting at worst, and could even be put down instability of the underlying system (nearby moons, or the sun, changing the orbit), but if there is little conservation of energy, the body in orbit will soon crash into the surface.

Summary

The perfect motion integration algorithm would be one which is very fast, very accurate in both the two body problem, and the n body problem, and finally fairly simple and easy to extend (with atmospheric, propulsion, collisions, explosions and other interesting effects, not just gravity).

Testing Results

With that in mind, I set out to find out what method (or combination of methods) I should use. I did this by writing a python program as a demonstration and benchmark for the different popular motion integration algorithms. For the initial parameters/situation of this benchmark, I used planet earth, with the ISS orbiting on a slightly unusual and catastrophic elliptical orbit, like something out of the movie "Gravity". Credit goes to matplotlib for helping me plot these pretty graphs.
For all but the Kepler example, the orbital trajectories have been calculated and plotted over a period of 100000 seconds, with the same timestep of 100 seconds.

Euler


Verlet


Time Corrected Verlet


Runge Kutta 4 (RK4)


Kepler's Formula




Performance Benchmarks

Method Name Results
Euler 450884 function calls in 0.366 seconds
Verlet 454897 function calls in 0.387 seconds
Time Corrected Verlet 1835832 function calls in 1.638 seconds
Runge Kutta 4 1799879 function calls in 1.548 seconds
Kepler's Formula 106710 function calls in 0.1112 seconds


Discussion

Speed

As expected, both the Euler and Verlet numeric integration methods are much faster than the Runge Kutta 4 method. Kepler's formula requires an expensive precalculation step for each change in the orbit parameters. It is the fastest here because the benchmark is only for plotting out the orbit given the current orbit parameters. It is around 14 times faster than RK4 the only numeric method with comparable accuracy.

Accuracy and Stability

For the two body problem, there is little doubt that Kepler's formula will be the most accurate, because it solves the problem algebraically, and there is no propagation of error between orbits. 
The attempt at time corrected Verlet integration method which I used, is obviously the worst, and with the fact that this one takes the longest, I see little reason to use it, although there is a good chance that I've made a mistake in its implementation.
RK4 as shown here is very stable, and quite accurate, but is 4 times slower than Euler and Verlet. Available literature suggests RK4 is still more efficient speed/accuracy (otherwise why would anybody use it, rather than just decreasing the timestep on either Euler or Verlet).

Conservation of Energy

They all seem to do a reasonable job at this I think (apart from time corrected Verlet), reasonable enough for the purposes of the game. Unfortunately, while this works out well for gameplay, the projection of orbits seems heavily affected by the accuracy. Stable orbits look a lot neater, and are a lot easier to comprehend. If unstable (or inaccurately calculated) orbits are to be commonplace, then a method for perhaps colouring the projected path by time, or allowing the user to select how far ahead they want to predict the orbit will be necessary.

Conclusion

For static scenery objects such as planets or moons, which can be treated simple two body problems, the obvious choice is to use Kepler's Formula. This requires an expensive precalculation step, but seeing as the orbital parameters are unlikely to change during gameplay, this shouldn't be a problem. 
Static scenery objects are expected to be on extremely stable orbits, which makes Kepler another good match. There could be many static scenery objects, yet another reason to use Kepler's formula to quickly plot and calculate the motion of these objects.

For dynamic objects such as ships, debris, asteroids, etc, it is have a much harder problem. Not only is this potentially an N body problem, or at least a two body gravity, with extra applied forces (from thrusters, collisions, friction etc), but the orbit predictions are likely to be unstable, no matter which numeric method I choose to calculate them. This means that unless I use something like patched conics (which I'd like to avoid if possible), it's impossible to use kepler's formula to predict the orbit of this body when it is in freefall (only gravity forces being applied).

Calculating orbit trajectories using larger than realtime timesteps is required not only for showing the player where their ship, or other bodies are headed, but also as a form of precalculation for bodies in freefall, whether that the body be away from the player, or the player is using warp drive on their ship.

Large timesteps can be used to plot a course over a large period of time, but some form of interpolation will be required between the calculated points for a continous position of the object over time. Linear interpolation might be good enough for positioning the object on the screen, but it probably won't be good enough for the initial velocity required if calculation is resumed (object enters player's influence, or player exits time warp)
One possible solution is to use some form of polynomial/bezier interpolation between the points and apply it to velocity as well. Another solution, which I think might be better, is that when a body exits interpolated motion, it recalculates it's current position and velocity starting from the previous plotted point, but using a much smaller timestep.

So based on these results I've decided I'd like to use Kepler's Formular for static scenery objects, and some form of Numeric Integration, whether that be RK4, Euler or Verlet remains to be seen, but if performance isn't an issue, then preferably RK4.

In order to deal with the numeric calculations efficiently, each body will have a required timestep accuracy, which is based upon the player's position, whether the player is using warp drive, and various other influences. This timestep is used to precalculate an interpolation buffer, which is reset if the required timestep shrinks, as described in the previous paragraph.
Each set of dynamic objects which can interact gravitationally with each other should be self contained, and it should be possible to put each set into a seperate thread. Most dynamic objects don't interact with each other, and so it should fairly simple to make this suitable for parallel processing.

Most modern phones and tablets these days are quad core. A single thread/Core will be dedicated will be dedicated to input, bullet physics, graphics, and game logic. This leaves three other threads to be split over motion integration, and other system simulations.

Making the motion integration algorithm play ball with a general purpose physics engine such as Bullet will be the subject of future posts, but I'll record here a few thoughts about what I've been planning to try and do. The main thread will feed values out of the interpolation buffer into bullet, calculating the required linear velocity to reach the next interpolation point at the correct time. If a collision occurs, interpolation buffer is reset, and the resulting change in velocity is fed back to the motion integrator to be used in the initial parameters for buffer recalculation.

Scheduling will be tricky. When the main thread goes through and looks for values from the interpolation buffer, if any buffers are empty, thread/s are spawned, and they are immediately calculated to a reasonable distance (a couple of timesteps perhaps), pausing all the continuous threads operating on the buffer. This is because if the zero value of objects in the buffer are required but not available, this will block the main thread, and thus takes priority over all other calculations.

Anyway, that's enough for now. Next post I'm hoping to have a simple 2d touch example on Android using Verlet or RK4 in realtime, ready to show off.