I must warn you, this is a long post and it’s mostly going over how I got to this point, if that doesn’t sound like something interesting to you, you most likely wont want to read it. Do look at the images though 🙂
Generating things procedurally is addictive
The first time I saw something generate from my code, it looked terrible, I mean, really bad. It was a cube that grew other cubes out of the top of it. It made no sense, it didn’t look pretty, but it did act as a test bed for the ideas I had about generation. I discovered Lindenmayer systems (or L-System for short), once I understood the concept I rushed to add it to my cube generator. For some reason I wanted to make cube trees at the time. I wrote a really rudimentary system that worked well enough and produced okay-ish results, but it really wasn’t what I wanted. At the time I was working on my GOAP based AI system for my end of year project, so I shelved the generative project for a while to focus on that.
Since then I’ve toyed with generating things, but nothing major. I met Tom Betts (aka Nullpointer) when I moved down to Brighton, he was finishing up on Sir, You are being hunted which features quite a lot of procedural generation, after talking to him for a little bit I rediscovered a passion for procedural code.
Water – Procedural landscapes
My partner (@RobertaSaliani) pushed an image of monument valley (the water scene mostly) into my hands and asked if I could make it, I wanted a challenge at the time so I said why not and got to work creating it. Everything looked low-poly and I had been meaning to learn how to procedurally generate meshes so I thought it would be a good excuse to make something whilst learning. I set out to create a 100×100 cell mesh with each point on the mesh having its Y coordinate modified by a simple perlin noise function, all it needed was an X and Z coordinate to return its height.
For the water itself, I used that number to define its phase in a Sine wave 🙂
Moving onto generating cities from Open Street Map data
Open Street Map is a fantastic initiative, the data is entirely crowd sourced from people in the local areas, that data is completely open-source too, if you go onto the map and hit export you can define a region you would like and the server will point you to a download zip with all of the information in it. Careful with large regions, those datasets tend to get > 2 GB fairly quickly.
Their data format is very well documented too, so once I had downloaded their data I set about making a renderer for it within Unity. Parsing a huge XML file was a little tricky, but it ended up being easier than I thought, it left me with Nodes (Vector coordinates) and Ways (a series of references to nodes). All I had to do then was render lines for each Way by following along the node coordinates. Initially I used line renderers but they ate up far too much of the FPS, I ended up settling with particles with different sizes depending on the Way type.
After the roads were set-up and rendering fairly well, I decided to tackle the buildings. They are laid out in a similar way to the roads, Nodes and Ways. The only difference is the Ways are closed loops of Nodes for the buildings. Meaning, if you generate a mesh which follows the nodes and then extrude it up (based on the Ways meta data defining the height) you will end up with a fairly nice footprint of the building. It’s not great looking, but it works. Now our maps have buildings!
Now what?
From this point, I really don’t know what to add to the generator. It uses all the data it has, you can pathfind on the roads themselves and click on each building to show their data, it works exactly as I wanted it to. I wanted to build entire cities from scratch, using only code and no predefined data, that’s why I started to build our City Generation tech.
At the time of writing the generator can do three things: Take a chunk of land and separate it into councils, generate a road network to best suit the population density map and render it to screen in real-time. I plan on adding in more features like traffic, buildings (exterior and interior), zoning and extend the planning AI with a more realistic feature set.
Initially the code was a simple branching algorithm, with road generators following the highest density of population and then splitting into two going different directions whenever it encountered a large population. It looked fairly natural but it really didn’t look like cities, it looked more like Slime Mould. I found a paper from Parish and MĂĽller which defined a system that extended upon the L-System and created close approximations on actual city designs, it was a really great read and gave me some ideas for my own system, so I’d really recommend it if you’re interested in learning about this sort of thing.
My implementation was completely broken for a while, producing things that looked like sub-urban towns in the countryside, but nothing like an actual city. Funnily enough I didn’t notice it at all until someone pointed it out, I found the bug and changed two lines of code which then allowed the system to generate fairly nice townscape, but still no cities.
The roads intersecting was starting to become a problem, so I implemented a Quadtree system that allowed me to quickly lookup nearby roads (instead of having each road check against every other road in the area) and do intersection tests before joining those roads together. That solved the overlapping road issue.
So that’s everything right now, I love making this sort of thing as its both a creative and technical challenge which is completely in the hands of the developer. I will be constantly improving this tool and may release it to the public (though I’ll need to figure out the kind of use cases for it before creating tools to interact with it).
I hope you’ve enjoyed the read, it was a long post but I haven’t wrote one in a while.
Have a great day!
Danny
Recent Comments