Culling, normals and face winding

The internet is an amazing resource for programmers, especially when using a language or platform that has a massive community behind it. It’s almost guaranteed that someone has experienced your current problem and found a solution to it, a solution that’s shared on a forum post or mailing list somewhere. Unity3D is one such platform.

Unfortunately though, if you don’t use the correct search terms you can end up going in circles for a while and such a thing happened to me last week when trying to extrude my OSM buildings into 3D. I kept telling myself that it was not important for gameplay (and it isn’t) but I was so curious to see what it looked like that I’ve spent just over a weeks worth of free time trying to implement it.

Inevitably trying something new (or even doing something that was done ages ago but is now being reimplemented in some new manifestation) results in all sorts of problems, puzzles, frustrations, mindless hacking and misinterpretations.

Mesh extrusion should be easy, right? Just duplicate the initial mesh, offset it then create new triangles between the two layers to create the walls?


Fun with normals

It turns out that doing a ‘sandwich’ style join between two layers of the 2D mesh (one at a y offset to the other) was a bad idea because of the way shading (from lighting) is done in the grahics pipeline. If there is only one normal per vertex, the buildings would look like they have rounded edges instead of flat shaded (hard) edges. What’s required is to have three copies of a vertex in order to hold the three normals that represent each face. Consider a cube. There aren’t 8 vertices, there are really 24 vertices with 24 corresponding normals (3 per corner).

I had to break the code down into its most basic form and try an extrude just a box into a cube so that I could figure out where my code was breaking.

Fun with normals part 2

Oh, so Unity uses a left-hand coordinate system? That right hand rule for cross product I learnt at Uni and is all over the internet doesn’t actually apply here? Grrr. If I’d been more diligent I’d have noticed that the Unity3D docs clearly state the use of left-hand rule when using Vector3.Cross but it took a while to ‘get it’. My Vector3 normal = Vector3.Cross(b-a, d-a).normalized; should have been Vector3 normal = Vector3.Cross(d-a, b-a).normalized; (I didn’t want to multiply by -1).

With the normals facing the wrong way, the walls would look dark despite the intensity of the lighting.

Fun with normals and culling

Upon investigating the ‘why are the walls dark’ issue, I experimented with some Unity3D provided shaders, but oh no!

Why oh why are some of the walls of my buildings being culled out?! The normals are definitely outward facing!

I stumbled across this forum post (see, someone had a similar issue). Normals are only used for shading and colour, the triangle winding order determines what triangles are back facing.

*sigh* I felt like I’d had that very issue before when my collision meshes didn’t work; the triangles had to be drawn with CCW winding instead of CW (or was it vice versa) for the raycasts to ‘hit’. Anyway, duh! I felt stupid for wasting an evening.

Fun with normals and culling part 2

But holy crap! It still didn’t work! Some of the buildings had walls disappearing but others were fine.

Think, think…

Oh, the OSM data has some polygons defined in CCW order and others in CW order. I had to make them use the same ordering. More searching and I found this. Yay for pseudo code. It works. Ship it.

And now for a screenshot.

Oh look, I’ve got placeholder art from Shenandoah Studio’s Battle of the Bulge; I hope they don’t mind. Their games looks amazing.


Sphere Casts

There are so many instances where I write a pleading forum post asking for suggestions/solutions to a problem I have only to discover the answer soon after I post it. I’m hedging bets in a way, hoping that someone else has a solution in case I don’t eventually find one myself.

This almost happened again in regards to collision detection in Unity3D. I wanted to do something that seemed really basic: find out when two collision meshes collided (overlapped). You’d think this would be simple, right? Nah, maybe… kind of…

After the initial hunting around the documentation I tried implementing the the OnCollisionEnter/Exit approach. Mmmm, didn’t work. So I hunted around the forums and found this.

Oh, so I need rigid bodies to detect a collision? Seriously? That’s a bit overkill when I’m not using any physics.

Oh, wait, it turns out I should have been using raycasts.

That I can do because I’d been using them for touch selection (mouse picking).

…but oh, wait… What’s this sphere cast?


SphereCasts are just fat raycasts but they feel like a little heavy handed for detecting collisions. Every time a unit moves (such as a tank or infantry) I’m having to sphere cast its bounds against the 2D world to see if it hits anything. I’d rather the engine have been smart enough to tell be there was a collision rather than me having to ask. I feel like I’m have to hand-hold it.

In my case I’m trying to determine if a unit is inside one of the OpenStreetMaps defined areas such as a building. If I was having to build this system from scratch I’d use the fantastic Sort and Sweep algorithm that is described in Real-Time Collision Detection by Christer Ericson but I have to trust that Unity is using some cleverly optimised collision detection algorithm that quickly determines collisions between sphere cast and the collision meshes.

That’s why I’m using an engine. It look quick a while to make that algorithm work for Mobile Assault.

Version Control in Unity – RTFM

I could probably create an entire RTFM series because of my terrible track record of ‘getting it’ the first time around. I could have sworn I’d read this Unity article about version control, especially considering I’d changed the settings so that the .meta files showed up in my project. Strangely though, I somehow didn’t come to actually commit those .meta files to version control despite the article explicitly stating to do so. Perhaps I’d read some misinformed post somewhere else that said not to.

The realisation came when I decided to clone my repository just to make sure I’d been committing all the files I was supposed to have. I’ve had some misadventures with getting Unity command line builds to work and had given up on my continuous integration aspirations; the consequence being that I never got to the part where I could make the project build from the files submitted to the repo…

…It turns out the repo build was pretty broken.

The primary culprit was all the missing meta files, but there were a few other little tweaks that were required. All fixed now. Unfortunately no one wants to answer my post as to whether or not command line builds are possible using the free version of Unity. At this rate I will have to apply for a 30 day Pro trial to see if I can get it working and thus answer my own forum question.