Owen Powell creates both beautiful and accurate digital 3D maps using Blender and GIS data. Today, he explains his workflow in great detail, originally from BlenderNation, republished here with permission.
I am a 36-year-old GIS Analyst/Cartographer living in the UK. I studied Fine Art at art school, and did a postgraduate course in 3d modeling and animation using Maya. A friend of mine got me into Blender a couple of years ago and I haven’t been able to put it down since.
Following on from the revolution in open source software, open data has revolutionised the Geographic Information Systems (GIS) industry. As well as crowd sourced mapping data such as OpenStreetMap, countries such as the UK have been driving innovation and transparency of information by releasing government sourced data to the public, including a wealth of map data.
As a result an archive of incredibly detailed digital terrain models have been released by the Environment Agency and Natural Resources Wales, captured from aerial LIDAR. Data such as this would have cost thousands only a couple of years ago, and to now have access to such a large amount of quality data inspires me to make things with it.
One of my recent images of Beddgelert in North Wales, combines lidar and topographic data from the Ordnance Survey. (Beddgelert, meaning ‘Gelert’s Grave’ gets its name from a 13th century tribute by a Prince to his dog)
To prepare the model for Sketchfab, I UV unwrapped the terrain and baked the light and materials onto a texture. I then reduced the density of the mesh, which was pretty high poly to begin with, being based on lidar with a point every metre on the ground. I dropped the thousands of trees as this would’ve made the model too large.
To enhance the detail I used the sharpness filter, which really helps to pick out the buildings and roads. In the model I turned off the direct lights, relying on an environment map and the baked light from Blender.
I’d like to develop these 3d maps further by adding text and annotations. In GIS you have an abundance of attribute data stored with the geometry, it would be really cool to import annotations based on data, or upload 3d formats such as KML and ESRI geodatabases to enhance the usability.
The model itself is fairly simple, and a scale representation of the source map data. Most of the work is done in Blender with Cycles materials, lighting and compositing. The main light in the scene is a sun map, with a cooler area lamp plus a light just to reflect on the glossy water.
The main material consists of a diffuse colour gradient based on height, and a glossy shader applied using a texture map to control the mix factor.
In a similar way, I use a texture of the Ordnance Survey woodland layer to control the trees, made using a hair emitter. The scene has 200,000 instances of a single tree, with a colour ramp to vary the colour, and a random size.
I used FME to process the ascii lidar and shapefiles into 3d features and textures. The surface is a triangulated mesh, with roads, railways and buildings draped on it, then extruded. Being part of a single workflow, all features and images are clipped exactly to the extent of the surface, and are to scale.
FME is not free but is worth its weight in gold, as it allows you to integrate data from over 345 applications, web services, and file formats. The FME authoring environment is similar to the node based Cycles interface, but each transformer performs a different function such as clipping, spatial queries (intersect, contains, etc.) and coordinate transformations. Other approaches include the Blender GIS add-on and mesh displacement but this is my tool of choice as it allows me to combine multiple sources of data, create textures and automate a single workflow rather than countless steps.
Finally, I use the compositor in Blender to merge a mist pass and background with the render.