About
My name is Louis du Mont, I’m a 3D Artist working at a London, UK based company called Formation. I work across lots of different mediums from VFX to interactive digital projects.
The ‘SphereBot’ project was inspired by the stocky structure of a poodle moth, I thought it would make a nice robot mech. I’m also in the process of learning Blender; I’ve explored a fair amount of the modeling and surfacing tools and thought this would be a good opportunity to explore the dope sheet and curves editor and produce a little animation.Sketching and Blocking
After an initial sketch, I started blocking out the form, built mostly from modified spheres and cylinders. Eager to try out the rigging tools in Blender, I jumped right in and started adding bones, assigning weight maps and setting up IK and constraints. I found the process pleasantly smooth, being able to feel out a lot of the settings via trial and error. Thankfully, the test rigging setups helped inform how I was to model the hi-res mesh, as certain parts had to be reshaped to allow more rotation of the leg elements without having mesh clipping occur.
Modeling with Modifiers
Modelling the final mesh in blender started off slow for me. In the past, I’ve modelled in Lightwave 3D’s modeller and, while the process is quite destructive, it’s very fast for me to put together Sub-D geometry. However, the time I lost was soon made up by being able to mark an edge for bevelling by a non destructive modifier, allowing for quick and easy tweaks without compromising the integrity of the bevel.
UV Unwrapping
The next step was UV unwrapping, which, similar to bevelling, involved marking edges’ ‘seams’ to define the UV islands. I’m still not sure this is the best method but had to apply the bevel modifier and then choose the seams as the bevel would split edges, causing errors. I also found the ‘radivarig’ add-on for straightening UV edges very handy when creating UV’s that would respond well to having thin texture lines applied.
Rigging
With things working out so well while testing the rigging features I thought, now that I had all the geometry built out, I’d jump straight into the rigging. I worked on one leg, setting up the IK chain and pole. I created an IK goal empty with another parent ‘ground’ empty with four further child empties with rotation constraints on two axes, parented to act as goals for the toe bones to point at. This setup allows for the toes to automatically splay when nearing the ‘ground’ empty. I then went about duplicating and mirroring that leg setup for the remaining legs, which, apart from some normals issues, worked surprisingly well.
Surfacing
Time to create some PBR texture sets. I exported an obj from Blender, which handily collapsed the modifiers and armature pose, leaving a posed, reasonably high poly model. To begin with, I wanted to create a faceted, brushed metal-like texture as the base so I used noise and angled directional blur in photoshop, masked with a cloud generated layer that was put through the ‘stained glass’ filter to create the facets. While in Photoshop, I did a quick paint over of the some of the detailing I thought would look nice, then briefly jumped back into Blender to model and create depth maps to use as brushes and stencils when painting.
I find painting texture massively rewarding, having not previously had access to tools with such an instant level of feedback. I imported my brushes and stencils into Substance Painter and started drawing out lines to signify the separate panels from which the robot is constructed. I used the base, faceted map to effect the roughness and slight height of the material and the stencils to add the more designed details, as well as using smart masks and hand painted parts for some aging. Finally I exported all the texture sets for import into Blender.
When back in Blender, the process to set up the material was mainly just plugging the matching image channels into the inputs of the principled BSDF shader, adding an emission shader on occasion.
Photogrammetry
While creating this character, I was using the photogrammetry software ‘Reality Capture’ for another job and thought it would be quite nice to be able to capture the terrain, in part to save time, but also to see how much detail could be achieved on a relatively flat surface which may have significant occlusion issues with smaller rocks.
After capturing 20 or so images of a somewhat rocky area around the base of a tree on the way to work (I’m sure people thought I was a bit odd), I loaded the images into Reality Capture and ran through the fairly automated process for aligning and reconstructing geometry, which spat out a detailed, if a little too heavy, mesh, with accompanying texture and UV coordinates.
The resulting mesh was heavy but still loaded fine into Blender and just about rendered in the GPU’s, coming very close to filling the 8GB of VRAM on the GTX 1080’s. Even so, I created a comparatively low poly mesh to bake in the texture and normal maps so as to have something a little more lightweight to have viewable while animating and to ultimately use for an Eevee render and a Sketchfab export.
Animation
Animation was immensely fun, treading the line between insect and robot-like movement. I planned to have it find something in the terrain, but it wasn’t until quite late that it turned out to be what was supposed to resemble the skull of an advanced humanoid being. The skull was very quickly sculpted and surfaced within Blender using SmartUV unwrapping. I found the f-curves editor a little tricky to navigate around initially but was soon happy to find that many of the viewport navigation methods were mirrored in the f-curve editor.
Rendering
After refining the materials a little, I set up the lighting and imported a 360 cloud environment as well as a cloud background plate that I had photographed previously (these are available on supertextures.co.uk). They’re not amazingly sharp images but feel free to use them for anything with or without credit. I switched colour management to ‘Filmic’, increased the strength of the ‘Sun’ light and started producing test renders. I was particularly interested in getting a multi-EXR sequence into the newly released DaVinci Resolve 15, which now has a Fusion compositing tab.
I was able to import the EXR’s, separate the channels and, with a little help from Maxime Roz’s filmic .ocio file, reproduce a filmic like curve within Resolve. Ultimately I ended up exporting 16bit PNG’s as the EXR’s were quite slow in Resolve. In Resolve I adding a bit of atmospheric mist, a few lens effects and a some colour grading, then exported.
Realtime and Sketchfab
As a final workflow exploration, I adapted the animation into a loop and baked FK animation into the rig. I removed unnecessary geometry and converted PNG textures to a more manageable JPG format, apart from the normal, which tend to show any compression artefacts quite a lot. After exporting FBX and Collada files I eventually realised that Sketchfab accepts Blender ‘.blend’ files, which worked perfectly. Ah well, lesson learnt.
Within the Sketchfab settings I set a custom HDRI environment, which was also used for the lighting. I created a transparency map for the ground plane to give a falloff that, along with the custom background and subtle vignette, I hoped would create a sense of atmospheric dust.