A recent modelling project I've worked on. Creating 14 high resolution models from photo reference, and some my own design. Modelled in 3dsMax and Maya. The most complex models were the trains, and the robot heads. The robots had to look like the era they depicted, but they couldn't look exactly like anything else for copyright reasons. Some of curves on the bullet train really tested my subdiv skills. Overall it was fun to get back to modelling for a bit, but I'm not sure I could do it for ever.
Here's a recent project I was involved with at Analog, only for the last few days mind, but some useful skills brushed up on. I tracked the face of the robot for the live action sections, so the digital eyes could be added on. I also object tracked the robot arm and added CG missiles, so fast you'll miss it! Anyway, it was fun re-learning PFTrack again, now its become 'node based'.
Here's a recent project I created at NineteenTwenty. I helped with the clouds in the opening shot. Created in Houdini from a pretty comprehensive set of custom tools I worked on, which helped get the correct look of shapes and types. I created a mini library of Cumulus, Cirrus, and various trailing clouds. Shot was rendered in Mantra.
I also lit and rendered the opening of the roof, also rendered in Mantra.
Here's three videos of a few customs tools I've completed for Houdini Flip simulations. The first isn't a tool as such, but a method for getting custom slip fields into the solver. I've also written a surface tension OP, which blends the nearest surface into the calculations, so you don't get complete sphere's. Lastly, a Curvature Attract OP, to help the fluid stick when flowing around curved objects, the result is a sticky feel on the underside of objects.
I've been knocking this shader around for quite a while. So now it's time to release it into the wild. MicroBevel is a fancy rounded edges shader. Very useful for getting extra detail into models on sharp edges. If you already have modelled bevels, it won't work. Works pretty well on CAD though. But sometimes all the weird flipped normals can cause issues.
It works by raycasting, and returning a normal average of hit polygons, then blending the current surface normal and the found normal between a radius. Thus creating a rounded edge.
As with everything, it has it's limitations. 6 samples will be fine for small bevels. If you go too large you will find you need to use RandomSamples, and increase the Samples upwards of 6 to get a smoother result. But then its supposed to be micro, so smaller radius works much better.
To use it with the Principle and Classic shader, you will need to plug the nN into base and coat normals.
There are some Debug export' you can use to see where the edges are, and normals.
I've left the HDA unlocked so you can root around inside. These are not bullet proof, and I'm not 100% certain it will work in all scenarios.
Here's my new Mantra edges shader. I've left the HDA unlocked so you can root around inside. These are not bullet proof. So understand, if you use them, and they break, you'll have to dig around yourself.
If you want to vary the edge width using a map, you will have to activate the map toggle. I couldn't get the 'if connected' thing to work for auto map plug in, missing some voodoo i think.
Settings are pretty simple. As standard the Shader takes regularly spaced samples in a circle. So 6 samples is ample. If you want solid lines, tick the box. If not, it returns a gradient, and then you might need more samples, and random samples to get a clean gradient.
Angle limit clips the returned Convex and concave angles.
To make a dented edge, just plug in a turbulent noise, fit range to vary the radius. On the returned edge, you can then use the float mask to add in smaller dot/dents for more detail.
Surface offset plays an important part, if it's very low then you'll get all the tiny polygon angle changes returned. These can be clipped slightly with the angle limit, but sometimes it's easier to just push the offset further away.
Here's a compilation of updated Mantra Shaders. I've left the HDA unlocked so you can root around inside. These are not bullet proof. So understand, if you use them, and they break, you'll have to dig around yourself.
Technically similar to the one's I've done before, but now I've swapped crazy matrix maths for the sample sphere VEX, which makes it a bit more consistent, and easier to modify.
Also, when you increase Pixel samples in the Mantra ROP, you can lower the shader samples.
A little tip. If like me you hate these ugly new VOP loops:Not sure what SideFX where thinking. Yes they aren't buried in a sub node, but damn they are messy. When it comes to laying out VOP's I get extreme OCD. Coming from Softimage, this is a mess. Plug anything in incorrectly, it goes berserk, creates extra channels, then just dies.
In my head, this is much cleaner, and easier to copy to other networks. The old Loops:
These are still accessible, as with most stuff, SideFX have kept all the legacy nodes in the background for compatibility, and thankfully we can reveal them.
Open a textport, and type : opunhide, hit return.
Find the one you want, then type it in like this: opunhide Vop for
It will now showup in the tool search in the section Digital Asset. So you can re-activate all the good old loop tools. And escape the spaghetti window hell...
Here's a very useful VOP/Vex node available in Houdini. Sample Sphere creates samples on a sphere. Before now, to sample spherically, I was making a direction then rotating it. But using this is far simpler. With a usample input, it takes a Vec2, or more simply put, 0-1 controls rotating around the circumference of a sphere, and another 0-1 controls the upward vector.
Just put this into a ForLoop, randomise the two Usample Vec2 inputs, and you can sample randomly spherically from a position.
The direction specifies the overall direction that the sphere is on, so if you rotate this, then the sample sphere rotates.
Here's an example fed into a add point, this way you can see the result:
What's the point? I hear you cry. Well, you can add this to a raycast/rayhit, and use it in the shader context to sample the scene at every pixel hit from Mantra. So think, AO, thickness, edges and much more. Here's a demo file: SampleSphereDemo Hip
Recently I've been obsessed with melting statues. Seemed like a fun way to get to grips with flip in Houdini. Here are few of the results. Out of the box there are some issues melting Rigid shapes, the main one is the amount of detail in the original mesh never really transfers through the vdb, particle, and then remesh workflow. To help this I calculated the nearest position to the start mesh for every flip particle, then transfer that data back to the remeshed fluid surface. Using that I could snap the new mesh to the old, and blend to the fluid by distance etc. Works pretty well in the end. Using the nearest surface XYZ distance was sufficient, but it could be expanded more to use a raycast approach in the future.