Choosing 3D Software
I want to make cool 3D shit. I’ve heard of some programs like Maya, and Unreal and Blender. I just assume I will take a quick peek at each, read a review or two and be on my way. It didn’t turn out to be quite that simple.
Prerendered
The workflow of these tools requires doing all of the modeling, materials, texturing, and lighting up front, and then rendering, and finally displaying the result generally as a video.
Modeling
Polygons
Blender, Maya or Cinema 4D
Sculpting
Zbrush
Node Based
Houdini
Parametric
Rhino
Animating
Rendering
Arnold or Redshift
Realtime
Realtime renderers, as will be discussed below, do still have quite a bit of upfront work in terms of modeling, texturing and lighting, but they are not pre-rendered. This has one advantage and one disadvantage. The disadvantage is that the quality will be lower than is possible with prerendering. Being real time, and the math involved in tracing light around a scene as it bounces off objects and to the camera, is quite expensive. The advantage, is that if all of the rendering is happening in real time, then it can dynamically react to outside inputs such as keyboard, mouse, joystick, or even sound, Kinect 3d camera data, etc..
Game Engines
Game engines are the first step in moving towards real time. They are, however, primarily designed to make games and other similar interactive content. Their input options are typically limited to game controllers, mice and keyboards.
Unreal
Unity
Interactive Video
Finally we take a look at a set of tools designed for real time motion graphics and interactive VFX. While game engines were designed with gamers in mind, these tools were designed with performers in mind. Be they DJs, rock stars, live tv, events and art installations. There are tradeoffs in every approach, and these tools tend to be a bit more general purpose than game engines in terms of the kinds of things they show, so there is even a bit further fall off in the polish of the output. But a game engine won’t have a built in mechanism to project a rendered scene across a screen or wall, to react to how people are dancing in the audience, or take music as a real time input and dynamically shape an animation with it. It is for this flexibility that these are the tools I ultimately settled on to explore in my art practice.
There are a handful of pieces of software that fit the bill, but the two established heavyweights in the field are Touchdesigner and Notch.