top of page

Manmade (WIP)

Data Driven Procedural Environments

WIP_year_0_cam4_v02_006_TITLE.jpg

Project Description

Manmade is an exploration in creating data driven procedural environments. Manmade uses recorded CO2 levels from the beginning of the Industrial Revolution to 2021 to drive the generation of unique landscape compositions. Each year from 1760 to 2021 gets an entirely unique landscape procedurally generated, surfaced, lit, and rendered where the gestalt serves as a reflection of the data used in its creation. It's interesting that through this process of creating  complicated compositions and environments the values and interpreted meaning behind those source numbers can still be conveyed. When the project is finished collections of images will be rendered exploring each of the landscapes generated for each of the 262 years of CO2 data.

As the bar in a bar graph would have a maximum and minimum height mapped on to the minimum and maximum values of CO2 levels, certain aspects of the the scene are mapped in a similar way. CO2 levels were used to drive scene lighting, manmade object count and scale of those objects, and control environmental conditions like the amount of smoke, fog, and wind. For example higher numbers of manmade objects correlate to rising CO2 levels. When you take all these different results you end up with scenes that take on a very different character at the lowest and highest levels of CO2. So, like the bar graph maps height to the numbers this project maps scene composition, character, and content to those numbers.

The manmade objects used in this work are not chosen to necessarily depict the worst of humanity but rather represent a gamut of human intention. They exist due to our need to control, to protect, to build, to provide convenience, to destroy, to kill, to provide food, to foster progress, to evoke caution, etc. Increased carbon emissions come as a result of the increase of manmade processes (energy production, industry/manufacturing, transportation, etc.) with which we've been able to exploit to accomplished both incredible and horrific things. The effects of the processes which have produced higher and higher CO2 levels still wreaks havoc on the environment regardless our good or bad intentions. Thematically, this is the tie between the subject mater of the images to the data being used.


Landscape
The landscape is procedurally generated through a number of steps by starting with very simple geometry, conversion to hightfields, running multiple terrain simulations to shape the landscape with patterns of erosion, and finally converting back to geometry and reducing the mesh. The large rocks start as basic cubes and spheres and are scattered across the landscape. They then go through a process of subdivision and shaping to give them a more organic form, shattered and broken in various ways, converted to detailed volumes to add variation with various noise patterns, then finally converted back to geometry where the mesh is reduced to make the meshes easier to work with.

Manmade Objects
The charred wood objects are all geometry created procedurally from basic source geometry. A process was created to chop each object up into regular little pieces similar to the patterns found in burnt wood. Layers of masks were then generated and used to drive a mixture of material characteristics like glowing hot embers, blackened wood where the grain is still visible, and white ash on thinner parts of the model that have burned the longest.

Plants
All the non-grass plants were derived from simple single surface plant assets. The simple planar geometry was run through a process to give thickness to the models and convert them to water tight meshes. This creates geometry with variable thickness the catches light much better than simple geometry and provides more desirable subsurface scattering results.

Grass
A palette of 150 unique blades of grass were generated and scattered across the terrain.

Simulation
Once all the natural and manmade objects were chosen they were imported into the scene and dropped onto the terrain with a rigid body simulation. After the objects were placed each has its own pyro simulation to simulate wispy smoke rising from the charred objects.

Houdini TOPs
Every aspect of the creation of each landscape for every year was derived from a collection of CO2 levels and seed values contained within a single CSV file. This file was loaded via a Houdini TOPs network where the values contained within it were used to generate each landscape and its constituent parts. Every piece of geometry and volume used to compose the landscapes were generated individually and saved to disk then collected into a scene for final rendering.

Rendering
Camera placement is done by hand as it gives me an opportunity to explore what unique compositional happy accidents emerge with each year. I began the project using Redshift (my preferred render software) but ultimately settled on Arnold 7 as the large amounts of geometry and smoke volumes exceeded the VRAM of my poor ol' 2080 Ti's 😢.

Python
I had to created a dozen or so tools to help with some of the more time consuming aspects of the project. One example of a tool that saved me an enormous amount of time was one for picking a folder containing .bgeo.sc files, recursively finding and loading those files, creating a subnetwork to place them in, deleting unwanted attributes from geometry, creating output nulls, and finally giving me the option to choose a material to apply to all the imported files.

This project is still a work in progress. However, all the difficult work is complete. It's now just a matter of rendering when time allows.

Roles

Houdini: TOPs | CHOPs | SOPs | Simulation
Arnold: Rendering | Lighting | Materials
Vex & Python Tools
Procedural Environment

WIP Renders

year: 1760

year: 2021

LookDev

bottom of page