Tool overview: Nonlinear Deformers

I made this video to supplement my in-class lecture on nonlinear deformers. I found that after I’d outlined a lecture, it didn’t take very long to record a succinct version. This is a little more work than just recording the lecture in-class, but I think there’s value there for the student.

It’s available in HD so that the Maya interface is legible.


Landscapes of skin and coal, pt. 2

This is probably the image I’m happiest with from this semester’s work. It’s based on terrain data from the Black Mountain mine, on the Virginia-Kentucky border.


Rendered in Maya, with some post in Photoshop. Getting this to 8-bit for saving to web does some bad things to the contrast – the print looks substantially better, I think.

Bringing real-world elevation data to Maya

EDIT: As of September 14, 2014, this method doesn’t work. Zonum’s terrain data exporter doesn’t seem to be working. It’s returning elevations of -9999 for every data point.



The USGS has massive amounts of elevation data available through the National Map Viewer. There’s got to be a way to get that into Maya, right?

One way is to take the Arcgrid data from the USGS server, convert it to a GeoTIFF, then use that as a displacement map in Maya. (See USGS tools here, and other free tools here). This method works well for large tracts of land, but if you’re trying to isolate a particular ridge or valley, it can be a but cumbersome.

Here’s the method I use, which gets elevation data at a slightly more local level. Briefly, the method is Zonum Solutions > .csv/spreadsheet > .asc > Meshlab > .obj > Maya.

1. Go to Zonum Solutions’ terrain data exporter. (

zonum_1This works basically like the Google Maps interface, because it is. You can specify an area to extract data from either by latitude/longitude, or by navigating with the map and clicking the Get Extent From Map button.

As an example, let’s say you wanted elevation data for Mt. Hood. Navigate there by zooming and positioning the map interface.



2. Now click on Get Extent From Map. You should see your map window turn pink.


3. Now you can choose how many samples you want for the area you’ve chosen. Zonum will automatically get data from the highest-resolution data set available through the USGS, and will allow you sample up to 5000 data points. If you’ve got a few minutes to kill, put 5000 in the box, leave “Randomly” selected, and click the “Get Elevations >>” button.


4. You will now have the option to watch Zonum draw the points as it samples them, or just wait for the dataset, which will pop up in a window. Either way, it will take several minutes to sample 5000 points. Click the “Click here to Start” button to…well, start. Go get some coffee. Or make a sandwich. zonum_6

5. Although the map area looks like it changed since the last step, Zonum will sample from the original extent you chose – the area outlined in red from Step 3.

When it’s done sampling, you’ll see something that looks like this:


(I got impatient and took a screencap before it finished.)

6. Now you have to choose what separator you want. Choose comma, because the .csv file we’ll use in the next step uses a comma as a standard separator.

When you’ve chosen your seperator, Zonum will pop-up a window with 5000 lines of data. Select all and copy this data, EXCEPT for the first line (you don’t need the labels).


7. Unfortunately, Excel isn’t smart enough for you to paste this straight into a spreadsheet. To workaround this, paste it into an empty notepad document, then save the file as yourFileName.csv. You can save this anywhere, as long as you can find it.


8. In your spreadsheet software, you need to convert the longitude and latitude to meters. To do that, you need to know how many meters a degree of latitude/longitude is. If you don’t, your point cloud will just be a narrow line. Luckily, there’s a handy tool here to help you figure it out. Mt. Hood is at about 45 degrees North Latitude.latitude

A degree of latitude there is about 111,000 meters. A degree of longitude there is about 79,000 meters.

9. In column D, multiply the value in column A by 111000. In column E, multiple the value in column B by 79000.


Copy these new values, then paste the values only over of the old ones in columns A and B. You can now delete columns D and E. Your spreadsheet should now look something like this:


I know those numbers look crazy, but they’re all in the same units, and that’s the important thing.

These screenshots are from Excel, but you can use Google Spreadsheets or really any spreadsheet software that handles .csv files.

10. Save out your .csv file. Excel will give you a message about the .csv format not having the same features as the native Excel format, just hit “yes” and save the file.

11. Now we have a list of x, y, and z coordinates in a file, with one coordinate point per line. Luckily, that matches the .asc file format, which can be read by Meshlab. All we need to do to get that to work is change the extension from .csv to .asc.

12. Open up Meshlab (which you can download from this page). Meshlab has a lot of great features for dealing with point clouds and large meshes, but we just need it to set normals, create a mesh, and export it as a .obj.

Open up Meshlab, go to File > Import Mesh, and navigate to your yourFileName.asc.


You’ll end up with something that looks roughly like this:


13. Now we need to give these points normals, so that Meshlab can do its thing to construct a mesh.

To do that, go to Filters > Point Set > Compute normals for point set. I’ve used the following settings with good results. You can check your normals at Render > Vertex Normals. The normals should all be pointing up.


14. Now we need to get Meshlab to build a mesh from these points. To do this, go to Filter  > Point Set > Surface Reconstruction: Poisson. I use an Octree Depth of 12 and a Solver Divide of 7, because those settings seemed to have worked pretty well.

Make sure your layers are visible, and that your view settings are set to flat lines. Make the point cloud layers invisible, and you should see your mesh. Almost done!



15. Export this object as an .obj by going to File > Export Mesh and choosing .obj from the list.


Save it somewhere you can find it. Be sure to include Normals when you export (it should be checked for you in the dialog box).


Now, in Maya, go to File > Import, and choose your .obj file. Your mesh will appear WAY the hell away from the origin – remember those crazy coordinates from before?


I’m still playing around with what the ideal scale/map extent for 5000 data points is. At the map extent I chose, Mt. Hood gets a bit rounded off, but that’s where some nice bump/displace/normal mapping and texturing comes in.


Landscapes of skin and coal: procedural skin displacement map

In this semester, I’ve been working with mapping coal mining data. As much of my work deals with materials, I’m playing with the combination of coal and mineral materials with human skin. I’m also referencing pictorial landscapes and trying to pair the aspirational feel of some of that style of photography with the visceral and emotional impact of familiar human skin.

This work has taken quite a bit of development time, as the technical aspects of subsurface scattering, map creation, render optimization, and rendering at print sizes has been a challenge.

One of the first problems to solve for print was optimizing map size and texture space. Even using 8k maps, printing at gallery sizes (30″, ~200dpi) means pixelation for the compositions I was choosing. So I developed a procedural skin bump/displacement map. Here’s a render, Maya lambert, one spotlight, depth map shadows.



As a flat render, it’s not going to wow – but when you put it on a modeled form – or use it as one part of a good skin shader, it works pretty well. It actually performs better with a bit of UV distortion, which for my current project was perfect.