Using a Displacement Map to Build a Scene With SceneKit Under SwiftUI | by Mark Locking | Jun, 2022

Continuing the adventure on SceneKit capabilities

A displacement map in SceneKit

Around the same time they were making the Alien movie, there was a second sci-fi in production that would become a $10 billion dollar franchise that Disney recently rebooted. That movie came out in 1977, and it was called Star Wars.

With a company behind it that would go on to change the way computers were being used in movies, too, a company in a warehouse in Van Nuys, California, calling itself ILM or Industrial Light and Magic.

A company behind an open source graphics standard that is used in industry to this day, the OpenEXR multi-channel raster file format. A standard that Apple partly adopted in 2017 in SceneKit.

Join me in this article to learn a little more about it — and, indeed, how you can use it to create both artificial and real-world meshes for your games and alike.

In a previous article on this subject, I started by building a mesh purely with SceneKit. I then used some of Apple’s prebuilt shapes to make the process a little faster. Although, as you know, nothing is ever fast enough; so join to learn how to do this on an industrial basis, even quicker.

OK. We start then with the only mention of any substance on displacement maps in the WWDC2017 the What’s new in SceneKit presentation.

Some nineteen minutes into the video, Amaury, the presenter, mentioned displacement maps, explaining that they are an image that stores the height or elevation of a point within a raster image.

He then mentions terrain rendering, which is a mesh to you and me. In his next slide, he shows us a plain that appears on screen as a vast collection of polygons, which he then tells us we should tessellate. We’ll talk about more of that later.

A tesselated plain that we deform using a heightmap — something you and I can do using just two lines of code — I think not. So, having spent three minutes covering the new functionality, he starts talking about vector maps, an extension, confident no doubt he has completely covered the subject.

So, be prepared. We got three minutes of video made five years ago to point us in the right direction. Here goes.

Join me on a journey to implement displacement maps.

I was excited; this looked like the way forward. The opportunity to take my custom geometry to the next level. I set about creating a polygon of polygons — something to look like that Apple slide.

As I did in a previous piece, this article is as much about the journey as the destination. If you just want the answer now, fast forward to the code.

Here’s the code where I added the two magic lines of code shown on the Apple slide following this one and ran it on the Xcode simulator. Sadly, the result was an unspectacular failure. Absolutely nothing happened. All I got was a square of custom polygons.

It didn’t work.

I rewatched the video. Was there a link between the number of polygons in the custom shape and the map’s resolution? I built a polygon with more vertices, the same number I had pixels [some 256 x 256]and tried again.

It didn’t work.

In the back of my mind, this seemed a little nuts. I downloaded some more maps and tried again, increasing the number of polygons as I did so. Of course, I was dead in the water with a map of 1024×1024 pixels. I now needed a custom geometric shape of one million polygons.

It didn’t work and was starting to fall apart, as was I.

I took the dog for a walk, returned to rewatch Amaury talk again, and did some more searches in Stack Overflow, which I should have perhaps mentioned earlier.

Apple makes almost no mention of this capability. Either everyone else has got it to work easily, and I am stupid, or it is so badly documented most people have never tried.

I found a mention of tessellation in one of the two or three questions on the subject over the past five years. Jeez. I needed to add tessellation. I thought about it a little more — why would I need a custom geometry if SceneKit is chopping up my node into a polygon anyways?

I commented on the polygon code and created a single plain node to which I added a tessellation. This code is the result:

And ran the code.

It didn’t work.

I looked up the Apple documentation on this — hoping there might be a clue, and downloaded the demo code from WWDC2017, too — neither helped. I looked up an article I wrote on analyzing image assets and ran it on my “displacement map.” It contained data, although evidently not the right flavor. I watched the video again and sent Amaury a request to connect on LinkedIn. This was a desperate measure.

As I rewatched it, I realized he also called displacement maps, height maps, so I googled some. They looked different from my map. I also caught him mentioning height maps are black and white — which my map wasn’t — sure it had some odd colours, but not black and white. I looked at the dog, who gave me that “I told you so” look back. I found a height map and reran it.

It didn’t work.

I was starting to go doolally at this point. I returned to Stack Overflow and read all the comments on the posts I could find on the subject.

About an hour before I needed to meet my client for whom I was investigating this, I found the answer. SceneKit uses the Metal framework to do this job, and to use that, you need to run your code on an actual device. I ran my code on a real device, and well…

It worked! Saved by the bell as they say.

So, here you go, you have the complete code to use to do displacement maps — which you should look for on google as height maps. These maps should look like black and white smudges or very faint white smudges for the most part.

Finally, a reward if you got to the end of this article. A one-minute video of a run through the valley with some suitable music in the background. A valley that is, in fact, a reversed height map. A map built with more than two lines of code Amaury.

A displacement map that I have inverted to create a valley that I fly through.

Of course, I am not quite finished yet, because as good as this is, I may want to build my own scenes and the landscapes I upload from around the planet. To do that, I referenced this article I wrote about image processing and added to it the formula to create height maps SceneKit would understand. I then built this as a demo —a man-made mountain that uses all 255 possible height values ​​available.

The formula you need to make this work is on line 34 of this code:

Here’s the code you feed this grayscale image I built to create the video I showed you.

The height-map image used to create the mountain video, some 255 grayscale values

As you’ve seen, it wasn’t an easy win. Here is a summary of the findings:

  • Use a real device. This won’t work on the simulator.
  • Be sure you’re using height/exr image maps as in the black and white smudges.
  • Forget custom geometry; it was a red herring. Tessellation is the name of the game.
  • All maps are not made equal — tune with the intensity field.
  • Be careful with the mappingChannel if you’re using exr images; They can contain multiple sets of data [it’s intentional].
  • If you’re planning to build your own maps and need to start fooling around with UInt32s, be very careful with multiplication and division. Certainly, the latter may not be working the way you want to imagine it would.
  • Be warning: Height maps can only display areas with no more than 256 differences in height. If your data has greater variations, you’ll need to do more processing/come up with some creative code.

All of which brings me to the end of this article.

I hope you enjoyed reading it as much as I did writing it.

Stay tuned for more journeys into SceneKit — and ARKit, its close cousin.

Leave a Comment