Getting Started with RealityKit: Shaders

RealityKit Shaders Banner Image

At last we can add a geometry or surface shader to an Entity in your RealityKit scene with the release of RealityKit 2! Coming with iOS 15/macOS 12.

What Is a Shader?

A shader is code that is passed to the renderer to then be run at specific stages in the rendering pipeline.

The two shaders we now have in RealityKit are geometry and surface shaders. Both of these are sampled in the sample code: Building an Immersive Experience with RealityKit.


A geometry shader (GeometryModifier) alters the position of the geometry’s vertices. In the sample provided this by Apple this is used to add a wavey effect to the seaweed. Previously the only way to do this would be to bake an animation into a USDZ file, which means you would need to re-create the USDZ every time you tweak the animation (rather than changing a parameter in your shader), and all the seaweed in the scene would look extremely uniform or need their own USDZ animations.

A SurfaceShader on the other hand alters the material applied to a mesh. A surface shader alters the shading of each pixel of a material as it is rendered on the screen. These can take into consideration models around them or behind, and can be used to make interesting effects such as glass warping the object behind it, or transitioning between colours.

If you want to play around with shaders outside of RealityKit to gain a better understanding, I’d encourage you to check out the following links:

There are a lot of resources out there, as shaders are by no means a new technology.

Creating a Basic Shader With RealityKit

When creating any kind of shader in RealityKit, you must use a Metal file.

Let’s make a basic looping animation shader to apply to a Cube. For this example, the logic is as follows:

  • Take all points above the y-axis
  • Animate them to shrink to the y axis, and expand away from the y-axis by the same amount

To make an oscillating motion like this we will use a sinusoidal wave such as a sine or cosine.

With a cube, there are typically 4 vertices in the positive y axis and 4 below. And these four 3D points are each one of:

[-0.5, 0.5, 0.5], [0.5, 0.5, 0.5],
[-0.5, 0.5, -0.5], [0.5, 0.5, -0.5]

Notice that all the y values are positive 0.5 (above the y axis)

From a topdown view, this is what our cube’s four coordinates will be doing:

See the above graph in Desmos:

If each point on the cube has a starting point at (±0.5, ±0.5) and we want to:

  • Move out to the expanded (±1, ±1)
  • Move back in to (0, 0)
  • Then back to the starting point
  • Repeat the process

Based on the above, the offset of our point is in the X and Z world space coordinates by a value changing between +0.5 and -0.5, depending on the sign of our starting point.

Let’s take a look at a basic example of a geometry shader now:

Place the above code in a new Metal file in your Xcode project.

This geometry shader will give us the desired output, but there are a few steps needed to apply this to a model in our RealityKit scene.

// Create the cube
let cubeModel = ModelEntity(
mesh: .generateBox(size: 1),
materials: [SimpleMaterial(color: .red, isMetallic: false)]

// Fetch the default metal library
let mtlLibrary = MTLCreateSystemDefaultDevice()!

// Fetch the geometry modifier by name
let geomModifier = CustomMaterial.GeometryModifier(
named: "simpleStretch", in: mtlLibrary

// Take each of the model's current materials,
// and apply the geometry shader to it.
cubeModel.model?.materials = cubeModel.model? {
try! CustomMaterial(from: $0, geometryModifier: geomModifier)
} ?? [Material]()

In the above code snippet, we fetch the default Metal Device with MTLCreateSystemDefaultDevice(). The default device will be the GPU, as all these shaders by default run on the GPU (which is what you want!).

Once that cube is added to the scene, we will get a result like this:

Creating a Slightly More Advanced Shader

In addition to a geometry modifier, RealityKit also offers a way to alter the rendered colour on a material.

In this second example, we’ll take a simple plane made up of 100x100 vertices and make it look like a slowly waving body of water.

To create the base geometry I’m generating my own geometry using the techniques outlined in my previous post. The method to create it is included in the RealityGeometries package.

import RealityGeometries

var oceanMat = SimpleMaterial(color: .blue, isMetallic: false)
oceanMat.metallic = .float(0.7)
oceanMat.roughness = 0.9

let modelEnt = ModelEntity(
mesh: try! .generateDetailedPlane(
width: 2, depth: 2, vertices: (100, 100)
), materials: [oceanMat]

Currently all we see is a basic plane with the basic material. Let’s add some waves to it with a geometry shader.

The geometry shader here looks more complex, but it was created by a bit of trial and error:

The two values xPeriod and zPeriod are dictating the distance between waves in the x and z axis. I set them to be non-constant, varying based on time and their distance in the x and z plane. Without this the waves would be far too uniform.

xOffset and zOffset are the sizes of the x and z planes at their given position based on time and position in the x and z axis once again. I would encourage you to play around with these values if it seems alien.

Once again, here’s how to apply that to our material:

// Fetch the default metal library
let mtlLibrary = MTLCreateSystemDefaultDevice()!.makeDefaultLibrary()!
let geometryShader = CustomMaterial.GeometryModifier(
named: "waveMotion", in: mtlLibrary

modelEnt.model?.materials = modelEnt.model? {
try! CustomMaterial(from: $0, geometryModifier: geometryShader)
} ?? [Material]()

We can see based on the edges here that there are waves happening, but because the lighting is so uniform in there we can’t properly see any movement in the middle.

The next shader we will apply to this geometry is a surface shader. The logic will be very simple, depending on the height of the point on this mesh we will set the base colour to a value moving from a nice ocean blue up to white. Here’s the shader I’ve ended up with:

The value of params.geometry().model_position().y ranges from -maxAmp to +maxAmp. So I need to add maxAmp then divide by maxAmp * 2 in order to get a value ranging from 0 to 1. I raise this result to the power of 8 making it non-linear; this means that only the higher values will ramp up quickly to white.

To apply both the geometry and the surface shader to our model, we must apply them at the same time like so:

// Fetch the default metal library
let mtlLibrary = MTLCreateSystemDefaultDevice()!
// Fetch the "waveMotion" geometry modifier
let geometryShader = CustomMaterial.GeometryModifier(
named: "waveMotion", in: mtlLibrary
// Fetch the "waveSurface" surface shader
let surfaceShader = CustomMaterial.SurfaceShader(
named: "waveSurface", in: mtlLibrary
// Apply both to the material
modelEnt.model?.materials = modelEnt.model? {
try! CustomMaterial(
from: $0,
surfaceShader: surfaceShader,
geometryModifier: geometryShader
} ?? [Material]()


This has been a very brief introduction to all of the things that are possible using RealityKit shaders. I hope you find some inspiration from these examples, and that they help you when implementing shaders in your own applications.

For more information follow me here on Medium, twitter or github, as I’m frequently posting new content on both those platforms, including open source swift packages specifically for RealityKit!

Also leave some claps if you’re feeling excited by WWDC’s new features in RealityKit this year!




Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store