ARKit — Geometry, materials, nodes, gestures Oh my!

This chapter is part of my book “ARKit for iOS Developers”. Get the book now and start building amazing augmented reality applications.

In the last chapter you learned about the components involved in creating the default ARKit application. You also witnessed a spaceship floating right in front of you. Hopefully, you have informed NASA about their missing spaceship!

In this chapter you are going to learn how to create virtual objects with different geometries ,decorate them with materials and finally add them to the real world as SceneKit node objects. You will also learn how to interact with your virtual objects using gestures and hit testing. Let’s learn some pythagoras theorem.. No not really!

Geometry:

Geometry represents the wireframe of the object which corresponds to the shape. SceneKit framework supports a lot of geometries out of the box and some of them are shown here.

SCNPlane: A rectangular one sided geometry of specified width and height

SCNSphere: A sphere (or ball or globe) geometry

SCNBox: A six-sided polyhedron geometry whose faces are all rectangles, optionally with rounded edges and corners

SCNPyramid: A right rectangular pyramid geometry

SCNTube: A tube or pipe geometry. A right circular cylinder with a circular hole along its central axis

Each class represents a particular shape in geometry and can be used together to create more complicated shapes.

Material:

By default geometry is just a wireframe. You can think of geometry like a human skeleton. The material is how you decorate the skeleton with flesh. The material can be a color, image or even a video. Make sure while reading this section you are not sitting close to cannibals or they might get a wrong idea.

Node:

A node represents an object that can be added to the scene. Anything added to the scene can be seen by the users, provided that it is not explicitly hidden. You can think of node as a single element on the screen. If you are building a car racing game then a node can represent a single car. If you are building a spaceship game then the node can be spaceship and a separate node can represent a missile. Nodes can also be composite, meaning they can contain other nodes. You can think of a node representing a large truck with child nodes representing the trailer, tires, load etc.

Adding Shapes to the Real World:

Let’s start by adding few shapes to our augmented reality world. Create a new Augmented Reality project in Xcode 9 and add the code in Listing 1 to the viewDidLoad function.

override func viewDidLoad() {super.viewDidLoad()// Set the view’s delegatesceneView.delegate = self// Create a new scenelet scene = SCNScene()let box = SCNBox(width: 0.2, height: 0.2, length: 0.2, chamferRadius: 0)let boxNode = SCNNode(geometry: box)boxNode.position = SCNVector3(0,0,-0.5)scene.rootNode.addChildNode(boxNode)// Set the scene to the viewsceneView.scene = scene}

Listing 1: Adding a Box to the Scene

We start by creating a SCNBox geometry which represents the cube shape we are building. We specified the width, height, length and chamferRadius of the box. The chamfer radius represents the corner radius of the box and we have set it to 0 indicating that we don’t want corner radius.

NOTE: The unit of measure when working with SceneKit is meters. This means that the dimensions and the positions you are going to set will be calculated in terms of meters.

Next, we create a SCNNode object which uses geometry as the parameter. This means when we add our node to the scene it will act like a box since we have specified box as geometry. We set the value of z-axis to -0.5 meters, which represents the distance object will be placed away from us. Finally, we add the boxNode as a child to the rootNode of the scene. This allow the node to be visible on the screen. Figure 1 shows the box being displayed in real world.

Figure 1: Adding a Box to the Scene

Amazing and simple right!

This chapter is part of my book “ARKit for iOS Developers”. Get the book now and start building amazing augmented reality applications for iOS

At present our box is not using any material and that is why it appears to be white in color. In order to change the appearance of the box we need to decorate it with a material. Listing 2 shows the implementation where we have created a material using the SCNMaterial class and applied it to all the sides of the box.

override func viewDidLoad() {super.viewDidLoad()// Set the view’s delegatesceneView.delegate = self// Create a new scenelet scene = SCNScene()let box = SCNBox(width: 0.2, height: 0.2, length: 0.2, chamferRadius: 0)let material = SCNMaterial()material.diffuse.contents = UIColor.redbox.materials = [material]let boxNode = SCNNode(geometry: box)boxNode.position = SCNVector3(0,0,-0.5)scene.rootNode.addChildNode(boxNode)// Set the scene to the viewsceneView.scene = scene}

Listing 2: Applying Material to the Box

The diffuse property of the material object controls how the shading will be affected due to light.

Figure 2: Setting Geometry Material

This is great! But usually in augmented reality applications you want to make the virtual items look more like real items, so they can easily blend in the real world. Red color is great but if we can replace it with a real world texture then it would be even better.

The following snippet shows how an image can be used as a material instead of a color.

material.diffuse.contents = UIImage(named: “brick.png”)

Instead of specifying a color we are specifying an image which will be used as the material for the box. Figure 3 shows the result of using an image as the material.

Figure 3: Setting Image as Geometry Material

When applying an image as a texture make sure that the texture blends in nicely with the environment giving an illusion that it is part of the real world.

Adding Touch Events:

In this section we are going to learn how to add texture to the SCNSphere shape and allow the user to cycle through a list of textures when they touch the sphere.

Similar to the box example discussed earlier we are going to start with a geometry. This time we are going to use SCNSphere class which will allow us to create virtual objects using the sphere geometry. Listing 3 shows the implementation to create and add sphere node to the scene.

let sphere = SCNSphere(radius: 0.3)let sphereNode = SCNNode(geometry: sphere)sphereNode.position = SCNVector3(0,0,-0.5)

Listing 3: Creating a Sphere

We specified the radius of the sphere in meters and then added it to the sphere node. Finally, sphereNode object was added to the scene. Since, we have not set the material of the sphere node it comes out as a white object. We are going to decorate our sphere with the texture that represents our beautiful planet, Earth. Listing 4 shows how to decorate the sphere with an image used as a material.

let sphere = SCNSphere(radius: 0.3)let material = SCNMaterial()material.diffuse.contents = UIImage(named: “earth.jpg”)sphere.materials = [material]let sphereNode = SCNNode(geometry: sphere)sphereNode.position = SCNVector3(0,0,-0.5)

Listing 4: Creating a Sphere Geometry with Earth Material

Go ahead and run the app and be amazed!

Figure 4: Setting Texture for Sphere Geometry

Next, we will add touch gesture to our planet. This will allow us to cycle through different textures and update the sphere to reflect new planet. The only reason we are doing this is to keep our neighbors happy so they don’t invade our beautiful planet.

The first step is to register the tap gesture for the scene view. This can be accomplished by using UITapGestureRecognizer class as shown in Listing 5.

private func registerGestureRecognizers() {let tapGestureRecognizer = UITapGestureRecognizer(target: self, action: #selector(tapped))self.sceneView.addGestureRecognizer(tapGestureRecognizer)}

Listing 5: Registering a Tap Gesture for the Scene View

The registerGestureRecognizer function is called from inside of viewDidLoad function, which will make sure that when view is loaded the gestures are registered. For every tap event the tapped function will be called. The tapped function is where we will write code to check if the user has tapped on the virtual object or not.

The implementation in Listing 6 shows how to detect if the user’s touch has been intercepted with the node in the scene view.

@objc func tapped(recognizer :UITapGestureRecognizer) {let sceneView = recognizer.view as! ARSCNViewlet touchLocation = recognizer.location(in: sceneView)let hitResults = sceneView.hitTest(touchLocation, options: [:])if !hitResults.isEmpty {// this means the node has been touched}}

Listing 6: Hit Test Detection

We begin by getting the touch location in 2D space. This can be accomplished by recognizer.location function which takes the view as an argument and returns the CGPoint structure as the touch location. Next we used ARSCNView’s class special method called hitTest, which can take the location of the touch as CGPoint and return the result representing the objects that were found along the point line segment.

Next, we need to access the node that intersected with the line segment. This is easy because the SCNHitTestResult instance consists of a property called node which refers to the intersected node. Once, we have the node we can simply change the type of the material it uses and boom we are done. Listing 7 shows the updated implementation of the tapped function.

@objc func tapped(recognizer :UITapGestureRecognizer) {let sceneView = recognizer.view as! ARSCNViewlet touchLocation = recognizer.location(in: sceneView)let hitResults = sceneView.hitTest(touchLocation, options: [:])if !hitResults.isEmpty {if index == self.textures.count {index = 0}guard let hitResult = hitResults.first else {return}let node = hitResult.nodenode.geometry?.firstMaterial?.diffuse.contents = UIImage(named: textures[index])index += 1}}

Listing 7: Changing Virtual Object Material on Touch Gesture

Go ahead and run the app! The app will start by displaying our beautiful planet Earth in the real world. If you touch the planet you will notice that it will change the material to reflect the next planet in the textures array. Make sure you don’t touch Venus, you might burn your hand because of blazing hot temperature.

Conclusion:

In this chapter we learned how to use geometry to create different shapes. We decorated the wireframe represented by geometry with material to give make it more realistic feeling. Finally, we used gestures and hit testing to interact with our virtual objects in the real world.

iOS Developer, speaker and educator. Top Udemy and LinkedIn instructor. Lead instructor at DigitalCrafts. https://www.udemy.com/user/mohammad-azam-2/

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store