Before diving into the wonderful world of augmented reality let’s make sure that we have all our prerequisites in place.

  • Xcode 9 (beta or above): ARKit is available on Xcode 9 (beta) or above. So, make sure to download the latest available version of Xcode 9.
  • Physical Device: You will need a physical device to test your ARKit applications. It is recommended that the testing device consists of at least A9 processor, which means iPhone 6S or above or the latest iPad.
  • iOS 11 (beta or above): Since, ARKit is part of the iOS 11 SDK it is required that your device is running iOS 11 beta or later.

Depending on your internet connection it might take anywhere from 10 minutes to 45 days to download and install everything. I am kidding, 30 days maximum :)

Now, that you have all the prerequisites downloaded and installed you are ready to get the first taste of building an augmented reality application using ARKit for iOS. By the end of this chapter you will know the basic concepts behind implementing ARKit applications.

Launch Xcode 9 and create a new project. You will notice that Xcode 9 comes with a new project template specifically designed for ARKit applications as shown in Figure 1.

Figure 1: Augmented Reality Project Template in Xcode 9

After selecting “Augmented Reality App” template press the “Next” button. This will take you to the project options screen which will allow you to configure different attributes of the project. The most important option for us is “Content Technology”, which allows the developers to create ARKit applications using either SpriteKit, SceneKit or Metal framework.

Figure 2: Support for Multiple Content Technologies

Make sure you select SceneKit as most of the examples in this book uses SceneKit as the content technology. Next, specify a location for your project and finally press the “Create” button to bring your ARKit project to life.

You are only moments away for witnessing the magic of augmented reality. Run the app and make sure that your physical iPhone is plugged in and connected. The Figure 3 shows the ARKit default application in action. Your garden might not be as green as mine :)

Figure 3: ARKit Default App in Action

Houston! We have ARKit!

This chapter is part of my book “ARKit for iOS Developers”. Get the book now and start building amazing augmented reality applications.

Congratulations on running your first ARKit application. In the next section we are going to understand the code involved in making the default ARKit project.

Understanding the Project:

In this section we are going to look at different parts of the default ARKit project and we will examine the different classes involved in creating the augmented reality experience possible.

Let’s immediately dive into the code which is added by default by Xcode. Listing 1 shows the implementation of the viewDidLoad function of the ViewController.

override func viewDidLoad() {super.viewDidLoad()// Set the view’s delegatesceneView.delegate = self// Show statistics such as fps and timing informationsceneView.showsStatistics = true// Create a new scenelet scene = SCNScene(named: “art.scnassets/ship.scn”)!// Set the scene to the viewsceneView.scene = scene}

Listing 1: viewDidLoad function of the ViewController

Inside viewDidLoad we first set the scene view delegate to the instance of the controller. Then we enable the statistics property which will allow us to view the debugging information, including frame rate etc.

After that we load the SceneKit scene called “ship.scn”, which is contained in the assets folder. This scene contains the actual model of the spaceship. Finally, we set the sceneView scene property to the loaded scene.

The most important responsibility of the view controller is to configure and initialize the ARKit world tracking. The world tracking allows us to track virtual items in the real world and it is considered the brains behind the workings of the ARKit framework.

World tracking is enabled inside the viewWillAppear function as shown in Listing 2.

override func viewWillAppear(_ animated: Bool) {super.viewWillAppear(animated)// Create a session configurationlet configuration = ARWorldTrackingConfiguration()// Run the view’s sessionsceneView.session.run(configuration)}

Listing 2: Running world tracking

NOTE: Apart from ARWorldTrackingConfiguration, Apple also provides ARConfiguration which is meant to provide less immersive AR experience for devices lower than iPhone 6S and not running A9 processor. Unfortunately, at the time of this writing ARConfiguration does not work as expected and the app crashes when run using the ARConfiguration configuration on non A9 devices.

At this point you might be wondering where in the world sceneView comes from. Open your Main.storyboard file and take a look at your view controller as shown in Figure 3.

Listing 3: ARSCNView as the Root View of the Controller

The root view of your view controller is not UIView but ARSCNView. ARSCNView is a special kind of view which allows to show augmented reality view for SceneKit enabled applications. The sceneView instance is connected to the storyboard through the use of an outlet. If you were using SpriteKit as the content technology for your ARKit project then it would have been ARSKView instead of ARSCNView.

In the next chapter we are going to look into creating virtual objects using geometry, decorating them using materials and finally inserting them into the world using nodes.

Conclusion:

In this chapter we learned about running our first augmented reality application using ARKit. We looked at different components of the default ARKit application and learned about the purpose each component serve in an ARKit app.

iOS Developer, speaker and educator. Top Udemy and LinkedIn instructor. Lead instructor at DigitalCrafts. https://www.udemy.com/user/mohammad-azam-2/

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store