Overview
- In iOS 12, Apple introduced ARKit 2.0 with some extreme AR features like Multiuser AR experience, 2D image detection and 3D object detection.
- In this tutorial, I will show you how to scan real world object using apple demo and create object reference file. Use this Object file in our app for detecting that object.
- In the first part Image recognition and tracking using ARKit 2, we write tutorial on image detecting and tracking, that allows you to detect and track images that you add to the app. This is limited to images in two dimensions, in this tutorial we demonstrate you how to detect 3D real world object in your ARKit app.
Prerequisites:
- Xcode 10 (beta or above)
- iOS 12 (beta or above)
- iPhone 6S (Apple A9 chip or above)
- Object reference files (.arobject files of your real objects)
How to get or create object reference file of your real world object :
-
- There are two ways to create .arobject file
- Create separate app for scanning real world object, apple provides api for that.
- User apple demo app, and quickly scan your object and export it.
- This is apple demo app Scanning and Detecting 3D Objects for quickly scanning your real world object and you can use for your app.
- Download and Run this demo in real device (iPhone 6S or above).
- Before scanning object you should know about which object can be easily scanned, below is example of good and poor objects.
- Metallic, transparent, refractive and class material type object do not work properly.
- Rigid object, texture rich, no reflective, no transparent are good objects to track, also keep in your mind that your environment have good lighting for scanning and detecting objects.
- There are two ways to create .arobject file
- How to scan object using apple demo :
- See below video
- After scanning object, you will test and share file to your mac machine via airdrop and use this file in your 3D object detection app.
How to use object reference file your app?
-
- Create new Project and select Augmented reality app
-
- Add ARObject file into your app, select Assets.xcassets, tap (+) plus button from the bottom of the screen.
- Now select New AR Resource Group and change name to gallery then drag and drop ARObject reference file.
- You can also add multiple ARObject file into AR Resource group.
- See below video (video file name : add_object_file):
Detect real world object :
- Create ARWorldTrackingConfiguration object and load your gallery assert group.
- Assign refObjects to configuration.
let configuration = ARWorldTrackingConfiguration() guard let refObjects = ARReferenceObject.referenceObjects(inGroupNamed:"gallery",bundle: nil) else { fatalError("Missing expected asset catalog resources.") } configuration.detectionObjects = referenceObjects sceneView.session.run(configuration)
- Now time to scan your object, when object is successfully detected below delegate is called.
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) { if let objectAnchor = anchor as? ARObjectAnchor { // Object successfully detected. } }
Create AR interaction :
- Now interact with your detected object, we will show one arrow on top of the object, that indicates here is your object.
- Add arrow scn file into art folder, and load that file when object is detected.
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) { if let objectAnchor = anchor as? ARObjectAnchor { let translation = objectAnchor.transform.columns.3 let pos = float3(translation.x, translation.y, translation.z) let nodeArrow = getArrowNode() nodeArrow.position = SCNVector3(pos) sceneView.scene.rootNode.addChildNode(nodeArrow) } } func getArrowNode() -> SCNNode { let sceneURL = Bundle.main.url(forResource: "arrow_yellow", withExtension: "scn", subdirectory: "art.scnassets")! let referenceNode = SCNReferenceNode(url: sceneURL)! referenceNode.load() return referenceNode }