A weekend of ARKit Hacking
This post is way overdue. Whoops.
I spent the 4th of July weekend trying to mash two of my favorite things together:
- Programming 💻
- Running 🏃
There’s a “Race Condition” pun somewhere in there. The end result was a “running pacer;” a virtual object placed on a real track, to help you maintain your speed while running.
tl;dr ARKit is 🔥
How is pacing tracked now?
For a non-professional runner, you rely heavily on your watch/phone to report your current performance.
I use a Garmin Forerunner 735xt to keep tabs on my performance. It uses GPS to determine my distance traversed, and as a result, pace. That information is helpful, especially mile to mile, but what does an increase in pace actually look like?
Nike and Puma have taken a swing at this:
This seemed like a cool place to try out ARKit, and here’s the end result. Disclaimer: It’s insane to run while staring through your phone.
How it was done in ARKit
There are three critical components in a basic ARKit experience.
- ARSession: The configuration for ARKit / how it “sees the world.”
- ARPlaneAnchor: A plane (flat surface) that ARKit recognizes.
- SceneKit: Rendering/Animating 3D Objects
After reading through the docs and seeing some examples, I was able to assemble this fairly quickly.
ARKit basically creates an X, Y, Z coordinate system in the real world. It fires events for when it “detects” an ARAnchor, a fixed point and orientation. An ARPlaneAnchor inherits from this, and represents a flat surface that ARKit has detected.
When the application detects an anchor, I placed an object via SceneKit relative to that anchor. This means the object is parallel to the plane, and any X, Y, Z transformations to the object will occur relative to this plane.
Since a running track is flat, this would enable the “pacer” to run parallel to the track, rather than above or into it.
I plan to spend more time on the actual experience, but for now, when the user hits “Start,” the pacer moves at a fixed speed a total of 100m.
Misc Thoughts on ARKit
- The ease of integration into SceneKit stood out the most. Apple knows how big their game ecosystem is, and any tools to make it easy for that community to leverage ARKit, the more adoption it’ll have.
- This will be 10x better on the future “Apple Glasses.” Holding a phone is wonky, especially when it demands significant movement. But as iOS 11 progresses an more apps begin using ARKit, this will help inform improvements to ARKit that will make it ready by the time the glasses arrive.
This was originally posted to Medium.