Haptrix — Core Haptics Designer

7 min read Original article ↗

Chris Davis

Apple introduced CoreHaptics for iOS 13 back at WWDC 2019, it allows you to create haptic patterns for your apps.

TLDR;
In this blog post I’ll cover how & why I built an editor for it.

WWDC

I’ve never been to WWDC, one day I’d like to go, but in the meantime the videos are great, one that caught my eye this year was Core Haptics. I’d tried to work with custom vibrations on the iPhone before, but all I ended up with was hacks, more on that at the bottom of this post.

I was curious, downloaded the betas and tried to create a custom pattern, you had two choices:

  1. Use the API
  2. Use a JSON file which describes the pattern, more on that here

I played with creating a simple pattern with the API, it works well, but I couldn’t imagine myself using the API to create a pattern everytime I wanted to make a change.

I looked at the JSON format and created a simple pattern, again, there was no way I was going to hand edit this file, some of Apple’s example files are hundreds of lines long.

What should I do?

Core Haptics can be used by user-experience designers, accessibility experts, games developers.

They shouldn’t have to know the API inside-out to create something.

Making an Editor

It would be great if there was a tool where you could draw on the screen the pattern you wanted, and then test it out…

Requirements:

  • Obvious to pick up and use
  • Patterns should be fast to put together
  • Fast to edit properties
  • Easy to visualise what a pattern is doing without running it
  • Able to experience the pattern running on iOS

Controls

Core Haptics supports multiple types of commands, Haptic, Audio, Curves & Dynamic, All of them are supported in Haptrix.

Haptic Events

With HapticEvents, you can experience a long vibration, or a short kick.
These are HapticContinuous and HapticTransient type events.
The main difference between the two is that HapticContinuous has a duration.

Press enter or click to view image in full size

Audio Events

AudioCustom is the event I use all the time, you can use it to play an audio clip, but because of the other tools Haptrix supports, you can craft the sound.

For example, you can play a file, but then suddenly change it’s pitch, because Haptrix uses a layer system, we can also add Haptic Events that happen at the same time.

A “knock-knock, who’s there” sound effect can easily be accompanied by Haptic Transient events so you can feel the “knocks”

Dynamic Parameters

Dynamic parameters changes playback instantly at a point in time, I haven’t really found a need for these yet, but they’re in there if you need them.

Parameter Curves

Curves are a fun tool to use, you have to use them in conjunction with either a haptic or audio event.

A couple of examples would be

  • When playing an audio file fade the sound from the left to the right
  • Change pitch of a sound (I’m guessing it uses AVAudioEngine for this)
  • Change the volume over time

It’s worth noting here that paramater curve values go from -1.0 to +1.0

Press enter or click to view image in full size

Attack, Decay, Sustain, Release

Audio engineers will be familiar with ADSR envelopes, Apple has made Core Haptics support a likewise system, again, in Haptrix I’ve supported these options.

In this example, you can feel the sound fade in, and then drop off.

Press enter or click to view image in full size

Run on device, macOS to iOS

The app features a ‘Run on device’ command, if you download the Haptrix app on iOS and have the mac app open, you will be able to connect them together.

This is great because if you edit your pattern, you can instantly run it and experience it on your device.

Countless times have I been sat there making patterns with my right hand, holding my iPhone in my left, creating, editing, tweaking patterns and instantly experiencing the result to see if it feels the way I want.

Syncing is so easy, just open up the app and it will automatically search for Haptrix, pick your Mac, and you’re connected.

When you press run from the mac app (Space bar works too) it syncs to the iOS app and plays automatically, you can re-play the pattern by pressing the large button.

Mixing events

I ran into a situation that I could only feel, I wanted a way to visualise it.

If you have two continuous haptic events with different sharpness, you will feel something in the middle, it’s where the haptics engine is interpolating between the two values over time.

I wanted to show that on screen, as you won’t know it does that unless you feel it in your hand.

For example, the figure below has two non-overlapping events, it’s easy to see that they have different sharpness, but what would it feel like if you overlapped them?

Press enter or click to view image in full size

Well, depending on the sharpness, and duration of the overlap, In this example you’d feel a quick rumble sensation.

Increasing the sharpness, affects the feeling, in this example you would get a fast rumble of haptic events just after the start of the pattern.

Beat Detection

I knew from the start that one feature I wanted was beat-detection.

(I knew I wanted it, because that’s what I tried to replicate years before.)

Where I could import a file and it automatically know where the beat points are in the track.

I spent a few days working on different approaches and found one that works well using a fast-fourier transform with a sliding window.

For my testing I’ve been using a *.wav file of the Indiana Jones intro, results below:

Press enter or click to view image in full size

General UI

The general UI is in four sections:
- The top main navigation bar contains a ToolBar with mainly drawing controls, these can also be switched by key shortcuts.
- The main prominent view is the editor where you can add events
- The bottom is a list of all events, so you can easily scroll through
- The right is the Inspector, which changes whenever you select an item

Press enter or click to view image in full size

and the new coolness, darkmode is supported:

Press enter or click to view image in full size

AHAP Files

Files created with Haptrix are *.ahap files, there’s no wrapper required. So you can embed them into your app as-is.
I do also have an export command if required with a pretty-print JSON option.

Hardware support

Core Haptics is supported on devices iPhone 8 and newer, of course, I didn’t have a test device that I could use to install iOS13, but managed to pick up a second-hand unit for my testing from iOutlet — it works great.

OS Support

Core Haptics is new in iOS13, some features I’m using for the communication between iOS13 and macOS require Catalina (10.15), I tried to support Mojave, but the code is so much simpler on Catalina 10.15.

Previous Tools

I like making things, I’ve made tools like this before, Phonique for AVAudioEngine, VectorFlow for Procedural Vector Editing, WarpTool for SKWarpGeometry among many others, check out my personal site for more: Chris Davis.

How you could do it before Core Haptics?

Prior to this, if you wanted a custom vibration, you had to do hacks, these aren’t allowed in the AppStore, but mentioned here as example.

First you had to add a header to access some closed off API’s

Then construct a dictionary of the intensity and vibration pattern, it’s pretty basic, on for 1000 miliseconds, off for 250, on etc.

It’s great to see something like this opened up.

Conclusion

It’s great to work with new technology, Core Haptics was a challenge to begin with, but playing with it, I’ve learnt a lot and will use it in my next iOS project for sure!

I spent 135 days of my spare time working on this app, according to my GitHub, some weekends, but mostly a couple of hours here and there in the evenings after work, I couldn’t say for certain how many hours it took.

More videos can be found on www.haptrix.com

Haptrix for macOS and iOS are available on the AppStore now!