There are a lot of nifty things that machine learning can do now and it can only get better. With pretty much everyone and their mother sporting smartphones, machine learning on smartphones is most certainly a match made in heaven.
There has been a lot of talk about machine learning on the “edge”. This means smartphones and IoT devices being able to run machine learning algorithms directly without them having to ship data to servers and get inference results back.
Think of an IoT smart camera being able to detect the presence of strangers without having to send data to servers. There are several wins here: video is a lot of data. Running machine learning locally can mean faster detection. The whole setup will now take way less bandwidth. And most importantly, since data never leaves the device, it’s a big win for privacy. If the device happens to be battery operated, then avoiding external network communication can save a lot of juice.
In this video, we take a look at how to run a machine learning model based inference directly on iOS using CoreML. We will be using Keras to build and train a simple model that is capable of detecting handwritten digits based on the MNIST handwritten digits data set. We will then use CoreML’s coremltools Python module to convert the saved Keras model which we will import into XCode and run on a simple app which allows the user to detect handwritten digits. Although the app is very simple, the important point is that we describe the process by which a model that is trained on an external machine is converted and used in an iOS app.
GitHub Repo: https://github.com/shuveb/iOS-CoreML-MNIST