Iryss: an app that uses Machine Learning to identify objects

There are countless new features to take advantage of in iOS 11, but many of them are under the hood in the form of APIs and require App Store apps to get the full effect.

Iryss is a new app developed by Florian Hebrard, better known by the jailbreak community as Ziph0n, that takes advantage of Apple’s brand-new CoreML (Machine Learning) API to recognize objects around you in real time.

Using Iryss is as simple as pointing your camera at something. Within mere milliseconds, the app guesses what the object in the frame is by way of text on the screen. Alternatively, you can configure the app to announce object detections aloud.

Iryss can recognize both animate and inanimate objects. In the case of animals, the app is smart enough to discern between common breeds or species.

There are a couple of settings you can configure within the Iryss app to make it work how you want:

Here, you can force the app to announce what it detects aloud rather than just displaying it as text, and you can also enable haptic feedback to indicate successful object identification. Because the app is ad-supported, you’re also going to find an in-app purchase in this preference pane to buy out the ads.

So how well does the app work? It’s okay, but not perfect. Perhaps more often than you might like, the app will incorrectly identify an object as something it’s not. As a prime example, this gumball machine is detected as a “pill bottle,” likely because of the colored dots inside:

One thing that I found to be a significant hindrance to the app’s functionality is poor photographic conditions. To ensure you get the most accurate readings, I recommend ensuring you have the best lighting conditions possible and that you’re far enough away from an object that the whole thing is in the frame.

The developer tells me that while object detections can’t be right all the time, it can always be improved. Thanks to how simple Machine Learning makes it for developers to create apps of this nature, regular updates to Iryss should be capable of enhancing its accuracy over time.

Iryss certainly seems gimmicky for most users at first glance, but off the top of my head, I can think of at least two uses in the real world where the app could be helpful, such as:

  1. When someone’s hard at seeing and needs the iPhone to do the seeing for them, Iryss can make voice-driven announcements to tell that person what’s around them.
  2. When parents want to facilitate object/name learning for their child at a young age, the child can play with Iryss to learn about the objects around them and what they’re called.

For other users, Iryss is really just a proof of concept app that you can toy around with for fun.

If you’re interested in giving Iryss a try for yourself, it’s a free download from the App Store. It’s a great way to see what Machine Learning is capable of in iOS 11, and you can expect that this API will make many other incredible apps possible in the future.

Do you like the concept behind Iryss? Share your thoughts in the comments below.