An Icon Classifier Android App

AI for Art and Design

Margaret Maynard-Reid
5 min readJun 12, 2020

Written by Margaret Maynard-Reid, ML GDE

In Part 1 of the Icon Classifier tutorial, I shared about how to create an icon classifier model with the TensorFlow Lite Model Maker. This is part 2 of the tutorial in which I go over how to make an Android app with just a few lines of code implementing the TensorFlow Lite model. Please follow along with the Android code on GitHub here.

The model implementation on Android takes just a few lines of code, thanks to the TensorFlow Lite metadata, the Android code generator and the new ML Model Binding feature in Android Studio Preview.

By default the .tflite model created by the TensorFlow Lite Model Maker already contains metadata with useful info about the model, and most importantly it can be used to auto generate model inference code for the Android app.

An Icon Classifier Android app

Download Android Studio 4.1 Preview

We will be using the new ML Model Binding feature from the Android Studio Preview of 4.1 Beta 1, which you can download from here. You should be able to run the Preview version side by side with your stable version. Make sure to update your Gradle plug-in to at least 4.1.0; otherwise the ML binding menu File > New > Other > TensorFlow Lite Model will be inaccessible.

Android Studio Preview 4.1 — ML Model Binding

Android application

Create a new project

Open Android Studio and create a new Android Project with the “Empty Activity” template.

Create UI with a GridView

The model icons-50.tflite is a simple image classifier that recognizes icons. So I put a few of the icon images under the drawable folder and then display them along with the model predictions in the UI with a simple GridView. The UI code is fairly standard Android implementation so I won’t be going over it in details here.

Here is a summary of the steps:

  • Create a layout xml file called list_time.xml with an ImageView for the icon and a TextView for the model prediction.
  • Create a simple data model class called Icon, corresponding to the icon drawable ids and prediction.
  • Add a GridView to activity_main.xml to display a grid of icons plus predictions.
  • Create an adapter class extending from a BaseAdapter, which binds the data and the UI elements.

Import the TensorFlow Lite Model

Now let’s import the icons-50.tflite model created from the previous tutorial.

  • Open menu File > New > Other > TensorFlow Lite Model. If you are working from the Project explorer view instead of the Android explorer, make sure to position your cursor at or below the app module. Otherwise you may not find the “Other” menu option.
  • Specify the model location to where the icons-50.tflite model is located.
  • Leave “Auto add build feature and dependencies to gradle” checked (by default). This will save you time to configure TensorFlow Lite dependencies in your build.gradle files.
  • Leave “Auto add TensorFlow Lite gpu dependencies to gradle (optional)” unchecked (by default), since the model is a very simple classifier.

You will notice that a few steps have been automatically completed for you:

  • The build.grade has been updated as follows:
android {
...
buildFeatures {
mlModelBinding true
}
}
dependencies {
...
implementation 'org.tensorflow:tensorflow-lite:2.1.0'
implementation 'org.tensorflow:tensorflow-lite-support:0.0.0-nightly'
implementation 'org.tensorflow:tensorflow-lite-metadata:0.1.0-rc1'
}
  • A folder called “ml” was created with the model file added:
icons-50.tflite model file added to /ml folder
  • The model file is being opened to display the model metadata as well as the Kotlin / Java sample code snippets that you can use in the Android app. This makes it so much easier for the Android implementation!
tflite model metadata & sample code snippets

Putting it all together

Now I’m going to use the model to make class predictions and then display both the icon and prediction in the UI.

First I create an array to hold the icon drawable ids —

private val icons = arrayOf(R.drawable.alarm_clock,
R.drawable.beach_with_umbrella,
R.drawable.cloud_with_rain,
R.drawable.island,
R.drawable.mailbox,
R.drawable.mountain,
R.drawable.taxi,
R.drawable.tulip)

Then I copy/paste the sample code snippets (generated after importing the .tflite model) to the Android app, with a few minor changes:

  • Get the predicted class by finding the max of the model prediction probabilities.
  • Get the class label and also the probability score.
  • Concatenate the two strings into one prediction string.
val model = Icons50.newInstance(this)

// Creates inputs for reference.
val image = TensorImage.fromBitmap(bitmap)

// Runs model inference and gets result.
val outputs = model.process(image)
val maxProbability = outputs.probabilityAsCategoryList.maxBy { it.score }
prediction = maxProbability!!.label + " " + maxProbability.score

Then I create a list of Icon data by adding the iconId and prediction to each of the new Icon object.

fun addIconData() {
var iconBitmap: Bitmap
var prediction: String

for (iconId in icons) {
iconBitmap = BitmapFactory.decodeResource(resources, iconId)
prediction = predict(iconBitmap)
iconList?.add(Icon(iconId, prediction))
}
}

Finally, in onCreate() I set up the GridView adapter and pass the icon list to it.

iconAdapter = iconList?.let { IconAdapter(this, it) }
iconGrid.adapter = iconAdapter

Then the adapter takes take care of binding the icon list data to the UI. That’s it!

So all together it takes just a few lines of code to load the TFLite model for running inference.

In the past, I’d have to manually place the tflite model (labels.txt) file under the assets folder, configure the build.gradle file, and write a bunch of boilerplate code to load the model and run inference with it, not to mention the tedious pre-processing of the input images and post-processing of the prediction results.

Now Android developers can finally focus on building the Android application itself, instead of spending ton of time figuring out how to write code to run inference with the TensorFlow Lite model.

Many thanks to Khanh LeViet, Lu Wang, Shuangfeng Li, Lei Yu and Shiyu Hu from the TensorFlow Lite and ML Kit team for your help, review and feedback!

For a list of TensorFlow Lite models, samples and learning resources, make sure to check out the awesome-tflite list that I have compiled on GitHub.

--

--

Margaret Maynard-Reid

ML GDE (Google Developer Expert) | AI, Art & Design | 3D Fashion Designer