Create an Android App to Recognize Face Contours With Firebase ML

Firebase ML Kit, a collection of local and cloud-based APIs for adding machine learning capabilities to mobile apps, has recently been enhanced to support contour detection. Thanks to this powerful feature, you no longer have to limit yourself to approximate rectangles while detecting faces. Instead, you can work with a large number of coordinates that accurately describe the shapes of detected faces and facial landmarks, such as eyes, lips, and eyebrows.

This allows you to easily create AI-powered apps that can do complex computer vision related tasks such as swapping faces, recognizing emotions, or applying digital makeup.

In this tutorial, I’ll show you how to use ML Kit’s face contour detection feature to create an app that can highlight faces in photos.


To make the most of this tutorial, you must have access to the following:

  • The latest version of Android Studio
  • a device running Android API level 23 or higher

1. Configuring Your Project

Because ML Kit is a part of the Firebase platform, you’ll need a Firebase project to be able to use it in your Android Studio project. To create one, fire up the Firebase Assistant by going to Tools > Firebase.

Next, open the Analytics section and press the Connect button. In the dialog that pops up, type in a name for your new Firebase project, select the country you are in, and press the Connect button.

Connect to Firebase dialog

Once you have a successful connection, press the Add analytics to your app button so that the assistant can make all the necessary Firebase-related configuration changes in your Android Studio project.

At this point, if you open your app module’s build.gradle file, among other changes, you should see the following implementation dependency present in it:

To be able to use ML Kit’s face contour detection feature, you’ll need two more dependencies: one for the latest version of the ML Vision library and one for the ML Vision face model. Here’s how you can add them:

In this tutorial, you’ll be working with remote images. To facilitate downloading and displaying such images, add a dependency for the Picasso library:

ML Kit’s face contour detection always runs locally on your user’s device. By default, the machine learning model that does the face contour detection is automatically downloaded the first time the user opens your app. To improve the user experience, however, I suggest you start the download as soon as the user installs your app. To do so, add the following  tag to the AndroidManifest.xml file:

You might also like More from author

Leave A Reply

Your email address will not be published.