ESRC provides the vision API and SDK for your mobile app, enabling real-time recognizing facial expression, heart response and emotion using a camera.
To use our Android samples, you should first install ESRC SDK for Android 2.6.0 or higher on your system and should be received License Key by requesting by our email: esrc@esrc.co.kr
The minimum requirements for ESRC SDK for Android are:
- Android 4.2.0 (API level 17) or later
- Java 8 or later
- Gradle 4.0.0 or later
// build.gradle(app)
android {
defaultConfig {
minSdkVersion 17
}
compileOptions {
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
}
}
if you would like to try the sample app specifically fit to your usage, you can do so by following the steps below.
Initialization binds the ESRC SDK to Android’s context, thereby allowing it to use a camera in your mobile. To the init()
method, pass the App ID of your ESRC application to initialize the ESRC SDK and the ESRCLicenseHandler to received callback for validation of the App ID.
ESRC.init(APP_ID, getApplicationContext(), new ESRCLicense.ESRCLicenseHandler() {
@Override
public void onValidatedLicense() {
…
}
@Override
public void onInvalidatedLicense() {
…
}
});
Note: The
ESRC.init()
method must be called once across your Android app. It is recommended to initialize the ESRC SDK in theonCreate()
method of the Application instance.
If you don't want to develop a layout that uses the camera, you can ues the ESRC Fragment provided from the ESRC SDK. Include the container to bind the ESRC Fragment in your layout .xml
file. Please skip the Step 4: Feed the ESRC SDK. The ESRC Fragment will feed the image to our SDK itself.
<FrameLayout
android:id=”@+id/container”
android:layout_width=”match_parent”
android:layout_height=”match_parent” />
Note: FrameLayout is just one of examples. You can change to other layout type to purpose your app.
Bind the ESRC Fragment to display the image taken with the camera on the screen. ESRC Fragment send the image to the ESRC SDK in real-time to be able to recognize facial expression, heart response and emotion. ESRC Fragment automatically display the image to fit the size of your custom layout.
// Bind LAYOUT.xml on your Activity.
setContentView(R.layout.LAYOUT);
// Replace the container in LAYOUT as the ESRC Fragment .
getSupportFragmentManager().beginTransaction()
.replace(R.id.container, ESRCFragment.newInstance())
.commit();
Start the ESRC SDK to recognize your facial expression, heart response and emotion. To the start()
method, pass the ESRCType.Property
to select analysis modules and the ESRC.ESRCHandler
to handle the results. You should implement the callback method of ESRC.ESRCHandler
interface. So, you can receive the results of face, facial landmark, head pose, attention, facial expression, heart rate, heart rate variability and engagement. Please refer to sample app.
ESRC.start(
new ESRCType.Property(
true, // Whether visualize result or not. It is only valid If you bind the ESRC Fragment (i.e., Step 2).
true, // Whether analyze measurement environment or not.
true, // Whether detect face or not.
true, // Whether detect facial landmark or not. If enableFace is false, it is also automatically set to false.
true, // Whether analyze facial action unit or not. If enableFace or enableFacialLandmark is false, it is also automatically set to false.
true, // Whether recognize basic facial expression or not. If enableFace is false, it is also automatically set to false.
true, // Whether recognize valence facial expression or not. If enableFace is false, it is also automatically set to false.
true, // Whether estimate remote hr or not. If enableFace is false, it is also automatically set to false.
true, // Whether analyze HRV not not. If enableFace or enableRemoteHR is false, it is also automatically set to false.
true), // Whether recognize engagement or not. If enableRemoteHR and enableHRV are false, it is also automatically set to false.
new ESRC.ESRCHandler() {
@Override
public void onDetectedFace(ESRCTYPE.Face face, ESRCException e) {
if(e != null) {
// Handle error.
}
// The face is detected.
// Through the “face” parameter of the onDetectedFace() callback method,
// you can get the location of the face from the result object
// that ESRC SDK has passed to the onDetectedFace().
…
}
// Please implement other callback method of ESRC.ESRCHandler interface.
@Override public void onAnalyzedMeasureEnv( … ) { … }
@Override public void onDetectedFacialLandmark( … ) { … }
@Override public void onAnalyzedFacialActionUnit( … ) { … }
@Override public void onRecognizedBasicFacialExpression( … ) { … }
@Override public void onRecognizedValenceFacialExpression( … ) { … }
@Override public void onEstimatedHeadPose( … ) { … }
@Override public void onRecognizedAttention( … ) { … }
@Override public void didChangedProgressRatioOnRemoteHR( … ) { … }
@Override public void onEstimatedRemoteHR( … ) { … }
@Override public void didChangedProgressRatioOnHRV( … ) { … }
@Override public void onAnalyzedHRV( … ) { … }
@Override public void onRecognizedEngagement( … ) { … }
});
Feed Mat
on the ESRC SDK. To the feed()
method, pass the Mat
image received using a camera in real-time. Please do it at 10 fps. You can skip this step if you follow Step 2: Bind the ESRC Fragment.
ESRC.feed(Mat);
When your app is not use the camera or destroyed, stop the ESRC SDK.
ESRC.stop();
You can clone the project from the Sample repository.
// Clone this repository
git clone git@github.com:esrc-official/ESRC-Android.git
// Move to the sample
cd ESRC-Android
For further detail on ESRC SDK for Android, reter to ESRC SDK for Android README.