Skip to content

Problems using Google YAMNet for environmental sound classification in flutter #54

@ghost

Description

I discovered an error when testing on a real device. My module is based on modifying the sample from

https://www.tensorflow.org/tutorials/audio/transfer_learning_audio

and then converted to tflite. I followed the settings like xml, build.grade, yaml, and included labels.txt and tflite. But when I tested it on a real device, it could not run.

Parameters:

Syncing files to device SM A7050...
I/flutter (21612): Initializing state...
I/flutter (21612): Loading model...
D/TfliteAudio(21612): model name is: assets/converted_model.tflite
I/flutter (21612): Building widget...
I/InterpreterApi(21612): Loaded native library: tensorflowlite_jni
I/InterpreterApi(21612): Didn't load native library: tensorflowlite_jni_gms_client
I/tflite (21612): Initialized TensorFlow Lite runtime.
I/tflite (21612): Created TensorFlow Lite XNNPACK delegate for CPU.
D/TfliteAudio(21612): inputShape: [1]
D/TfliteAudio(21612): label name is: assets/labels.txt
D/TfliteAudio(21612): Labels: [crickets, crying_baby]
D/TfliteAudio(21612): loadModel parameters: {isAsset=true, outputRawScores=false, model=assets/converted_model.tflite, inputType=decodedWav, label=assets/labels.txt, numThreads=1}
I/SurfaceView(21612): applySurfaceTransforms: t = android.view.SurfaceControl$Transaction@3ede20b surfaceControl = Surface(name=SurfaceView - com.example.baby_cryr/com.example.baby_cryr.MainActivity@991d082@0)/@0x6aac8e8 frame = 1
I/ample.baby_cry(21612): WaitForGcToComplete blocked RunEmptyCheckpoint on ProfileSaver for 9.483ms
I/ViewRootImpl@d891c9cMainActivity: MSG_WINDOW_FOCUS_CHANGED 1 1
D/InputMethodManager(21612): prepareNavigationBarInfo() DecorView@9c71922[MainActivity]
D/InputMethodManager(21612): getNavigationBarColor() -16711423
D/InputMethodManager(21612): prepareNavigationBarInfo() DecorView@9c71922[MainActivity]
D/InputMethodManager(21612): getNavigationBarColor() -16711423
V/InputMethodManager(21612): Starting input: tba=com.example.baby_cryr ic=null mNaviBarColor -16711423 mIsGetNaviBarColorSuccess true , NavVisible : true , NavTrans : false
D/InputMethodManager(21612): startInputInner - Id : 0
I/InputMethodManager(21612): startInputInner - mService.startInputOrWindowGainedFocus
D/InputTransport(21612): Input channel constructed: 'ClientS', fd=107
D/InputMethodManager(21612): prepareNavigationBarInfo() DecorView@9c71922[MainActivity]
D/InputMethodManager(21612): getNavigationBarColor() -16711423
V/InputMethodManager(21612): Starting input: tba=com.example.baby_cryr ic=null mNaviBarColor -16711423 mIsGetNaviBarColorSuccess true , NavVisible : true , NavTrans : false
D/InputMethodManager(21612): startInputInner - Id : 0
I/SurfaceControl(21612): nativeRelease nativeObject s[-5476376610795376448]
I/SurfaceControl(21612): nativeRelease nativeObject e[-5476376610795376448]
I/SurfaceControl(21612): nativeRelease nativeObject s[-5476376610795376272]
I/SurfaceControl(21612): nativeRelease nativeObject e[-5476376610795376272]
I/ViewRootImpl@d891c9cMainActivity: ViewPostIme pointer 0
I/ViewRootImpl@d891c9cMainActivity: ViewPostIme pointer 1
I/flutter (21612): Requesting microphone permission...
I/flutter (21612): Microphone permission granted.
I/flutter (21612): Starting audio recognition...
D/TfliteAudio(21612): Parameters: {detectionThreshold=0.3, minimumTimeBetweenSamples=0, method=setAudioRecognitionStream, numOfInferences=1, averageWindowDuration=0, audioLength=0, sampleRate=16000, suppressionTime=0, bufferSize=2000}
D/TfliteAudio(21612): AudioLength has been readjusted. Length: 1
E/EventChannel#AudioRecognitionStream(21612): Failed to open event stream
E/EventChannel#AudioRecognitionStream(21612): java.lang.ArrayIndexOutOfBoundsException: length=1; index=1
E/EventChannel#AudioRecognitionStream(21612): at flutter.tflite_audio.TfliteAudioPlugin.determineAudio(TfliteAudioPlugin.java:296)
E/EventChannel#AudioRecognitionStream(21612): at flutter.tflite_audio.TfliteAudioPlugin.onListen(TfliteAudioPlugin.java:244)
E/EventChannel#AudioRecognitionStream(21612): at io.flutter.plugin.common.EventChannel$IncomingStreamRequestHandler.onListen(EventChannel.java:218)
E/EventChannel#AudioRecognitionStream(21612): at io.flutter.plugin.common.EventChannel$IncomingStreamRequestHandler.onMessage(EventChannel.java:197)
E/EventChannel#AudioRecognitionStream(21612): at io.flutter.embedding.engine.dart.DartMessenger.invokeHandler(DartMessenger.java:295)
E/EventChannel#AudioRecognitionStream(21612): at io.flutter.embedding.engine.dart.DartMessenger.lambda$dispatchMessageToQueue$0$io-flutter-embedding-engine-dart-DartMessenger(DartMessenger.java:322)
E/EventChannel#AudioRecognitionStream(21612): at io.flutter.embedding.engine.dart.DartMessenger$$ExternalSyntheticLambda0.run(Unknown Source:12)
E/EventChannel#AudioRecognitionStream(21612): at android.os.Handler.handleCallback(Handler.java:938)
E/EventChannel#AudioRecognitionStream(21612): at android.os.Handler.dispatchMessage(Handler.java:99)
E/EventChannel#AudioRecognitionStream(21612): at android.os.Looper.loop(Looper.java:246)
E/EventChannel#AudioRecognitionStream(21612): at android.app.ActivityThread.main(ActivityThread.java:8653)
E/EventChannel#AudioRecognitionStream(21612): at java.lang.reflect.Method.invoke(Native Method)
E/EventChannel#AudioRecognitionStream(21612): at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:602)
E/EventChannel#AudioRecognitionStream(21612): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1130)

======== Exception caught by services library ======================================================
The following PlatformException was thrown while activating platform stream on channel AudioRecognitionStream:
PlatformException(error, length=1; index=1, null, null)

When the exception was thrown, this was the stack:

#0 StandardMethodCodec.decodeEnvelope (package:flutter/src/services/message_codecs.dart:652:7)
#1 MethodChannel._invokeMethod (package:flutter/src/services/platform_channel.dart:310:18)
#2 EventChannel.receiveBroadcastStream.
(package:flutter/src/services/platform_channel.dart:652:9) ====================================================================================================

I was hoping you might be able to help me understand whether the issue is with my trained model or if the platform simply does not support it. It would be greatly appreciated if you could provide some guidance. Thank you for your time and assistance

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions