[Flutter App Dev] – Read a Barcode

This tutorial demonstrates how to use the camera plugin in combination with Firebase’s vision library to read any type of barcode. The example below is demonstrated using the Android emulator with the virtual scene option selected as the camera emulator.

Using the Android Emulator Virtual Scene

For those who do not wish to use the virtual scene, as shown above, please skip this section. Otherwise, start by creating an Android Emulator and select the virtual scene option for the camera of your choice.

Download any barcode image you can find on Google image search, run the emulator and click the ellipsis menu as shown below.

Finally, under the Camera option on the left nav, set the Wall image to point to this barcode file.

Project Setup

I’m not going to detail the steps to create a Flutter project. Instead, I will assume you already have your project ready and running. However, you will need to set up a Firebase Project and add it to your Flutter application project.

You may wonder why Firebase is used? Firebase has a service called ML Kit which we can pass an image to, and retrieve the values of any barcodes read. We can also rest assured that ML Kit has been trained to read all types of barcodes!

Setup Camera Preview

Luckily, there is a flutter plugin conveniently called Camera that allows us to have a camera preview along with the ability to acquire an image and pass it to Firebase ML Vision for barcode results.

Simply add the Camera plugin (with the current version) to your pubspec.yaml

dependencies:
  flutter:
    sdk: flutter

  # The following adds the Cupertino Icons font to your application.
  # Use with the CupertinoIcons class for iOS style icons.
  cupertino_icons: ^0.1.2
  camera: 0.3.0+3
pubspec.yaml

We’ll take the camera code example straight from there as a basis to work with.

import 'dart:async';
import 'dart:io';
import 'package:flutter/material.dart';
import 'package:camera/camera.dart';

List<CameraDescription> cameras;

Future<void> main() async {
  cameras = await availableCameras();
  runApp(App());
}

class App extends StatelessWidget {

  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      home: CameraApp(),
    );
  }
}

class CameraApp extends StatefulWidget {
  @override
  _CameraAppState createState() => _CameraAppState();
}

class _CameraAppState extends State<CameraApp> {
  CameraController controller;
	
  @override
  void initState() {
    super.initState();
    controller = CameraController(cameras[0], ResolutionPreset.medium);
    controller.initialize().then((_) {
      if (!mounted) {
        return;
      }
      setState(() {});
    });
  }

  @override
  void dispose() {
    controller?.dispose();
    super.dispose();
  }

  @override
  Widget build(BuildContext context) {
    if (!controller.value.isInitialized) {
      return Container();
    }

    return Stack(
      alignment: Alignment.center,
      children: <Widget>[
        AspectRatio(
          aspectRatio:
          controller.value.aspectRatio,
          child: CameraPreview(controller)
        )
      ],       
    );
  }
}
Full screen camera preview example

** For those using the Android virtual scene for the camera preview, you can hold Alt + WSAD keys to move around (the wall is in the room behind you) **

Read a Barcode

Now we have a camera preview to work with, we can start taking an image and passing it to Firebase’s vision detection API. As the camera plugin is still in preview, there is currently no way to stream the camera’s preview into ML Kit. Although there is now the functionality to acquire the byte buffer of the preview, the pixel data is not in the correct format that the VisionImage class expects. Converting this to the expected format is out of scope for this tutorial.

Instead, we will create a timer that runs every 3 seconds that takes an image, saves it, and have ML Kit load and read this.

First, let us setup the timer code.

class _CameraAppState extends State<CameraApp> {
  CameraController controller;
  Timer _timer;

  @override
  void initState() {
    super.initState();
    controller = CameraController(cameras[0], ResolutionPreset.medium);
    controller.initialize().then((_) {
      if (!mounted) {
        return;
      }
      setState(() {});

      _startTimer();
    });
  }

  void _startTimer() {
    _timer = new Timer(Duration(seconds: 3), _timerElapsed);
  }

  void _stopTimer() {
    if(_timer != null) {
      _timer.cancel();
      _timer = null;
    }
  }

  Future<void> _timerElapsed() async{
    _stopTimer();

	// Code to capture image and read barcode here...

    _startTimer();
  }
}
Adding a callback timer

Now that we have the callback function ticking every 3 seconds (safeguarded incase the barcode detection overruns by stopping it during the callback tick) let’s take an image!

Future<void> _timerElapsed() async{
    _stopTimer();

    File file = await _takePicture();

    _startTimer();
  }

  Future<File> _takePicture() async {
    final Directory extDir = await getApplicationDocumentsDirectory();
    final String dirPath = '${extDir.path}/Pictures/barcode';
    await Directory(dirPath).create(recursive: true);
    final File file = new File('$dirPath/barcode.jpg');
    
    if(await file.exists())
      await file.delete();
    
    await controller.takePicture(file.path);
    return file;
  }
Take and save photo example

Every 3 seconds the image will be overridden and passed to the ML Kit API as described below:

class _CameraAppState extends State<CameraApp> {
  CameraController controller;
  Timer _timer;
  String _barcodeRead = "";  // Add this ...
	
  // Rest of CameraAppState's methods ...
	
  Future<void> _timerElapsed() async{
    _stopTimer();

    File file = await _takePicture();

    await _readBarcode(file);

    _startTimer();
  }
	
  Future _readBarcode(File file) async {
    FirebaseVisionImage firebaseImage = FirebaseVisionImage.fromFile(file);
    final BarcodeDetector barcodeDetector = FirebaseVision.instance.barcodeDetector();
    
    final List<Barcode> barcodes = await barcodeDetector.detectInImage(firebaseImage);
    
    _barcodeRead = "";
    for(Barcode barcode in barcodes) {
      _barcodeRead += barcode.rawValue + ", ";
    }
  }
}
Read the barcode example

For the above code to compile, you will need to add the Firebase ML Vision plugin to the pubspec.yaml (along with path_provider, to get folder locations on the system).

dependencies:
  flutter:
    sdk: flutter

  # The following adds the Cupertino Icons font to your application.
  # Use with the CupertinoIcons class for iOS style icons.
  cupertino_icons: ^0.1.2
  camera: 0.3.0+3
  firebase_ml_vision: 0.5.0+1
  path_provider: 0.5.0+1
MLKit added to firebase pubspec.yaml

And add the necessary includes at the top of the main.dart file

import 'dart:async';
import 'dart:io';
import 'package:flutter/material.dart';
import 'package:camera/camera.dart';
import 'package:path_provider/path_provider.dart';
import 'package:firebase_ml_vision/firebase_ml_vision.dart';

So… Now the app can take the photo, read all the barcodes detected in the image and store them in the member variable “_barcodeRead” string, all that is left is to display it!

Display the Barcodes

Add the Text element to the Stack inside of the “Build” method – we can wrap it in a container so that it can be anchored to the bottom of the screen.

@override
  Widget build(BuildContext context) {
    if (!controller.value.isInitialized) {
      return Container();
    }

    return Stack(
      alignment: Alignment.center,
      children: <Widget>[
        AspectRatio(
          aspectRatio:
          controller.value.aspectRatio,
          child: CameraPreview(controller)
        ),
        
        Container(
          alignment: Alignment.bottomCenter,
          child: Text(
            _barcodeRead.length > 0 ? _barcodeRead : "No Barcode",
            textAlign: TextAlign.center
          ),
        )
      ],       
    );
  }
Display barcode string

Finally,  we need to ‘redraw’ the widget whenever we update the barcode variable. To do this in Flutter, all we need to do is call “setState”.

Future _readBarcode(File file) async {
    FirebaseVisionImage firebaseImage = FirebaseVisionImage.fromFile(file);
    final BarcodeDetector barcodeDetector = FirebaseVision.instance.barcodeDetector();
    
    final List<Barcode> barcodes = await barcodeDetector.detectInImage(firebaseImage);
    
    _barcodeRead = "";
    for(Barcode barcode in barcodes) {
      _barcodeRead += barcode.rawValue + ", ";
    }
    
    setState(() {});
  }
Update the widget to display barcode

[Flutter App Dev] – Setting Up Firebase

If you are just getting into mobile development with Flutter (or mobile development in general) let me introduce you to Firebase. It is a service that offers a tonne of features such as free push notifications, analytics, Authentication etc. It even has some paid services (free tier is still quite impressive) for database hosting, file storage, and SMS service for phone authentication etc.

In this post, I will show you how to set this up for a flutter project. To keep this post short and to the point, I will assume you have already created your flutter project.

Firebase Setup

As a prerequisite, all you need to do is go to the Firebase Website and sign up for an account and log in.

Step 1 – Add a Firebase Project

Go to the Firebase Console and click “Add Project”

Step 2 – Give the Firebase Project a Name

The name you enter here is used to identify your project on Firebase. You can optionally edit the “Project ID” field, as this is used to reference your Firebase endpoint.

Accept the terms and then press “Create Project”

Step 3 – Create Application Projects in Firebase

This step is broken up into two sections: Android project setup and iOS Project Setup.

Setup a Firebase Android Project

On your Firebase Console, click the Android button to start the process of adding an Android application to your project.

Step 1 of this process is the most important, your “Android package name” must match the identifier you set on your Android project and is normally in the format of com.companyname.applicationname.

The app nickname is once again only used to identify your android project within your Firebase project.

Please refer to this post on how to find the SHA-1 fingerprint of your debug key. It is not so important for now, but when you come to release your application, you will need to add the SHA-1 fingerprint of the keystore file you use to sign and distribute your app.

Once you click “Register App” you will be taken to step 2. Simply download this file (google-services.json) and place within the Android -> app folder of your Flutter Application.

google-services.json file placed in the correct location for a Flutter app.

Step 3 is where you add the dependencies for your Android application. This step on the Firebase example is a bit miss-leading as the line for including Firebase core is not needed. This is because, when you add one of the Firebase libraries it will manage this include for you. Including this line manually will just cause compiler errors.

Therefore, setup is as follows:

Of course, use the version number for google-services as described by the Firebase project setup wizard that is on your screen.

Now refer to “Add Firebase to the Flutter Application” below or continue to “Setup a Firebase iOS Project”.

Setup a Firebase iOS Project

Click the iOS icon or the “Add App” button (if you followed the above Android steps). In my case, I have the Add App button (be sure to select the iOS option that appears.)

Fill out the details for step 1. Note that the “iOS bundle ID” must match exactly what you set in your iOS project. Your “App Store ID” requires you to have enrolled in the Apple Developer Program 
(which has a cost!) and have your app uploaded on iTunes connect.

However, this is optional for now, but remember to add it to your project’s settings before you release your application.

Next, in step 2, download the GoogleService-info.plist file and place it in the iOS project of your Flutter project. Below is an example of where I placed mine (in the ios -> Runner folder).

Placement of the GoogleService-info.plist file for a Flutter iOS App.

You can skip step 3, as including a FlutterFire library into your flutter application will already do this.

Follow instructions for step 4 as described by the Firebase steps precisely for your chosen language (Objective-C / swift).

Now refer to “Add Firebase to the Flutter Application” below.

Step 4 – Add a Firebase library to the Flutter Application

Providing you followed the steps for placement of the Google service files in the previous two sections, all you need to do for this is add any FlutterFire plugin to your Flutter project’s “pubspec.yaml” file.

FlutterFire plugin “firebase_ml_vision” included in the “dependencies” section of the pubspec.yaml file example.

Finally, deploy and run your application to the device and you should see the successful message stating your app has communicated with the Firebase servers. Note, that this message appears on the last step of each application setup of your Firebase project, within the Firebase
Console.

[Flutter App Dev] – Camera Plugin – Dark Preview Fix

As promised… This blog is moving direction to focus on Flutter Development… First up is how to fix flutter’s dark camera plugin (that can be used in any Android project utilising the Camera2 API)!

When developing my (Xamarin) app Prog I ran into a rather complex bug on Android with the Camera2 API. Upon starting the camera preview, it would appear correctly lit for a split second and promptly become dark. This rendered the camera preview useless, as seen below.

Dark Camera Preview Example with Flutter Camera Plugin

I spent a good month reading over endless Stackoverflow posts, and attempting to translate the Android documentation into Xamarins C# wrapper equivalent. Just as I was about to give up, I gave it one last attempt, and I got it!

It turned out, the camera FPS was too high for the auto exposure to keep up. This resulted in the auto exposure failing miserably and seemingly ‘giving up’. Note, that on high end devices this didn’t seem to be a huge problem. Although, I tested on a “Samsung Galaxy Tab A” which is seemingly low end – but should still be way more than capable of running Prog.

The solution turned out to be pretty simple… You can query the list of available FPS ranges that the auto exposure can handle via the CameraCharacteristics API. A range in this instance has a lower and upper bound – the lower end meaning slower FPS and the upper meaning faster FPS.

The list of ranges returned can come in two forms (x, x) where the lower and upper range is the same (i.e. constant FPS). Or a (x, y) form where x < y but there is variation on the FPS. From personal experience, the (x, y) range appears to use the lower FPS when the exposure is struggling but remain high FPS on high end device. Thus, finding the FPS range with the biggest difference between X and Y components resulted in the ‘sweet spot’ when choosing the FPS range.

Enough Background… This is For Flutter!

Or more specifically… The Flutter’s Camera Plugin. I have already created a Pull Request to fix this – but have been told they’re favouring quality over features before approving the pull request (it’s became a ‘feature’ as choosing a slower FPS in favour of better exposure may not be required by all apps, therefore an option is required to be implemented).

I’m sharing this a work around for those running into this problem and require a useable camera preview in all scenarios.

Within the Android native class of the camera plugin source code (CameraPlugin.java) create a method called “setBestAERange” as shown below. This will get all fpsRanges, check for the range with the biggest difference between lower and upper bound and assign to the member variable “aeFPSRange”.

private void setBestAERange(CameraCharacteristics characteristics) {
	      Range<Integer>[] fpsRanges =
	          characteristics.get(CameraCharacteristics.CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES);
	
	      if (fpsRanges.length <= 0) {
	        return;
	      }
	
	      Integer idx = 0;
	      Integer biggestDiference = 0;
	
	      for (Integer i = 0; i < fpsRanges.length; i++) {
	        Integer currentDifference = fpsRanges[i].getUpper() - fpsRanges[i].getLower();
	
	        if (currentDifference > biggestDiference) {
	          idx = i;
	          biggestDiference = currentDifference;
	        }
	      }
	
	      aeFPSRange = fpsRanges[idx];
	    }

Hopefully this should indicate that you need to add a Range<Integer> type member variable to the “Camera” class. Then call this method inside the Camera constructor, just above “computeBestCaptureSize” making sure you pass in the characteristics variable.

Camera(final String cameraName, final String resolutionPreset, @NonNull final Result result) {
    ...

    setBestAERange(characteristics);
    computeBestCaptureSize(streamConfigurationMap);
    
    ...
}

The final piece of the puzzle is to set this FPS range on the capture request for the camera preview. Add the following code to the “createCaptureSession” method, just before the call to “setRepeatingRequest” on the capture request.

if (Camera.this.aeFPSRange != null) {
    captureRequestBuilder.set(                                CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE,   Camera.this.aeFPSRange);
}
Fixed camera preview for flutter camera plugin

Happy usable camera preview!

New Blog Direction – Flutter Mobile Development

It has been ~10 months since my last post… In that time I have started a new job, bought a house, learned great amount of new skills at my new job (one of which is mobile development), I’ve created and released my own Xamarin app called Prog and finally… Contributed (currently an open pull request) to my first open source project ever – Flutter Camera Plugin.

It’s safe to say that this last year (2018) has been quite the roller-coaster, and has led me away from Raspberry Pi development. I believe my interest in this technology had peaked when I saw my code interacting with live machinery via the Pi in my previous employment. Since then, I have not had any project worth investing time into with regards to the Raspberry Pi. Therefore, this blog is taking a new direction – Mobile Development.

Flutter

My first entry into mobile development was using Xamarin Forms 3.0. While I did battle through and eventually release my own app, I don’t think I will be using it again. The eco system on Windows was broken. As an example: making a small change would result in a 5 minute wait for a compile and deployment to take place. If the compile didn’t fail and you didn’t have to restart Visual Studio, clean and rebuild you would just find out that the small change you made didn’t work and you had to repeat the process until it finally ‘worked’.

Flutter is a game changer I mean really… It has hot reload out of the box and works on all major platforms! It can even hot restart when you make logic changes. On my machine, a hot reload will take 800ms, and a hot restart will take 1.2s.

The layout and widget system makes perfect sense and the documentation is great. Animations – something developers usually shy away from – is so simple I can’t wait to post about it. Native Functionality integration is super easy also (until you get to the objective-C / Swift code – but that isn’t Flutter’s issue). And finally, it comes with amazing integration with Visual Studio Code.

Anyhow – I just wanted to make a post to say that there will likely never be anymore Raspberry Pi stuff here. I’m likely going to post solutions to problems I encounter and solve during my time developing mobiles apps with Flutter.