This tutorial shows a basic usage example of the camera plugin https://pub.dev/packages/camera
We will create a widget which will display a camera preview
Other tutorials which I found would pass the camera object to the widget, but I wanted to make a widget self containing and do all the camera initialization inside it.
One challenge which I was facing was an async initialization of the camera that's why I am using a FutureBuilder
Here is the complete source code of the widget:
// livePic.dart
import 'package:camera/camera.dart';
import 'package:flutter/widgets.dart';
class LivePic extends StatefulWidget {
@override
LivePicState createState() => LivePicState();
}
class LivePicState extends State<LivePic> {
CameraController _controller;
Future<void> _initializeControllerFuture;
Future<CameraDescription> getCamera() async {
final cameras = await availableCameras();
return cameras.first;
}
Future<void> initializeController() async {
final camera = await getCamera();
_controller = CameraController(camera, ResolutionPreset.max);
return _controller.initialize();
}
@override
void initState() {
super.initState();
_initializeControllerFuture = initializeController();
}
@override
void dispose() {
// Dispose of the controller when the widget is disposed.
_controller.dispose();
super.dispose();
}
@override
Widget build(BuildContext context) {
return FutureBuilder<void>(
future: _initializeControllerFuture,
builder: (context, snapshot) {
if (snapshot.connectionState == ConnectionState.done) {
// If the Future is complete, display the preview.
return CameraPreview(_controller);
} else {
// Otherwise, display a loading indicator.
return Placeholder();
}
});
}
}
The method getCamera returns the first available camera. I think there is a possibility to choose front or back camera explicitly, but I haven't tried it yet
Future<CameraDescription> getCamera() async {
final cameras = await availableCameras();
return cameras.first;
}
When initializeController method is called (from initState) it sets the object variable _controller and initializes it. Plugin documentation says that the Camera controller needs to be initialized before it can be used.
Future<void> initializeController() async {
final camera = await getCamera();
_controller = CameraController(camera, ResolutionPreset.max);
return _controller.initialize();
}
The build method returns a FutureBuilder which in this case allows to rebuild the widget after the camera initialization is finished:
@override
Widget build(BuildContext context) {
return FutureBuilder<void>(
future: _initializeControllerFuture,
builder: (context, snapshot) {
if (snapshot.connectionState == ConnectionState.done) {
// If the Future is complete, display the preview.
return CameraPreview(_controller);
} else {
// Otherwise, display a placeholder.
return Placeholder();
}
});
}
That's it, here is my main.dart:
import 'package:flutter/material.dart';
import 'package:poseapp/livePic.dart';
void main() {
runApp(MyApp());
}
class MyApp extends StatelessWidget {
// This widget is the root of your application.
@override
Widget build(BuildContext context) {
return MaterialApp(
title: 'Flutter Demo',
theme: ThemeData(
// This is the theme of your application.
//
// Try running your application with "flutter run". You'll see the
// application has a blue toolbar. Then, without quitting the app, try
// changing the primarySwatch below to Colors.green and then invoke
// "hot reload" (press "r" in the console where you ran "flutter run",
// or simply save your changes to "hot reload" in a Flutter IDE).
// Notice that the counter didn't reset back to zero; the application
// is not restarted.
primarySwatch: Colors.blue,
// This makes the visual density adapt to the platform that you run
// the app on. For desktop platforms, the controls will be smaller and
// closer together (more dense) than on mobile platforms.
visualDensity: VisualDensity.adaptivePlatformDensity,
),
home: LivePic(),
);
}
}
Conclusion
This was a basic proof of concept - my goal is to do some gesture recognition and I will try to process the image data which comes from the camera. Next steps would be to try to use OpenCV and/or TensorFlowLite