这是indexloc提供的服务,不要输入任何密码

facetagr 0.0.13 copy "facetagr: ^0.0.13" to clipboard
facetagr: ^0.0.13 copied to clipboard

At FaceTagr, we are pioneers in advanced face recognition technology, delivering solutions that are accurate, reliable, and scalable. Our NIST-tested algorithms, with 99.91% accuracy, ensure that our [...]

FaceTagr Flutter Package #

The FaceTagr Flutter package allows third-party teams to integrate face recognition capabilities into their applications. This package provides two primary functions: initialization (init) and face matching (fnFaceMatch).

📦 Installation #

This will add a line like this to your package's pubspec.yaml (and run an implicit flutter pub get)

dependencies:
  facetagr: ^0.0.13
  

flutter pub get

Setup the config file #

Run the following command to create a new config automatically:


dart run facetagr:facetagr_init --clientID <clientID> --clientKey <clientKey> --apiURL <apiURL> --path <path>

--path eg: for Windows: C:\Users<USERNAME>\AppData\Local\Pub\Cache\hosted\pub.dev

for Mac/Linux: /Users/

📥 Import Package #

import 'package:facetagr/facetagr.dart';
import 'package:camera/camera.dart';

🛠️ Initialization #

Use init inside your main page or login flow:

class _HomePageState extends State<HomePage> {
StreamSubscription<String>? _initSub;
Facetagr _faceTagr = Facetagr();

@override
void initState() {
  super.initState();
  _facetagr_initialize();
  _listenToBroadcast();
}
void _listenToBroadcast() {
    _initSub = Facetagr.initStream.listen((message) {
      final decoded = jsonDecode(message);
      if (decoded['StatusMessage'] == 1001) {
        if (!mounted) return;
        ScaffoldMessenger.of(context).showSnackBar(
          SnackBar(content: Text(decoded['StatusMessage'])),
        );
      } else {
        if (!mounted) return;
        ScaffoldMessenger.of(context).showSnackBar(
          SnackBar(content: Text(decoded['StatusMessage'])),
        );
      }
      setState(() => _isProcessing = false);
    });
  }

void _facetagr_initialize() {
    String clientKey = "clientKey";
    String apiURL = "https://yourapiurl.com";
    String clientID = "yourClientID";
    String externalID = "yourExternalID";
    String requestID = const Uuid().v4();
    String utcTime = DateTime.now().toUtc().toString();
    String hashcode = "hashcode";
    _faceTagr.init(apiURL, clientID, externalID, hashcode, utcTime, requestID);
  }
  @override
  void dispose() {
    _initSub?.cancel();
    super.dispose();
  }
}

🔑 Hash Logic #

FaceTagr uses a SHA-512 hash for request signing.


import 'dart:convert';
import 'package:crypto/crypto.dart';

String fn_get_hash(String clientID, String utcTime, String requestID, String clientKey) {
  String input = clientID + utcTime + requestID + clientKey;
  var bytes = utf8.encode(input);
  var hash = sha512.convert(bytes);
  return hash.toString();
}

Example Flow

String requestID = const Uuid().v4();
String utcTime   = DateTime.now().toUtc().toString();
String hash      = fn_get_hash(clientID, utcTime, requestID, clientKey);

_faceTagr.init(apiURL, clientID, externalID, hash, utcTime, requestID);

🔒 Best practice: Generate the hash server-side (so the clientKey never sits inside the app).

📷 Open FaceTagr Camera #

Navigator.of(context).push(
  MaterialPageRoute(builder: (_) => const FaceTagrLivePreview()),
).then((_) => mounted ? setState(() => _isProcessing = false) : null);

This launches the built-in FaceTagrLivePreview widget with face recognition.

🔐 Logout #

await faceTagr.fnLogout();

This clears local tokens and resets the session.

🎧 Listening to Events #

• Initialization → Facetagr.initStream.listen(...) • Face Match → Facetagr.faceMatchStream.listen(...)

Events are returned as JSON:


{
  "StatusCode": 1001,
  "StatusMessage": "Success"
}

🖼️ Live Preview Widget #

Navigator.push(
  context,
  MaterialPageRoute(builder: (_) => const FaceTagrLivePreview()),
);

The widget provides: • Front camera stream • Face bounding box overlays • Spinner while matching • Dialogs on success/failure

FaceTagr Camera FaceTagrLivePreview.dart #

import 'dart:async';
import 'dart:convert';
import 'dart:io';
import 'dart:typed_data';
import 'package:flutter/foundation.dart';
import 'package:flutter/material.dart';
import 'package:camera/camera.dart';
import 'package:facetagr/facetagr.dart';
import 'package:wakelock_plus/wakelock_plus.dart';
import 'main.dart';

const bool kDetectorMirrorsFront = true;
const bool kDetectorReturnsPreviewOrientedBoxes = true;
class FaceTagrLivePreview extends StatefulWidget {
  const FaceTagrLivePreview({Key? key}) : super(key: key);

  @override
  State<FaceTagrLivePreview> createState() => _FaceTagrLivePreviewState();
}

class _FaceTagrLivePreviewState extends State<FaceTagrLivePreview> {
  StreamSubscription<String>? _matchSub;
  CameraController? _controller;
  bool _isDetecting = false;
  Rect? _faceBox;
  String _status = "Initializing camera...";
  late bool _isFrontCamera;
  bool _showSpinner = false;
  String _deviceType = "";

  @override
  void initState() {
    super.initState();
    _deviceType = Platform.isIOS ? "ios" : "android";
    WakelockPlus.enable();
    _listenToBroadcast();
    initializeCamera();
  }

  Future<void> _listenToBroadcast() async {
    _matchSub = Facetagr.faceMatchStream.listen((message) {
      try {
        final decoded = jsonDecode(message);
        final int statusCode =  int.tryParse(decoded['StatusCode'].toString()) ?? -1;
        final String statusMessage = decoded['StatusMessage'];
        if (!mounted) return;
        if(statusCode < 5000) {
          _showPopup(statusCode, statusMessage);
        }else{
          setState(() {
            _status = statusMessage;
            _faceBox = null;
          });
        }
      } catch (_) {
        // not JSON; ignore
      }
    });
  }
  void _showPopup(int statusCode, String message) {
    showDialog(
      context: context,
      barrierDismissible: false, // prevent closing by tapping outside
      builder: (BuildContext context) {
        return AlertDialog(
          shape: RoundedRectangleBorder(
            borderRadius: BorderRadius.circular(12),
          ),
          backgroundColor: Colors.white,
          title: const Text(
            "FaceTagr",
            style: TextStyle(color: Colors.blue),
          ),
          content: Text(
            message,
            style: const TextStyle(color: Colors.lightBlue),
          ),
          actions: [
            TextButton(
              child: const Text("OK", style: TextStyle(color: Colors.blue)),
              onPressed: () {
                Navigator.of(context).pop();
                if (statusCode == 1001) {
                  Navigator.of(context).pushAndRemoveUntil(
                    MaterialPageRoute(builder: (_) => const HomePage()),
                        (route) => false,
                  );
                } else {
                  setState(() {
                    _showSpinner = false;
                  });
                  initializeCamera();
                }
              },
            ),
          ],
        );
      },
    );
  }
  Future<void> initializeCamera() async {
    try {
      final cameras = await availableCameras();
      final camera = cameras.firstWhere(
            (c) => c.lensDirection == CameraLensDirection.front,
        orElse: () => cameras.first,
      );
      _isFrontCamera = camera.lensDirection == CameraLensDirection.front;

      _controller = CameraController(
        camera,
        ResolutionPreset.medium,
        imageFormatGroup: ImageFormatGroup.nv21,
        enableAudio: false,
      );

      await _controller!.initialize();
      if (!mounted) return;

      int frameCount = 1;
      const int frameSkip = 5;

      await _controller!.startImageStream((CameraImage image) async {
        if (!mounted) return;
        if (_isDetecting || (frameCount++ % frameSkip != 0)) return;

        _isDetecting = true;
        try {
          final int width = image.width;
          final int height = image.height;
          Map<String, dynamic>? result;
          if (_deviceType == "android") {
            final yuv = _concatenatePlanes(image.planes);
            result = await Facetagr.detectFace(yuv, width, height, 8);
          }else if (_deviceType == "ios"){
            final yuv = _bgraToYUV420(image); // BGRA to YUV420
            result = await Facetagr.detectFace(yuv, width, height, 1);
          }

          if (!mounted) return;

          if (result is Map && result?["status"] != null) {
            final int status = result?["status"];
            final String msg = (result?["message"] ?? "").toString();
            final double left = (result?['x1'] ?? 0).toDouble();
            final double top = (result?['y1'] ?? 0).toDouble();
            final double w = (result?['width'] ?? 0).toDouble();
            final double h = (result?['height'] ?? 0).toDouble();
           
            if (status == 1001 || status == 1002) {
              _showSpinner = true;
              setState(() {
                _status = ""; // no message text
                _faceBox = Rect.fromLTRB(left, top, w, h);
              });
              await _controller?.stopImageStream();
            } else if (status == 1000) {
              setState(() {
                _status = msg; // no message text
                _faceBox = Rect.fromLTRB(left, top, w, h);
              });
            } else {
              setState(() {
                _faceBox = null;
                _status = msg;
              });
            }
          } else {
            setState(() {
              _status = "Error";
              _faceBox = null;
            });
          }
        } catch (e) {
          if (mounted) {
            setState(() {
              _status = "Error: $e";
              _faceBox = null;
            });
          }
        } finally {
          _isDetecting = false;
        }
      });
    } catch (e) {
      if (!mounted) return;
      setState(() => _status = "Camera error: $e");
    }
  }

  // Put this inside your State class (_FaceTagrLivePreviewState)
  Uint8List _concatenatePlanes(List<Plane> planes) {
    final WriteBuffer allBytes = WriteBuffer();
    for (Plane plane in planes) {
      allBytes.putUint8List(plane.bytes);
    }
    return allBytes.done().buffer.asUint8List();
  }

  Uint8List _bgraToYUV420(CameraImage image) {
    final int width = image.width;
    final int height = image.height;
    final int frameSize = width * height;
    final Uint8List yuv = Uint8List(frameSize + (frameSize ~/ 2));
    final Uint8List bgra = image.planes[0].bytes;

    int yIndex = 0;
    int uvIndex = frameSize;

    for (int j = 0; j < height; j++) {
      for (int i = 0; i < width; i++) {
        final int index = (j * width + i) * 4;

        final int b = bgra[index];
        final int g = bgra[index + 1];
        final int r = bgra[index + 2];

        final y = (((66 * r + 129 * g + 25 * b + 128) >> 8) + 16).clamp(0, 255);
        final u = (((-38 * r - 74 * g + 112 * b + 128) >> 8) + 128).clamp(0, 255);
        final v = (((112 * r - 94 * g - 18 * b + 128) >> 8) + 128).clamp(0, 255);

        yuv[yIndex++] = y;

        if (j % 2 == 0 && i % 2 == 0) {
          yuv[uvIndex++] = v;
          yuv[uvIndex++] = u;
        }
      }
    }

    return yuv;
  }

  @override
  void dispose() {
    WakelockPlus.disable();
    _controller?.dispose();
    _matchSub?.cancel();
    super.dispose();
  }

  @override
  Widget build(BuildContext context) {
    final controller = _controller;
    if (controller == null || !controller.value.isInitialized) {
      return const Scaffold(body: Center(child: CircularProgressIndicator()));
    }

    final bool previewMirrored = _isFrontCamera;
    final Widget basePreview = CameraPreview(controller);
    
    final Widget previewWidget = previewMirrored
        ? Transform(
      alignment: Alignment.center,
      transform: Matrix4.identity()..rotateY(3.1415926535),
      child: basePreview,
    )
        : basePreview;

    final Size sensorSize = controller.value.previewSize!;
    final Size screenSize = MediaQuery.of(context).size;

    final bool isPortrait = screenSize.height > screenSize.width;
    final Size orientedSensor =
    isPortrait ? Size(sensorSize.height, sensorSize.width) : sensorSize;

    final int overlayRotation = kDetectorReturnsPreviewOrientedBoxes
        ? 0
        : controller.description.sensorOrientation;

    final Size imageSpaceSize = kDetectorReturnsPreviewOrientedBoxes
        ? orientedSensor
        : Size(sensorSize.width, sensorSize.height);
    
    bool overlayMirror = true;
    if(_deviceType == "ios"){
      overlayMirror = true;
    }else{
      overlayMirror = previewMirrored ^ kDetectorMirrorsFront;
    }

    return Scaffold(
      appBar: AppBar(title: const Text('FaceTagr')),
      body: Stack(
        children: [
          Positioned.fill(
            child: FittedBox(
              fit: BoxFit.cover,
              child: SizedBox(
                width: orientedSensor.width,
                height: orientedSensor.height,
                child: Stack(
                  fit: StackFit.passthrough,
                  children: [
                    previewWidget,

                    if (_faceBox != null)
                      CustomPaint(
                        size: orientedSensor,
                        painter: FaceBoxPainter(
                          faceBoxImageSpace: _faceBox!,
                          imageSize: imageSpaceSize,
                          mirrorHorizontally: overlayMirror,
                          rotationDegrees: overlayRotation,
                          label: "",
                        ),
                      ),
                  ],
                ),
              ),
            ),
          ),
          
          if (_showSpinner)
            Positioned.fill(
              child: Container(
                color: Colors.blue,
                child: const Center(
                  child: CircularProgressIndicator(),
                ),
              ),
            ),
          
          if (_status != "")
            Positioned(
              left: 16,
              right: 16,
              bottom: 16,
              child: Container(
                padding: const EdgeInsets.symmetric(horizontal: 12, vertical: 8),
                decoration: BoxDecoration(
                  color: Colors.white,
                  borderRadius: BorderRadius.circular(8),
                ),
                child: Text(_status, style: const TextStyle(color: Colors.blue)),
              ),
            ),
        ],
      ),
    );
  }
}

class FaceBoxPainter extends CustomPainter {
  final Rect faceBoxImageSpace; 
  final Size imageSize; 
  final bool mirrorHorizontally;
  final int rotationDegrees;
  final String? label;

  FaceBoxPainter({
    required this.faceBoxImageSpace,
    required this.imageSize,
    required this.mirrorHorizontally,
    required this.rotationDegrees,
    this.label,
  });

  @override
  void paint(Canvas canvas, Size size) {
    final _Rotated r = _rotateRect(faceBoxImageSpace, imageSize, rotationDegrees);
    final double sx = size.width / r.rotatedImageSize.width;
    final double sy = size.height / r.rotatedImageSize.height;

    Rect box = Rect.fromLTWH(
      r.rect.left * sx,
      r.rect.top * sy,
      r.rect.width * sx,
      r.rect.height * sy,
    );

    if (mirrorHorizontally) {
      box = Rect.fromLTWH(size.width - (box.left + box.width), box.top, box.width, box.height);
    }

    box = Rect.fromLTWH(
      box.left,
      box.top,
      box.width,
      box.height,
    );
    _drawCornerTicks(canvas, box, color: Colors.green, length: 28, thickness: 4);
    
    if ((label ?? '').isNotEmpty) {
      _drawLabel(canvas, size, box, label!);
    }
  }

  void _drawCornerTicks(Canvas canvas, Rect box,
      {required Color color, double length = 22, double thickness = 3}) {
    final Paint p = Paint()
      ..color = color
      ..strokeWidth = thickness
      ..strokeCap = StrokeCap.round;

    final tl = box.topLeft;
    final tr = box.topRight;
    final bl = box.bottomLeft;
    final br = box.bottomRight;
    
    canvas.drawLine(tl, tl + Offset(length, 0), p);
    canvas.drawLine(tl, tl + Offset(0, length), p);
    canvas.drawLine(tr, tr + Offset(-length, 0), p);
    canvas.drawLine(tr, tr + Offset(0, length), p);
    canvas.drawLine(bl, bl + Offset(length, 0), p);
    canvas.drawLine(bl, bl + Offset(0, -length), p);
    canvas.drawLine(br, br + Offset(-length, 0), p);
    canvas.drawLine(br, br + Offset(0, -length), p);
  }

  void _drawLabel(Canvas canvas, Size screenSize, Rect box, String text) {
    final TextPainter tp = TextPainter(
      text: TextSpan(
        text: text,
        style: const TextStyle(
          color: Colors.white,
          fontSize: 14,
          fontWeight: FontWeight.bold,
        ),
      ),
      textDirection: TextDirection.ltr,
    )..layout(maxWidth: screenSize.width * 0.8);
    
    Offset textOffset = Offset(box.left, box.top - tp.height - 6);
    if (textOffset.dy < 0) {
      textOffset = Offset(box.left, box.bottom + 6);
    }

    tp.paint(canvas, textOffset);
  }

  @override
  bool shouldRepaint(covariant FaceBoxPainter old) =>
      old.faceBoxImageSpace != faceBoxImageSpace ||
          old.imageSize != imageSize ||
          old.mirrorHorizontally != mirrorHorizontally ||
          old.rotationDegrees != rotationDegrees ||
          old.label != label;
}

class _Rotated {
  final Rect rect;
  final Size rotatedImageSize;
  _Rotated(this.rect, this.rotatedImageSize);
}

_Rotated _rotateRect(Rect r, Size img, int deg) {
  switch (deg % 360) {
    case 0:
      return _Rotated(r, img);
    case 90:
      return _Rotated(
        Rect.fromLTWH(
          img.height - (r.top + r.height),
          r.left,
          r.height,
          r.width,
        ),
        Size(img.height, img.width),
      );
    case 180:
      return _Rotated(
        Rect.fromLTWH(
          img.width - (r.left + r.width),
          img.height - (r.top + r.height),
          r.width,
          r.height,
        ),
        img,
      );
    case 270:
      return _Rotated(
        Rect.fromLTWH(
          r.top,
          img.width - (r.left + r.width),
          r.height,
          r.width,
        ),
        Size(img.height, img.width),
      );
    default:
      return _Rotated(r, img);
  }
}

🔄 Flow Diagram #

sequenceDiagram
    participant App
    participant FaceTagr SDK
    participant Backend API

    App->>FaceTagr SDK: init(apiURL, clientID, externalID, hash, time, reqID)
    FaceTagr SDK->>Backend API: Validate credentials
    Backend API-->>FaceTagr SDK: Auth success
    FaceTagr SDK-->>App: Init success (1001)

    App->>FaceTagr SDK: Open Camera
    FaceTagr SDK->>Backend API: Stream frames
    Backend API-->>FaceTagr SDK: Match success (1001)
    FaceTagr SDK-->>App: FaceMatch event

✅ Quick Recap #

  1. Add package to pubspec.yaml.
  2. Generate SHA-512 hash.
  3. Call init() with credentials.
  4. Open camera (FaceTagrLivePreview).
  5. Listen for initStream and faceMatchStream.

License #

FaceTagr

0
likes
100
points
299
downloads

Publisher

unverified uploader

Weekly Downloads

At FaceTagr, we are pioneers in advanced face recognition technology, delivering solutions that are accurate, reliable, and scalable. Our NIST-tested algorithms, with 99.91% accuracy, ensure that our technology meets the highest global standards for identity verification and security.

Homepage

Documentation

API reference

License

unknown (license)

Dependencies

args, aws_s3_api, crypto, device_info_plus, dio, flutter, flutter_secure_storage, image, uuid

More

Packages that depend on facetagr

Packages that implement facetagr