This project simplifies fish measurement and monitoring during catch-and-release operations.
We’re developing an iOS app that uses the device’s camera and LiDAR to detect fish, identify
head and tail positions, and calculate length automatically. With machine learning, the app
will also identify species and streamline data collection for marine or citizen scientists.
We use two repositories. Follow the README in each for replication:
Swift code for real-time fish detection using the iPhone's camera and LiDAR.
See the README for setup and usage.
Rust library for fish length estimation from sensor data.
See the README for build and integration steps.
Sample screenshot of the app in action measuring fish length:
Our algorithm has an error rate of less than 10% on iPhone pro and kess than 16% on iPad pro, demonstrating high accuracy.
For more details, refer to the final report below.
We use a classification model based on Fishial, exported from TorchScript to ONNX for efficient iOS deployment via ONNX Runtime.
Model available on Hugging Face:
ONNX Fish Classifier
This enables real-time, on-device fish species recognition.