Detecting with a local ONNX model deployed on the device
In the previous recipes, we worked with models running in the cloud or on a server. But what if your device isn’t always connected to the internet, or if you need faster response times? You can actually run AI models directly on your device – even on a mobile phone. The best part is that you don’t need to be a machine learning expert to do this because there are pre-trained AI models available. With ONNX models and ML.NET, you can achieve all of this right on your device.
ONNX is an open source format designed to represent machine learning models. It allows models to be interoperable across different frameworks.
This essentially means that a model created and trained in other frameworks, such as TensorFlow or PyTorch, can be converted to the ONNX format. ONNX models run in ONNX Runtime, which has libraries available for .NET.
Another important part of the system is ML.NET. This is a machine learning...