Snips NLU is a Python library that can be used to easily train models to make predictions on new queries. Snips have also open sourced Snips NLU-rs, a Rust implementation focused on the prediction aspect.
Snip NLU-rs consists of a traditional flat model called a linear chain Conditional Random Field (CRF), instead of CNNs or bi-LSTMs. The Snips team has replaced heavy word embeddings with a carefully crafted set of features that capture semantic and structural signals from the sentence. The Snips NLU-rs inference engine can run literally anywhere, from a 5$ Raspberry Pi Zero to an AWS EC2 free-tier instance. This library can be used on most modern architectures: on small connected devices, on mobile, on a desktop, or on a server. It can currently handle 5 languages (English, French, German, Spanish and Korean), with more to be added regularly.
Unlike other chatbots and voice assistants that mostly rely on cloud services for their NLU, Snips NLU can run on the Edge or on a server. Moreover, the platform is the first ‘Private by Design’ alternative to traditional voice assistants. This means user data is not touched, processed or collected, unlike most voice assistants.
Researchers at Snip compared their NLU engine with other leading voice assistants/chatbots including API.ai (now DialogFlow, Google), Wit.ai (Facebook), Luis.ai (Microsoft), and Amazon Alexa by training them all using the same dataset, and testing them on the same out-sample test set. Experimental results showed that Snips NLU is as accurate or better than other cloud solutions at slot extraction tasks, regardless of how much training data was used.
If you want to know more, check out the Github repository. To start building your own Snip NLU assistant go on to the Snips console.