Data forms an integral part of every business or organization which is used to make valuable decisions based on changing circumstances. Natural Language Processing (NLP) is a widely adopted technique used by machines to understand and communicate with humans in human language. This enables human to access, analyze and extract data more intelligently from a huge amount of unstructured data.
Intel AI Lab’s team of NLP researchers and developers has introduced NLP Architect, a new open-source Python library. This library can be used as a platform for future research and developing the state-of-the-art deep learning techniques for natural language processing and natural language understanding.
Rapid and recent advancements in deep learning and neural network paradigms has led to the growth in NLP domain. This new library offers flexibility in implementing NLP solutions which are packed with the past and ongoing NLP research and development work of Intel AI Lab.
The current version of NLP Architect offers noteworthy features which form the backbone in terms of research and practical development. All the following models are provided with required training and inference processes:
Source: AI Intel Blog
This library of NLP components provides the required functionality to extend NLP solutions with a range of audience. It provides excellent media for analysis and optimization of Intel software and hardware on NLP workloads.
In addition to these models, new features such as data pipelines, common functional calls, and utilities related to NLP domain which are majorly used when deploying models, are added.
To know more about the updates, you can refer the official Intel AI blog.
This repository supports several open source deep learning frameworks such as:
Note: We can expect the list of models to update in future. All these models will run with Python 3.5+
If you want to download the open-source Python library or want to contribute to the project by providing valuable feedback, download the code from Github. A complete documentation for all core modules with end-to-end examples can be found in their official page.