Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

F8 PyTorch announcements: PyTorch 1.1 releases with new AI tools, open sourcing BoTorch and Ax, and more

Save for later
  • 4 min read
  • 03 May 2019

article-image

Despite Facebook’s frequent appearance in the news for all the wrong reasons, we cannot deny that its open source contributions to AI have been its one redeeming quality. At its F8 annual developer conference showcasing its exceptional AI prowess, Facebook shared how the production-ready PyTorch 1.0 is being adopted by the community and also the release of PyTorch 1.1.

Facebook introduced PyTorch in 2017, and since then it has been well-received by developers. It partnered with the AI community for further development in PyTorch and released the stable version last year in December. Along with optimizing and fixing other parts of PyTorch, the team introduced Just-in-time compilation for production support that allows seamless transitions between eager mode and graph mode.

PyTorch 1.0 in leading businesses, communities, and universities


Facebook is leveraging end-to-end workflows of PyTorch 1.0 for building and deploying translation and NLP at large scale. These NLP systems are delivering a staggering 6 billion translations for applications such as Messenger. PyTorch has also enabled Facebook to quickly iterate their ML systems. It has helped them accelerate their research-to-production cycle.

Other leading organizations and businesses are also now using PyTorch for speeding up the development of AI features. Airbnb’s Smart Reply feature is backed by PyTorch libraries and APIs for conversational AI. ATOM (Accelerating Therapeutics for Opportunities in Medicine) has come up with a variational autoencoder that represents diverse chemical structures and designs new drug candidates. Microsoft has built large-scale distributed language models that are now in production in offerings such as Cognitive Services.

PyTorch 1.1 releases with new model understanding and visualization tools


Along with showcasing how the production-ready version is being accepted by the community, the PyTorch team further announced the release of PyTorch 1.1. This release focuses on improved performance, brings new model understanding and visualization tools for improved usability, and more.

Following are some of the key feature PyTorch 1.1 comes with:

  • Support for TensorBoard: TensorBoard, a suite of visualization tools, is now natively supported in PyTorch. You can use it through the  “from torch.utils.tensorboard import SummaryWriter” command.
  • Improved JIT compiler: Along with some bug fixes, the team has expanded capabilities in TorchScript such as support for dictionaries, user classes, and attributes.
  • Unlock access to the largest independent learning library in Tech for FREE!
    Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
    Renews at $19.99/month. Cancel anytime
  • Introducing new APIs: New APIs are introduced to support Boolean tensors and custom recurrent neural networks.
  • Distributed training: This release comes with improved performance for common models such as CNNs. Multi-device modules support and the ability to split models across GPUs while still using Distributed Data Parallel is added.

Ax, BoTorch, and more: Open source tools for Machine Learning engineers


Facebook announced that it is open sourcing two new tools, Ax and BoTorch that are aimed at solving large scale exploration problems both in research and production environment. Built on top of PyTorch, BoTorch leverages its features such as auto-differentiation, massive parallelism, and deep learning to help in researches related Bayesian optimization. Ax is a general purpose ML platform for managing adaptive experiments. Both Ax and BoTorch use probabilistic models that efficiently use data and meaningfully quantify the costs and benefits of exploring new regions of problem space.

Facebook has also open sourced PyTorch-BigGraph (PBG), a tool that makes it easier and faster to produce graph embeddings for extremely large graphs with billions of entities and trillions of edges. PBG comes with support for sharding and negative sampling and also offers sample use cases based on Wikidata embedding.

As a result of its collaboration with Google, AI Platform Notebooks, a new histed JupyterLab service from Google Cloud Platform, now comes preinstalled with PyTorch. It also comes integrated with other GCP services such as BigQuery, Cloud Dataproc, Cloud Dataflow, and AI Factory.

The broader PyTorch community has also come up with some impressive open source tools. BigGAN-Torch is basically a full reimplementation of PyTorch that uses gradient accumulation to provide the benefits of big batches by only using a few GPUs. GeomLoss is an API written in Python that defines PyTorch layers for geometric loss functions between sampled measures, images, and volumes. It provides efficient GPU implementations for Kernel norms, Hausdorff divergences, and unbiased Sinkhorn divergences. PyTorch Geometric is a geometric deep learning extension library for PyTorch consisting of various methods for deep learning on graphs and other irregular structures.

Read the official announcement on Facebook’s AI  blog.

Facebook open-sources F14 algorithm for faster and memory-efficient hash tables

“Is it actually possible to have a free and fair election ever again?,” Pulitzer finalist, Carole Cadwalladr on Facebook’s role in Brexit

F8 Developer Conference Highlights: Redesigned FB5 app, Messenger update, new Oculus Quest and Rift S, Instagram shops, and more