Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases now! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

Developers from the Swift for TensorFlow project propose adding first-class differentiable programming to Swift

Save for later
  • 5 min read
  • 09 Sep 2019

article-image

After working for over 1.5 years on the Differentiable Programming Mega-Proposal, Richard Wei, a developer at Google Brain, and his team submitted the proposal on the Swift Evolution forum on Thursday last week.

This proposal aims to “push Swift's capabilities to the next level in numerics and machine learning” by introducing differentiable programming as a new language feature in Swift. It is a part of the Swift for TensorFlow project under which the team is integrating TensorFlow directly into the language to offer developers a next-generation platform for machine learning.

What is differentiable programming


With the increasing sophistication in deep learning models and the introduction of modern deep learning frameworks, many researchers have started to realize that building neural networks is very similar to programming.

Yann LeCun, VP and Chief AI Scientist at Facebook, calls differentiable programming “a little more than a rebranding of the modern collection Deep Learning techniques, the same way Deep Learning was a rebranding of the modern incarnations of neural nets with more than two layers.” He compares it with regular programming, with the only difference that the resulting programs are “parameterized, automatically differentiated, and trainable/optimizable.

Many also say that differentiable programming is a different name for automatic differentiation, a collection of techniques to numerically evaluate the derivative of a function. It can be seen as a new programming paradigm in which programs can be differentiated throughout.

Check out the paper “Demystifying Differentiable Programming: Shift/Reset the Penultimate Backpropagation” to get a better understanding of differentiable programming.

Why differential programming is proposed in Swift


Swift is an expressive, high-performance language, which makes it a perfect candidate for numerical applications. According to the proposal authors, first-class support for differentiable programming in Swift will allow safe and powerful machine learning development.

The authors also believe that this is a “big step towards high-level numerical computing support.” With this proposal, they aim to make Swift a “real contender in the numerical computing and machine learning landscape.

Here are some of the advantages of adding first-class support for differentiable programming in Swift:

  • Better language coverage: First-class differentiable programming support will enable differentiation to work smoothly with other Swift features. This will allow developers to code normally without being restricted to a subset of Swift.
  • Enable extensibility: This will provide developers an “extensible differentiable programming system.” They will be able to create custom differentiation APIs by leveraging primitive operators defined in the standard library and supported by the type system.
  • Static warnings and errors: This will enable the compiler to statically identify the functions that cannot be differentiated or will give a zero derivative. It will then be able to give a non-differentiability error or warning. This will improve productivity by making common runtime errors in machine learning directly debuggable without library boundaries.


Some of the components that will be added in Swift under this proposal are:

  • The Differentiable protocol: This is a standard library protocol that will generalize all data structures that can be a parameter or result of a differentiable function.
  • Unlock access to the largest independent learning library in Tech for FREE!
    Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
    Renews at $19.99/month. Cancel anytime
  • The @differentiable declaration attribute: This will be used to mark all the function-like declarations as differentiable.
  • The @differentiable function types: This is a subtype of normal function types with a different runtime representation and calling convention. Differentiable function types will have differentiable parameters and results.
  • Differential operators: These are the core differentiation APIs that take ‘@differentiable’ functions as inputs and return derivative functions or compute derivative values.
  • @differentiating and @transposing attributes: These attributes are for declaring custom derivative function for some other function declaration.


This proposal sparked a discussion on Hacker News. Many developers were excited about bringing differentiable programming support in the Swift core. A user commented, “This is actually huge. I saw a proof of concept of something like this in Haskell a few years back, but it's amazing it see it (probably) making it into the core of a mainstream language. This may let them capture a large chunk of the ML market from Python - and hopefully, greatly improve ML APIs while they're at it.

Some felt that a library could have served the purpose. “I don't see why a well-written library could not serve the same purpose. It seems like a lot of cruft. I doubt, for example, Python would ever consider adding this and it's the de facto language that would benefit the most from something like this - due to the existing tools and communities.

It just seems so narrow and not at the same level of abstraction that languages typically sit at. I could see the language supporting higher-level functionality so a library could do this without a bunch of extra work (such as by some reflection),” a user added.

Users also discussed another effort that goes along the lines of this project: Julia Zygote, which is a working prototype for source-to-source automatic differentiation. A user commented, “Yup, work is continuing apace with Julia’s next-gen Zygote project. Also, from the GP’s thought about applications beyond DL, my favorite examples so far are for model-based RL and Neural ODEs.

To know more in detail, check out the proposal: Differentiable Programming Mega-Proposal.

Other news in programming


Why Perl 6 is considering a name change?

The Julia team shares its finalized release process with the community

TypeScript 3.6 releases with stricter generators, new functions in TypeScript playground, better Unicode support for identifiers, and more