Google TensorFlow: Updates & Lessons

TensorFlow came out today, and like the rest of the ML world, I buried myself with it. I have never been more excited about a new open source code. There are actionable tutorials etc. on the home page that’s worth checking out, but I wanted to know if this was yet another computational graph framework — we already have Theano and CGT (CGT is fast; Theano is most popular).

There’s a detailed white paper outlining TensorFlow, which I highly recommend reading, but summarizing items that caught my eye.

  1. Apache 2.0 license. Well done Google!
  2. Google quality code, and reasonably good docs for day 0
  3. Out of the box multiple device and distributed execution
  4. Fancy placement algorithms for multi-node scheduling
  5. Autograds (like Theano and Torch), but possibly faster (need to profile that yet).
  6. Fault tolerance & checkpointing (ability to resume interrupted execution)
  7. Finer grained control on concurrency
  8. Support for multiple devices (mobile to GPU arrays) and multiple language interop
  9. Fancypants optimizations of the computational graph

    The end result of these efforts resulted in a 6-fold speed improvement in training time over DistBelief

  10. TensorBoard tool to visualize computation graphs and monitor network parameters during training. This is indispensable. During most of my DL work for clients, I have spent a lot of time instrumenting existing code to get the monitoring stats. This by itself feels like a major boon.

This is a lot from a single release. If you were using Theano, upgrading to TensorFlow should feel like moving from a Honda Civic to a Ferrari. What excites me most is, all this work improves not only DL but also most other ML algorithms that can be implemented atop of TensorFlow. I’m getting back to playing with mine.

PS: Seriously, read the white paper. A great read by itself. This paper also provides great advice on porting ML platforms.

6 Responses to Google TensorFlow: Updates & Lessons

  1. knowlengr November 10, 2015 at 7:22 pm #

    Thx for the curation

  2. rrtucci November 11, 2015 at 3:49 pm #

    What is Microsoft using instead of TensorFlow? Is it, or something else? Whatever it is, do you think MS will make a response in kind by open sourcing some of their Bing AI code?

    • Delip Rao November 27, 2015 at 8:35 pm #

      DMTK from MSR was open-sourced recently.

      • rrtucci November 27, 2015 at 9:46 pm #

        Thanks Delip.



  1. Simple end-to-end TensorFlow examples | Bcomposes - November 26, 2015

    […] TensorFlow to much general excitement. So, I figured I’d give it a go, especially given Delip Rao’s enthusiasm for it—he even compared the move from Theano to TensorFlow feeling like changing from “a […]

  2. TensorFlow and Monetizing Intellectual Property | ASKX Blog and News Service - January 16, 2016

    […] qualified to judge the technical worth of TensorFlow, but I feel pretty safe in assuming that it is excellent and likely far beyond what any other company could produce. Machine learning, though, is about a […]

Leave a Reply

© 2016 Delip Rao. All Rights Reserved.