Recent Discussion on Next Generation DL Toolkit

[1] The Next Generation of Machine Learning Tools, Jan. 22 2020.

[2] []


In [1], the author first reflects on AlexNet as a modified LeNet with more layers, better weight init. activation functions and data augmentation.

“In fact, Alex Krizhevsky’s former colleagues recall that many meetings before the competition consisted of Alex describing his progress with the CUDA quircks and features.”

“I believe a significant factor was the emergence of the second generation of general-purpose ML frameworks such as TensorFlow and PyTorch.”

The author listed several requirements for his willing next-generation ML toolkits:

  1. Fine-grained control flow use;
  2. Non-standard optimization loops;
  3. Higher-order differentiation as a first-class citizen;
  4. Probabilistic programming as a first-class citizen;
  5. Support for multiple heterogeneous accelerators in one model;
  6. Seamless scalability from a single machine to gigantic clusters;