Menu
Home Explore People Places Arts History Plants & Animals Science Life & Culture Technology
On this page
Differentiable programming
Programming paradigm

Differentiable programming is a programming paradigm in which a numeric computer program can be differentiated throughout via automatic differentiation. This allows for gradient-based optimization of parameters in the program, often via gradient descent, as well as other learning approaches that are based on higher order derivative information. Differentiable programming has found use in a wide variety of areas, particularly scientific computing and machine learning. One of the early proposals to adopt such a framework in a systematic fashion to improve upon learning algorithms was made by the Advanced Concepts Team at the European Space Agency in early 2016.

We don't have any images related to Differentiable programming yet.
We don't have any YouTube videos related to Differentiable programming yet.
We don't have any PDF documents related to Differentiable programming yet.
We don't have any Books related to Differentiable programming yet.
We don't have any archived web articles related to Differentiable programming yet.

Approaches

Most differentiable programming frameworks work by constructing a graph containing the control flow and data structures in the program.8 Attempts generally fall into two groups:

  • Static, compiled graph-based approaches such as TensorFlow,9 Theano, and MXNet. They tend to allow for good compiler optimization and easier scaling to large systems, but their static nature limits interactivity and the types of programs that can be created easily (e.g. those involving loops or recursion), as well as making it harder for users to reason effectively about their programs.10 A proof of concept compiler toolchain called Myia uses a subset of Python as a front end and supports higher-order functions, recursion, and higher-order derivatives.111213
  • Operator overloading, dynamic graph based approaches such as PyTorch, NumPy's autograd package as well as Pyaudi. Their dynamic and interactive nature lets most programs be written and reasoned about more easily. However, they lead to interpreter overhead (particularly when composing many small operations), poorer scalability, and reduced benefit from compiler optimization.1415

The use of Just-in-Time compilation has emerged recently as a possible solution to overcome some of the bottlenecks of interpreted languages. The C++ heyoka and python package heyoka.py make large use of this technique to offer advanced differentiable programming capabilities (also at high orders). A package for the Julia programming language – Zygote – works directly on Julia's intermediate representation. 161718

A limitation of earlier approaches is that they are only able to differentiate code written in a suitable manner for the framework, limiting their interoperability with other programs. Newer approaches resolve this issue by constructing the graph from the language's syntax or IR, allowing arbitrary code to be differentiated. 1920

Applications

Differentiable programming has been applied in areas such as combining deep learning with physics engines in robotics,21 solving electronic structure problems with differentiable density functional theory,22 differentiable ray tracing,23 image processing,24 and probabilistic programming.25

Multidisciplinary application

Differentiable programming is making significant strides in various fields beyond its traditional applications. In healthcare and life sciences, for example, it is being used for deep learning in biophysics-based modelling of molecular mechanisms. This involves leveraging differentiable programming in areas such as protein structure prediction and drug discovery. These applications demonstrate the potential of differentiable programming in contributing to significant advancements in understanding complex biological systems and improving healthcare solutions.26

See also

Notes

References

  1. Izzo, Dario; Biscani, Francesco; Mereta, Alessio (2017). "Differentiable Genetic Programming". Genetic Programming. Lecture Notes in Computer Science. Vol. 10196. pp. 35–51. arXiv:1611.04766. doi:10.1007/978-3-319-55696-3_3. ISBN 978-3-319-55695-6. S2CID 17786263. 978-3-319-55695-6

  2. Baydin, Atilim Gunes; Pearlmutter, Barak A.; Radul, Alexey Andreyevich; Siskind, Jeffrey Mark (2018). "Automatic Differentiation in Machine Learning: a Survey". Journal of Marchine Learning Research. 18 (153): 1–43. https://jmlr.org/papers/v18/17-468.html

  3. Wang, Fei; Decker, James; Wu, Xilun; Essertel, Gregory; Rompf, Tiark (2018). "Backpropagation with Callbacks: Foundations for Efficient and Expressive Differentiable Programming" (PDF). In Bengio, S.; Wallach, H.; Larochelle, H.; Grauman, K (eds.). NIPS'18: Proceedings of the 32nd International Conference on Neural Information Processing Systems. Curran Associates. pp. 10201–10212. http://papers.nips.cc/paper/8221-backpropagation-with-callbacks-foundations-for-efficient-and-expressive-differentiable-programming.pdf

  4. Innes, Mike (2018). "On Machine Learning and Programming Languages" (PDF). SysML Conference 2018. Archived from the original (PDF) on 2019-07-17. Retrieved 2019-07-04. https://web.archive.org/web/20190717211700/http://www.sysml.cc/doc/2018/37.pdf

  5. Innes, Mike; Edelman, Alan; Fischer, Keno; Rackauckas, Chris; Saba, Elliot; Viral B Shah; Tebbutt, Will (2019). "A Differentiable Programming System to Bridge Machine Learning and Scientific Computing". arXiv:1907.07587 [cs.PL]. /wiki/ArXiv_(identifier)

  6. Innes, Mike; Edelman, Alan; Fischer, Keno; Rackauckas, Chris; Saba, Elliot; Viral B Shah; Tebbutt, Will (2019). "A Differentiable Programming System to Bridge Machine Learning and Scientific Computing". arXiv:1907.07587 [cs.PL]. /wiki/ArXiv_(identifier)

  7. "Differential Intelligence". October 2016. Retrieved 2022-10-19. https://www.esa.int/gsp/ACT/projects/differential_intelligence/

  8. Innes, Michael; Saba, Elliot; Fischer, Keno; Gandhi, Dhairya; Marco Concetto Rudilosso; Neethu Mariya Joy; Karmali, Tejan; Pal, Avik; Shah, Viral (2018). "Fashionable Modelling with Flux". arXiv:1811.01457 [cs.PL]. /wiki/ArXiv_(identifier)

  9. TensorFlow 1 uses the static graph approach, whereas TensorFlow 2 uses the dynamic graph approach by default.

  10. Innes, Michael; Saba, Elliot; Fischer, Keno; Gandhi, Dhairya; Marco Concetto Rudilosso; Neethu Mariya Joy; Karmali, Tejan; Pal, Avik; Shah, Viral (2018). "Fashionable Modelling with Flux". arXiv:1811.01457 [cs.PL]. /wiki/ArXiv_(identifier)

  11. Merriënboer, Bart van; Breuleux, Olivier; Bergeron, Arnaud; Lamblin, Pascal (3 December 2018). "Automatic differentiation in ML: where we are and where we should be going". NIPS'18. Vol. 31. pp. 8771–81. https://papers.nips.cc/paper/2018/hash/770f8e448d07586afbf77bb59f698587-Abstract.html

  12. Breuleux, O.; van Merriënboer, B. (2017). "Automatic Differentiation in Myia" (PDF). Archived from the original (PDF) on 2019-06-24. Retrieved 2019-06-24. https://web.archive.org/web/20190624180156/https://www.sysml.cc/doc/2018/39.pdf

  13. "TensorFlow: Static Graphs". Tutorials: Learning PyTorch. PyTorch.org. Retrieved 2019-03-04. https://pytorch.org/tutorials/beginner/examples_autograd/tf_two_layer_net.html

  14. Breuleux, O.; van Merriënboer, B. (2017). "Automatic Differentiation in Myia" (PDF). Archived from the original (PDF) on 2019-06-24. Retrieved 2019-06-24. https://web.archive.org/web/20190624180156/https://www.sysml.cc/doc/2018/39.pdf

  15. "TensorFlow: Static Graphs". Tutorials: Learning PyTorch. PyTorch.org. Retrieved 2019-03-04. https://pytorch.org/tutorials/beginner/examples_autograd/tf_two_layer_net.html

  16. Innes, Michael; Saba, Elliot; Fischer, Keno; Gandhi, Dhairya; Marco Concetto Rudilosso; Neethu Mariya Joy; Karmali, Tejan; Pal, Avik; Shah, Viral (2018). "Fashionable Modelling with Flux". arXiv:1811.01457 [cs.PL]. /wiki/ArXiv_(identifier)

  17. Innes, Michael (2018). "Don't Unroll Adjoint: Differentiating SSA-Form Programs". arXiv:1810.07951 [cs.PL]. /wiki/ArXiv_(identifier)

  18. Innes, Mike; Edelman, Alan; Fischer, Keno; Rackauckas, Chris; Saba, Elliot; Viral B Shah; Tebbutt, Will (2019). "A Differentiable Programming System to Bridge Machine Learning and Scientific Computing". arXiv:1907.07587 [cs.PL]. /wiki/ArXiv_(identifier)

  19. Innes, Michael; Saba, Elliot; Fischer, Keno; Gandhi, Dhairya; Marco Concetto Rudilosso; Neethu Mariya Joy; Karmali, Tejan; Pal, Avik; Shah, Viral (2018). "Fashionable Modelling with Flux". arXiv:1811.01457 [cs.PL]. /wiki/ArXiv_(identifier)

  20. Breuleux, O.; van Merriënboer, B. (2017). "Automatic Differentiation in Myia" (PDF). Archived from the original (PDF) on 2019-06-24. Retrieved 2019-06-24. https://web.archive.org/web/20190624180156/https://www.sysml.cc/doc/2018/39.pdf

  21. Degrave, Jonas; Hermans, Michiel; Dambre, Joni; wyffels, Francis (2016). "A Differentiable Physics Engine for Deep Learning in Robotics". arXiv:1611.01652 [cs.NE]. /wiki/ArXiv_(identifier)

  22. Li, Li; Hoyer, Stephan; Pederson, Ryan; Sun, Ruoxi; Cubuk, Ekin D.; Riley, Patrick; Burke, Kieron (2021). "Kohn-Sham Equations as Regularizer: Building Prior Knowledge into Machine-Learned Physics". Physical Review Letters. 126 (3): 036401. arXiv:2009.08551. Bibcode:2021PhRvL.126c6401L. doi:10.1103/PhysRevLett.126.036401. PMID 33543980. https://doi.org/10.1103%2FPhysRevLett.126.036401

  23. Li, Tzu-Mao; Aittala, Miika; Durand, Frédo; Lehtinen, Jaakko (2018). "Differentiable Monte Carlo Ray Tracing through Edge Sampling". ACM Transactions on Graphics. 37 (6): 222:1–11. doi:10.1145/3272127.3275109. S2CID 52839714. https://people.csail.mit.edu/tzumao/diffrt/

  24. Li, Tzu-Mao; Gharbi, Michaël; Adams, Andrew; Durand, Frédo; Ragan-Kelley, Jonathan (August 2018). "Differentiable Programming for Image Processing and Deep Learning in Halide". ACM Transactions on Graphics. 37 (4): 139:1–13. doi:10.1145/3197517.3201383. S2CID 46927588. https://cseweb.ucsd.edu/~tzli/gradient_halide

  25. Innes, Mike; Edelman, Alan; Fischer, Keno; Rackauckas, Chris; Saba, Elliot; Viral B Shah; Tebbutt, Will (2019). "A Differentiable Programming System to Bridge Machine Learning and Scientific Computing". arXiv:1907.07587 [cs.PL]. /wiki/ArXiv_(identifier)

  26. AlQuraishi, Mohammed; Sorger, Peter K. (October 2021). "Differentiable biology: using deep learning for biophysics-based and data-driven modeling of molecular mechanisms". Nature Methods. 18 (10): 1169–1180. doi:10.1038/s41592-021-01283-4. PMC 8793939. PMID 34608321. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8793939