The simple essence of automatic differentiation

April 02, 2018 ยท Entered Twilight ยท ๐Ÿ› Proc. ACM Program. Lang.

๐ŸŒ… TWILIGHT: Old Age
Predates the code-sharing era โ€” a pioneer of its time

"No code URL or promise found in abstract"
"Code repo scraped from project page (backfill)"

Evidence collected by the PWNC Scanner

Repo contents: .gitignore, Figures, Makefile, essence-of-ad.lhs, formatting.fmt, macros.tex, mit-categories.lhs, notes.md, readme.md, todo.md

Authors Conal Elliott arXiv ID 1804.00746 Category cs.PL: Programming Languages Citations 117 Venue Proc. ACM Program. Lang. Repository https://github.com/conal/talk-2018-essence-of-ad โญ 201 Last Checked 9 days ago
Abstract
Automatic differentiation (AD) in reverse mode (RAD) is a central component of deep learning and other uses of large-scale optimization. Commonly used RAD algorithms such as backpropagation, however, are complex and stateful, hindering deep understanding, improvement, and parallel execution. This paper develops a simple, generalized AD algorithm calculated from a simple, natural specification. The general algorithm is then specialized by varying the representation of derivatives. In particular, applying well-known constructions to a naive representation yields two RAD algorithms that are far simpler than previously known. In contrast to commonly used RAD implementations, the algorithms defined here involve no graphs, tapes, variables, partial derivatives, or mutation. They are inherently parallel-friendly, correct by construction, and usable directly from an existing programming language with no need for new data types or programming style, thanks to use of an AD-agnostic compiler plugin.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Programming Languages