Differentiate Everything with a Reversible Embeded Domain-Specific Language

March 10, 2020 ยท Entered Twilight ยท ๐Ÿ› arXiv.org

๐ŸŒ… TWILIGHT: Old Age
Predates the code-sharing era โ€” a pioneer of its time

"Last commit was 5.0 years ago (โ‰ฅ5 year threshold)"

Evidence collected by the PWNC Scanner

Repo contents: .gitattributes, .gitignore, README.md, data.txt, works.jl, ๅฏๅพฎ็ผ–็จ‹ไธ€.ipynb, ๅฏๅพฎ็ผ–็จ‹ไบŒ.ipynb

Authors Jin-Guo Liu, Taine Zhao arXiv ID 2003.04617 Category cs.PL: Programming Languages Cross-listed cs.LG Citations 1 Venue arXiv.org Repository https://github.com/GiggleLiu/NiLang.jl Last Checked 2 months ago
Abstract
Reverse-mode automatic differentiation (AD) suffers from the issue of having too much space overhead to trace back intermediate computational states for back-propagation. The traditional method to trace back states is called checkpointing that stores intermediate states into a global stack and restore state through either stack pop or re-computing. The overhead of stack manipulations and re-computing makes the general purposed (not tensor-based) AD engines unable to meet many industrial needs. Instead of checkpointing, we propose to use reverse computing to trace back states by designing and implementing a reversible programming eDSL, where a program can be executed bi-directionally without implicit stack operations. The absence of implicit stack operations makes the program compatible with existing compiler features, including utilizing existing optimization passes and compiling the code as GPU kernels. We implement AD for sparse matrix operations and some machine learning applications to show that our framework has the state-of-the-art performance.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Programming Languages