Title
Optimization in the Sparse Tensor Algebra Compiler
Abstract
The Sparse Tensor Algebra Compiler (taco) is the first to automatically generate code for any tensor algebra expression on many sparse tensor data structures. This frees the programmer from choosing between hand-optimized libraries that support only a some operations on some data structures and general tensor compute engines with poor performance. In this talk, I will present recent work on a new intermediate representation and program transformations to optimize the generated sparse tensor code. I will show how these transformations generalize common optimizations for specific cases, such as the sparse accumulator in sparse matrix multiplication, can lead to asymptotically better performance for some expressions, and can transform code to fit execution patterns required by different hardware architectures. The performance of the resulting sparse code is comparable to hand-optimized libraries, but generalize to many more expressions and data structures.
The sparse tensor algebra compiler is available under the MIT license. The source code and related publications can be found at http://tensor-compiler.org.
Bio
Fredrik Kjolstad is a PhD candidate at MIT and will join Stanford as an Assistant Professor in March 2020. He works on topics in compilers, programming languages, and performance engineering. His projects include the Sparse Tensor Algebra Compiler (taco) and the Simit language for physical simulation. He received his master degree from the University of Illinois at Urbana-Champaign and his bachelor degree from the Norwegian University of Science and Technology in Gjøvik. He has received the Eureka and Rosing prizes for his bachelor project, the Adobe Fellowship, a best poster award, and two best paper awards.