HN Theater @HNTheaterMonth

The best talks and videos of Hacker News.

Hacker News Comments on
2019 EuroLLVM Developers’ Meeting: T. Shpeisman & C. Lattner “MLIR: Multi-Level Intermediate Repr..”

LLVM · Youtube · 17 HN points · 2 HN comments
HN Theater has aggregated all Hacker News stories and comments that mention LLVM's video "2019 EuroLLVM Developers’ Meeting: T. Shpeisman & C. Lattner “MLIR: Multi-Level Intermediate Repr..”".
Youtube Summary
http://llvm.org/devmtg/2019-04/

MLIR: Multi-Level Intermediate Representation for Compiler Infrastructure - Tatiana Shpeisman (Google), Chris Lattner (Google)

Slides: http://llvm.org/devmtg/2019-04/slides/Keynote-ShpeismanLattner-MLIR.pdf

This talk will give an overview of Multi-Level Intermediate Representation - a new intermediate representation designed to provide a unified, flexible and extensible intermediate representation that is language-agnostic and can be used as a base compiler infrastructure. MLIR shares similarities with traditional CFG-based three-address SSA representations (including LLVM IR or SIL), but it also introduces notions from the polyhedral domain as first class concepts. The notion of dialects is a core concept of MLIR extensibility, allowing multiple levels in a single representation. MLIR supports the continuous lowering from dataflow graphs to high-performance target specific code through partial specialization between dialects. We will illustrate in this talk how MLIR can be used to build an optimizing compiler infrastructure for deep learning applications.

MLIR supports multiple front- and back-ends and uses LLVM IR as one of its primary code generation targets. MLIR also relies heavily on design principles and practices developed by the LLVM community. For example, it depends on LLVM APIs and programming idioms to minimize IR size and maximize optimization efficiency. MLIR uses LLVM testing utilities such as FileCheck to ensure robust functionality at every level of the compilation stack, TableGen to express IR invariants, and it leverages LLVM infrastructure such as dominance analysis to avoid implementing all the necessary compiler functionalities from scratch. At the same time, it is a brand new IR, both more restrictive and more general than LLVM IR in different aspects of its design. We believe that the LLVM community will find in MLIR a useful tool for developing new compilers, especially in machine learning and other high-performance domains.

Videos Filmed & Edited by Bash Films: http://www.BashFilms.com
HN Theater Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this video.
This project has similar goals to the MLIR project:

https://github.com/tensorflow/mlir

https://www.youtube.com/watch?v=qzljG6DKgic

Exciting times for the future of parallel computing!

Jul 06, 2019 · 16 points, 0 comments · submitted by pjmlp
100% agree. If starting now I think you should consider targeting mlir[1]. It fixes many problems you will have with llvm, and allows you to create language specific dialects and transformations. Impressive new work by a top-notch team led by LLVM creator Chris Lattner. It has LLVM as an optional target, so you get that for free.

For a parser, you should consider using tree-sitter[2]. Tree-sitter gives you live editor support for free. Impressive work by Max Brunsfeld.

[1] https://www.youtube.com/watch?v=qzljG6DKgic [2] https://www.youtube.com/watch?v=a1rC79DHpmY

Apr 26, 2019 · 1 points, 0 comments · submitted by matt_d
HN Theater is an independent project and is not operated by Y Combinator or any of the video hosting platforms linked to on this site.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.