Skip to content
/ tvm Public
forked from apache/tvm

Open deep learning compiler stack for cpu, gpu and specialized accelerators

License

Notifications You must be signed in to change notification settings

ekut-es/tvm

This branch is 267 commits ahead of, 178 commits behind apache/tvm:main.

Folders and files

NameName
Last commit message
Last commit date

Latest commit

066ccc5 · Nov 11, 2024
Sep 10, 2024
Sep 22, 2024
Nov 5, 2024
Sep 25, 2024
Nov 5, 2024
Oct 12, 2024
Apr 19, 2022
Nov 5, 2024
Nov 2, 2024
Sep 24, 2024
Oct 3, 2023
Nov 5, 2024
Oct 14, 2024
Mar 20, 2024
Nov 11, 2024
Nov 11, 2024
Nov 11, 2024
Nov 11, 2024
Aug 12, 2024
Oct 12, 2024
Jan 22, 2024
Jan 8, 2019
Dec 6, 2022
Jan 6, 2023
Jan 22, 2024
Aug 31, 2021
Sep 27, 2024
Oct 8, 2024
Mar 10, 2024
Mar 20, 2024
Jul 31, 2023
Oct 19, 2022
Jun 17, 2023
Oct 23, 2021
Nov 15, 2023
Oct 19, 2022
Jul 12, 2024
Jun 7, 2024
Oct 12, 2024

Repository files navigation

Open Deep Learning Compiler Stack

Documentation | Contributors | Community | Release Notes

Build Status WinMacBuild

Apache TVM is a compiler stack for deep learning systems. It is designed to close the gap between the productivity-focused deep learning frameworks, and the performance- and efficiency-focused hardware backends. TVM works with deep learning frameworks to provide end to end compilation to different backends.

License

TVM is licensed under the Apache-2.0 license.

Getting Started

Check out the TVM Documentation site for installation instructions, tutorials, examples, and more. The Getting Started with TVM tutorial is a great place to start.

Contribute to TVM

TVM adopts apache committer model, we aim to create an open source project that is maintained and owned by the community. Check out the Contributor Guide.

Acknowledgement

We learned a lot from the following projects when building TVM.

  • Halide: Part of TVM's TIR and arithmetic simplification module originates from Halide. We also learned and adapted some part of lowering pipeline from Halide.
  • Loopy: use of integer set analysis and its loop transformation primitives.
  • Theano: the design inspiration of symbolic scan operator for recurrence.

About

Open deep learning compiler stack for cpu, gpu and specialized accelerators

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 59.6%
  • C++ 36.3%
  • Rust 0.7%
  • C 0.7%
  • Shell 0.7%
  • CMake 0.5%
  • Other 1.5%