SHA: 67f1e Signed-off-by: Kainan Cha <kainan.zha@verisilicon.com> |
||
|---|---|---|
| .github/workflows | ||
| cmake | ||
| docs | ||
| include/tim | ||
| prebuilt-sdk | ||
| samples | ||
| src/tim | ||
| toolchains | ||
| .bazelrc | ||
| .bazelversion | ||
| .clang-format | ||
| .gitignore | ||
| Android.mk | ||
| BUILD | ||
| CMakeLists.txt | ||
| LICENSE | ||
| README.md | ||
| VERSION | ||
| WORKSPACE | ||
README.md
TIM-VX - Tensor Interface Module
TIM-VX is a software integration module provided by VeriSilicon to facilitate deployment of Neural-Networks on Verisilicon ML accelerators. It serves as the backend binding for runtime frameworks such as Android NN, Tensorflow-Lite, MLIR, TVM and more.
Main Features
- Over 150 operators with rich format support for both quantized and floating point
- Simplified C++ binding API calls to create Tensors and Operations Guide
- Dynamic graph construction with support for shape inference and layout inference
- Built-in custom layer extensions
- A set of utility functions for debugging
Framework Support
- Tensorflow-Lite (External Delegate)
- Tengine (Official)
- TVM (Fork)
- MLIR Dialect (In development)
Feel free to raise a github issue if you wish to add TIM-VX for other frameworks.
Get started
Build and Run
TIM-VX supports both bazel and cmake. Install bazel to get started.
TIM-VX needs to be compiled and linked against VeriSilicon OpenVX SDK which provides related header files and pre-compiled libraries. A default linux-x86_64 SDK is provided which contains the simulation environment on PC. Platform specific SDKs can be obtained from respective SoC vendors.
To build TIM-VX
bazel build libtim-vx.so
To run sample LeNet
# set VIVANTE_SDK_DIR for runtime compilation environment
export VIVANTE_SDK_DIR=`pwd`/prebuilt-sdk/x86_64_linux
bazel build //samples/lenet:lenet_asymu8_cc
bazel run //samples/lenet:lenet_asymu8_cc
To build and run Tensorflow-Lite with TIM-VX, please see README
To build and run TVM with TIM-VX, please see TVM README
Support
create issue on github or email to ML_Support@verisilicon.com