README change to add info on using prebuilt Docker images (#201)

* Add description of prebuilt docker images

* example Dockerfile using prebuilt container

* vscode config files for Docker example

* vscode files for Docker example

* vscode files for Docker example

* add Dockerfile info

* typo

* fix bad name for example file

doc check failed because file name was incorrect

Co-authored-by: Tian Jin <tjingrant@gmail.com>
This commit is contained in:
Kevin O'Brien 2020-07-02 01:54:38 -04:00 committed by GitHub
parent cf96d635cc
commit 5c6d85e6f3
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
5 changed files with 135 additions and 0 deletions

View File

@ -7,6 +7,40 @@ The Open Neural Network Exchange implementation in MLIR (http://onnx.ai/onnx-mli
| s390-Linux | [![Build Status](https://yktpandb.watson.ibm.com/jenkins/buildStatus/icon?job=ONNX-MLIR-Linux-s390x-Build)](https://yktpandb.watson.ibm.com/jenkins/job/ONNX-MLIR-Linux-s390x-Build/) | | s390-Linux | [![Build Status](https://yktpandb.watson.ibm.com/jenkins/buildStatus/icon?job=ONNX-MLIR-Linux-s390x-Build)](https://yktpandb.watson.ibm.com/jenkins/job/ONNX-MLIR-Linux-s390x-Build/) |
| x86-Windows | [![Build Status](https://dev.azure.com/onnx-pipelines/onnx/_apis/build/status/MLIR-Windows-CI?branchName=master)](https://dev.azure.com/onnx-pipelines/onnx/_build/latest?definitionId=9&branchName=master) | | x86-Windows | [![Build Status](https://dev.azure.com/onnx-pipelines/onnx/_apis/build/status/MLIR-Windows-CI?branchName=master)](https://dev.azure.com/onnx-pipelines/onnx/_build/latest?definitionId=9&branchName=master) |
## Prebuilt Container
An easy way to get started with ONNX-MLIR is to use a prebuilt docker image. These images are created as a result of a successful merge build on the trunk.
This means that the latest image represents the tip of the trunk.
Currently there are images for amd64, ppc64le and IBM System Z respectively saved in Docker Hub as onnxmlirczar/onnx-mlir-build:amd64,
onnxmlirczar/onnx-mlir-build:ppc64le and onnxmlirczar/onnx-mlir-build:s390x. To use one of these images either pull it directly from Docker Hub,
launch a container and run an interactive bash shell in it, or use it as the base image in a dockerfile. The container contains the full build tree including
the prerequisites and a clone of the source code. The source can be modified and onnx-mlir rebuilt from within the container, so it is possible to use it
as a development environment. It is also possible to attach vscode to the running container. An example Dockerfile and vscode configuration files can be
seen in the docs folder. The Dockerfile is shown here.
[same-as-file]: <> (docs/docker-example/Dockerfile)
```
FROM onnxmlirczar/onnx-mlir-build:amd64
WORKDIR /build
ENV HOME=/build
ENV PYENV_ROOT=$HOME/.pyenv
ENV PATH=$PYENV_ROOT/shims:$PYENV_ROOT/bin:$PATH
RUN pyenv global 3.7.0
RUN pyenv rehash
ENV PATH=$PATH:/build/bin
RUN apt-get update
RUN apt-get install -y python-numpy
RUN apt-get install -y python3-pip
RUN apt-get install -y gdb
RUN apt-get install -y lldb
RUN apt-get install -y emacs
WORKDIR /build/.vscode
ADD .vscode /build/.vscode
WORKDIR /build
```
## Prerequisites ## Prerequisites
``` ```

View File

@ -0,0 +1,33 @@
{
"configurations": [
{
"name": "Linux",
"includePath": [
"${workspaceFolder}/**"
],
"defines": [],
"compilerPath": "/usr/bin/gcc",
"cStandard": "c11",
"cppStandard": "gnu++14",
"intelliSenseMode": "clang-x64"
},
{
"name": "onnx-mlir-linux",
"includePath": [
"${workspaceFolder}/**"
],
"forcedInclude": [
"${default}"
],
"defines": [
"ONNX_ML=1"
],
"compilerPath": "/usr/bin/gcc",
"compilerArgs": ["-I${workspaceFolder}/llvm-project/mlir/include", "-I${workspaceFolder}/llvm-project/build/tools/mlir/include","-I${workspaceFolder}/llvm-project/include", "-I${workspaceFolder}/llvm-project/build/ßinclude"],
"cStandard": "c11",
"cppStandard": "c++14",
"intelliSenseMode": "${default}"
}
],
"version": 4
}

44
docs/docker-example/.vscode/launch.json vendored Normal file
View File

@ -0,0 +1,44 @@
{
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"configurations": [
{
"name": "Debug onnx-mlir",
"type": "cppdbg",
"request": "launch",
"program": "${workspaceFolder}/onnx-mlir/build/bin/onnx-mlir",
"args": ["--EmitONNXIR","exampleop.onnx"],
"stopAtEntry": true,
"cwd": "${workspaceFolder}",
"environment": [],
"externalConsole": false,
"MIMode": "gdb",
"setupCommands": [
{
"description": "Enable pretty-printing for gdb",
"text": "-enable-pretty-printing",
"ignoreFailures": true
}
],
"preLaunchTask": "",
"miDebuggerPath": "/usr/bin/gdb"
},
{
"name": "(gdb) Attach",
"type": "cppdbg",
"request": "attach",
"program": "${workspaceFolder}/onnx-mlir/build/bin/onnx-mlir",
"processId": "${command:pickProcess}",
"MIMode": "gdb",
"setupCommands": [
{
"description": "Enable pretty-printing for gdb",
"text": "-enable-pretty-printing",
"ignoreFailures": true
}
]
}
]
}

View File

@ -0,0 +1,5 @@
{
"files.associations": {
"*.inc": "cpp"
}
}

View File

@ -0,0 +1,19 @@
FROM onnxmlirczar/onnx-mlir-build:amd64
WORKDIR /build
ENV HOME=/build
ENV PYENV_ROOT=$HOME/.pyenv
ENV PATH=$PYENV_ROOT/shims:$PYENV_ROOT/bin:$PATH
RUN pyenv global 3.7.0
RUN pyenv rehash
ENV PATH=$PATH:/build/bin
RUN apt-get update
RUN apt-get install -y python-numpy
RUN apt-get install -y python3-pip
RUN apt-get install -y gdb
RUN apt-get install -y lldb
RUN apt-get install -y emacs
WORKDIR /build/.vscode
ADD .vscode /build/.vscode
WORKDIR /build