Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Commit

Permalink
[MXNET-1083] Add the example to demonstrate the inference workflow us…
Browse files Browse the repository at this point in the history
…ing C++ API (#13294)

* [MXNET-1083] Add the example to demonstrate the inference workflow using C++ API

* [MXNET-1083] Add the example to demonstrate the inference workflow using C++ API

* Updated the code to address the review comments.

* Added the README file for the folder.

* Addressed the review comments

* Addressed the review comments to use argmax and default mean values.
  • Loading branch information
leleamol authored and sandeep-krishnamurthy committed Dec 15, 2018
1 parent 3626fd1 commit ed2cb76
Show file tree
Hide file tree
Showing 5 changed files with 573 additions and 2 deletions.
5 changes: 3 additions & 2 deletions cpp-package/example/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,8 @@

## Building C++ examples

The examples are built while building the MXNet library and cpp-package from source . However, they can be built manually as follows
The examples in this folder demonstrate the **training** workflow. The **inference workflow** related examples can be found in [inference](</~https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/inference>) folder.
The examples in this folder are built while building the MXNet library and cpp-package from source . However, they can be built manually as follows

From cpp-package/examples directory

Expand All @@ -18,7 +19,7 @@ The examples that are built to be run on GPU may not work on the non-GPU machine
The makefile will also download the necessary data files and store in a data folder. (The download will take couple of minutes, but will be done only once on a fresh installation.)


## Examples
## Examples demonstrating training workflow

This directory contains following examples. In order to run the examples, ensure that the path to the MXNet shared library is added to the OS specific environment variable viz. **LD\_LIBRARY\_PATH** for Linux, Mac and Ubuntu OS and **PATH** for Windows OS. For example `export LD_LIBRARY_PATH=/usr/local/cuda/lib64:/home/ubuntu/incubator-mxnet/lib` on ubuntu using gpu.

Expand Down
40 changes: 40 additions & 0 deletions cpp-package/example/inference/Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.


CPPEX_SRC = $(wildcard *.cpp)
CPPEX_EXE = $(patsubst %.cpp, %, $(CPPEX_SRC))
OPENCV_CFLAGS=`pkg-config --cflags opencv`
OPENCV_LDFLAGS=`pkg-config --libs opencv`

CXX=g++


CFLAGS=$(COMMFLAGS) -I../../../3rdparty/tvm/nnvm/include -I../../../3rdparty/dmlc-core/include -I ../../include -I ../../../include -Wall -O3 -msse3 -funroll-loops -Wno-unused-parameter -Wno-unknown-pragmas
CPPEX_EXTRA_LDFLAGS := -L../../../lib -lmxnet $(OPENCV_LDFLAGS)

all: $(CPPEX_EXE)

debug: CPPEX_CFLAGS += -DDEBUG -g
debug: all


$(CPPEX_EXE):% : %.cpp
$(CXX) -std=c++0x $(CFLAGS) $(CPPEX_CFLAGS) -o $@ $(filter %.cpp %.a, $^) $(CPPEX_EXTRA_LDFLAGS)

clean:
rm -f $(CPPEX_EXE)
41 changes: 41 additions & 0 deletions cpp-package/example/inference/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
# MXNet C++ Package Inference Workflow Examples

## Building C++ Inference examples

The examples in this folder demonstrate the **inference** workflow.
To build examples use following commands:

- Release: **make all**
- Debug: **make debug all**


## Examples demonstrating inference workflow

This directory contains following examples. In order to run the examples, ensure that the path to the MXNet shared library is added to the OS specific environment variable viz. **LD\_LIBRARY\_PATH** for Linux, Mac and Ubuntu OS and **PATH** for Windows OS.

### [inception_inference.cpp](</~https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/inference/inception_inference.cpp>)

This example demonstrates image classification workflow with pre-trained models using MXNet C++ API. The command line parameters the example can accept are as shown below:

```
./inception_inference --help
Usage:
inception_inference --symbol <model symbol file in json format>
--params <model params file>
--image <path to the image used for prediction
--synset file containing labels for prediction
[--input_shape <dimensions of input image e.g "3 224 224"]
[--mean file containing mean image for normalizing the input image
[--gpu] Specify this option if workflow needs to be run in gpu context
```
The model json and param file and synset files are required to run this example. The sample command line is as follows:

```
./inception_inference --symbol "./model/Inception-BN-symbol.json" --params "./model/Inception-BN-0126.params" --synset "./model/synset.txt" --mean "./model/mean_224.nd" --image "./model/dog.jpg"
```
Alternatively, The script [unit_test_inception_inference.sh](</~https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/inference/unit_test_inception_inference.sh>) downloads the pre-trained **Inception** model and a test image. The users can invoke this script as follows:

```
./unit_test_inception_inference.sh
```
Loading

0 comments on commit ed2cb76

Please sign in to comment.