Skip to content

infernce-enable-engine 0.1

Past due by over 6 years 100% complete

Support Inference on the third-party engines, TensorRT for example. Use subgraph to support all the Fluid models.

We will fully support any model on TensorRT with subgraph engine design and enable its full features.

With Anakin, we will support the full graph, and subgraph feature if their C++ parser is valid.

Support Inference on the third-party engines, TensorRT for example. Use subgraph to support all the Fluid models.

We will fully support any model on TensorRT with subgraph engine design and enable its full features.

With Anakin, we will support the full graph, and subgraph feature if their C++ parser is valid.

There are no open issues in this milestone.

Add issues to milestones to help organize your work for a particular release or project.

Create new issue

Or find and add issues with no milestone in this repo.