You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The all data members of cloned plugin must be set in IPluginV2::clone. The plugin only calls IPlugin*::initialize once when creating ICudaEngine. The cloned plugin won't call IPlugin*::initialize after IExecutionContext is created, but TensorRT may clone the plugin during the creation of IExecutionContext and use the cloned plugin to infer the model.
I fix a plugin as example, see #24106. Following plugins have the same bug at least:
@NHZlX
This is a serious bug. Please assign someone to fix it. Otherwise, all Paddle-TRT applications will crash after you upgrade TensorRT to next version, because new version of TensorRT always uses cloned plugin to infer the model
Since you haven't replied for more than a year, we have closed this issue/pr.
If the problem is not solved or there is a follow-up one, please reopen it at any time and we will continue to follow up.
由于您超过一年未回复,我们将关闭这个issue/pr。
若问题未解决或有后续问题,请随时重新打开,我们会继续跟进。
The all data members of cloned plugin must be set in
IPluginV2::clone
. The plugin only callsIPlugin*::initialize
once when creating ICudaEngine. The cloned plugin won't callIPlugin*::initialize
after IExecutionContext is created, but TensorRT may clone the plugin during the creation of IExecutionContext and use the cloned plugin to infer the model.I fix a plugin as example, see #24106. Following plugins have the same bug at least:
Read https://docs.nvidia.com/deeplearning/sdk/tensorrt-developer-guide/index.html#ipluginext to know more.
The text was updated successfully, but these errors were encountered: