-
-
Notifications
You must be signed in to change notification settings - Fork 155
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Possible to load model from memory? #44
Comments
The support for frozen models (e.g.models all in one, with variables + structures all of them converted to a single optimized .pb) has been deprecated in TensorFlow 2.0 and it is going to be removed soon - that's why I support only loading from disk. But maybe I misunderstood your question: do you have a SavedModel loaded in If instead, as I thought, you have a It was present, however, you can go back to a specific commit (0663583) and load your frozen model using this version of tfgo (you also need some old TensorFlow C runtime installed, like 1.x) |
Loading models from []byte have a few advantages:
Unfortunately, looks like the current solution of going back to older Tensorflow versions aren't really viable. |
Unfortunately, the SavedModel serialization format is a folder - maybe some kind of abstraction can be designed, in order to have a byte array and let tfgo interpret this as a path (since tfgo just invokes the standard TensorFlow C API for loading saved models, and this API wants a location on the disk). But honestly, I don't know if this is feasible or how can this become complex to design and implement |
tfgo.LoadModel() method require the path to the model file on disk.
Suppose I have a []byte that contain a model file content, just downloaded from internet. Is it possible to directly load model from this []byte?
The text was updated successfully, but these errors were encountered: