If it is possible to convert the saved model to tflite model?
See original GitHub issueI downloaded the single person from here
https://omnomnom.vision.rwth-aachen.de/data/metrabs/metrabs_singleperson_smpl.zip
And want to convert the saved model to tflite model by tflite converter :
model = tf.saved_model.load("./metrabs_singleperson_smpl/")
converter = tf.lite.TFLiteConverter.from_concrete_functions(model.__call__.concrete_functions)
tfmodel = converter.convert()
then it crashed and the log is
InvalidArgumentError: Input 2 of node StatefulPartitionedCall was passed float from unknown:0 incompatible with expected resource.
Did I miss something while converting? Thanks for you good job ~~~
Issue Analytics
- State:
- Created 2 years ago
- Comments:6 (2 by maintainers)
Top Results From Across the Web
Convert TensorFlow models | TensorFlow Lite
Convert a TensorFlow model using tf.lite.TFLiteConverter . A TensorFlow model is stored using the SavedModel format and is generated either ...
Read more >TensorFlow Lite converter
Convert a TensorFlow 2.x model using tf.lite.TFLiteConverter . A TensorFlow 2.x model is stored using the SavedModel format and is generated either using...
Read more >How to generate tflite from saved model? - Stack Overflow
Take a look at this Medium post for the end-to-end process of training and exporting the model as a TFLite graph. For conversion,...
Read more >Tensorflow Lite Converter Example!! | by Maheshwar Ligade
In order to convert TensorFlow 2.0 models to TensorFlow Lite, the model needs to be exported as a concrete function. If you have...
Read more >Convert Python Machine Learning Model to TensorFlow Lite
Convert a Google Colaboratory (Jupyter Notebook) linear regression model from Python to TF Lite. Learn the basics of NumPy, ...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
With the new models version (TF2), the mentioned operations are still incompatible?
I’m new to this world and I don’t understand much but I tried to convert like this:
converter = tf.lite.TFLiteConverter.from_saved_model("./models/metrabs_eff2l_y4")and it raised an error:ValueError: Only support at least one signature key.It seems that there isn’t any signature but I checked the code (src/models/metrabs.py) and I saw a signature( and it isn’t t the only one). I’m missing something or it is an intentional behavior? Thanks for you hard work!Can you let me know what your use case would be? The big model needs a powerful GPU anyway, so it’s not like you can deploy it easily on some lightweight embedded device. I’ll try looking into it though.