Skip to content

xiakshay/Google-Text2Text-Flan-With-Docker

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Google Text2Text generation

Used libraries

fastapi==0.74.*
requests==2.27.*
uvicorn[stanard]==0.17.*
sentencepiece==0.1.*
torch==2.3.*
transformers==4.*

Use of fastAPI to request the model from google/flan-t5-small and use on deployed application.

Deployment on HugginFace

  • Drop the files on HuggingFace new Space
  • To go App
  • Go to embed this space
  • Click on url
  • To check the running app go to \generate and try it out

This is how you can deploy using Docker.

About

Text2Text Generation using goolge flan t5 small and containarization using Docker along with deployment on HuggingFace.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published