Train and deploy a SOTA model#
great-ai in action by going over the lifecycle of a simple service.
- You will see how the great_ai.utilities can integrate into your Data Science workflow.
- You will use great_ai.large_file to version and store your trained model.
- You will use GreatAI to prepare your model for a robust and responsible deployment.
You will train a field of study (domain) classifier for scientific sentences. The exact task was proposed by the SciBERT paper in which SciBERT achieved an F1-score of 0.6571. We are going to outperform it using a trivial text classification model: a Linear SVM.
You are ready to start the tutorial. Feel free to return to the summary section once you're finished.
After training and evaluating a model, it is exported using great_ai.save_model.
To store your model remotely, you must set your credentials before calling
For example, to use AWS S3:
from great_ai.large_file import LargeFileS3 LargeFileS3.configure( aws_region_name='eu-west-2', aws_access_key_id='MY_AWS_ACCESS_KEY', aws_secret_access_key='MY_AWS_SECRET_KEY', large_files_bucket_name='my_bucket_for_models' ) from great_ai import save_model save_model(model, key='my-domain-predictor')
For more info, checkout the configuration how-to page.
We create an inference function that can be hardened by wrapping it in a GreatAI instance.
- @use_model loads and injects your model into the
modelargument. You can freely reference it, knowing that the function is always provided with it.
Finally, we test the model's inference function through the GreatAI dashboard. The only thing left is to deploy the hardened service properly.