How to configure GreatAI#
GreatAI aims to provide reasonable defaults wherever possible. The current configuration is always prominently displayed (and updated) on the dashboard and in the command-line start-up banner.
You can override any of the default settings by calling great_ai.configure. If you don't call
configure, the default settings are applied on the first call to most
from great_ai import configure, RouteConfig
- Completely disable caching.
- The unspecified routes are enabled by default.
Using remote storage#
The only aspect that cannot be automated is choosing the backing storage for the database and file storage.
Without explicit configuration, LargeFileLocal is selected by default. This one still version-controls your files but it only stores them in a local path (which of course can be a remote volume attached by NFS, HDFS, etc.).
- This line isn't strictly necessary because if
mongo.ini) is available in the current working directory, they are automatically used to configure their respective LargeFile implementations/databases.
Departing from AWS
aws_endpoint_url argument, it is possible to use any other S3-compatible service such as Backblaze. In that case, it would be
GridFS specifies how to store files in MongoDB. The official MongoDB server and many compatible implementations support it.
Simplifying config files
You can combine
s3.ini with your application's config file because the unneeded keys are ignored by the
Using a database#
By default, a thread-safe version of TinyDB is utilised for saving the prediction traces into a local file. Unfortunately, for most production needs, this method is not suitable.
Currently, only MongoDB is supported as a production-ready
TracingDatabase. In order to use it, you have to either place a file named
mongo.ini in your working directory or explicitly call either MongoDbDriver.configure_credentials_from_file or MongoDbDriver.configure_credentials.