Real-time Models (Postgres)
For real-time models with mid-level throughput (e.g models with an HTTP endpoint such as POST /predict
), you can insert predictions to a database such as Postgres, MySQL, or even Elasticsearch.
If you are dealing with billions of predictions, this solution might not be sufficient for you.
Dealing with billions of predictions?
If you are dealing with billions of predictions, this solution might not be sufficient for you.
Please consider the guide on real-time models with Kafka.
Example: FastAPI + SQLAlchemy
If you are serving models with Flask or FastAPI, and don't have an extremely high throughput, you can simply insert predictions to a standard database.
Here, we'll use SQLAlchemy, which is a Python ORM to replace writing SQL INSERT
statements directly with something a bit nicer. Please see the FastAPI + SQLAlchemy tutorial for more details.
First, we can define the structure of our database table using Pydantic:
And here is a sample implementation of POST /predict
endpoint:
Last updated