To get started as quickly as possible, see the environment preperation tutorial, which shows how to set up a full environment in minutes.
Step 1: Kafka Broker
First, we will need a Kafka broker to collect all KServe inference requests and responses:
apiVersion:eventing.knative.dev/v1kind:Brokermetadata:name:sklearn-iris-brokernamespace:defaultannotations:eventing.knative.dev/broker.class:Kafkaspec:config:apiVersion:v1kind:ConfigMapname:inferencedb-kafka-broker-confignamespace:knative-eventing---apiVersion:v1kind:ConfigMapmetadata:name:inferencedb-kafka-broker-confignamespace:knative-eventingdata:# Number of topic partitionsdefault.topic.partitions:"8"# Replication factor of topic messages.default.topic.replication.factor:"1"# A comma separated list of bootstrap servers. (It can be in or out the k8s cluster)bootstrap.servers:"kafka-cp-kafka.default.svc.cluster.local:9092"
Step 2: InferenceService
Next, we will serve a simple sklearn model using KServe:
Note the logger section - you can read more about it in the KServe documentation.
Step 3: InferenceLogger
Finally, we can log the predictions of our new model using InferenceDB:
apiVersion:inferencedb.aporia.com/v1alpha1kind:InferenceLoggermetadata:name:sklearn-irisnamespace:defaultspec:# NOTE: The format is knative-broker-<namespace>-<brokerName>topic:knative-broker-default-sklearn-iris-brokerevents:type:kserveconfig: {}destination:type:confluent-s3config:url:s3://aporia-data/inferencedbformat:parquet# Optional - Only if you want to override column namesschema:type:avroconfig:columnNames:inputs: [sepal_width,petal_width,sepal_length,petal_length]outputs: [flower]
Step 4: Send requests
First, we will need to port-forward the Istio service so we can access it from our local machine: