Skip to content

Commit 5b91333

Browse files
fix: include total batch size in inference completion message
1 parent 82ab02c commit 5b91333

1 file changed

Lines changed: 1 addition & 1 deletion

File tree

src/ds_platform_utils/metaflow/batch_inference_pipeline.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -234,7 +234,7 @@ def inference_worker():
234234
with _timer(f"🔹 Generated predictions for file {file_id}"):
235235
predictions_df = predict_fn(df)
236236
inference_queue.put((file_id, predictions_df), timeout=timeout_per_batch)
237-
print(f"🔘 Inference completed for batch {file_id}")
237+
print(f"🔘 Inference completed for batch {file_id} of {len(file_batches)}")
238238

239239
def upload_worker():
240240
while True:

0 commit comments

Comments
 (0)