Image batch processing on roboflow dedicated cloud server

Hi
I am trying to process a batch of images with a custum-model and a workflow already created and ready for deployment.

I am following the docu at Running Workflows - Roboflow Inference

However I am missing the command, how to refer to a specific input_directory within my workspace? Can I make a link to a directory of images, which I have uploaded into my workspace? Shall I create a dataset? Or will all images be uploaded sequentially from my client-notebook to the clould inference server.

inference workflows process-images-directory
-i {your_input_directory}
-o {your_output_directory} [workflows.py](…%2F…%2Finference_cli%2Fworkflows.py)
–workspace_name {my-workspace}
–workflow_id {my-workflow}
–api-key {my-api-key}

Please kindly help, I have a bunch >10000 images per day to be infered, there must a reasonable way to do this in batch modus, please.

thx & cheers
Thomas

Hi Thomas, thank you for your message. Do you currently have a directory of images on your client machine? In this case, the input directory and output directory refer to local directories in the execution environment for the inference SDK.

In other words, your workflow defines the steps to take for each image in the batch, and the folders are not from your Roboflow workspace but instead your development machine that is executing the command.

1 Like

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.