Request payload is too large 413 when inference on Roboflow

Hello,

My name is Boris Sekachev. I am developer at CVAT.ai.
We have intergration with Roboflow that allows our users to perform automatic labeling on their tasks.

To send an image for inference we encode file as base64 and then use it in POST request body.
Sometimes it leads to failed request with code 413. The reason is clear, but I am trying to sort out current limitations in POST request for detect.roboflow.com requests. Was it documented somewhere?

In another topic I’ve seen 10MB was mentioned as a body limit, but that does not seem to be true. Because according to experiments, for example 4.66MB body leads to code 413, but 2.9MB body works fine.

Thank you in advance.

Try sending as a multipart form request vs base64 in the request body. The 10MB is a hard limit in AWS API Gateway but I think the request body might get duplicated so it goes down to 5MB with certain request formats.

Alternative is using the image param and sending a URL if this image is already hosted elsewhere (eg on S3) or resizing the image before sending across the wire (which is also desirable for improving network latency).

Hi @brad
Thank you for the clarification.

As I see from the documentation, “multipart/form-data” may be used for uploading an image file, but not supported for inference (se cURL tab).

Do you think I am looking at the wrong place? Or the feature was not documented?

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.