Describe your question/issue here! (delete this when you post)
- Project Type: Object Detection
- Operating System & Browser: not relevant
- Project Universe Link or Workspace/Project ID: not relevant
Hello,
I am trying to use the Hosted API to send an image from my React Native app and then receive the annotated image back. However, I am running into some issues with the response.
Here’s what I have:
const sendFrameToServer = async (base64Image: string) => {
setLoading(true);
axios({
method: "POST",
url: "xxxx",
params: {
api_key: "xxxx",
labels: true,
format: "image",
},
data: base64Image,
headers: {
"Content-Type": "application/x-www-form-urlencoded",
},
})
.then(function (response) {
setLoading(false);
const blob = new Blob([response.data], { type: "image/jpeg" });
const imageUrl = URL.createObjectURL(blob);
setImage(imageUrl);
console.log(imageUrl);
})
.catch(function (error) {
console.log(error.message);
});
};
When I log the imageUrl I get this:
WARN Received data was not a string, or was not a recognised encoding.
LOG blob:337b3060-7593-4121-9367-6a60148f9bbc?offset=0&size=0
I am confused about how to actually use the returned image. I understand from the docs that it returns an image with annotated predictions as a binary blob with a Content-Type
of image/jpeg
, but I’m not sure how to actually use this image and display it.
I’m trying to use it like this:
{image && (
<Image
source={{ uri: image }}
style={{
position: "absolute",
top: 0,
left: 0,
width: "100%",
height: "100%",
zIndex: 1,
}}
/>
)}
to no avail. What am I missing here? I just want to display the image.