w/ @p.peczek
Sorry, I’m new to the infernece package and I’m sure I’m confused about something.
I installed inference server on my personal server, not in cloud form like roboflow, aws, etc.
So my thought is that the roboflow api-key has nothing to do with the 3 workspaces captured above, but how do I match the key… The model itself seems to be downloaded inside docker
In other words, what API_KEY could be associated with a workspace in internal Docker?
here is docker log
INFO: 172.17.0.1:35882 - "GET / HTTP/1.1" 304 Not Modified
Traceback (most recent call last):
File "/app/inference/core/interfaces/http/http_api.py", line 163, in wrapped_route
return await route(*args, **kwargs)
File "/app/inference/core/interfaces/http/http_api.py", line 1079, in sam_embed_image
model_response = await self.model_manager.infer_from_request(
File "/app/inference/core/managers/decorators/fixed_size_cache.py", line 91, in infer_from_request
return await super().infer_from_request(model_id, request, **kwargs)
File "/app/inference/core/managers/decorators/base.py", line 69, in infer_from_request
return await self.model_manager.infer_from_request(model_id, request, **kwargs)
File "/app/inference/core/managers/active_learning.py", line 147, in infer_from_request
prediction = await super().infer_from_request(
File "/app/inference/core/managers/active_learning.py", line 35, in infer_from_request
prediction = await super().infer_from_request(
File "/app/inference/core/managers/base.py", line 95, in infer_from_request
rtn_val = await self.model_infer(
File "/app/inference/core/managers/base.py", line 152, in model_infer
return self._models[model_id].infer_from_request(request)
File "/app/inference/models/sam/segment_anything.py", line 134, in infer_from_request
embedding, _ = self.embed_image(**request.dict())
File "/app/inference/models/sam/segment_anything.py", line 110, in embed_image
img_in = self.preproc_image(image)
File "/app/inference/models/sam/segment_anything.py", line 181, in preproc_image
np_image = load_image_rgb(image)
File "/app/inference/core/utils/image_utils.py", line 42, in load_image_rgb
np_image, is_bgr = load_image(
File "/app/inference/core/utils/image_utils.py", line 81, in load_image
np_image, is_bgr = load_image_with_known_type(
File "/app/inference/core/utils/image_utils.py", line 164, in load_image_with_known_type
image = loader(value, cv_imread_flags)
File "/app/inference/core/utils/image_utils.py", line 255, in load_image_base64
value = pybase64.b64decode(value)
binascii.Error: Incorrect padding
INFO: 172.17.0.1:40558 - "POST /sam/embed_image?api_key=2pKBv3TsNPQAmJm6AMzm HTTP/1.1" 500 Internal Server Error
Thank you