Things were working earlier in the day, but now the same code fails.
My REST requests are getting 401 Auth errors even though the API key is the same as in settings. The “Visualize” “Try on my machine” locked up and is still stuck even after refreshing multiple times.
Wondering if maybe something is broken?
Edit: Just revoked my key and generated a new one; same issue. No code has changed since earlier when it worked.
It’s funny… yesterday I was struggling to get inference offline on devices working, and today I thought “Well, maybe I can just do it over REST for now.”
Could you tell me which the models you are trying to use and which API endpoints are failing for you?
Our status checks (https://status.roboflow.com/) all seem fine and we are not aware of any issues with our hosted services – just did manual testing on several models without issues as well.
If you are working on private workspaces or models I can still try to debug internally on our systems if you give me the workspace id or model id. If you don’t want to post them here on the public forum, feel free to email me at thomas@roboflow.com
@Am_Pro, @mattkenefick, @Enas We’re looking into this more and think we have identified an issue with some public models being used on the hosted service. Investigating now and will update you here when we have a resolution. Thank you for reporting the issue.
We just rolled back a change deployed last night that might have caused authorization issues on some public models on the servereless API. We have rolled back the change and are seeing the errors disappear in logs.
I’m experiencing a problem deploying a model. It seems like a Roboflow outage problem, could somebody confirm this?
{
“message”: “The model upload service is temporarily disabled. It will be re-enabled soon, and we will notify you when it is back online.”,
“type”: “ServiceUnavailableException”,
“hint”: “Please try again later.”
}
The model upload is a different issue. We have temporarily disabled the model upload service because a security vulnerability was just discovered in PyTorch. We are working on patching our services to precent this issue being exploited, but until then will need to leave the model upload service disabled unfortunately.
Thanks for the prompt response. Does it mean there is no way at all to deploy a model to Roboflow? This seems like a major breaking point for a lot of people, is there a workaround?