Ollama Watch · live LLM inference benchmarks
https://ollama.linkworksinc.com/
Cloud LLM inference, measured against a real local rig — every hour, with the receipts.
I couldn't find an official status page for Ollama.com cloud services, but I did find this instead via a post on Reddit.