Hi,
I have the following alert in Prometheus whose goal is to be triggered when the number of errors out of the total number of requests is higher than 5 %:
sum(increase(errorMetric{service_name="someservice"}[5m])) / sum(increase(http_requests_count{service_name="someservice", path="/some/path"}[5m])) > 0.05
Is this the correct way to represent the intention behind this alert?
I have an overall idea of the traffic and it can range between 100 requests per hour over 24h interval. How valuable is having the interval set for 5m? Shall this range over a longer period of time, e.g. 1h. This alert goes off and it does not really inform us of a problem. What is your view?
Thank you