Alertmanager sending notifications sporadically

Hello, folks. I have a problem with Alertmanager not sending notifications. Sometimes, it sends the first one after having received an alert from Prometheus but for other notifications like resolved and repeated notifications, nothing happens. I have run out of ideas as to what the problem is. The email service is working as expected.

Alertmanager (0.31.1) and Prometheus (3.9.1) are run by Prometheus operator (0.89), K8s 1.33.

Alertmanager reports receiving an alert but that’s about it, it does not report any notify activity. I am using one Prometheus rule to generate this alert. Since I am just validating the configuration and delivery of the notifications, the alerting rule is always firing:

groups:
- name: vpzz.rules
  rules:
  - alert: UpAlert
    annotations:
      description: 'A problem occured: service {{ $labels.service }}, container 
        {{ $labels.container }}, pod {{ $labels.pod }}.'
      summary: 'Value: {{ $value }}'
    expr: up{endpoint="http-web",namespace="prometheus-operator",service="kube-prometheus-stack-prometheus"} == 1
    for: 1m
    labels:
      severity: critical
      test: "true" 

The alertmanager’s configuration looks like this, route.routes[0] is my custom route configured via AlertmanagerConfig CR:

global:
  resolve_timeout: 5m
route:
  receiver: "null"
  group_by:
  - namespace
  routes:
  - receiver: prometheus-operator/test-config/acs
    group_by:
    - alertname
    matchers:
    - severity=~"critical"
    - job=~"kube-prometheus-stack-prometheus"
    - namespace="prometheus-operator"
    continue: true
    group_wait: 30s
    group_interval: 2m
    repeat_interval: 1h
  - receiver: "null"
    matchers:
    - alertname = "Watchdog"
  group_wait: 30s
  group_interval: 5m
  repeat_interval: 12h
inhibit_rules:
- target_matchers:
  - severity =~ warning|info
  source_matchers:
  - severity = critical
  equal:
  - namespace
  - alertname
- target_matchers:
  - severity = info
  source_matchers:
  - severity = warning
  equal:
  - namespace
  - alertname
- target_matchers:
  - severity = info
  source_matchers:
  - alertname = InfoInhibitor
  equal:
  - namespace
- target_matchers:
  - alertname = InfoInhibitor
receivers:
- name: "null"
- name: prometheus-operator/test-config/acs
  email_configs:
  - send_resolved: true
    to: user@example.com
    from: sender@example.com
    hello: alertmanager.local
    smarthost: smtp.example.com:587
    auth_username: auth-user
    auth_password: secret
templates:
- /etc/alertmanager/config/*.tmpl

The labels in the firing alert match the matcher configuration. The firing rule at Prometheus looks like this:

...
            "alerts": [
              {
                "labels": {
                  "alertname": "UpAlert",
                  "container": "prometheus",
                  "endpoint": "http-web",
                  "instance": "10.244.5.2:9090",
                  "job": "kube-prometheus-stack-prometheus",
                  "namespace": "prometheus-operator",
                  "pod": "prometheus-kube-prometheus-stack-prometheus-0",
                  "service": "kube-prometheus-stack-prometheus",
                  "severity": "critical",
                  "test": "true"
                },
                "annotations": {
                  "description": "A problem occured: service kube-prometheus-stack-prometheus, container prometheus, pod prometheus-kube-prometheus-stack-prometheus-0.",
                  "summary": "Value: 1"
                },
                "state": "firing",
                "activeAt": "2026-03-27T09:24:01.985001659Z",
                "value": "1e+00"
              }
            ],
            "health": "ok",
            "evaluationTime": 0.000421513,
            "lastEvaluation": "2026-03-27T09:31:31.985795924Z",
            "type": "alerting"
          }
        ],
        "interval": 30,
        "limit": 0,
        "evaluationTime": 0.000454714,
        "lastEvaluation": "2026-03-27T09:31:31.985767323Z"
      }
    ]

The correspondning alert at Alertmanager:

[
  {
    "annotations": {
      "description": "A problem occured: service kube-prometheus-stack-prometheus, container prometheus, pod prometheus-kube-prometheus-stack-prometheus-0.",
      "summary": "Value: 1"
    },
    "endsAt": "2026-03-27T09:36:31.985Z",
    "fingerprint": "3d1dc1af561a1e7e",
    "receivers": [
      {
        "name": "prometheus-operator/test-config/acs"
      }
    ],
    "startsAt": "2026-03-27T09:25:01.985Z",
    "status": {
      "inhibitedBy": [],
      "mutedBy": [],
      "silencedBy": [],
      "state": "active"
    },
    "updatedAt": "2026-03-27T09:32:31.986Z",
    "generatorURL": "http://example.com/graph?g0.expr=up%7Bendpoint%3D%22http-web%22%2Cnamespace%3D%22prometheus-operator%22%2Cservice%3D%22kube-prometheus-stack-prometheus%22%7D+%3D%3D+1&g0.tab=1",
    "labels": {
      "alertname": "UpAlert",
      "container": "prometheus",
      "endpoint": "http-web",
      "instance": "10.244.5.2:9090",
      "job": "kube-prometheus-stack-prometheus",
      "namespace": "prometheus-operator",
      "pod": "prometheus-kube-prometheus-stack-prometheus-0",
      "prometheus": "prometheus-operator/kube-prometheus-stack-prometheus",
      "service": "kube-prometheus-stack-prometheus",
      "severity": "critical",
      "test": "true"
    }
  }
]

Whilst the rule is firing, alertmanager shows the following in its log:

alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:55.839Z level=INFO source=main.go:191 msg="Starting Alertmanager" version="(version=0.31.1, branch=HEAD, revision=7a639ba087d6e877038cabfdaa645230f79001b8)"
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:55.839Z level=INFO source=main.go:194 msg="Build context" build_context="(go=go1.25.7, platform=linux/amd64, user=root@d57bf8cf52ff, date=20260211-21:08:35, tags=netgo)"
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.147Z level=DEBUG source=main.go:405 msg="external url" externalUrl=http://kube-prometheus-stack-alertmanager.prometheus-operator:9093
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.147Z level=INFO source=coordinator.go:111 msg="Loading configuration file" component=configuration file=/etc/alertmanager/config_out/alertmanager.env.yaml
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.232Z level=DEBUG source=parse.go:154 msg="Parsing with UTF-8 matchers parser, with fallback to classic matchers parser" input="severity=~\"critical\"" origin=config
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.233Z level=DEBUG source=parse.go:154 msg="Parsing with UTF-8 matchers parser, with fallback to classic matchers parser" input="job=~\"kube-prometheus-stack-prometheus\"" origin=config
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.233Z level=DEBUG source=parse.go:154 msg="Parsing with UTF-8 matchers parser, with fallback to classic matchers parser" input="namespace=\"prometheus-operator\"" origin=config
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.233Z level=DEBUG source=parse.go:154 msg="Parsing with UTF-8 matchers parser, with fallback to classic matchers parser" input="alertname = \"Watchdog\"" origin=config
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.233Z level=DEBUG source=parse.go:154 msg="Parsing with UTF-8 matchers parser, with fallback to classic matchers parser" input="severity =~ warning|info" origin=config
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.233Z level=DEBUG source=parse.go:154 msg="Parsing with UTF-8 matchers parser, with fallback to classic matchers parser" input="severity = critical" origin=config
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.233Z level=DEBUG source=parse.go:154 msg="Parsing with UTF-8 matchers parser, with fallback to classic matchers parser" input="severity = info" origin=config
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.233Z level=DEBUG source=parse.go:154 msg="Parsing with UTF-8 matchers parser, with fallback to classic matchers parser" input="severity = warning" origin=config
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.233Z level=DEBUG source=parse.go:154 msg="Parsing with UTF-8 matchers parser, with fallback to classic matchers parser" input="severity = info" origin=config
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.233Z level=DEBUG source=parse.go:154 msg="Parsing with UTF-8 matchers parser, with fallback to classic matchers parser" input="alertname = InfoInhibitor" origin=config
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.233Z level=DEBUG source=parse.go:154 msg="Parsing with UTF-8 matchers parser, with fallback to classic matchers parser" input="alertname = InfoInhibitor" origin=config
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.233Z level=INFO source=coordinator.go:124 msg="Completed loading of configuration file" component=configuration file=/etc/alertmanager/config_out/alertmanager.env.yaml
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.333Z level=DEBUG source=dispatch.go:161 msg="preparing to start" component=dispatcher startTime=2026-03-27T09:22:55.839Z
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.334Z level=DEBUG source=dispatch.go:164 msg="setting state" component=dispatcher state=waiting_to_start
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.334Z level=DEBUG source=dispatch.go:214 msg=started component=dispatcher state=running
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.334Z level=DEBUG source=dispatch.go:215 msg="Starting all existing aggregation groups" component=dispatcher
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.334Z level=DEBUG source=main.go:580 msg="route prefix" routePrefix=/
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.335Z level=INFO source=tls_config.go:354 msg="Listening on" address=[::]:9093
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.335Z level=INFO source=tls_config.go:400 msg="TLS is disabled." http2=false address=[::]:9093
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.734Z level=INFO source=coordinator.go:111 msg="Loading configuration file" component=configuration file=/etc/alertmanager/config_out/alertmanager.env.yaml
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.734Z level=DEBUG source=parse.go:154 msg="Parsing with UTF-8 matchers parser, with fallback to classic matchers parser" input="severity=~\"critical\"" origin=config
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.735Z level=DEBUG source=parse.go:154 msg="Parsing with UTF-8 matchers parser, with fallback to classic matchers parser" input="job=~\"kube-prometheus-stack-prometheus\"" origin=config
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.735Z level=DEBUG source=parse.go:154 msg="Parsing with UTF-8 matchers parser, with fallback to classic matchers parser" input="namespace=\"prometheus-operator\"" origin=config
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.735Z level=DEBUG source=parse.go:154 msg="Parsing with UTF-8 matchers parser, with fallback to classic matchers parser" input="alertname = \"Watchdog\"" origin=config
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.735Z level=DEBUG source=parse.go:154 msg="Parsing with UTF-8 matchers parser, with fallback to classic matchers parser" input="severity =~ warning|info" origin=config
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.735Z level=DEBUG source=parse.go:154 msg="Parsing with UTF-8 matchers parser, with fallback to classic matchers parser" input="severity = critical" origin=config
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.735Z level=DEBUG source=parse.go:154 msg="Parsing with UTF-8 matchers parser, with fallback to classic matchers parser" input="severity = info" origin=config
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.735Z level=DEBUG source=parse.go:154 msg="Parsing with UTF-8 matchers parser, with fallback to classic matchers parser" input="severity = warning" origin=config
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.735Z level=DEBUG source=parse.go:154 msg="Parsing with UTF-8 matchers parser, with fallback to classic matchers parser" input="severity = info" origin=config
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.735Z level=DEBUG source=parse.go:154 msg="Parsing with UTF-8 matchers parser, with fallback to classic matchers parser" input="alertname = InfoInhibitor" origin=config
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.735Z level=DEBUG source=parse.go:154 msg="Parsing with UTF-8 matchers parser, with fallback to classic matchers parser" input="alertname = InfoInhibitor" origin=config
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.735Z level=INFO source=coordinator.go:124 msg="Completed loading of configuration file" component=configuration file=/etc/alertmanager/config_out/alertmanager.env.yaml
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.736Z level=DEBUG source=dispatch.go:161 msg="preparing to start" component=dispatcher startTime=2026-03-27T09:22:55.839Z
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.737Z level=DEBUG source=dispatch.go:164 msg="setting state" component=dispatcher state=waiting_to_start
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.737Z level=DEBUG source=dispatch.go:214 msg=started component=dispatcher state=running
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:22:56.737Z level=DEBUG source=dispatch.go:215 msg="Starting all existing aggregation groups" component=dispatcher
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:24:31.988Z level=DEBUG source=dispatch.go:232 msg="Received alert" component=dispatcher alert=UpAlert[f500670][active]
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:25:01.986Z level=DEBUG source=dispatch.go:232 msg="Received alert" component=dispatcher alert=UpAlert[3d1dc1a][active]
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:25:01.989Z level=DEBUG source=dispatch.go:712 msg=flushing component=dispatcher aggrGroup="{}/{job=~\"kube-prometheus-stack-prometheus\",namespace=\"prometheus-operator\",severity=~\"critical\"}:{alertname=\"UpAlert\"}" alerts="[UpAlert[3d1dc1a][active] UpAlert[f500670][active]]"
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:26:01.986Z level=DEBUG source=dispatch.go:232 msg="Received alert" component=dispatcher alert=UpAlert[f500670][active]
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:26:31.986Z level=DEBUG source=dispatch.go:232 msg="Received alert" component=dispatcher alert=UpAlert[3d1dc1a][active]
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:27:31.986Z level=DEBUG source=dispatch.go:232 msg="Received alert" component=dispatcher alert=UpAlert[f500670][active]
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:28:01.986Z level=DEBUG source=dispatch.go:232 msg="Received alert" component=dispatcher alert=UpAlert[3d1dc1a][active]
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:28:01.986Z level=DEBUG source=dispatch.go:232 msg="Received alert" component=dispatcher alert=UpAlert[f500670][resolved]
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:29:31.986Z level=DEBUG source=dispatch.go:232 msg="Received alert" component=dispatcher alert=UpAlert[3d1dc1a][active]
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:29:31.986Z level=DEBUG source=dispatch.go:232 msg="Received alert" component=dispatcher alert=UpAlert[f500670][resolved]
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:31:01.986Z level=DEBUG source=dispatch.go:232 msg="Received alert" component=dispatcher alert=UpAlert[f500670][resolved]
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:31:01.987Z level=DEBUG source=dispatch.go:232 msg="Received alert" component=dispatcher alert=UpAlert[3d1dc1a][active]
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:32:31.986Z level=DEBUG source=dispatch.go:232 msg="Received alert" component=dispatcher alert=UpAlert[f500670][resolved]
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:32:31.987Z level=DEBUG source=dispatch.go:232 msg="Received alert" component=dispatcher alert=UpAlert[3d1dc1a][active]
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:34:01.986Z level=DEBUG source=dispatch.go:232 msg="Received alert" component=dispatcher alert=UpAlert[f500670][resolved]
alertmanager-kube-prometheus-stack-alertmanager-0 alertmanager time=2026-03-27T09:34:01.986Z level=DEBUG source=dispatch.go:232 msg="Received alert" component=dispatcher alert=UpAlert[3d1dc1a][active]

I would appreciate any hint as to what the problem might be. Cheers!