Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Alertmanager crashes when using 3rd party plugin #2651

Closed
gamer22026 opened this issue Aug 4, 2022 · 2 comments · Fixed by #3097
Closed

Alertmanager crashes when using 3rd party plugin #2651

gamer22026 opened this issue Aug 4, 2022 · 2 comments · Fixed by #3097
Labels
bug Something isn't working component/alertmanager

Comments

@gamer22026
Copy link

gamer22026 commented Aug 4, 2022

Describe the bug

We've been using https://grafana.com/grafana/plugins/camptocamp-prometheus-alertmanager-datasource/ to have a few operational dashboards. Works fine on regular Prometheus Alertmanager. When using it with the Mimir alertmanager, setting up a dashboard using that datasource causes the alertmanager pods to crash

To Reproduce

Steps to reproduce the behavior:

  1. Add https://grafana.com/grafana/plugins/camptocamp-prometheus-alertmanager-datasource/ datasource to Grafana
  2. Point it to Mimir alertmanager endpoint
  3. Try to configure a dashboard using the datasource with some label matchers
  4. Observe that the alertmanager pods crash

Expected behavior

Work as it does with Prometheus alertmanager

Environment

  • Kubernetes (1.22.12)
  • Helm
  • Mimir: 2.2.0
  • Grafana: 9.0.6

Additional Context

level=error ts=2022-08-03T23:02:00.270408794Z caller=api.go:228 component=MultiTenantAlertmanager user=dc1 component=api version=v2 path=/api/v2/alerts method=GET msg="Failed to parse matchers" err="bad matcher format: s"
level=error ts=2022-08-03T23:02:00.518144885Z caller=api.go:228 component=MultiTenantAlertmanager user=dc1 component=api version=v2 path=/api/v2/alerts method=GET msg="Failed to parse matchers" err="bad matcher format: se"
level=error ts=2022-08-03T23:02:00.836637511Z caller=api.go:228 component=MultiTenantAlertmanager user=dc1 component=api version=v2 path=/api/v2/alerts method=GET msg="Failed to parse matchers" err="bad matcher format: sev"
level=error ts=2022-08-03T23:02:01.237292966Z caller=api.go:228 component=MultiTenantAlertmanager user=dc1 component=api version=v2 path=/api/v2/alerts method=GET msg="Failed to parse matchers" err="bad matcher format: seve"
level=error ts=2022-08-03T23:02:01.346848828Z caller=api.go:228 component=MultiTenantAlertmanager user=dc1 component=api version=v2 path=/api/v2/alerts method=GET msg="Failed to parse matchers" err="bad matcher format: sever"
level=error ts=2022-08-03T23:02:04.244323073Z caller=api.go:228 component=MultiTenantAlertmanager user=dc1 component=api version=v2 path=/api/v2/alerts method=GET msg="Failed to parse matchers" err="bad matcher format: severi"
level=error ts=2022-08-03T23:02:04.533766987Z caller=api.go:228 component=MultiTenantAlertmanager user=dc1 component=api version=v2 path=/api/v2/alerts method=GET msg="Failed to parse matchers" err="bad matcher format: severit"
level=error ts=2022-08-03T23:02:04.80687485Z caller=api.go:228 component=MultiTenantAlertmanager user=dc1 component=api version=v2 path=/api/v2/alerts method=GET msg="Failed to parse matchers" err="bad matcher format: severity"
panic: runtime error: index out of range [0] with length 0

goroutine 6738 [running]:
github.com/prometheus/alertmanager/pkg/labels.ParseMatcher({0xc0009ac100, 0x9})
	/__w/mimir/mimir/vendor/github.com/prometheus/alertmanager/pkg/labels/parse.go:130 +0x692
github.com/prometheus/alertmanager/api/v2.parseFilter({0xc00079a3e0, 0x1, 0xc0014a2ca8})
	/__w/mimir/mimir/vendor/github.com/prometheus/alertmanager/api/v2/api.go:666 +0xa7
github.com/prometheus/alertmanager/api/v2.(*API).getAlertsHandler(0xc00076f3f0, {0xc0016ea400, 0xc0009ac109, {0xc00079a3e0, 0x1, 0x1}, 0xc0009ac10a, 0x0, 0xc0009ac10b, 0xc0009ac0cf})
	/__w/mimir/mimir/vendor/github.com/prometheus/alertmanager/api/v2/api.go:226 +0x1ad
github.com/prometheus/alertmanager/api/v2/restapi/operations/alert.GetAlertsHandlerFunc.Handle(0x23dea48, {0xc0016ea400, 0xc0009ac109, {0xc00079a3e0, 0x1, 0x1}, 0xc0009ac10a, 0x0, 0xc0009ac10b, 0xc0009ac0cf})
	/__w/mimir/mimir/vendor/github.com/prometheus/alertmanager/api/v2/restapi/operations/alert/get_alerts.go:33 +0x59
github.com/prometheus/alertmanager/api/v2/restapi/operations/alert.(*GetAlerts).ServeHTTP(0xc0012e15a8, {0x23bf540, 0xc00171c080}, 0xc0016ea400)
	/__w/mimir/mimir/vendor/github.com/prometheus/alertmanager/api/v2/restapi/operations/alert/get_alerts.go:68 +0x25e
github.com/go-openapi/runtime/middleware.NewOperationExecutor.func1({0x23bf540, 0xc00171c080}, 0xc0016ea400)
	/__w/mimir/mimir/vendor/github.com/go-openapi/runtime/middleware/operation.go:28 +0x59
net/http.HandlerFunc.ServeHTTP(0x40d234, {0x23bf540, 0xc00171c080}, 0x0)
	/usr/local/go/src/net/http/server.go:2047 +0x2f
github.com/go-openapi/runtime/middleware.NewRouter.func1({0x23bf540, 0xc00171c080}, 0xc0016ea200)
	/__w/mimir/mimir/vendor/github.com/go-openapi/runtime/middleware/router.go:78 +0x25e
net/http.HandlerFunc.ServeHTTP(0x0, {0x23bf540, 0xc00171c080}, 0x0)
	/usr/local/go/src/net/http/server.go:2047 +0x2f
github.com/go-openapi/runtime/middleware.Spec.func1({0x23bf540, 0xc00171c080}, 0xc001985268)
	/__w/mimir/mimir/vendor/github.com/go-openapi/runtime/middleware/spec.go:46 +0x18c
net/http.HandlerFunc.ServeHTTP(0xc0000fa9c0, {0x23bf540, 0xc00171c080}, 0xc0016ea200)
	/usr/local/go/src/net/http/server.go:2047 +0x2f
github.com/rs/cors.(*Cors).Handler.func1({0x23bf540, 0xc00171c080}, 0xc0016ea200)
	/__w/mimir/mimir/vendor/github.com/rs/cors/cors.go:231 +0x1c4
net/http.HandlerFunc.ServeHTTP(0xc0014a31c0, {0x23bf540, 0xc00171c080}, 0xc000eac430)
	/usr/local/go/src/net/http/server.go:2047 +0x2f
net/http.StripPrefix.func1({0x23bf540, 0xc00171c080}, 0xc0016ea100)
	/usr/local/go/src/net/http/server.go:2090 +0x330
net/http.HandlerFunc.ServeHTTP(0xc0013a9f00, {0x23bf540, 0xc00171c080}, 0x403eec)
	/usr/local/go/src/net/http/server.go:2047 +0x2f
github.com/prometheus/alertmanager/api.(*API).limitHandler.func1({0x23bf540, 0xc00171c080}, 0xc0016ea100)
	/__w/mimir/mimir/vendor/github.com/prometheus/alertmanager/api/api.go:222 +0x1eb
net/http.HandlerFunc.ServeHTTP(0xc0014a32e8, {0x23bf540, 0xc00171c080}, 0x1a70800)
	/usr/local/go/src/net/http/server.go:2047 +0x2f
net/http.(*ServeMux).ServeHTTP(0x1b11ae0, {0x23bf540, 0xc00171c080}, 0xc0016ea100)
	/usr/local/go/src/net/http/server.go:2425 +0x149
github.com/grafana/mimir/pkg/alertmanager.(*MultitenantAlertmanager).serveRequest(0xc00028a500, {0x23bf540, 0xc00171c080}, 0xc0016ea100)
	/__w/mimir/mimir/pkg/alertmanager/multitenant.go:804 +0x705
github.com/grafana/mimir/pkg/alertmanager.(*handlerForGRPCServer).ServeHTTP(0xc0010d5f00, {0x23bf540, 0xc00171c080}, 0x8)
	/__w/mimir/mimir/pkg/alertmanager/multitenant.go:384 +0x26
github.com/weaveworks/common/httpgrpc/server.Server.Handle({{0x239bca0, 0xc000386588}}, {0x23dea48, 0xc0016ee0f0}, 0xc001bb39a0)
	/__w/mimir/mimir/vendor/github.com/weaveworks/common/httpgrpc/server/server.go:61 +0x41f
github.com/grafana/mimir/pkg/alertmanager.(*MultitenantAlertmanager).HandleRequest(0xc0014a3558, {0x23dea48, 0xc0016ee0f0}, 0x1d45ce0)
	/__w/mimir/mimir/pkg/alertmanager/multitenant.go:789 +0x3a
github.com/grafana/mimir/pkg/alertmanager/alertmanagerpb._Alertmanager_HandleRequest_Handler.func1({0x23dea48, 0xc0016ee0f0}, {0x1d8e9e0, 0xc001bb39a0})
	/__w/mimir/mimir/pkg/alertmanager/alertmanagerpb/alertmanager.pb.go:472 +0x78
github.com/grafana/mimir/pkg/mimir.ThanosTracerUnaryInterceptor({0x23dea48, 0xc0016ee0c0}, {0x1d8e9e0, 0xc001bb39a0}, 0xc00079a200, 0xc0009bf860)
	/__w/mimir/mimir/pkg/mimir/tracing.go:19 +0x7a
github.com/grpc-ecosystem/go-grpc-middleware.ChainUnaryServer.func1.1.1({0x23dea48, 0xc0016ee0c0}, {0x1d8e9e0, 0xc001bb39a0})
	/__w/mimir/mimir/vendor/github.com/grpc-ecosystem/go-grpc-middleware/chain.go:25 +0x3a
github.com/weaveworks/common/middleware.ServerUserHeaderInterceptor({0x23dea48, 0xc0016ee060}, {0x1d8e9e0, 0xc001bb39a0}, 0x40d234, 0xc0010ac000)
	/__w/mimir/mimir/vendor/github.com/weaveworks/common/middleware/grpc_auth.go:38 +0x65
github.com/grafana/mimir/pkg/util/noauth.SetupAuthMiddleware.func1({0x23dea48, 0xc0016ee060}, {0x1d8e9e0, 0xc001bb39a0}, 0xc000021fe0, 0xc0010ac000)
	/__w/mimir/mimir/pkg/util/noauth/no_auth.go:32 +0xa7
github.com/grpc-ecosystem/go-grpc-middleware.ChainUnaryServer.func1.1.1({0x23dea48, 0xc0016ee060}, {0x1d8e9e0, 0xc001bb39a0})
	/__w/mimir/mimir/vendor/github.com/grpc-ecosystem/go-grpc-middleware/chain.go:25 +0x3a
github.com/weaveworks/common/middleware.UnaryServerInstrumentInterceptor.func1({0x23dea48, 0xc0016ee060}, {0x1d8e9e0, 0xc001bb39a0}, 0xc000021fe0, 0xc0010ac200)
	/__w/mimir/mimir/vendor/github.com/weaveworks/common/middleware/grpc_instrumentation.go:33 +0xa2
github.com/grpc-ecosystem/go-grpc-middleware.ChainUnaryServer.func1.1.1({0x23dea48, 0xc0016ee060}, {0x1d8e9e0, 0xc001bb39a0})
	/__w/mimir/mimir/vendor/github.com/grpc-ecosystem/go-grpc-middleware/chain.go:25 +0x3a
github.com/opentracing-contrib/go-grpc.OpenTracingServerInterceptor.func1({0x23dea48, 0xc00191e570}, {0x1d8e9e0, 0xc001bb39a0}, 0xc000021fe0, 0xc0010ac240)
	/__w/mimir/mimir/vendor/github.com/opentracing-contrib/go-grpc/server.go:57 +0x40f
github.com/grpc-ecosystem/go-grpc-middleware.ChainUnaryServer.func1.1.1({0x23dea48, 0xc00191e570}, {0x1d8e9e0, 0xc001bb39a0})
	/__w/mimir/mimir/vendor/github.com/grpc-ecosystem/go-grpc-middleware/chain.go:25 +0x3a
github.com/weaveworks/common/middleware.GRPCServerLog.UnaryServerInterceptor({{0x24156d8, 0xc00079aef0}, 0x20}, {0x23dea48, 0xc00191e570}, {0x1d8e9e0, 0xc001bb39a0}, 0xc000021fe0, 0xc0010ac440)
	/__w/mimir/mimir/vendor/github.com/weaveworks/common/middleware/grpc_logging.go:29 +0xbe
github.com/grpc-ecosystem/go-grpc-middleware.ChainUnaryServer.func1.1.1({0x23dea48, 0xc00191e570}, {0x1d8e9e0, 0xc001bb39a0})
	/__w/mimir/mimir/vendor/github.com/grpc-ecosystem/go-grpc-middleware/chain.go:25 +0x3a
github.com/grpc-ecosystem/go-grpc-middleware.ChainUnaryServer.func1({0x23dea48, 0xc00191e570}, {0x1d8e9e0, 0xc001bb39a0}, 0xc001592bd0, 0x1b2eec0)
	/__w/mimir/mimir/vendor/github.com/grpc-ecosystem/go-grpc-middleware/chain.go:34 +0xbf
github.com/grafana/mimir/pkg/alertmanager/alertmanagerpb._Alertmanager_HandleRequest_Handler({0x1dc9120, 0xc00028a500}, {0x23dea48, 0xc00191e570}, 0xc0012d6d80, 0xc00096d920)
	/__w/mimir/mimir/pkg/alertmanager/alertmanagerpb/alertmanager.pb.go:474 +0x138
google.golang.org/grpc.(*Server).processUnaryRPC(0xc0002c7180, {0x240bad0, 0xc00077d6c0}, 0xc000c178c0, 0xc000c68000, 0x333a440, 0x0)
	/__w/mimir/mimir/vendor/google.golang.org/grpc/server.go:1282 +0xccf
google.golang.org/grpc.(*Server).handleStream(0xc0002c7180, {0x240bad0, 0xc00077d6c0}, 0xc000c178c0, 0x0)
	/__w/mimir/mimir/vendor/google.golang.org/grpc/server.go:1619 +0xa2a
google.golang.org/grpc.(*Server).serveStreams.func1.2()
	/__w/mimir/mimir/vendor/google.golang.org/grpc/server.go:921 +0x98
created by google.golang.org/grpc.(*Server).serveStreams.func1
	/__w/mimir/mimir/vendor/google.golang.org/grpc/server.go:919 +0x294
@LeviHarrison
Copy link
Contributor

LeviHarrison commented Aug 4, 2022

I can reproduce, both with Mimir and Prometheus Alertmanager (although the panic is recovered there), so I believe this is an issue on the Prometheus side. What trips it off is =, !=, =~, or !~ following any letter in the "Filters" field, but not followed by another letter.

@LeviHarrison LeviHarrison added bug Something isn't working component/alertmanager labels Aug 4, 2022
@LeviHarrison
Copy link
Contributor

A fix is already underway: prometheus/alertmanager#2968

gotjosh added a commit that referenced this issue Sep 30, 2022
#3097)

* Alertmanager: Update the prometheus/alertmanager to its latest version

Fixes #2651
Fixes #2806
A revert of #2924
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working component/alertmanager
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants