Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Data race in otelgrpc.StatsHandler #4576

Closed
rosstimothy opened this issue Nov 13, 2023 · 3 comments
Closed

Data race in otelgrpc.StatsHandler #4576

rosstimothy opened this issue Nov 13, 2023 · 3 comments
Labels
area: instrumentation Related to an instrumentation package bug Something isn't working instrumentation: otelgrpc

Comments

@rosstimothy
Copy link

rosstimothy commented Nov 13, 2023

Description

WARNING: DATA RACE
Read at 0x00c007ab7e30 by goroutine 58927:
  go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc.(*config).handleRPC()
      /go/pkg/mod/go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc@v0.46.0/stats_handler.go:198 +0x176a
  go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc.(*serverHandler).HandleRPC()
      /go/pkg/mod/go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc@v0.46.0/stats_handler.go:88 +0x5d
  google.golang.org/grpc.(*Server).processStreamingRPC.func1()
      /go/pkg/mod/google.golang.org/grpc@v1.59.0/server.go:1546 +0x8f9
  runtime.deferreturn()
      /opt/go/src/runtime/panic.go:477 +0x30
  google.golang.org/grpc.(*Server).handleStream()
      /go/pkg/mod/google.golang.org/grpc@v1.59.0/server.go:1741 +0x1292
  google.golang.org/grpc.(*Server).serveStreams.func1.1()
      /go/pkg/mod/google.golang.org/grpc@v1.59.0/server.go:986 +0x11d

Previous write at 0x00c007ab7e30 by goroutine 58935:
  sync/atomic.AddInt64()
      /opt/go/src/runtime/race_amd64.s:289 +0xb
  sync/atomic.AddInt64()
      <autogenerated>:1 +0x15
  go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc.(*serverHandler).HandleRPC()
      /go/pkg/mod/go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc@v0.46.0/stats_handler.go:88 +0x5d
  google.golang.org/grpc.(*serverStream).RecvMsg()
      /go/pkg/mod/google.golang.org/grpc@v1.59.0/stream.go:1733 +0x5e2
  github.com/gravitational/teleport/api/utils/grpc/interceptors.(*grpcServerStreamWrapper).RecvMsg()
      /__w/teleport/teleport/api/utils/grpc/interceptors/errors.go:39 +0x55

The problem seems to stem from inconsistent usage of the members of gRPCContext. Writes to the fields are performed via atomic.AddInt64, however reads of the variables do not use atomic.LoadInt64.

Environment

  • OS: all
  • Architecture: all
  • Go Version: 1.21.4
  • otelgrpc version: v0.46.0

Expected behavior

Converting from the now deprecated interceptors to the new stats handler should produce the same traces without any data races occurring.

@rosstimothy rosstimothy added area: instrumentation Related to an instrumentation package bug Something isn't working instrumentation: otelgrpc labels Nov 13, 2023
@slonka
Copy link

slonka commented Nov 14, 2023

We're seeing panics that might be caused by the same root problem.

@strideynet
Copy link

closed by #4577 ?

@MrAlias
Copy link
Contributor

MrAlias commented Apr 9, 2024

closed by #4577 ?

Assuming yes, based on lack of response.

@MrAlias MrAlias closed this as completed Apr 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area: instrumentation Related to an instrumentation package bug Something isn't working instrumentation: otelgrpc
Projects
None yet
Development

No branches or pull requests

4 participants