Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Panic in otlptrace triggered by integration test #5356

Open
titpetric opened this issue May 15, 2024 · 3 comments
Open

Panic in otlptrace triggered by integration test #5356

titpetric opened this issue May 15, 2024 · 3 comments
Labels
bug Something isn't working

Comments

@titpetric
Copy link

titpetric commented May 15, 2024

I'm getting the following panic (in CI, but not on local):

tyk-1             | panic: runtime error: hash of unhashable type [2]string
tyk-1             | 
tyk-1             | goroutine 54 [running]:
tyk-1             | go.opentelemetry.io/otel/exporters/otlp/otlptrace/internal/tracetransform.Spans({0xc00020af08, 0x4, 0xc000e6a080?})
tyk-1             | 	go.opentelemetry.io/otel/exporters/otlp/otlptrace@v1.26.0/internal/tracetransform/span.go:41 +0x2d9
tyk-1             | go.opentelemetry.io/otel/exporters/otlp/otlptrace.(*Exporter).ExportSpans(0xc000304370, {0x404dc18, 0xc0002dc0e0}, {0xc00020af08?, 0xc00008eef2?, 0xc0002936c0?})
tyk-1             | 	go.opentelemetry.io/otel/exporters/otlp/otlptrace@v1.26.0/exporter.go:31 +0x34
tyk-1             | go.opentelemetry.io/otel/sdk/trace.(*batchSpanProcessor).exportSpans(0xc00031c140, {0x404dba8, 0xc00017c6e0})
tyk-1             | 	go.opentelemetry.io/otel/sdk@v1.26.0/trace/batch_span_processor.go:277 +0x238
tyk-1             | go.opentelemetry.io/otel/sdk/trace.(*batchSpanProcessor).processQueue(0xc00031c140)
tyk-1             | 	go.opentelemetry.io/otel/sdk@v1.26.0/trace/batch_span_processor.go:305 +0x36e
tyk-1             | go.opentelemetry.io/otel/sdk/trace.NewBatchSpanProcessor.func1()
tyk-1             | 	go.opentelemetry.io/otel/sdk@v1.26.0/trace/batch_span_processor.go:117 +0x54
tyk-1             | created by go.opentelemetry.io/otel/sdk/trace.NewBatchSpanProcessor in goroutine 1
tyk-1             | 	go.opentelemetry.io/otel/sdk@v1.26.0/trace/batch_span_processor.go:115 +0x2e5
tyk-1 exited with code 2

Looking at the code, this should not be possible. We are using go 1.22.3 to build the project, and running the same test locally doesn't trigger the panic. I'm still investigating the issue, but if anyone has any ideas, I'm open for advice. This uses otel collector 0.100.0, and 1.26.0 release of otel, otel/trace (as seen in panic output).

@titpetric titpetric added the bug Something isn't working label May 15, 2024
@Cirilla-zmh
Copy link

@titpetric Could you describe how one might reproduce the issue?

@titpetric
Copy link
Author

Currently we have two PRs that are trying to replicate this, however we are not able to replicate it locally (still working on using the actual image in the CI test due to access control). It's failing in GH actions right now, and it seems to be caused by the go upgrade (same test suite passes on 1.21.x). I wish I had more info, other than that the trace from the panic seems fully invalid, the code in question is using a struct{} with 2 fields and have no idea where [2]string may be coming from.

https://github.com/TykTechnologies/tyk/actions/runs/9098478917/job/25009198049?pr=6269

I'll post any updates.

@titpetric
Copy link
Author

Replicated the panic on local with the ECR image, continuing investigation.

The second PR without an otel update produces a similar panic, but with v1.18.0.

panic: runtime error: hash of unhashable type [2]string

goroutine 48 [running]:
go.opentelemetry.io/otel/exporters/otlp/otlptrace/internal/tracetransform.Spans({0xc0004ea508, 0x4, 0xc000f06540?})
	go.opentelemetry.io/otel/exporters/otlp/otlptrace@v1.18.0/internal/tracetransform/span.go:52 +0x2d9
go.opentelemetry.io/otel/exporters/otlp/otlptrace.(*Exporter).ExportSpans(0xc0002ea4b0, {0x4022b18, 0xc00030c2a0}, {0xc0004ea508?, 0xc000f15ef2?, 0xc00029e1c0?})
	go.opentelemetry.io/otel/exporters/otlp/otlptrace@v1.18.0/exporter.go:44 +0x34
go.opentelemetry.io/otel/sdk/trace.(*batchSpanProcessor).exportSpans(0xc0005be0a0, {0x4022aa8, 0xc0002eab40})
	go.opentelemetry.io/otel/sdk@v1.18.0/trace/batch_span_processor.go:288 +0x238
go.opentelemetry.io/otel/sdk/trace.(*batchSpanProcessor).processQueue(0xc0005be0a0)
	go.opentelemetry.io/otel/sdk@v1.18.0/trace/batch_span_processor.go:316 +0x38d
go.opentelemetry.io/otel/sdk/trace.NewBatchSpanProcessor.func1()
	go.opentelemetry.io/otel/sdk@v1.18.0/trace/batch_span_processor.go:128 +0x54
created by go.opentelemetry.io/otel/sdk/trace.NewBatchSpanProcessor in goroutine 1
	go.opentelemetry.io/otel/sdk@v1.18.0/trace/batch_span_processor.go:126 +0x2e5

Things are pointing to some particular build issue with Go 1.22.3 in the CI environment, build from source doesn't experience this issue even with Go 1.22.3, so it's likely related to our CI cross build environment which is different (build with goreleaser, -X cflags, tags, trimpath,...)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants