Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

debug_traceBlockByHash returns method handler crashed for a certain block #2235

Closed
tibineacsu95 opened this issue Oct 30, 2023 · 7 comments
Closed
Labels
bug Something isn't working

Comments

@tibineacsu95
Copy link

tibineacsu95 commented Oct 30, 2023

Hello,

One of our customers is trying to perform the following query:

curl -s -X POST -H "Content-Type: application/json" --data '{"method":"debug_traceBlockByHash","params":["0xaf4beb8295889cbbe4760f4114730e46126e6b719f370e8387723c770eca0c85", {"tracer": "prestateTracer", "tracerConfig": {"onlyTopCall":false,"diffMode":true}}],"id":1,"jsonrpc":"2.0"}' localhost:9650/ext/bc/C/rpc

And the response is:

{"jsonrpc":"2.0","id":1,"error":{"code":-32603,"message":"method handler crashed"}}

We have tried tracing the the next block hash in history:

  • ran getBlockByHash on 0xaf4beb8295889cbbe4760f4114730e46126e6b719f370e8387723c770eca0c85 and got 0x877114 as block number in hex for the current block, so we are taking 0x877115 as the following block
  • ran eth_getBlockByNumber on 0x877115 and got the following block hash 0xecbb1db71a133c2959ff56018fc611b0ffa3daa71ebf0e89c98172652a35d117
  • running debug_traceBlockByHash on the above extracted hash renders the expected result

Conclusion: It seems like some blocks are having problems when being traced by hash - the above is just one of the examples, another block number with the same behaviour would be 8876292.

Avalanchego version:
avalanche/1.10.13 [database=v1.4.5, rpcchainvm=29, commit=2d8222da82114088a2a99d8d69f40846f1ea6208, go=1.20.10]
We've also tried updating to v1.10.14 but we faced the same behaviour.

Could anyone be able to shed some light on the above issues, whenever you get the time, please?

Thanks a lot!

P.S: This is a similar issue we've opened in the past, attaching it for traceability: #1689, which opened ava-labs/coreth#282 and was ultimately fixed.

@tibineacsu95 tibineacsu95 added the bug Something isn't working label Oct 30, 2023
@aaronbuchwald
Copy link
Collaborator

Thanks for reporting @tibineacsu95 , I tried reproducing this and get block not found. Checking against Snowtrace and AvaScan, I don't see any of those block hashes showing up and see different block hashes at those heights on both Fuji and Mainnet.

The block format for Coreth and Subnet-EVM are slightly different from go-ethereum, so it's possible that if the library they are using is re-calculating the block hash they may do so incorrectly resulting in making a query for a block hash that doesn't exist.

For the block hash that you got by using eth_getBlockByNumber (this will be a hash calculated by the node), I do see this block on Snowtrace as expected.

That being said, running the first query I get:

{"jsonrpc":"2.0","id":1,"error":{"code":-32000,"message":"block 0xaf4beb8295889cbbe4760f4114730e46126e6b719f370e8387723c770eca0c85 not found"}}

Could you share logs from the node you sent that request to? It should contain a stacktrace from the method handler crash.

@aaronbuchwald
Copy link
Collaborator

This could be caused by ethereum/go-ethereum#27691 which has not been migrated from upstream into coreth yet, but I'd need to see the stacktrace / be able to reproduce to confirm it's the same issue.

@tibineacsu95
Copy link
Author

Hi @aaronbuchwald!

First of all, let me attach the call used to extract the block and its response:

curl localhost:9650/ext/bc/C/rpc \ -X POST \ -H "Content-Type: application/json" \ --data '{ "method":"eth_getBlockByHash", "params":["0xaf4beb8295889cbbe4760f4114730e46126e6b719f370e8387723c770eca0c85",false], "id":1, "jsonrpc":"2.0" }' {"jsonrpc":"2.0","id":1,"result":{"baseFeePerGas":"0x5d21dba00","blockExtraData":"0x","blockGasCost":"0x0","difficulty":"0x1","extDataGasUsed":"0x0","extDataHash":"0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421","extraData":"0x00000000000000000000000000037afd0000000000000000000000000000d79e000000000000000000000000000f6aa70000000000000000000000000005cd7e00000000000000000000000000000000","gasLimit":"0x7a1200","gasUsed":"0x2eb307","hash":"0xaf4beb8295889cbbe4760f4114730e46126e6b719f370e8387723c770eca0c85","logsBloom":"0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000002000800000000008800000000000000000000000000000000000000000000000000000000000000000000000000000000000000000008000000040000000000000000000000000000200000000000000000000000000000000000000000000000000000000000000000000001000000000000000000000000000000000000040000200800000000000000000000000000000000000000000000002000000000","miner":"0x0100000000000000000000000000000000000000","mixHash":"0x0000000000000000000000000000000000000000000000000000000000000000","nonce":"0x0000000000000000","number":"0x877114","parentHash":"0x78fb4ec98df2e4ed221cfc25a954c62c0839eac40b4398c4629e5fe614299882","receiptsRoot":"0x03d51af21e599214f153ed7bddc539a86eca3ac5a1713707332c19ee1e6d521e","sha3Uncles":"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347","size":"0x2e2f","stateRoot":"0xadb192b04d856dfb987a1d94823b0f1e01e0ab70d8b22caaeec2132e4f8c24be","timestamp":"0x62641303","totalDifficulty":"0x877114","transactions":["0x235c2e6293a7b79ec35bf36b1e02c35568c9358d2a97804c2b6c790231b46586","0x72ced8deb54e2df66ab12fbdca5d10ba7be919c560a3ea0623d15b631304957c","0x03f5db020871d0837293ae040b40feb9b42adf1d1aaebbc90d68eae508edf77f"],"transactionsRoot":"0xa96bcfef712c5709b6b0397a2b146f85d190054d9f5e43df592f7688f128cb00","uncles":[]}}

The resulted number corresponds to the block here: https://testnet.snowtrace.io/block/8876308

Another behaviour we've observed is that the following call works (tracing by block number instead of block hash):

curl localhost:9650/ext/bc/C/rpc \ -X POST \ -H "Content-Type: application/json" \ --data '{"method":"debug_traceBlockByNumber","params":["0x877114", {"tracer": "callTracer"}],"id":1,"jsonrpc":"2.0"}'

Will not attach the output as it is quite long.

On another note, I have tried removing one of our Testnet nodes from Load Balancing in order to isolate the issue, and extracted a stacktrace logged when sending queries using debug_traceBlockByHash. Please do keep in mind that I'm not 100% sure these messages relate to the same block we're having issues about, since we do have some session stickiness in place as well.

Here are the lines:
Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: ERROR[10-31|08:50:14.120] <C Chain> github.com/ava-labs/coreth/rpc/service.go:210: RPC method debug_traceBlockByHash crashed: runtime error: slice bounds out of range [:610] with capacity 480 Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: goroutine 174793064 [running]: Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: github.com/ava-labs/coreth/rpc.(*callback).call.func1() Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: /home/runner/go/pkg/mod/github.com/ava-labs/coreth@v0.12.6-rc.2/rpc/service.go:209 +0x89 Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: panic({0x1b5f5c0, 0xc0205e67c8}) Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: /opt/hostedtoolcache/go/1.20.10/x64/src/runtime/panic.go:884 +0x213 Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: github.com/ava-labs/coreth/core/vm.(*Memory).GetCopy(...) Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: /home/runner/go/pkg/mod/github.com/ava-labs/coreth@v0.12.6-rc.2/core/vm/memory.go:86 Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: github.com/ava-labs/coreth/eth/tracers/native.(*prestateTracer).CaptureState(0xc026d8d5c0, 0x632c3f?, 0xf5, 0x1?, 0x6?, 0xc0276510f8, {0x1?, 0x41517a?, 0xd3c8?}, 0x1, ...) Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: /home/runner/go/pkg/mod/github.com/ava-labs/coreth@v0.12.6-rc.2/eth/tracers/native/prestate.go:178 +0x770 Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: github.com/ava-labs/coreth/core/vm.(*EVMInterpreter).Run(0xc026d8d620, 0xc020d1c300, {0xc04463a0f0, 0x24, 0x24}, 0x0) Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: /home/runner/go/pkg/mod/github.com/ava-labs/coreth@v0.12.6-rc.2/core/vm/interpreter.go:254 +0x9ba Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: github.com/ava-labs/coreth/core/vm.(*EVM).Call(0xc028dfe240, {0x1fce140, 0xc0205e6750}, {0x3, 0xb0, 0xa2, 0x5f, 0x44, 0xa5, 0x70, ...}, ...) Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: /home/runner/go/pkg/mod/github.com/ava-labs/coreth@v0.12.6-rc.2/core/vm/evm.go:308 +0xcc8 Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: github.com/ava-labs/coreth/core.(*StateTransition).TransitionDb(0xc00fb5ada0) Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: /home/runner/go/pkg/mod/github.com/ava-labs/coreth@v0.12.6-rc.2/core/state_transition.go:390 +0x8ea Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: github.com/ava-labs/coreth/core.ApplyMessage(0x1fdc5d0?, 0xc0234579f0?, 0x1de0958?) Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: /home/runner/go/pkg/mod/github.com/ava-labs/coreth@v0.12.6-rc.2/core/state_transition.go:187 +0x5d Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: github.com/ava-labs/coreth/eth/tracers.(*baseAPI).traceTx(0xc018226460, {0x1fdc5d0, 0xc0234579f0}, 0xc0266bd8c0, 0xc023457f40, {0x1de0948, 0x1de0940, 0x1de0958, 0x1de0950, 0xc012c7fd10, ...}, ...) Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: /home/runner/go/pkg/mod/github.com/ava-labs/coreth@v0.12.6-rc.2/eth/tracers/api.go:1012 +0x6c5 Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: github.com/ava-labs/coreth/eth/tracers.(*baseAPI).traceBlock(0xc018226460, {0x1fdc5d0?, 0xc0234579f0?}, 0xc04273aaf0, 0xc015f20f80) Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: /home/runner/go/pkg/mod/github.com/ava-labs/coreth@v0.12.6-rc.2/eth/tracers/api.go:662 +0xe6d Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: github.com/ava-labs/coreth/eth/tracers.(*API).TraceBlockByHash(0xc018226460, {0x1fdc5d0, 0xc0234579f0}, {0x2, 0x95, 0xbe, 0x7f, 0xac, 0xbe, 0xc1, ...}, ...) Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: /home/runner/go/pkg/mod/github.com/ava-labs/coreth@v0.12.6-rc.2/eth/tracers/api.go:480 +0x85 Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: reflect.Value.call({0xc0182256c0?, 0xc018133148?, 0x7fc89fe3cf38?}, {0x1c5cb98, 0x4}, {0xc026d8d140, 0x4, 0x0?}) Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: /opt/hostedtoolcache/go/1.20.10/x64/src/reflect/value.go:586 +0xb07 Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: reflect.Value.Call({0xc0182256c0?, 0xc018133148?, 0x16?}, {0xc026d8d140?, 0x2?, 0x2?}) Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: /opt/hostedtoolcache/go/1.20.10/x64/src/reflect/value.go:370 +0xbc Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: github.com/ava-labs/coreth/rpc.(*callback).call(0xc0182793e0, {0x1fdc5d0?, 0xc0234579f0}, {0xc0205e6318, 0x16}, {0xc012c7f9b0, 0x2, 0x4e9d37?}) Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: /home/runner/go/pkg/mod/github.com/ava-labs/coreth@v0.12.6-rc.2/rpc/service.go:215 +0x3c5 Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: github.com/ava-labs/coreth/rpc.(*handler).runMethod(0xc01601c0a0?, {0x1fdc5d0?, 0xc0234579f0?}, 0xc0402c6850, 0x2?, {0xc012c7f9b0?, 0x1adce80?, 0xc009a8fd28?}) Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: /home/runner/go/pkg/mod/github.com/ava-labs/coreth@v0.12.6-rc.2/rpc/handler.go:613 +0x45 Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: github.com/ava-labs/coreth/rpc.(*handler).handleCall(0xc0251ca820, 0xc026d8d080, 0xc0402c6850) Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: /home/runner/go/pkg/mod/github.com/ava-labs/coreth@v0.12.6-rc.2/rpc/handler.go:557 +0x239 Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: github.com/ava-labs/coreth/rpc.(*handler).handleCallMsg(0xc0251ca820, 0xc026d8d080, 0xc0402c6850) Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: /home/runner/go/pkg/mod/github.com/ava-labs/coreth@v0.12.6-rc.2/rpc/handler.go:518 +0x3ad Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: github.com/ava-labs/coreth/rpc.(*handler).handleMsg.func1(0xc026d8d080) Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: /home/runner/go/pkg/mod/github.com/ava-labs/coreth@v0.12.6-rc.2/rpc/handler.go:285 +0x1a5 Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: github.com/ava-labs/coreth/rpc.(*handler).startCallProc.func1() Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: /home/runner/go/pkg/mod/github.com/ava-labs/coreth@v0.12.6-rc.2/rpc/handler.go:430 +0x1ba Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: created by github.com/ava-labs/coreth/rpc.(*handler).startCallProc Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: /home/runner/go/pkg/mod/github.com/ava-labs/coreth@v0.12.6-rc.2/rpc/handler.go:434 +0xa5 Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: Oct 31 08:50:14 avalanche-fuji avalanchego[817966]: INFO [10-31|08:50:14.120] <C Chain> github.com/ava-labs/coreth/rpc/handler.go:526: Served debug_traceBlockByHash conn=162.19.232.108:4731 reqid=13520 execTime=1.416693ms procTime=1.424487ms totalTime=1.425081ms err="method handler crashed"

Please let me know if this is helpful in any way. Thank you!

@aaronbuchwald
Copy link
Collaborator

Thanks for posting the stacktrace, this is very helpful and looks like the same issue that I mentioned has already been fixed upstream. Creating an issue on Coreth to track this: ava-labs/coreth#369.

@aaronbuchwald
Copy link
Collaborator

Btw it looks like the reason that tracing by block number does not trigger the same error is that the API call that you pasted there is running the callTracer instead of the prestateTracer. I was running on mainnet instead of fuji before and I'm now able to reproduce on Fuji and get the same result tracing by block hash and number when using the prestateTracer.

I'll put up a PR to migrate the geth change from upstream to coreth and this should be included in the next release.

@tibineacsu95
Copy link
Author

Thanks a lot for your help and time spent in troubleshooting! Really appreciate it. We'll wait for the new release of avalanchego and update as soon as possible. Have a nice rest of the week!

@tibineacsu95
Copy link
Author

tibineacsu95 commented Nov 2, 2023

Quick update: after updating to avalanche/1.10.15 everything is back to normal. We can trace the problematic block now. Thank you again for the help! I will also close the issue and reopen it in case we find any other discrepancies, but I think we're good for now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
No open projects
Status: Done
Development

No branches or pull requests

2 participants