Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Please enable better error handling or auto handling for the case of normal message callback followed by "RST_STREAM with code 0" error status callback #2004

Closed
baechul-k opened this issue Jan 3, 2022 · 18 comments

Comments

@baechul-k
Copy link

baechul-k commented Jan 3, 2022

Is your feature request related to a problem? Please describe.

I wrote a simple unary rpc client over to an internal rpc server that I don't have control. It failed with "Error: 13 INTERNAL: Received RST_STREAM with code 0". I have identified the root cause. The server sent the expected message which was received in the onReceiveMessage callback of the grpc-js client.ts. However somehow the grpc-js client received the unexpected error, "RST_STREAM with code 0" instead of OK status. It appears the server or its layers on the server side unexpectedly closed the stream after the correct response message was sent to the grps-js client. The responseMessage variable still holds the earlier received message by the time onReceiveStatus callback is called with the error status in the grpc-js layer.

It appears, from my client code, there is no way to access the received message when the follow-up status callback has received a status other than status.OK. (The current implementation v1.4.5 of "onReceiveStatus" in makeUnaryRequest, makeClientStreamRequest in @grpc/grpc-js/src/client.ts)

Describe the solution you'd like

When an error happens, could we callback with not just error but also response message?

FROM:

if (status.code === constants_1.Status.OK) {
    callProperties.callback(null, responseMessage);
}
else {
    callProperties.callback(call_1.callErrorFromStatus(status));
}

TO:

if (status.code === constants_1.Status.OK) {
    callProperties.callback(null, responseMessage);
}
else {
    callProperties.callback(call_1.callErrorFromStatus(status), responseMessage);
}

This enables user to write a better error handling:

// from my client promise callback
...
(err, resp) => {
    if (err) {
        if (err.code === 13 && err.message.includes('RST_STREAM with code 0') && resp) {
            resolve(resp)
        } else {
            reject(err)
        }
    } else {
        resolve(resp)
    }
}   

Describe alternatives you've considered

For the same server, I have wrote a scala client with the sbt-akka-grpc. It worked out of the box and I didn't have to deal with the case of the correct-message followed by RST_STREAM with code 0 error.

// scala grpc client
call onComplete {
    case Success(msg) =>        // <-- I just received the message on the success case for the same scenario 
         myUnmarshalling(msg)
         exit(0)

     case Failure(e) =>
          e.printStackTrace(System.err)
          exit(1)
     }
}

Hence another option would be for grpc-js to treat this special case as a normal OK case and issue a message to the customer instead of giving the error.

Additional context

Add any other context about the feature request here.

@murgatroid99
Copy link
Member

Since it works with another client, you may be encountering a bug in grpc-js. Would you mind running your client with the environment variables GRPC_TRACE=all and GRPC_VERBOSITY=DEBUG, and sharing the output from a run when this error occurs?

@moatra
Copy link

moatra commented Jan 12, 2022

I seem to have a similar issue, here are the requested logs:

As far as I can tell, this seems to occur when the server (in my case Tonic) tries to short circuit and return the full response in the initial response headers frame, and does not put the grpc status details in a trailer frame.

D 2022-01-12T07:20:13.881Z | index | Loading @grpc/grpc-js version 1.4.1                                                                                                                                                                                                                                                                                              
D 2022-01-12T07:20:14.047Z | resolving_load_balancer | dns:localhost:50051 IDLE -> IDLE                                                                                                                                                                                                                                                                               
D 2022-01-12T07:20:14.048Z | connectivity_state | (1) dns:localhost:50051 IDLE -> IDLE                                                                                                                                                                                                                                                                                
D 2022-01-12T07:20:14.048Z | dns_resolver | Resolver constructed for target dns:localhost:50051                                                                                                                                                                                                                                                                       
D 2022-01-12T07:20:14.049Z | channel | (1) dns:localhost:50051 Channel constructed with options {}                                                                                                                                                                                                                                                                    
D 2022-01-12T07:20:24.883Z | channel | (1) dns:localhost:50051 createCall [0] method="/identity.UnauthedIdentity/LogIn", deadline=Infinity                                                                                                                                                                                                                            
D 2022-01-12T07:20:24.884Z | call_stream | [0] Sending metadata                                                                                                                                                                                                                                                                                                       
D 2022-01-12T07:20:24.884Z | dns_resolver | Resolution update requested for target dns:localhost:50051                                                                                                                                                                                                                                                                
D 2022-01-12T07:20:24.885Z | resolving_load_balancer | dns:localhost:50051 IDLE -> CONNECTING                                                                                                                                                                                                                                                                         
D 2022-01-12T07:20:24.885Z | connectivity_state | (1) dns:localhost:50051 IDLE -> CONNECTING                                                                                                                                                                                                                                                                          
D 2022-01-12T07:20:24.885Z | resolving_load_balancer | dns:localhost:50051 CONNECTING -> CONNECTING                                                                                                                                                                                                                                                                   
D 2022-01-12T07:20:24.885Z | connectivity_state | (1) dns:localhost:50051 CONNECTING -> CONNECTING                                                                                                                                                                                                                                                                    
D 2022-01-12T07:20:24.885Z | channel | (1) dns:localhost:50051 callRefTimer.ref | configSelectionQueue.length=1 pickQueue.length=0                                                                                                                                                                                                                                    
D 2022-01-12T07:20:24.887Z | call_stream | [0] write() called with message of length 131                                                                                                                                                                                                                                                                              
D 2022-01-12T07:20:24.887Z | call_stream | [0] end() called                                                                                                                                                                                                                                                                                                           
D 2022-01-12T07:20:24.887Z | call_stream | [0] deferring writing data chunk of length 136                                                                                                                                                                                                                                                                             
D 2022-01-12T07:20:24.888Z | dns_resolver | Resolved addresses for target dns:localhost:50051: [::1:50051,127.0.0.1:50051]                                                                                                                                                                                                                                            
D 2022-01-12T07:20:24.889Z | pick_first | Connect to address list ::1:50051,127.0.0.1:50051                                                                                                                                                                                                                                                                           
D 2022-01-12T07:20:24.890Z | subchannel | (2) ::1:50051 Subchannel constructed with options {}                                                                                                                                                                                                                                                                        
D 2022-01-12T07:20:24.890Z | subchannel_refcount | (2) ::1:50051 refcount 0 -> 1                                                                                                                                                                                                                                                                                      
D 2022-01-12T07:20:24.890Z | subchannel | (3) 127.0.0.1:50051 Subchannel constructed with options {}                                                                                                                                                                                                                                                                  
D 2022-01-12T07:20:24.890Z | subchannel_refcount | (3) 127.0.0.1:50051 refcount 0 -> 1                                                                                                                                                                                                                                                                                
D 2022-01-12T07:20:24.890Z | subchannel_refcount | (2) ::1:50051 refcount 1 -> 2                                                                                                                                                                                                                                                                                      
D 2022-01-12T07:20:24.890Z | subchannel_refcount | (3) 127.0.0.1:50051 refcount 1 -> 2                                                                                                                                                                                                                                                                                
D 2022-01-12T07:20:24.890Z | pick_first | Start connecting to subchannel with address ::1:50051                                                                                                                                                                                                                                                                       
D 2022-01-12T07:20:24.890Z | pick_first | IDLE -> CONNECTING                                                                                                                                                                                                                                                                                                          
D 2022-01-12T07:20:24.890Z | resolving_load_balancer | dns:localhost:50051 CONNECTING -> CONNECTING 
D 2022-01-12T07:20:24.890Z | channel | (1) dns:localhost:50051 callRefTimer.unref | configSelectionQueue.length=1 pickQueue.length=0
D 2022-01-12T07:20:24.890Z | connectivity_state | (1) dns:localhost:50051 CONNECTING -> CONNECTING
D 2022-01-12T07:20:24.891Z | subchannel | (2) ::1:50051 IDLE -> CONNECTING
D 2022-01-12T07:20:24.891Z | pick_first | CONNECTING -> CONNECTING
D 2022-01-12T07:20:24.891Z | resolving_load_balancer | dns:localhost:50051 CONNECTING -> CONNECTING
D 2022-01-12T07:20:24.891Z | connectivity_state | (1) dns:localhost:50051 CONNECTING -> CONNECTING
D 2022-01-12T07:20:24.892Z | channel | (1) dns:localhost:50051 Pick result: QUEUE subchannel: undefined status: undefined undefined
D 2022-01-12T07:20:24.892Z | channel | (1) dns:localhost:50051 callRefTimer.ref | configSelectionQueue.length=0 pickQueue.length=1
D 2022-01-12T07:20:24.892Z | subchannel | (3) 127.0.0.1:50051 IDLE -> CONNECTING
D 2022-01-12T07:20:24.892Z | pick_first | CONNECTING -> CONNECTING
D 2022-01-12T07:20:24.892Z | resolving_load_balancer | dns:localhost:50051 CONNECTING -> CONNECTING
D 2022-01-12T07:20:24.892Z | channel | (1) dns:localhost:50051 callRefTimer.unref | configSelectionQueue.length=0 pickQueue.length=0
D 2022-01-12T07:20:24.892Z | channel | (1) dns:localhost:50051 Pick result: QUEUE subchannel: undefined status: undefined undefined
D 2022-01-12T07:20:24.892Z | channel | (1) dns:localhost:50051 callRefTimer.ref | configSelectionQueue.length=0 pickQueue.length=1
D 2022-01-12T07:20:24.892Z | connectivity_state | (1) dns:localhost:50051 CONNECTING -> CONNECTING
D 2022-01-12T07:20:24.892Z | subchannel | (2) ::1:50051 creating HTTP/2 session
D 2022-01-12T07:20:24.894Z | subchannel | (3) 127.0.0.1:50051 creating HTTP/2 session
D 2022-01-12T07:20:24.896Z | subchannel | (2) ::1:50051 connection closed with error connect ECONNREFUSED ::1:50051
D 2022-01-12T07:20:24.896Z | subchannel | (2) ::1:50051 connection closed
D 2022-01-12T07:20:24.896Z | subchannel | (2) ::1:50051 CONNECTING -> TRANSIENT_FAILURE
D 2022-01-12T07:20:24.897Z | pick_first | CONNECTING -> CONNECTING
D 2022-01-12T07:20:24.897Z | resolving_load_balancer | dns:localhost:50051 CONNECTING -> CONNECTING
D 2022-01-12T07:20:24.897Z | channel | (1) dns:localhost:50051 callRefTimer.unref | configSelectionQueue.length=0 pickQueue.length=0
D 2022-01-12T07:20:24.897Z | channel | (1) dns:localhost:50051 Pick result: QUEUE subchannel: undefined status: undefined undefined
D 2022-01-12T07:20:24.897Z | channel | (1) dns:localhost:50051 callRefTimer.ref | configSelectionQueue.length=0 pickQueue.length=1
D 2022-01-12T07:20:24.897Z | connectivity_state | (1) dns:localhost:50051 CONNECTING -> CONNECTING
D 2022-01-12T07:20:24.898Z | subchannel | (3) 127.0.0.1:50051 CONNECTING -> READY
D 2022-01-12T07:20:24.898Z | pick_first | Pick subchannel with address 127.0.0.1:50051
D 2022-01-12T07:20:24.898Z | pick_first | CONNECTING -> READY
D 2022-01-12T07:20:24.899Z | resolving_load_balancer | dns:localhost:50051 CONNECTING -> READY
D 2022-01-12T07:20:24.899Z | channel | (1) dns:localhost:50051 callRefTimer.unref | configSelectionQueue.length=0 pickQueue.length=0
D 2022-01-12T07:20:24.899Z | channel | (1) dns:localhost:50051 Pick result: COMPLETE subchannel: 127.0.0.1:50051 status: undefined undefined
D 2022-01-12T07:20:24.899Z | connectivity_state | (1) dns:localhost:50051 CONNECTING -> READY
D 2022-01-12T07:20:24.899Z | subchannel_refcount | (3) 127.0.0.1:50051 refcount 2 -> 3
D 2022-01-12T07:20:24.899Z | subchannel_refcount | (2) ::1:50051 refcount 2 -> 1
D 2022-01-12T07:20:24.899Z | subchannel_refcount | (3) 127.0.0.1:50051 refcount 3 -> 2
D 2022-01-12T07:20:24.901Z | call_stream | Starting stream on subchannel (3) 127.0.0.1:50051 with headers
                grpc-accept-encoding: identity,deflate,gzip
                accept-encoding: identity
                :authority: localhost:50051
                user-agent: grpc-node-js/1.4.1
                content-type: application/grpc
                :method: POST
                :path: /identity.UnauthedIdentity/LogIn
                te: trailers

D 2022-01-12T07:20:24.901Z | call_stream | [0] attachHttp2Stream from subchannel 127.0.0.1:50051
D 2022-01-12T07:20:24.902Z | subchannel_refcount | (3) 127.0.0.1:50051 callRefcount 0 -> 1
D 2022-01-12T07:20:24.902Z | call_stream | [0] sending data chunk of length 136 (deferred)
D 2022-01-12T07:20:24.902Z | call_stream | [0] calling end() on HTTP/2 stream
D 2022-01-12T07:20:24.905Z | call_stream | [0] Received server headers:
                :status: 200
                content-type: application/grpc
                grpc-status: 3
                grpc-message: One%20or%20more%20of%20the%20request%20arguments%20were%20invalid%20(See%20details%20for%20ValidationErrors)
                grpc-status-details-bin: CkkKDS5tZXRob2QuZW1haWwSOEludmFsaWQgZW1haWwgYWRkcmVzcyAobXVzdCBjb250YWluICdAJyBhbmQgbGF0ZXIgYSAnLicp
                date: Wed, 12 Jan 2022 07:20:24 GMT

D 2022-01-12T07:20:24.906Z | call_stream | [0] HTTP/2 stream closed with code 0
D 2022-01-12T07:20:24.908Z | call_stream | [0] ended with status: code=13 details="Received RST_STREAM with code 0"
D 2022-01-12T07:20:24.909Z | subchannel_refcount | (3) 127.0.0.1:50051 callRefcount 1 -> 0
An unexpected error occurred while handling /login endpoint: Error: 13 INTERNAL: Received RST_STREAM with code 0
    at Object.callErrorFromStatus (/home/thomas/Projects/mros/web-ingress/node_modules/@grpc/grpc-js/build/src/call.js:31:26)
    at Object.onReceiveStatus (/home/thomas/Projects/mros/web-ingress/node_modules/@grpc/grpc-js/build/src/client.js:180:52)
    at Object.onReceiveStatus (/home/thomas/Projects/mros/web-ingress/node_modules/@grpc/grpc-js/build/src/client-interceptors.js:336:141)
    at Object.onReceiveStatus (/home/thomas/Projects/mros/web-ingress/node_modules/@grpc/grpc-js/build/src/client-interceptors.js:299:181)
    at /home/thomas/Projects/mros/web-ingress/node_modules/@grpc/grpc-js/build/src/call-stream.js:160:78
    at processTicksAndRejections (node:internal/process/task_queues:78:11) {
  code: 13,
  details: 'Received RST_STREAM with code 0',
  metadata: Metadata { internalRepr: Map(0) {}, options: {} }
}
D 2022-01-12T07:20:25.893Z | subchannel | (2) ::1:50051 TRANSIENT_FAILURE -> CONNECTING
D 2022-01-12T07:20:25.893Z | subchannel | (2) ::1:50051 creating HTTP/2 session
D 2022-01-12T07:20:25.893Z | subchannel | (2) ::1:50051 connection closed with error connect ECONNREFUSED ::1:50051
D 2022-01-12T07:20:25.893Z | subchannel | (2) ::1:50051 connection closed
D 2022-01-12T07:20:25.893Z | subchannel | (2) ::1:50051 CONNECTING -> TRANSIENT_FAILURE
D 2022-01-12T07:20:25.893Z | subchannel | (2) ::1:50051 TRANSIENT_FAILURE -> IDLE
D 2022-01-12T07:20:34.891Z | subchannel_refcount | (2) ::1:50051 refcount 1 -> 0

@moatra
Copy link

moatra commented Jan 12, 2022

Apologies for the possible red herring. After further investigation it appears that my situation was caused by a faulty middleware not properly propagating end of stream hints. After fixing, grpc-js seems to work fine in the trailers-only situation.

Logs from the server-side (with faulty middleware):

2022-01-12T16:51:04.833882Z  INFO request{request_id=01FS7K6EQ00HV9DXP2A0H60HJ1}: request_tracing: request stream completed method=POST uri=http://localhost:50051/identity.UnauthedIdentity/LogIn latency=289 stream_duration=0 http_status=200 OK grpc_status=Unauthenticated
2022-01-12T16:51:04.834000Z DEBUG Connection{peer=Server}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(1), flags: (0x4: END_HEADERS) }
2022-01-12T16:51:04.834120Z DEBUG Connection{peer=Server}: h2::codec::framed_write: send frame=Data { stream_id: StreamId(1), flags: (0x1: END_STREAM) }

With proper middleware:

2022-01-12T16:49:33.576577Z DEBUG trapi_server{request_id=01FS7K3NK40K2YDG1XYKXK9MQW}: tower_http::trace::on_response: finished processing request latency=292 ms status=16
2022-01-12T16:49:33.576605Z  INFO trapi_server{request_id=01FS7K3NK40K2YDG1XYKXK9MQW}: tower_http::trace::on_failure: response failed classification=Code: 16 latency=292 ms
2022-01-12T16:49:33.576793Z DEBUG Connection{peer=Server}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(1), flags: (0x5: END_HEADERS | END_STREAM) }
2022-01-12T16:49:33.579204Z DEBUG Connection{peer=Server}: h2::codec::framed_read: received frame=Reset { stream_id: StreamId(1), error_code: NO_ERROR }

@collinalexbell
Copy link

collinalexbell commented Jan 10, 2023

I think my team has come across a similar issue, "Error: 13 INTERNAL: Received RST_STREAM with code 0" which is confounding us

@collinalexbell
Copy link

Could this possibly be related to this: #2316.

@murgatroid99
Copy link
Member

That error occurs when a client receives an RST_STREAM to end the stream without receiving trailers from the server. That is a protocol error if it occurs, which is why it causes an error to be surfaced. I don't see any way that the change you linked would impact the handling of the relevant events that result in that error.

As I asked the previous person, can you run your client with the environment variables GRPC_TRACE=all and GRPC_VERBOSITY=DEBUG, and share the output from a run when this error occurs?

@collinalexbell
Copy link

collinalexbell commented Jan 10, 2023

Ok. The reason I thought they may be related is that we are also intermittently seeing dropped data (albiet, we don't manage the resource server so can't say with 100% certainty that it isn't the resource that has a problem).

Given what you said, the dropped data is probably unrelated to the RST_STREAM error which I think we can just retry on.

Anyway, I'll run with that env tomorrow once my team is back online.

@murgatroid99
Copy link
Member

How long have you been seeing this intermittently dropped data? I only published that change 4 hours ago, and it's supposed to be a fix for that problem.

@collinalexbell
Copy link

Ok, so we were using 1.8.0 and the dropped data was definitely fixed by bumping to 1.8.2 or rolling back to 1.7.3. We started experiencing the problem after an upgrade to 1.8.0 sometime last month.

@i7N3
Copy link

i7N3 commented Mar 2, 2023

@murgatroid99
I have the same problem.
Sometimes it works as expected: I get an INTERNAL 13 error.
Sometimes not: I get RST_STREAM with code 0.

Tested on:
@grpc-js: 1.5.9, 1.6.12, 1.7.3, 1.8.11 (tested all versions)
node: 16.18.0 and 16.19.1 (nvm)
MacOS m1 Monterey 12.5

The server running on node 16.18.0 and uses grpc-js 1.8.11

Server error log: (I am not seeing any grpc error here)

[Cannot read properties of undefined (reading 'gender') TypeError: Cannot read properties of undefined (reading 'gender')]

Server code (nestjs):
main.ts

// Imports ...

async function bootstrap() {
    const port = process.env.PORT || '5000'
    const address = '0.0.0.0'

    const journalingProtoPath = join(
        process.cwd(),
        '/node_modules/path/to/protofile.proto'
    )

    const microserviceOptions: MicroserviceOptions = {
        transport: Transport.GRPC,
        options: {
            package: [JOURNALING_PACKAGE_NAME],
            protoPath: [journalingProtoPath],
            url: `${address}:${port}`,
        },
    }

    const app = await NestFactory.createMicroservice(AppModule, {
        ...microserviceOptions,
    })

    app.listen().then(() => {
        instance.info(
            `Started GRPC server on ${microserviceOptions.options.url}`
        )
    })
}
bootstrap()

exceptionsFilter.ts

import { throwError } from 'rxjs'
import { status } from '@grpc/grpc-js'
import { BaseRpcExceptionFilter, RpcException } from '@nestjs/microservices'
import {
    Catch,
    Inject,
    Logger,
    ArgumentsHost,
    LoggerService,
    RpcExceptionFilter,
} from '@nestjs/common'

@Catch()
export class AllExceptionsFilter
    extends BaseRpcExceptionFilter
    implements RpcExceptionFilter
{
    constructor(@Inject(Logger) private readonly logger: LoggerService) {
        super()
    }

    catch(exception: any, host: ArgumentsHost) {
        const [req] = host.getArgs()
        const code =
            exception.code !== undefined ? exception.code : status.INTERNAL
        const message = exception.details
            ? String(exception.details)
            : exception.message
        const stack = exception.stack
        const userId = req?.user?.id ?? 'NONE-JOURNALING'
        this.logger.error({
            userId,
            message,
            stack,
        })
        return throwError(
            new RpcException({
                code,
                stack,
                message,
            }).getError()
        )
    }
}

Client code:

// Imports ...

type CredentialsReturnType = {
    callCreds: grpc.CallCredentials
    channelCredentials: grpc.ChannelCredentials
}

export class GrpcClient {
    private static email_notification: EmailNotificationServiceClient
    private static image_generator: ImageGeneratorServiceClient
    private static analytics: AnalyticsServiceClient
    private static question: QuestionServiceClient
    private static goal: GoalServiceClient
    private static questionnaire: QuestionnaireServiceClient
    private static journaling: JournalingServiceClient
    private static blog: BlogServiceClient
    private static wellbeing_techniques: WellbeingTechniquesServiceClient
    private static chat: ChatServiceClient
    private static follow_on_question: FollowOnQuestionServiceClient

    async initialize() {
        const packageNames = Object.values(ProtoPackage)

        for await (const packageName of packageNames) {
            const serviceUrl = config.services.servicesUrl[packageName]
            const serviceName = serviceNameMapping[packageName]

            const protoPath = path.join(
                process.cwd(),
                config.services.protoPaths[packageName]
            )

            const PROTO_PATH = path.resolve(__dirname, protoPath)

            const { channelCredentials, callCreds } =
                await GrpcClient.getCredentials(serviceUrl)

            var packageDefinition = protoLoader.loadSync(PROTO_PATH, {
                keepCase: false,
                longs: Number,
                enums: Number,
                defaults: true,
                oneofs: true
            })

            const packageDef = grpc.loadPackageDefinition(packageDefinition),
                routeguide = packageDef[packageName],
                service = get(routeguide, serviceName),
                client = new service(
                    serviceUrl,
                    // grpc.credentials.createInsecure(), // Uncomment to work on local
                    grpc.credentials.combineChannelCredentials(
                        channelCredentials,
                        callCreds
                    ), // Comment to work on local
                    {
                        "grpc.service_config": JSON.stringify({
                            methodConfig: [
                                {
                                    name: [
                                        {
                                            service: `${packageName}.${serviceName}`
                                        }
                                    ],
                                    retryPolicy: {
                                        maxAttempts: 2,
                                        initialBackoff: "1s",
                                        maxBackoff: "4s",
                                        backoffMultiplier: 2,
                                        retryableStatusCodes: ["UNAVAILABLE"]
                                    }
                                }
                            ]
                        })
                    } as grpc.ChannelOptions
                )

            GrpcClient[packageName] = client

            logger.info(`Created GrpcClient for ${packageName}`)
        }
    }

    private static async getCredentials(
        serviceUrl: string
    ): Promise<CredentialsReturnType> {
        logger.info("get credentials for grpcClient")
        const target = `https://${serviceUrl}`
        const idTokenClient = await new GoogleAuth().getIdTokenClient(target)
        const channelCredentials = grpc.credentials.createSsl()

        const callCreds =
            grpc.credentials.createFromGoogleCredential(idTokenClient)
        return {
            channelCredentials,
            callCreds
        }
    }

    public static async request(
        method: string,
        payload: any,
        packageName: ProtoPackage
    ): Promise<any> {
        try {
            if (!GrpcClient[packageName]) {
                throw new Error(`GrpcClient for ${packageName} not initialized`)
            }
            return new Promise((resolve, reject) =>
                get(GrpcClient, packageName)[method](
                    { ...payload },
                    (err: any, response: any) => {
                        if (err) reject(err)
                        resolve(response)
                    }
                )
            )
        } catch (err) {
            throw err
        }
    }
}

Client logs:

RST_STREAM with code 0

D 2023-03-02T13:32:47.673Z | channel | (10) dns:projectX-journaling-7llsghxwka-ew.a.run.app createCall [5] method="/journaling.JournalingService/submitGuidedJournalWithResult", deadline=Infinity
D 2023-03-02T13:32:47.680Z | call_stream | [5] Sending metadata
D 2023-03-02T13:32:47.680Z | channel | (10) dns:projectX-journaling-7llsghxwka-ew.a.run.app Pick result for call [5]: COMPLETE subchannel: (18) 216.239.38.53:443 status: undefined undefined
D 2023-03-02T13:32:47.682Z | call_stream | [5] write() called with message of length 152
D 2023-03-02T13:32:47.682Z | call_stream | [5] end() called
D 2023-03-02T13:32:47.687Z | call_stream | Starting stream [5] on subchannel (18) 216.239.38.53:443 with headers
                authorization: Bearer xxxxx
                grpc-accept-encoding: identity,deflate,gzip
                accept-encoding: identity
                :authority: projectX-journaling-7llsghxwka-ew.a.run.app
                user-agent: grpc-node-js/1.6.12
                content-type: application/grpc
                :method: POST
                :path: /journaling.JournalingService/submitGuidedJournalWithResult
                te: trailers

D 2023-03-02T13:32:47.688Z | subchannel_flowctrl | (18) 216.239.38.53:443 local window size: 65535 remote window size: 1048053
D 2023-03-02T13:32:47.688Z | subchannel_internals | (18) 216.239.38.53:443 session.closed=false session.destroyed=false session.socket.destroyed=false
D 2023-03-02T13:32:47.688Z | call_stream | [5] attachHttp2Stream from subchannel 216.239.38.53:443
D 2023-03-02T13:32:47.688Z | subchannel_refcount | (18) 216.239.38.53:443 callRefcount 0 -> 1
D 2023-03-02T13:32:47.688Z | call_stream | [5] sending data chunk of length 157
D 2023-03-02T13:32:47.689Z | call_stream | [5] calling end() on HTTP/2 stream
D 2023-03-02T13:32:47.849Z | call_stream | [5] Received server headers:
                :status: 200
                grpc-status: 13
                grpc-message: Cannot%20read%20properties%20of%20undefined%20(reading%20'gender')
                content-type: application/grpc+proto
                x-cloud-trace-context: 412e5480c0a77dbbeb99dc2645b102a2
                date: Thu, 02 Mar 2023 13:32:47 GMT
                server: Google Frontend
                content-length: 0
                alt-svc: h3=":443"; ma=2592000,h3-29=":443"; ma=2592000,h3-Q050=":443"; ma=2592000,h3-Q046=":443"; ma=2592000,h3-Q043=":443"; ma=2592000,quic=":443"; ma=2592000; v="46,43"

D 2023-03-02T13:32:47.851Z | call_stream | [5] HTTP/2 stream closed with code 0
D 2023-03-02T13:32:47.851Z | call_stream | [5] ended with status: code=13 details="Received RST_STREAM with code 0"
D 2023-03-02T13:32:47.851Z | subchannel_refcount | (18) 216.239.38.53:443 callRefcount 1 -> 0
error: Received RST_STREAM with code 0
RpcError: Received RST_STREAM with code 0
    at /Users/user/Work/projectX-backend/build/webpack:/src/helpers/grpc/journaling.helpers.ts:117:15
    at Generator.throw (<anonymous>)
    at rejected (/Users/user/Work/projectX-backend/build/main.js:15468:65)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
D 2023-03-02T13:32:47.927Z | channel | (3) dns:logging.googleapis.com:443 createCall [7] method="/google.logging.v2.LoggingServiceV2/WriteLogEntries", deadline=Thu Mar 02 2023 17:33:47 GMT+0400 (Georgia Standard Time)
D 2023-03-02T13:32:47.928Z | call_stream | [7] Sending metadata
D 2023-03-02T13:32:47.928Z | channel | (3) dns:logging.googleapis.com:443 Pick result for call [7]: COMPLETE subchannel: (5) 216.239.36.174:443 status: undefined undefined
D 2023-03-02T13:32:47.932Z | call_stream | [7] write() called with message of length 966
D 2023-03-02T13:32:47.932Z | call_stream | [7] end() called
D 2023-03-02T13:32:47.933Z | call_stream | [7] deferring writing data chunk of length 971
D 2023-03-02T13:32:47.934Z | call_stream | Starting stream [7] on subchannel (5) 216.239.36.174:443 with headers
                x-goog-api-client: gax/3.5.2 gapic/10.3.2 gl-node/16.18.0 grpc/1.7.3 gccl/10.3.2
                authorization: Bearer xxxxx
                grpc-timeout: 59993m
                grpc-accept-encoding: identity,deflate,gzip
                accept-encoding: identity
                :authority: logging.googleapis.com:443
                user-agent: grpc-node-js/1.7.3
                content-type: application/grpc
                :method: POST
                :path: /google.logging.v2.LoggingServiceV2/WriteLogEntries
                te: trailers

D 2023-03-02T13:32:47.934Z | subchannel_flowctrl | (5) 216.239.36.174:443 local window size: 65500 remote window size: 1041190
D 2023-03-02T13:32:47.934Z | subchannel_internals | (5) 216.239.36.174:443 session.closed=false session.destroyed=false session.socket.destroyed=false
D 2023-03-02T13:32:47.934Z | call_stream | [7] attachHttp2Stream from subchannel 216.239.36.174:443
D 2023-03-02T13:32:47.935Z | subchannel_refcount | (5) 216.239.36.174:443 callRefcount 0 -> 1
D 2023-03-02T13:32:47.935Z | call_stream | [7] sending data chunk of length 971 (deferred)
D 2023-03-02T13:32:47.935Z | call_stream | [7] calling end() on HTTP/2 stream
D 2023-03-02T13:32:48.206Z | call_stream | [7] Received server headers:
                :status: 200
                content-disposition: attachment
                content-type: application/grpc
                date: Thu, 02 Mar 2023 13:32:48 GMT
                alt-svc: h3=":443"; ma=2592000,h3-29=":443"; ma=2592000

D 2023-03-02T13:32:48.208Z | call_stream | [7] receive HTTP/2 data frame of length 5
D 2023-03-02T13:32:48.208Z | call_stream | [7] parsed message of length 5
D 2023-03-02T13:32:48.208Z | call_stream | [7] filterReceivedMessage of length 5
D 2023-03-02T13:32:48.209Z | call_stream | [7] pushing to reader message of length 0
D 2023-03-02T13:32:48.210Z | call_stream | [7] Received server trailers:
                grpc-status: 0
                content-disposition: attachment

D 2023-03-02T13:32:48.210Z | call_stream | [7] received status code 0 from server
D 2023-03-02T13:32:48.211Z | call_stream | [7] close http2 stream with code 0
D 2023-03-02T13:32:48.211Z | call_stream | [7] ended with status: code=0 details=""
D 2023-03-02T13:32:48.211Z | subchannel_refcount | (5) 216.239.36.174:443 callRefcount 1 -> 0
D 2023-03-02T13:32:48.213Z | call_stream | [7] HTTP/2 stream closed with code 0
D 2023-03-02T13:33:26.992Z | call_stream | [0] receive HTTP/2 data frame of length 33
D 2023-03-02T13:33:26.992Z | call_stream | [0] parsed message of length 33
D 2023-03-02T13:33:26.992Z | call_stream | [0] filterReceivedMessage of length 33
D 2023-03-02T13:33:26.993Z | call_stream | [0] pushing to reader message of length 28
D 2023-03-02T13:34:12.004Z | call_stream | [0] receive HTTP/2 data frame of length 33
D 2023-03-02T13:34:12.004Z | call_stream | [0] parsed message of length 33
D 2023-03-02T13:34:12.004Z | call_stream | [0] filterReceivedMessage of length 33
D 2023-03-02T13:34:12.005Z | call_stream | [0] pushing to reader message of length 28

What I want all the time.

D 2023-03-02T13:32:39.377Z | channel | (10) dns:projectX-journaling-7llsghxwka-ew.a.run.app createCall [4] method="/journaling.JournalingService/submitGuidedJournalWithResult", deadline=Infinity
D 2023-03-02T13:32:39.385Z | call_stream | [4] Sending metadata
D 2023-03-02T13:32:39.386Z | channel | (10) dns:projectX-journaling-7llsghxwka-ew.a.run.app Pick result for call [4]: COMPLETE subchannel: (18) 216.239.38.53:443 status: undefined undefined
D 2023-03-02T13:32:39.387Z | call_stream | [4] write() called with message of length 152
D 2023-03-02T13:32:39.387Z | call_stream | [4] end() called
D 2023-03-02T13:32:39.393Z | call_stream | Starting stream [4] on subchannel (18) 216.239.38.53:443 with headers
                authorization: Bearer xxxxx
                grpc-accept-encoding: identity,deflate,gzip
                accept-encoding: identity
                :authority: projectX-journaling-7llsghxwka-ew.a.run.app
                user-agent: grpc-node-js/1.6.12
                content-type: application/grpc
                :method: POST
                :path: /journaling.JournalingService/submitGuidedJournalWithResult
                te: trailers

D 2023-03-02T13:32:39.394Z | subchannel_flowctrl | (18) 216.239.38.53:443 local window size: 65535 remote window size: 1048210
D 2023-03-02T13:32:39.394Z | subchannel_internals | (18) 216.239.38.53:443 session.closed=false session.destroyed=false session.socket.destroyed=false
D 2023-03-02T13:32:39.394Z | call_stream | [4] attachHttp2Stream from subchannel 216.239.38.53:443
D 2023-03-02T13:32:39.394Z | subchannel_refcount | (18) 216.239.38.53:443 callRefcount 0 -> 1
D 2023-03-02T13:32:39.395Z | call_stream | [4] sending data chunk of length 157
D 2023-03-02T13:32:39.395Z | call_stream | [4] calling end() on HTTP/2 stream
D 2023-03-02T13:32:39.825Z | call_stream | [4] Received server headers:
                :status: 200
                grpc-status: 13
                grpc-message: Cannot%20read%20properties%20of%20undefined%20(reading%20'gender')
                content-type: application/grpc+proto
                x-cloud-trace-context: 4a8444d9fb5fd26f7841ff07d5129799;o=1
                date: Thu, 02 Mar 2023 13:32:39 GMT
                server: Google Frontend
                content-length: 0
                alt-svc: h3=":443"; ma=2592000,h3-29=":443"; ma=2592000,h3-Q050=":443"; ma=2592000,h3-Q046=":443"; ma=2592000,h3-Q043=":443"; ma=2592000,quic=":443"; ma=2592000; v="46,43"

D 2023-03-02T13:32:39.826Z | call_stream | [4] Received server trailers:
                :status: 200
                grpc-status: 13
                grpc-message: Cannot%20read%20properties%20of%20undefined%20(reading%20'gender')
                content-type: application/grpc+proto
                x-cloud-trace-context: 4a8444d9fb5fd26f7841ff07d5129799;o=1
                date: Thu, 02 Mar 2023 13:32:39 GMT
                server: Google Frontend
                content-length: 0
                alt-svc: h3=":443"; ma=2592000,h3-29=":443"; ma=2592000,h3-Q050=":443"; ma=2592000,h3-Q046=":443"; ma=2592000,h3-Q043=":443"; ma=2592000,quic=":443"; ma=2592000; v="46,43"

D 2023-03-02T13:32:39.827Z | call_stream | [4] received status code 13 from server
D 2023-03-02T13:32:39.827Z | call_stream | [4] received status details string "Cannot read properties of undefined (reading 'gender')" from server
D 2023-03-02T13:32:39.827Z | call_stream | [4] ended with status: code=13 details="Cannot read properties of undefined (reading 'gender')"
D 2023-03-02T13:32:39.827Z | subchannel_refcount | (18) 216.239.38.53:443 callRefcount 1 -> 0
D 2023-03-02T13:32:39.827Z | call_stream | [4] close http2 stream with code 8
error: Cannot read properties of undefined (reading 'gender')
RpcError: Cannot read properties of undefined (reading 'gender')
    at /Users/user/Work/projectX-backend/build/webpack:/src/helpers/grpc/journaling.helpers.ts:117:15
    at Generator.throw (<anonymous>)
    at rejected (/Users/user/Work/projectX-backend/build/main.js:15468:65)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
D 2023-03-02T13:32:39.855Z | call_stream | [4] HTTP/2 stream closed with code 8
D 2023-03-02T13:32:39.904Z | channel | (3) dns:logging.googleapis.com:443 createCall [6] method="/google.logging.v2.LoggingServiceV2/WriteLogEntries", deadline=Thu Mar 02 2023 17:33:39 GMT+0400 (Georgia Standard Time)
D 2023-03-02T13:32:39.905Z | call_stream | [6] Sending metadata
D 2023-03-02T13:32:39.906Z | channel | (3) dns:logging.googleapis.com:443 Pick result for call [6]: COMPLETE subchannel: (5) 216.239.36.174:443 status: undefined undefined
D 2023-03-02T13:32:39.910Z | call_stream | [6] write() called with message of length 1035
D 2023-03-02T13:32:39.911Z | call_stream | [6] end() called
D 2023-03-02T13:32:39.912Z | call_stream | [6] deferring writing data chunk of length 1040
D 2023-03-02T13:32:39.913Z | call_stream | Starting stream [6] on subchannel (5) 216.239.36.174:443 with headers
                x-goog-api-client: gax/3.5.2 gapic/10.3.2 gl-node/16.18.0 grpc/1.7.3 gccl/10.3.2
                authorization: Bearer xxxxx
                grpc-timeout: 59991m
                grpc-accept-encoding: identity,deflate,gzip
                accept-encoding: identity
                :authority: logging.googleapis.com:443
                user-agent: grpc-node-js/1.7.3
                content-type: application/grpc
                :method: POST
                :path: /google.logging.v2.LoggingServiceV2/WriteLogEntries
                te: trailers

D 2023-03-02T13:32:39.913Z | subchannel_flowctrl | (5) 216.239.36.174:443 local window size: 65505 remote window size: 1042230
D 2023-03-02T13:32:39.913Z | subchannel_internals | (5) 216.239.36.174:443 session.closed=false session.destroyed=false session.socket.destroyed=false
D 2023-03-02T13:32:39.913Z | call_stream | [6] attachHttp2Stream from subchannel 216.239.36.174:443
D 2023-03-02T13:32:39.913Z | subchannel_refcount | (5) 216.239.36.174:443 callRefcount 0 -> 1
D 2023-03-02T13:32:39.913Z | call_stream | [6] sending data chunk of length 1040 (deferred)
D 2023-03-02T13:32:39.913Z | call_stream | [6] calling end() on HTTP/2 stream
D 2023-03-02T13:32:40.104Z | call_stream | [6] Received server headers:
                :status: 200
                content-disposition: attachment
                content-type: application/grpc
                date: Thu, 02 Mar 2023 13:32:40 GMT
                alt-svc: h3=":443"; ma=2592000,h3-29=":443"; ma=2592000

D 2023-03-02T13:32:40.105Z | call_stream | [6] receive HTTP/2 data frame of length 5
D 2023-03-02T13:32:40.106Z | call_stream | [6] parsed message of length 5
D 2023-03-02T13:32:40.106Z | call_stream | [6] filterReceivedMessage of length 5
D 2023-03-02T13:32:40.107Z | call_stream | [6] pushing to reader message of length 0
D 2023-03-02T13:32:40.107Z | call_stream | [6] Received server trailers:
                grpc-status: 0
                content-disposition: attachment

D 2023-03-02T13:32:40.108Z | call_stream | [6] received status code 0 from server
D 2023-03-02T13:32:40.108Z | call_stream | [6] close http2 stream with code 0
D 2023-03-02T13:32:40.108Z | call_stream | [6] ended with status: code=0 details=""
D 2023-03-02T13:32:40.109Z | subchannel_refcount | (5) 216.239.36.174:443 callRefcount 1 -> 0
D 2023-03-02T13:32:40.110Z | call_stream | [6] HTTP/2 stream closed with code 0
D 2023-03-02T13:32:41.985Z | call_stream | [0] receive HTTP/2 data frame of length 33
D 2023-03-02T13:32:41.998Z | call_stream | [0] parsed message of length 33
D 2023-03-02T13:32:41.999Z | call_stream | [0] filterReceivedMessage of length 33
D 2023-03-02T13:32:42.002Z | call_stream | [0] pushing to reader message of length 28

I noticed that when I get "Received RST_STREAM with code 0" there are no "Received server trailers" in the logs.

@murgatroid99
Copy link
Member

@i7N3 Have you checked whether this happens when running the client and server on the same machine. If it does, a complete reproduction would really help me track down the cause of the bug.

If it does not happen, then the most likely cause is some intervening proxy, so I would recommend opening a support ticket for whatever system you are running the server on, with a reference to your comment.

@i7N3
Copy link

i7N3 commented Mar 7, 2023

@murgatroid99 This does not happen when I run the client and server on the same machine.

I will open a support ticket and write back about what will come of it later.
Thank you.

@i7N3
Copy link

i7N3 commented Mar 7, 2023

BTW I am using GCP - Cloud Run. Maybe someone has experience with this error on the same stack.

@vamsideepak
Copy link

Facing the same issue . Is there any update on this ? I am using nodejs version 16.18.0 and electronjs version 24.3.0 and I am running both grpc server and client on same machine in local.

@i7N3
Copy link

i7N3 commented May 17, 2023

Facing the same issue . Is there any update on this ? I am using nodejs version 16.18.0 and electronjs version 24.3.0 and I am running both grpc server and client on same machine in local.

From me, unfortunately not.

refs:

Can you provide minimal reproduction repo?

@foreverest
Copy link

Experiencing the same issue with @grpc/grpc-js@1.6.12. Same symptoms as @i7N3 -- Could not repro when running both client and server on the same machine. Repros in ~50% cases when running the server on GCP Cloud Run.

@foreverest
Copy link

Enabling HTTP/2 end-to-end in my deployment settings on Cloud Run fixed the issue for me.

@i7N3
Copy link

i7N3 commented Aug 28, 2023

I can confirm that enabling HTTP/2 end-to-end on Cloud Run (Network) fixed the issue for me too.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants