jsreportonline connect ECONNREFUSED / socket hang up



  • One of my reports has a lot of data and it therefore may take 30 sec or more to process in jsreportonline. Because of that, I've moved the report rendering to a worker process in AWS Lambda that, when complete, saves the report into my S3 bucket. Here is my code:

        try {
    
            // Code to get report data goes here .....
            // let reportData = ....
            
            // Call JSReport Engine, which returns a promise
            const res = await jsReportClient.render(
                    {
                        template: { "shortid": jsReportShortId},
                        data: reportData,
                        options: { reports: { save: false } }
                    }, 
                    { 
                        timeout: 180000 
                    });
            
            // Save PDF report
            renderedPDF = await res.body();
            
            // Code to save to S3 goes here .....
    
        }
        catch (err) {
            
            // Show error
            console.log("Error getting report from JSReport Online API endpoint");
            console.log(err);
    
            // Otherwise, just close out callback
            callback(null, 'Error Callback');
        }
    

    For the report in question, sometimes this works without any issue. However, other times I get the following error:

    2020-01-20 06:01:41.033 Error: connect ECONNREFUSED 127.0.0.1:16784
        at /var/task/processJSReport/node_modules/jsreport-client/lib/client.js:62:29
        at /var/task/processJSReport/node_modules/jsreport-client/lib/client.js:115:40
        at ConcatStream.<anonymous> (/var/task/processJSReport/node_modules/concat-stream/index.js:36:43)
        at ConcatStream.emit (events.js:215:7)
        at ConcatStream.emit (domain.js:476:20)
        at finishMaybe (/var/task/processJSReport/node_modules/readable-stream/lib/_stream_writable.js:630:14)
        at afterWrite (/var/task/processJSReport/node_modules/readable-stream/lib/_stream_writable.js:492:3)
        at processTicksAndRejections (internal/process/task_queues.js:82:21) {
      remoteStack: 'Error: connect ECONNREFUSED 127.0.0.1:16784\n' +
        '    at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1191:14)'
    }
    

    I've also received the following error:

    2020-01-20 06:26:21.052 Error: socket hang up
        at /var/task/processJSReport/node_modules/jsreport-client/lib/client.js:62:29
        at /var/task/processJSReport/node_modules/jsreport-client/lib/client.js:115:40
        at ConcatStream.<anonymous> (/var/task/processJSReport/node_modules/concat-stream/index.js:36:43)
        at ConcatStream.emit (events.js:215:7)
        at ConcatStream.emit (domain.js:476:20)
        at finishMaybe (/var/task/processJSReport/node_modules/readable-stream/lib/_stream_writable.js:630:14)
        at afterWrite (/var/task/processJSReport/node_modules/readable-stream/lib/_stream_writable.js:492:3)
        at processTicksAndRejections (internal/process/task_queues.js:82:21) {
      remoteStack: 'Error: socket hang up\n' +
        '    at createHangUpError (_http_client.js:342:15)\n' +
        '    at Socket.socketOnEnd (_http_client.js:437:23)\n' +
        '    at emitNone (events.js:111:20)\n' +
        '    at Socket.emit (events.js:208:7)\n' +
        '    at endReadableNT (_stream_readable.js:1064:12)\n' +
        '    at _combinedTickCallback (internal/process/next_tick.js:139:11)\n' +
        '    at process._tickCallback (internal/process/next_tick.js:181:9)'
    }
    

    Is there something that I must do to ensure that the report runs successfully each time?



  • One of my reports has a lot of data and it therefore may take 30 sec or more to process in jsreportonline. Because of that, I've moved the report rendering to a worker process in AWS Lambda that, when complete, saves the report into my S3 bucket.

    I am not sure I get it right. You moved the rendering to the lambda, but the mentioned errors are from jsreportonline?
    You want to solve the jsreportonline issue and don't use lambda? Or you still use lambda in a combination with jsreportonline?
    Or you use just lambda and this error is from it? Please elaborate on the details.



  • Sorry, let me clarify. Let's forget about Lambda for now. Basically, when I make the request to jsreportonline for the report in question, because of the amount of data, jsreportonline fails with one of the above errors. The failure happens at the following line:

    const res = await jsReportClient.render(
    

    The threshold seems to be somewhere around 30 seconds. For the given report, if the data included in the .render() statement causes jsreportonline to process for around 30sec or more, the request fails and .render() throws one of the errors above. If the data included causes jsreportonline to process for under 30 seconds, the SAME report renders without any issues.

    Bottom line, long running requests into .render() (30 sec or more) seem to fail with one of the above two errors. I assumed it was a timeout issue with .render() and that therefore, adding {timeout:180000} into the call would have fixed it. However, that doesn't seem to have done anything.

    I've been using lambda to make requests into jsreportonline by using client.render() without any issues for some time now. I will continue to use lambda to make calls to jsreportonline. However, it wasn't until recently, when one of my clients had a lot of data for this particular report that this issue arose.

    Note, I've also tested this by running the same client.render() request, with the same large data set & report, through a cloud9 AWS nodejs app and received the same error. So it doesn't appear to be relevant to lambda.



  • Thank you for the details. Could you email me your jsreport online subdomain/tenant name?
    I would search our logs and check it out.
    jan.blaha@jsreport.net
    Thank you


Log in to reply
 

Looks like your connection to jsreport forum was lost, please wait while we try to reconnect.