Not able to generate reports with 2 million rows and filesize > 80MB.

  • Hi,

    We are trying to create a report with text recipe and handlebars engine with around 2 million rows. We make a rpc call in the beforeRender function to fetch the data and the request returns the data in around one minute.
    But then the report is never generated and our afterRender function is never called.
    The report gets generated successfully sometimes in around 3 minutes but most of the times the execution gets stalled in between the two scripts.

    Below is our code for beforeRender script :-`starting before render script for : ${}.`);
      //database query to fetch data rows for report.
      return getData()
        .then((result) => {
"got data");
["values"] = result;
"data processed");
        .catch((e) => {
          dmLog.error("Error in before render script for Wifi Read Report.");
["error"] = true;

    And this is for afterRender scipt :-"reached after render");
      //saving generated report into filesystem.
      saveReport(, res, done)	
          .then((x) => {
  `afterRender for save completed for data ${}- `);
          .catch((x) => {
  `afterRender for save is failed for data ${}- `);

    We have set the timeout in our config for both scripts.timeout and templatingEngines.timeout to 10 minutes.

    We are able to generate report with around 1.7 million rows successfully but faced issues with 2 million.

    Can you suggest what we can do in order to prevent this stalling?

  • The node process likely reaches the 1.5GB memory and starts to run GC.
    You should increase the memory limit for node.js process using --max-old-space-size flag

    node --max_old_space_size=6096 server.js

    or if you use jsreport cli set environment variable

    NODE_OPTIONS="--max-old-space-size= 6096"

    and run

    jsreport start

    This will get propagated by default just to the main process. If you use dedicated-process or http-server strategy for templating engines/scripts, you need to set also this config

    "templatingEngines": {
        "timeout": 1000000,
        "strategy": "dedicated-process",
        "forkOptions": {
          "execArgv": ["--max-old-space-size=6096"]

    However, the in-process strategy will give you the best performance, because it avoids serializing and deserializing of such giant JSON between processes.

    With increasing the memory limits, I am able to produce 200MB output with 15mil records.

Log in to reply

Looks like your connection to jsreport forum was lost, please wait while we try to reconnect.