PDF of 2.6k page takes more than one hour to generate
-
Thank you for the profile. I see you render basically very long HTML table.
Unfortunately, browsers have had poor performance on long tables since I was a kid.
Perhaps you could try to avoid using tables and replace cells with divs if it helps to improve performance?
Otherwise, I'm afraid I can't help. Maybe a more CPU-oriented VM, but I think it won't help much.Is there a way where we can create sub report chunks of components and merge them ?
I believe you asks for some parallelization? You can split the data and use pdf utils in the script to concat the results.
Here is an example and docs
https://playground.jsreport.net/w/admin/UpVVJcAk
https://jsreport.net/learn/pdf-utils
However, this partially breaks the TOC because the links won't be clickable.
https://github.com/jsreport/jsreport/issues/771However, we experienced with such data splitting and parallel chrome render and it turned out it wasn't improving the performance. The chrome should already use multiple processors and the costs for splitting,appending and managing chrome instances was wasting the gains from the parallelization.
-
I have resolved the long table css issue by removing the border collapse. Also, when we tested with 30k or 50k data we are getting js core page request closed error. Could you please help?
-
Nice! Thank you for sharing!
js core page request closed error.
Could you share the whole error? I'm not sure where it comes from.
-
Protocol error: Connection closed. Most likely the page has been closed.
Error: Protocol error: Connection closed. Most likely the page has been closed.
at assert (/app/node_modules/puppeteer/lib/cjs/puppeteer/common/assert.js:26:15)
at Page.close (/app/node_modules/puppeteer/lib/cjs/puppeteer/common/Page.js:2115:32)
-
Could you run the template with the HTML recipe and email me the output?
I would try to replicate the problem on my machine.
-
-
I have sent an email with shashank16aug19@gmail.com with data json as well.
-
How long does it take to get that error?
For me, it seems it just endlessly runs...
As I noted, the chrome has problems with long tables and there is nothing we can do about it. There is a bug tracked for it, however I tried a later chrome and it doesn't seems to improve. The 80mb HTML is likely an edge case.
You can try to replace thetable
withdivs
if its possible.
-
Can we do lazy loading instead of one long iteration?
-
Not sure what you mean, but we have no influence on how chrome is converting HTML to pdf.