开发者

Problems during generating pdf report because of huge data

I have this problem during generating pdf reports. I have some huge data in my database. When I generate the report as HTML, I could use pagination to filter the results so that less data is there to process and show the user. However, in the case of pdf report, I don't think it is nice to make the clients get the pdf page by page. It's better to show the whole data开发者_运维知识库 in a single pdf. If I do so, then it will consume a lot of time and memory for the generation. What is the best way to accomplish such feature. Generating such report will result in server timeouts and massive memory consumption.

Should I generate them at offline time(say once every day, once every month) and make the users access the pre generated pdfs. I am not sure, what if the users want to generate them themselves in real time

What is the best approach?


You could approach it this way:

  • Fetch records from your database, in batches of say 100 records (you could further tune this, based on how many records fit in a single PDF page)
  • For each batch, generate a separate, single-page, PDF (write to a temporary location on the file system)
  • Once all records (batches) are processed, combine the PDFs using one of the various PDF assembling/merging tools available on the web, into a single resultant PDF.
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜