Jaspersoft java benchmarks

Hello everyone, I'm using Jasper to generate PDF reports and the time to fill some big reports is way too much.

Im running the java service on 8 CPUs, 10 GB memory, linux system. Im using Jaxen Xml and SwapFileWirtualizer. All the reports are first dumped to an xml file and then I feed this file to the jasper engine. The extraction of the data and the dump to the xml file are fast enough.

Now, I'm processing very huge xml files, they can span from 1 MB to 800-900 MB. For reports under 20 MB, the speed can be calculated in seconds, but, for example, for reports of 200 MB xml, it take 30 minutes to fill the report (only to fill, Im excluding here the time to compile and the rest).

At this point, Im not sure whether Im doing something wrong, or this is the expected peformance. My question is, with the few info I gave you (if you need more info I can provide it), is it normal that a 200MB file takes 30 minutes to generate a PDF File?


Thank you





I run some other tests with a 100 MB xml file. The runtime for filling is 1:33 minutes. If I feed a 200MB xml file, the runtime goes up to nearly 30 minutes.

luca.chr.dotti's picture
Joined: Nov 16 2020 - 6:15am
Last seen: 3 months 3 weeks ago

Thank you for posting to the Jaspersoft Community. Our team of experts has read your question and we are working to get you an answer as quickly as we can. If you have a Jaspersoft Professional Subscription plan, please visit https://support.tibco.com/s/ for direct access to our technical support teams offering guaranteed response times.

arai_4 - 3 months 3 weeks ago

1 Answer:

Hi Luca, 

Whether it's normal or not would depend on many factors (resources such as ram and CPUs, JVM params, errors, memory leak, tuning....). So far, we don't have any known performance issues with large XML files, so it seems to me you can fine-tune your environment to lower the response time. 

The best way to go through this is first to check your logs. Are there any errors? If you can, try increasing the RAM; as you stated bigger the size of your files longer is the response time, so I would say the time for the garbage collector to free up the memory for you is impacting your report generation time. 

Also, please use the latest JRS versions (which are generally better in performance than the previous ones), and use the latest JDK version with the parameters recommended in our install guide. 

You can go further on the investigation by hooking your JRS with visualVM or another profiling tool and check how much memory your report takes and in which java method it's taking the most of the time, sometimes a bad implementation of report components can impact the memory consumptions, especially for crosstabs. 

Please see here for how to investigate performance issues and fine-tuning : 


bdraifi's picture
Joined: Mar 17 2020 - 11:53am
Last seen: 5 days 15 hours ago