Japser + XML + Huge amount of data

0

Hello everyone, I'm using Jasper to generate my reports. These reports are huge, in fact only the xml data is about 1GB. At the moment I'm not able to reduce the memory consumption of my application when generating such huge reports, even with the SwapFileVirtualizer on. 

This is how it works:

1. Extract the data from the db

2. Generate the XML directly into the file system. For that I'm using the following code:

private Path asXml(Report report, Map<String, Parameter> parameters) throws Exception {
    Marshaller m = context.createMarshaller();
    m.setProperty(Marshaller.JAXB_FORMATTED_OUTPUT, Boolean.TRUE);
 
    Path path = fileService.generateOutputDataFile(parameters);
    BufferedOutputStream os = new BufferedOutputStream(new FileOutputStream(path.toFile()));
    m.marshal(report, os);
    os.close();
    return path;
}

3. Compile the report from the template file and generate compiled report directly into file system

JasperCompileManager
  .getInstance(jasperReportsContext)
  .compileToFile(templatePath, outputPath);

4. Fill the report. Pass the BufferedInputStream of the xml as parameter to be used as input to fill the report.

InputStream is = new BufferedInputStream(new FileInputStream(sourceXml.toFile()));
params.put(JRXPathQueryExecuterFactory.XML_INPUT_STREAM, is);
 
JRSwapFile sf = new JRSwapFile(jasperSwapFileDirectory, 512, 256);
JRSwapFileVirtualizer swapFileVirtualizer = new JRSwapFileVirtualizer(500, sf, true);
 
JasperFillManager
  .getInstance(jasperReportsContext)
  .fillToFile(compiledReportPath, params)
 
swapFileVirtualizer.cleanup();

Everything looks fine until step 4. I tried with the virualizer and several different settings but I'm still running out of memory (GC overhead limit exceeded). Could it be that Jasper is reading the whole XML into memory?

Thank you

luca.chr.dotti's picture
Joined: Nov 16 2020 - 6:15am
Last seen: 2 months 2 days ago

0 Answers:

No answers yet
Feedback
randomness