A system stores large (>50MB) XML files in DB2 V8 (actually V8 went out of support in April, 2009. long live V8!). Following that a message is pushed to a queue to which we listen. We read the XML based on the message and update our database.

The issue is if the transaction rate is high, we will be reading many large XML into memory and further parsing and processing is memory intensive. How can we keep working with little memory? A solution is to read the LOB columns as streams and use a SAX parser.

ResultSet rs = conn.execute(query);
while (rs.next()) {
  InputStream is = rs.getCharacterStream("xml_clob_col_name"); 
  //read from the stream. use a stax parser for further processing 

Everything fine so far. But the memory usage doesn’t seem to be reduced by this.

It turns out that, DB2 V8 fully materializes LOB data. If you want to read as stream, the connection should be enabled to do so. While the connection is made, we have to set the property fullyMaterializeLobData=false. There are various way to do this depending on how the connection is made.

1: Connect Url


2: Connection Properties

Properties props = new Properties();
props.put("user", userName);
props.put("password", password);
props.put("fullyMaterializeLobData", "false");
Connection conn = Drivermanager.getConnection("jdbc:db2://localhost:50000/testDb", props);

3: Using hibernate

hibernate.connection.fullyMaterializeLobData = false