[jdom-interest] Memory usage when processing a large file

Jon Baer jonbaer at digitalanywhere.com
Sun Oct 7 21:17:26 PDT 2001

I asked the same exact question a long time back and never really got a response but
from what I was told eventually was that if I was parsing and building a DOM
structure of anykind that was something like 7MB that Id be better off using a
database and skip the whole XML process, which got me going on a seperate project
that basically uses a database (HypersonicSQL) as almost a scratch disk for building
large documents that needed later direct manipulation (via XPath), something more or
else that would take XPath and turn into a temporary SQL solution.  I have not gotten
far with it but for this chatbot application Im writing it's going to eventually need
to done.

I'd be interested in hearing any other solutions out there.

- Jon

Benjamin Kopic wrote:

> Hi
> We have an application that processes a data feed and loads it into a
> database. It builds a JDom Document using SAXBuilder and Xerces, and then it
> uses Jaxen XPath to retrieve data needed.
> The problem is that when we parse a 7MB feed the memory usage by Java jumps
> to 110MB. Has anyone else used to process relatively large data feeds with
> JDom?
> Best regards
> Benjamin Kopic
> E: ben at kopic.org
> W: www.kopic.org
> T: +44 (0)20 7794 3090
> M: +44 (0)78 0154 7643
> _______________________________________________
> To control your jdom-interest membership:
> http://lists.denveronline.net/mailman/options/jdom-interest/youraddr@yourhost.com

More information about the jdom-interest mailing list