FW: [jdom-interest] parse large xml file using jdom

New, Cecil (GEAE) cecil.new at ae.ge.com
Tue Jun 10 04:44:35 PDT 2003


If your process can tolerate detaching, perhaps you should consider using SAX.
Takes more work, but you'll have the fastest and least memory approach possible.

-----Original Message-----
From: Eric Chi-hsuan Lai [mailto:lai at physics.utexas.edu]
Sent: Monday, June 09, 2003 5:43 PM
To: jdom-interest at jdom.org
Subject: [jdom-interest] parse large xml file using jdom


Hi,
	I am in a project that has to process large size xml files, i.e. 
50M-100M in size. Last time I tried to use jdom to parse it ends up with 
out of memory. I am wondering if the new beta 9 has new facility/features 
that can handle such big xml file. I saw on dom4j FAQ has a feature that 
can "detach" a node/element after you are done with it. I hope jdom has 
something similar features.

Eric

_______________________________________________
To control your jdom-interest membership:
http://lists.denveronline.net/mailman/options/jdom-interest/youraddr@yourhost.co
m



More information about the jdom-interest mailing list