large xml database - Mailing list pgsql-sql

From Viktor Bojović
Subject large xml database
Date
Msg-id AANLkTi=ZEO1W=FOxdH+S3twj_A1VCpyyzes7HFghJ7T-@mail.gmail.com
Whole thread Raw
Responses Re: large xml database  (Andreas Joseph Krogh <andreak@officenet.no>)
Re: large xml database  (James Cloos <cloos@jhcloos.com>)
Re: large xml database  (Lutz Steinborn <l.steinborn@4c-ag.de>)
List pgsql-sql
Hi,<br />i have very big XML documment which is larger than 50GB and want to import it into databse, and transform it
torelational schema.<br />When splitting this documment to smaller independent xml documments i get ~11.1mil XML
documents.<br/> I have spent lots of time trying to get fastest way to transform all this data but every time i give up
becauseit takes too much time. Sometimes more than month it would take if not stopped.<br />I have tried to insert each
lineas varchar into database and parse it using plperl regex..<br /> also i have tried to store every documment as XML
andparse it, but it is also to slow.<br />i have tried to store every documment as varchar but it is also slow when
usingregex to get data.<br /><br />many tries have failed because 8GB of ram and 10gb of swap were not enough. also
sometimesi get that more than 2^32 operations were performed, and functions stopped to work.<br /><br />i wanted just
toask if someone knows how to speed this up.<br /><br />thanx in advance<br /><br />-- <br
/>---------------------------------------<br/>Viktor Bojović<br />---------------------------------------<br />Wherever
Igo, Murphy goes with me<br /> 

pgsql-sql by date:

Previous
From: Will Furnass
Date:
Subject: Re: resolved: WITH RECURSIVE: ARRAY[id] All column datatypes must be hashable
Next
From: Andreas Joseph Krogh
Date:
Subject: Re: large xml database