how to handle very large data object efficiently - Mailing list pgsql-jdbc

From Xiaoyu
Subject how to handle very large data object efficiently
Date
Msg-id 1185465272.247919.148280@k79g2000hse.googlegroups.com
Whole thread Raw
Responses Re: how to handle very large data object efficiently  (Mark Lewis <mark.lewis@mir3.com>)
List pgsql-jdbc
Hi, folks:
I am a new programmer to JDBC, if the question is naive, please
forgive me.
question description:
1. store the data in a text file into database, the format of the text
file is similar to:
......
218596813 235940555 4387359 3 386658 4000 4 4
218597197 235940333 4366832 17 388842 5000 5 5
218597485 235940805 4374620 8 386226 4000 4 4
......
each record of the database corresponding to each line of the file,
and each element corresponding to each number.

2. the file is very huge, normally there are 9,000,000 lines in each
file. My current program read the file line by line and parse the line
and store the elements into the database. However, because the file is
huge, it may take days to store one file, and there are 50 similar
files need to be processed.

3. Can anyone give a better solution to take place "read line by
line"? Is there any method like manipulate block of data(equal several
lines) in JDBC? Many thanks

Xiaoyu


pgsql-jdbc by date:

Previous
From: Oliver Jowett
Date:
Subject: Re: defaultAutoCommit problem with glassfish
Next
From: Mark Lewis
Date:
Subject: Re: how to handle very large data object efficiently