Re: Inserting 'large' amounts of data - Mailing list pgsql-jdbc

From dmp
Subject Re: Inserting 'large' amounts of data
Date
Msg-id 4A9570CA.1050505@ttc-cmc.net
Whole thread Raw
In response to Inserting 'large' amounts of data  (Mario Splivalo <mario.splivalo@megafon.hr>)
List pgsql-jdbc
>
>
>I have a web application which allows users to upload a lot of phone
>numbers. I need to store those numbers to a database. Usualy, one would
>upload around 70k-100k of records, totaling around 2 MB in size.
>
>I'm using tomcat as an application server, and JDBC to connect to pg8.3
>database.
>
>I will have around 20-50 concurent users in peek hours, and even that is
>quite overestimated.
>
>I could create the temporary file on the filesystem where database
>cluster is located and then execute COPY mytable FROM
>'/tmp/upload-data/uuidofsomesort.csv' WITH CSV', but the 'problem' is
>that database server and tomcat reside on different physical machines.
>
>What would one recommend as the best way to insert those data?
>
>    Mario
>
Hello Mario,
If the users already have the data in CSV format why not let them do it
via the
app. server? The connection can be made across machines if setup properly.

http://dandymadeproductions.com/projects/MyJSQLView/docs/javadocs/index.html
CSVDataImportThread.java

This class could be used as a basis, with some work. I have a bug with
data that has
semicolons, could be more robust also, but could be used for a start.

John wrote:

> I believe you can use org.postgresql.copy.CopyIn() ...  there are
> variants that use a writeToCopy() call to send the data, or a
> java.io.InputStream, or a java.io.Reader ...

This sounds a lot cleaner.

danap.

pgsql-jdbc by date:

Previous
From: Maciek Sakrejda
Date:
Subject: Re: Inserting 'large' amounts of data
Next
From: Mario Splivalo
Date:
Subject: Re: Inserting 'large' amounts of data