Thread: Importing large data files into PostgreSQL

Importing large data files into PostgreSQL

From
"Elizabeth O'Neill"
Date:
Hi i hope some one can help i'm new to PostgreSQL i have been trying to
load data into a table using the "\Copy" function. All the records
(approx. 15,000) will go into the table but the it doesn't seem to
reconise the delimiters because the data is randomly split over the
columns in the table.

I have tried changing the size of the columns, making sure the
delimiters are correct, changing the delimiters to "," instead of
"<TAB>", inserting a small section of the data and changing the
data-types of the columns from 'varchar' to 'char' but none of this
makes a difference

Does anyone know what may be wrong or a better way to import large
chunks of data.

Please help before i go bald from pulling my hair out

Thank you in advance

Liz



Re: Importing large data files into PostgreSQL

From
"Nick Fankhauser"
Date:
Liz-

Without looking at a chunk of the datafile it is hard to tell what may be
wrong, but here are a copy of ideas:

If you haven't already looked here, look at this page in the docs:
http://www.postgresql.org/idocs/index.php?sql-copy.html



try manually inserting a few rows into your table, and then do a dump:
pg_dump database_name >test.out

Then take a look at test.out- After the object creation sections, it will
have a section where it uses the copy command to insert the rows, and you
can use this to see what sort of format copy expects.

-Nick



> -----Original Message-----
> From: pgsql-admin-owner@postgresql.org
> [mailto:pgsql-admin-owner@postgresql.org]On Behalf Of Elizabeth O'Neill
> Sent: Wednesday, November 14, 2001 1:38 PM
> To: pgsql-admin@postgresql.org
> Subject: [ADMIN] Importing large data files into PostgreSQL
>
>
> Hi i hope some one can help i'm new to PostgreSQL i have been trying to
> load data into a table using the "\Copy" function. All the records
> (approx. 15,000) will go into the table but the it doesn't seem to
> reconise the delimiters because the data is randomly split over the
> columns in the table.
>
> I have tried changing the size of the columns, making sure the
> delimiters are correct, changing the delimiters to "," instead of
> "<TAB>", inserting a small section of the data and changing the
> data-types of the columns from 'varchar' to 'char' but none of this
> makes a difference
>
> Does anyone know what may be wrong or a better way to import large
> chunks of data.
>
> Please help before i go bald from pulling my hair out
>
> Thank you in advance
>
> Liz
>
>
>
> ---------------------------(end of broadcast)---------------------------
> TIP 1: subscribe and unsubscribe commands go to majordomo@postgresql.org
>


Re: Importing large data files into PostgreSQL

From
Luis Amigo
Date:
Nick Fankhauser wrote:

> Liz-
>
> Without looking at a chunk of the datafile it is hard to tell what may be
> wrong, but here are a copy of ideas:
>
> If you haven't already looked here, look at this page in the docs:
> http://www.postgresql.org/idocs/index.php?sql-copy.html
>
> try manually inserting a few rows into your table, and then do a dump:
> pg_dump database_name >test.out
>
> Then take a look at test.out- After the object creation sections, it will
> have a section where it uses the copy command to insert the rows, and you
> can use this to see what sort of format copy expects.
>
> -Nick
>
> > -----Original Message-----
> > From: pgsql-admin-owner@postgresql.org
> > [mailto:pgsql-admin-owner@postgresql.org]On Behalf Of Elizabeth O'Neill
> > Sent: Wednesday, November 14, 2001 1:38 PM
> > To: pgsql-admin@postgresql.org
> > Subject: [ADMIN] Importing large data files into PostgreSQL
> >
> >
> > Hi i hope some one can help i'm new to PostgreSQL i have been trying to
> > load data into a table using the "\Copy" function. All the records
> > (approx. 15,000) will go into the table but the it doesn't seem to
> > reconise the delimiters because the data is randomly split over the
> > columns in the table.
> >
> > I have tried changing the size of the columns, making sure the
> > delimiters are correct, changing the delimiters to "," instead of
> > "<TAB>", inserting a small section of the data and changing the
> > data-types of the columns from 'varchar' to 'char' but none of this
> > makes a difference
> >
> > Does anyone know what may be wrong or a better way to import large
> > chunks of data.
> >
> > Please help before i go bald from pulling my hair out
> >
> > Thank you in advance
> >
> > Liz
> >
> >
> >
> > ---------------------------(end of broadcast)---------------------------
> > TIP 1: subscribe and unsubscribe commands go to majordomo@postgresql.org
> >
>
> ---------------------------(end of broadcast)---------------------------
> TIP 3: if posting/reading through Usenet, please send an appropriate
> subscribe-nomail command to majordomo@postgresql.org so that your
> message can get through to the mailing list cleanly

We're copying data without a problem with this scheme:
"col1\tcol2\tcol3\tcol4\ncol1\t and so..." from a file created by a C
function