RE: Converting Access tables to PostGre - Mailing list pgsql-interfaces

From Dave Page
Subject RE: Converting Access tables to PostGre
Date
Msg-id 8568FC767B4AD311AC33006097BCD3D61A27B1@woody.vale-housing.co.uk
Whole thread Raw
In response to Converting Access tables to PostGre  ("Richard Holland" <rholland@equitechinfo.com>)
List pgsql-interfaces

> -----Original Message-----
> From: Richard Holland [mailto:rholland@equitechinfo.com]
> Sent: 05 December 2000 14:41
> To: pgsql-interfaces@postgresql.org
> Subject: [INTERFACES] Converting Access tables to PostGre
> 
> 
> I've come across tons of information on using Access with 
> PostGre, and am thankful for this.  Be this as it may, the 
> upsizing tool I
> found to convert our access databases to postgre seems to 
> puke on our 800+ meg databases (which is why we're doing this, our
> software was created by idiots with no scalability in mind).  
> I downloaded pgadmin, the latest version as of a week ago.  This works
> great, congrats to it's maintainers.  

Thanks.

> My problem is the same 
> above mentioned idiots used double and single field types to handle
> money.  Being that our software handles property tax for 
> government counties, we can't have any rounding errors.  So I 
> figured on
> using the numeric(x,y) type, so I fire up the migration tool 
> built into pgadmin, edit the typemaps to throw singles and doubles to
> numeric, and it fails because it doesnt seem to give the 
> numerics a length.  It feeds postgre fieldname
> numeric(0,lenghtofdecimalsspecifiedinaccess)

pgAdmin attempts to figure out the scale/precision of the numeric from the
source database using the NumericScale and Precision properties of the
ADOX.Field object representing the source field (If I'm using the wrong
properties here corrections would be appreciated...). As the source database
has floats/longs in it (and therefore no (x,y) values) you will get odd
results (eg. zeros).

> this fails..  anybody have a better solution for migrating 
> our data?  I can make all 300 tables by hand if need be, but 
> it's getting
> the millions of records in i'm worried about.

I suspect that pgAdmin would fail with a Migration this large as well as it
creates a recordset from the source table and uses that to fire insert
queries at PostgreSQL - in my experience ADO recordsets get unusable around
the 100,000 record mark.

I would suggest a different tactic along the lines of:

1) Migrate the table/index definitions 'as-is' using pgAdmin's migration
wizard.
2) Dump the PostgreSQL database using pg_dump and use vi or something to
change the floats to numerics.
3) Reload the database.
4) Write some Access/VB code to dump the source databases to ASCII text
files. Use a quote and a delimiter eg:

'fld1'|'fld2'|'fld3'|

Hint: If you have ' in you data replace it with '' and don't use ' as a
quote character

5) Use the pgAdmin Import Wizard to load the ASCII files.

I'd be interested to know how pgAdmin copes with such large text files if
you try this method...

HTH, Regards,
Dave.


pgsql-interfaces by date:

Previous
From: Peter Mount
Date:
Subject: RE: Fwd: errors with getObject() with JDBC driver
Next
From: "'Ingo Luetkebohle'"
Date:
Subject: Re: jdbc: getBinaryStream blocks on second call