Hey guys,
I was given a database back of a non-PostgreSQL database. That database contains records where some binary file (looks
likeemail attachments) was imported into several chunks of X characters in length and then stored into multiple
records.A messy way of storing BLOB data. The database encoding is LATIN1, ISO8859-1.
There chunks are actually 50 fields of 60 bytes each per row. If the original file is larger than that, more than one
rowis used.
I can export the data out of that database into flat files just fine, but then I try to import the data to Postgres,
I'mgetting errors like this:
ERROR: invalid byte sequence for encoding "SQL_ASCII": 0x00
CONTEXT: COPY attachments, line 14: "58025 1 cl\Cert.r 10
M04P'15A415).($-H87)4:6UE+$-(05)!0U1%4BQ)3E!55"!I5&EM92!)3E1% M1T52'$585$523B!7..."
I tried LATIN1, SQL_ASCII, UTF-8, nothing works. I even tried to make the data type 'bytea', no luck. I'd love to have
a"NO-CONVERSION" option on the copy command that just takes what ever bytes come along and doesn't try to interpret
them.
Any ideas of what I can do to import this stuff?
best regards,
chris
--
chris ruprecht
database grunt and bit pusher extraordinaíre