>>>> Or go via MS-Access/Perl and ODBC/DBI perhaps?
>>>
>>> Yes, I think it would work. The problem is that the DB is too big for
>>> this king of export. Using DTS from MSSQL to export directly to
>>> PostgreSQL using psqlODBC Unicode Driver, I exported ~1000 rows per
>>> second in a 2-columns table with ~20M rows. That means several days just
>>> for this table, and I have bigger ones !
>>
>> Well it's about 0.25 days, but if it's too long, it's too long.
>
> Sure, sorry for the confusion, the problem is with the other tables (same
> number of rows but a lot of columns, some very large).
>
well, if its too slow, then you will have to dump the db to a textfile (DTS
does this for you) and then convert the textfile to utf8 manually before
importing it to pgsql. iconv for win32 will help you there. i found tho it
removes some wanted special characters, so watch out.
a less "scientific" approach would be using an unicode-aware texteditor to
convert it (ultraedit does this pretty nicely, for example). have had good
results with it.
loading several million rows will always take some time, tho.
- thomas