On 2022-11-07 14:40:40 -0600, Ron wrote:
> On 11/7/22 10:57, Вадим Самохин wrote:
> I have an application that must copy a local file in csv format to a
> postgres table on a remote host. The closest solution is this one (https://
> stackoverflow.com/a/9327519/618020). It boils down to specifying a \copy
> meta-command in a psql command:
>
> psql -U %s -p %s -d %s -f - <<EOT\n here hoes a \copy meta-command \nEOT\n
>
>
> and executing it. But it's quite an unnatural way to write database code. Has anything changed in the last ten
years?Or, is there a better way to copy file contents in a remote database?
>
>
> I'd write a small Python script, using the csv module to read the data and
> psycopg2 to load it.
If you use insert statements it will be significantly slower (which may
not matter for small files or one-off actions). If you use copy_from()
you don't have to parse it (but then why use Python at all?)
hp
--
_ | Peter J. Holzer | Story must make more sense than reality.
|_|_) | |
| | | hjp@hjp.at | -- Charles Stross, "Creative writing
__/ | http://www.hjp.at/ | challenge!"