Re: create batch script to import into postgres tables - Mailing list pgsql-admin

From Pepe TD Vo
Subject Re: create batch script to import into postgres tables
Date
Msg-id 1761277119.1163490.1592322138731@mail.yahoo.com
Whole thread Raw
In response to Re: create batch script to import into postgres tables  (Christopher Browne <cbbrowne@gmail.com>)
Responses Re: create batch script to import into postgres tables
Re: create batch script to import into postgres tables
List pgsql-admin
Yes, I do have putty installed but can't connect to the aws postgres instance.  Only work for oracle instance.  Only connect postgres instance using pgadmin.

follow the url and the login prompt for username and hung there.

thank you.

Bach-Nga

No one in this world is pure and perfect.  If you avoid people for their mistakes you will be alone. So judge less, love, and forgive more.EmojiEmojiEmoji
To call him a dog hardly seems to do him justice though in as much as he had four legs, a tail, and barked, I admit he was, to all outward appearances. But to those who knew him well, he was a perfect gentleman (Hermione Gingold)

**Live simply **Love generously **Care deeply **Speak kindly.
*** Genuinely rich *** Faithful talent *** Sharing success




On Tuesday, June 16, 2020, 11:17:21 AM EDT, Christopher Browne <cbbrowne@gmail.com> wrote:



On Tue, 16 Jun 2020 at 10:59, Pepe TD Vo <pepevo@yahoo.com> wrote:
I can run \copy in Linux with individual csv file into the table fine and run import using pgadmin into AWS instance.  I am trying to run \copy all csv files import into its own table in Linux and in AWS instance. If all csv files into one table is fine but each csv for each table.  Should I create one batch job for each imported table?  If each batch file import csv to its table would be fine via \copy table_name(col1, col2, ... coln) from '/path/tablename.csv' delimiter ',' csv header;  right?

There is no single straightforward answer to that.

Supposing I want a batch to either all be processed, or to all not process, then I might write a sql file like:

begin;
\copy table_1 (c1, c2, c3) from '/path/tabledata1.csv' csv header;
\copy table_2 (c1, c2, c3) from '/path/tabledata2.csv' csv header;
\copy table_3 (c1, c2, c3) from '/path/tabledata3.csv' csv header;
commit;

But you may be fine with having a separate SQL script for each table.

There will be conditions where one or the other is more appropriate, and that will be based on the requirements of the process.


Also, the problem is I can't pull/execute psql from window client to pull the psql in aws instance and don't know how to create the batch script for this run.  I tried simple \copy pull from c:\tes.csv and psql is unknown.


You cannot run psql without having it installed; there is a Windows installer for PostgreSQL, so you could use that to get it installed.

Hopefully there is an installer that will just install PostgreSQL client software (like psql, pg_dump, and notably *not* the database server software); I don't use WIndows, so I am not too familiar with that.
 
--
When confronted by a difficult problem, solve it by reducing it to the
question, "How would the Lone Ranger handle this?"

pgsql-admin by date:

Previous
From: Christopher Browne
Date:
Subject: Re: create batch script to import into postgres tables
Next
From: Achilleas Mantzios
Date:
Subject: please help! losing my subscriber node