Re: Processing data from table using awk. - Mailing list pgsql-general

From John McKown
Subject Re: Processing data from table using awk.
Date
Msg-id CAAJSdjg5eyVnf7zabse0a=rHB=ccySY6nW-n7b9ZAxrXk9bKeg@mail.gmail.com
Whole thread Raw
In response to Re: Processing data from table using awk.  ("Basques, Bob (CI-StPaul)" <bob.basques@ci.stpaul.mn.us>)
List pgsql-general
On Tue, Oct 6, 2015 at 10:38 AM, Basques, Bob (CI-StPaul) <bob.basques@ci.stpaul.mn.us> wrote:
Just to throw in an idea.

I almost exclusively use PERL for this type of thing.  A bunch of examples out on the web using DBI, and the main aspects are portable across many databases, not just POSTGRES.

​Me too. I'm in an "learn awk" mode. I already have a couple of PERL programs which use DBI to load tabular information into a series of PostgreSQL tables. The information is actually meant to be loaded into IBM's DB/2. I have a PERL program which can read the DB/2 load utility's control file and create a PERL program which can read the data file read using that control file. The created PERL program sends the information into the appropriate PostgreSQL tables instead. IMO, a rather nifty way to have a PERL ​program write another PERL program for me. 
 

Just my two cents.

AWK would work too, I’ve used it myself, and got very complicated with it as well, but you’ll eventually end up looking for more capabilities, and start pulling in other commands like SED, etc.  Perl just keeps on working.

bobb


--

Schrodinger's backup: The condition of any backup is unknown until a restore is attempted.

Yoda of Borg, we are. Futile, resistance is, yes. Assimilated, you will be.

He's about as useful as a wax frying pan.

10 to the 12th power microphones = 1 Megaphone

Maranatha! <><
John McKown

pgsql-general by date:

Previous
From: "David G. Johnston"
Date:
Subject: Re: Best practices for aggregate table design
Next
From: Olivier Dony
Date:
Subject: Re: Serialization errors despite KEY SHARE/NO KEY UPDATE