Thread: getting list of tables from command line

getting list of tables from command line

From
Craig White
Date:
I wrote a little script to individually back up table schemas, table
data and then vacuum the table and it works nicely but I wanted a way to
query a database and get a text file with just the table names and
cannot figure out a way to do that.

my script looks like this...
(all I want is to get a list of the tables into a text file pg_tables)

#/bin/sh
#
DB_NAME=whatever
#
for i in `cat pg_tables`
do
  pg_dump  --username=postgres \
    --schema=db
    --table=$i \
    --schema-only \
    $DB_NAME > schemas/$i.sql
  pg_dump  --username=postgres \
    --schema=db \
    --table=$i \
    --data-only \
    $DB_NAME > data/$i.sql
  vacuumdb --username=postgres \
   --dbname=$DB_NAME \
   --table=db.$i \
   --verbose \
   --full
done

Is there a way to do that?

Craig

PS there's a lack of cohesion between various commands such as vacuumdb
and pg_dump for things like '--schema'


Re: getting list of tables from command line

From
Reece Hart
Date:
On Wed, 2007-10-31 at 08:01 -0700, Craig White wrote:
> I wanted a way to
> query a database and get a text file with just the table names and
> cannot figure out a way to do that.

This should do the trick:

$ psql -Atc "select table_schema||'.'||table_name from
information_schema.tables"

-Reece

--
Reece Hart, http://harts.net/reece/, GPG:0x25EC91A0


Re: getting list of tables from command line

From
"T.J. Adami"
Date:
On 31 out, 12:01, craigwh...@azapple.com (Craig White) wrote:
> I wrote a little script to individually back up table schemas, table
> data and then vacuum the table and it works nicely but I wanted a way to
> query a database and get a text file with just the table names and
> cannot figure out a way to do that.
>
> my script looks like this...
> (all I want is to get a list of the tables into a text file pg_tables)
>
> #/bin/sh
> #
> DB_NAME=whatever
> #
> for i in `cat pg_tables`
> do
>   pg_dump  --username=postgres \
>     --schema=db
>     --table=$i \
>     --schema-only \
>     $DB_NAME > schemas/$i.sql
>   pg_dump  --username=postgres \
>     --schema=db \
>     --table=$i \
>     --data-only \
>     $DB_NAME > data/$i.sql
>   vacuumdb --username=postgres \
>    --dbname=$DB_NAME \
>    --table=db.$i \
>    --verbose \
>    --full
> done
>
> Is there a way to do that?
>
> Craig
>
> PS there's a lack of cohesion between various commands such as vacuumdb
> and pg_dump for things like '--schema'
>
> ---------------------------(end of broadcast)---------------------------
> TIP 9: In versions below 8.0, the planner will ignore your desire to
>        choose an index scan if your joining column's datatypes do not
>        match

First you can create a file with the SQL statement:

select tablename from pg_tables where schemaname not in
('information_schema','pg_catalog' ) order by tablename;

After, run it by psql:

psql -U [postgres_user] -d [database_name] -f [file_created_with_SQL]
> [output_file]

This will dump all non-database schema tables into the output file, so
you can open it and read table names.


Re: getting list of tables from command line

From
David Fetter
Date:
On Wed, Oct 31, 2007 at 08:01:41AM -0700, Craig White wrote:
> I wrote a little script to individually back up table schemas, table
> data and then vacuum the table and it works nicely but I wanted a
> way to query a database and get a text file with just the table
> names and cannot figure out a way to do that.

This should do it. :)

psql -At your_db <<EOT > pg_tables
SELECT quote_ident(table_schema) || '.' || quote_ident(table_name)
FROM information_schema.tables
WHERE
    table_schema NOT IN (
        'information_schema',
        'pg_catalog'
    )
AND
    table_type = 'BASE TABLE';
EOT

Cheers,
David.
--
David Fetter <david@fetter.org> http://fetter.org/
Phone: +1 415 235 3778  AIM: dfetter666  Yahoo!: dfetter
Skype: davidfetter      XMPP: david.fetter@gmail.com

Remember to vote!
Consider donating to Postgres: http://www.postgresql.org/about/donate

Re: getting list of tables from command line

From
"Filip Rembiałkowski"
Date:
2007/10/31, Craig White <craigwhite@azapple.com>:
> I wrote a little script to individually back up table schemas, table
> data and then vacuum the table and it works nicely but I wanted a way to
> query a database and get a text file with just the table names and
> cannot figure out a way to do that.


for table in ` psql -qAtc "select table_name from information_schema.tables
where table_schema='public'" `



--
Filip Rembiałkowski

Re: getting list of tables from command line

From
Ron St-Pierre
Date:
Craig White wrote:
> I wrote a little script to individually back up table schemas, table
> data and then vacuum the table and it works nicely but I wanted a way to
> query a database and get a text file with just the table names and
> cannot figure out a way to do that.
>
> my script looks like this...
> (all I want is to get a list of the tables into a text file pg_tables)
>
> #/bin/sh
> #
> DB_NAME=whatever
> #
> for i in `cat pg_tables`
> do
>   pg_dump  --username=postgres \
>     --schema=db
>     --table=$i \
>     --schema-only \
>     $DB_NAME > schemas/$i.sql
>   pg_dump  --username=postgres \
>     --schema=db \
>     --table=$i \
>     --data-only \
>     $DB_NAME > data/$i.sql
>   vacuumdb --username=postgres \
>    --dbname=$DB_NAME \
>    --table=db.$i \
>    --verbose \
>    --full
> done
>
> Is there a way to do that?
>
>
 From the command line you can run:
psql mydbname -c "\d"
to get a list of all public table names.

Or just select the table names from pg_tables and send it to a file:
   psql myDBname -c "SELECT tablename FROM pg_tables WHERE schemaname =
'someschema';" > my_tables.txt

This works on my 8.1 database on RHEL. You could also use something
similar inside of your script to generate the table names and send them
to pg_dump. As far as I know, if you specify a table name, but don't
specify a schema name to pg_dump, it will automatically dump all tables
with that name, irregardless of what schema they belong to. I'm sure
someone will let me know if that's not correct.......

hth

Ron
> Craig
>
> PS there's a lack of cohesion between various commands such as vacuumdb
> and pg_dump for things like '--schema'
>
>
> ---------------------------(end of broadcast)---------------------------
> TIP 9: In versions below 8.0, the planner will ignore your desire to
>        choose an index scan if your joining column's datatypes do not
>        match
>
>


Re: getting list of tables from command line

From
hubert depesz lubaczewski
Date:
On Wed, Oct 31, 2007 at 08:01:41AM -0700, Craig White wrote:
> my script looks like this...
> (all I want is to get a list of the tables into a text file pg_tables)

everybody else showed some ways, but i'll ask a question:
i hope you're not treating it as a backup? bacause when you do it that
way (on table-by-table basic) the "backup" is practically useless as it
doesn't have internal consistency.

if you want to make backup, perhaps you can tell us exactly what you
want to do, and why standard pg_dump is not enough.

if you're fully aware of the issue i pointed - sorry, i'm just trying to
make sure you'll not shoot yourself in a foot.

depesz

--
quicksil1er: "postgres is excellent, but like any DB it requires a
highly paid DBA.  here's my CV!" :)
http://www.depesz.com/ - blog dla ciebie (i moje CV)

Re: getting list of tables from command line

From
Craig White
Date:
On Thu, 2007-11-01 at 20:25 +0100, hubert depesz lubaczewski wrote:
> On Wed, Oct 31, 2007 at 08:01:41AM -0700, Craig White wrote:
> > my script looks like this...
> > (all I want is to get a list of the tables into a text file pg_tables)
>
> everybody else showed some ways, but i'll ask a question:
> i hope you're not treating it as a backup? bacause when you do it that
> way (on table-by-table basic) the "backup" is practically useless as it
> doesn't have internal consistency.
>
> if you want to make backup, perhaps you can tell us exactly what you
> want to do, and why standard pg_dump is not enough.
>
> if you're fully aware of the issue i pointed - sorry, i'm just trying to
> make sure you'll not shoot yourself in a foot.
----
No - thanks...you raised a very good point.

First I would like to thank all of those who responded (Reece, T.J.,
David, Filip, Ron) with great ideas...I learned a lot.

I also have pg_dumpall on a weekly basis and pg_dump each database
nightly. I learned that a long time ago when I was in early development
and migrated my fedora installation which updated postgres and my
database no longer worked.

I wanted the table separations nightly as insurance from things like bad
migrations (ruby on rails which alters the tables via scripting) and the
ability to reload the data from a table based on poorly conceived coding
(not that I would ever do such a thing), or to make it easier for me to
move data from my production database to my development database.

I am fortunate that even though I am now past 30 tables in my project
(production) and we've been adding data for a year and a half, the total
data backup is only 7 Megabytes (/var/lib/pgsql/data is only 132
megabytes) so I can't have too many different backups made nightly via
cron scripts at this point. I also turned on auto-vacuum in the
preferences but this method seems much more thorough.

For anyone interested, this is what I settled upon for my final script
(heck, I don't even bother tar/zip the things yet)...

*** begin pg_table_dump.scr ***
#/bin/sh
#
# Script to identify tables, backup schema and data separately and
# then finally, vacuum each table
#
DB_NAME=MY_DB
BACKUP_PATH=/home/backup/postgres/production
MY_SCHEMA=public
PG_USER=craig
#
psql -U $PG_USER \
  $DB_NAME \
  -c "SELECT tablename FROM pg_tables WHERE \
  schemaname = "\'$MY_SCHEMA\'";" | \
  grep -v 'tablename' | \
  grep -v [\--*] | \
  grep -v rows\) > $BACKUP_PATH/pg_tables
#
for i in `cat $BACKUP_PATH/pg_tables`
do
  pg_dump \
    --username=$PG_USER \
    --schema=$MY_SCHEMA \
    --table=$i \
    --schema-only \
    $DB_NAME > $BACKUP_PATH/schemas/$i.sql
  pg_dump \
    --username=$PG_USER \
    --schema=$MY_SCHEMA \
    --table=$i \
    --data-only \
    $DB_NAME > $BACKUP_PATH/data/$i.sql
  vacuumdb \
   --username=$PG_USER \
   --dbname=$DB_NAME \
   --table=$MY_SCHEMA.$i \
   --verbose \
   --full
done
*** end pg_table_dump.scr ***

Craig


Re: getting list of tables from command line

From
hubert depesz lubaczewski
Date:
On Thu, Nov 01, 2007 at 08:03:08PM -0700, Craig White wrote:
> *** begin pg_table_dump.scr ***
> #/bin/sh
> #
> # Script to identify tables, backup schema and data separately and
> # then finally, vacuum each table
> #
> DB_NAME=MY_DB
> BACKUP_PATH=/home/backup/postgres/production
> MY_SCHEMA=public
> PG_USER=craig
> #
> psql -U $PG_USER \
>   $DB_NAME \
>   -c "SELECT tablename FROM pg_tables WHERE \
>   schemaname = "\'$MY_SCHEMA\'";" | \
>   grep -v 'tablename' | \
>   grep -v [\--*] | \
>   grep -v rows\) > $BACKUP_PATH/pg_tables

instead of these 3 grep -v, i would suggest to use -qAt switches to
psql:
psql -U $PG_USER -d $DB_NAME -qAt -c "..."

depesz

--
quicksil1er: "postgres is excellent, but like any DB it requires a
highly paid DBA.  here's my CV!" :)
http://www.depesz.com/ - blog dla ciebie (i moje CV)