Thread: pg_dump limit 1000 rows

pg_dump limit 1000 rows

From
"drum.lucas@gmail.com"
Date:
Hi all,

I wanna do a minimal dump DB file. (I'm using PostgreSQL 9.2)
So I was thinking about getting the last 1000 rows data from some tables....
I could do something like:

pg_dump dbname --schema-only -f db_schema.sql

CREATE OR REPLACE FUNCTION _save_top_1000_row_tables(chemin file_path)
 RETURNS character varying AS
$BODY$declare 
_temps timestamp without time zone;
begin
execute 'copy (SELECT * FROM table1 limit 1000 offset 0) TO ''' || file_path||'table1.txt'''; 
execute 'copy (SELECT * FROM table2 limit 1000 offset 0) TO ''' || file_path||'table2.txt'''; 
execute 'copy (SELECT * FROM table3 limit 1000 offset 0) TO ''' || file_path||'table3.txt'''; 
return ('OK');
end;$BODY$
 LANGUAGE plpgsql VOLATILE
 COST 100;

 But using the "COPY" I wouldn't have my foreign keys right? Is there any other option/idea?

Thanks

Re: pg_dump limit 1000 rows

From
bricklen
Date:


On Wed, Mar 2, 2016 at 4:56 PM, drum.lucas@gmail.com <drum.lucas@gmail.com> wrote:
I wanna do a minimal dump DB file. (I'm using PostgreSQL 9.2)
So I was thinking about getting the last 1000 rows data from some tables....

Have a look at pg_sample, it should do exactly what you are looking to do.
https://github.com/mla/pg_sample

Re: pg_dump limit 1000 rows

From
"drum.lucas@gmail.com"
Date:


On Wed, Mar 2, 2016 at 4:56 PM, drum.lucas@gmail.com <drum.lucas@gmail.com> wrote:
I wanna do a minimal dump DB file. (I'm using PostgreSQL 9.2)
So I was thinking about getting the last 1000 rows data from some tables....

Have a look at pg_sample, it should do exactly what you are looking to do.
https://github.com/mla/pg_sample


Thanks Bricklen...

I've ran the pg_sample in a test server and it works!

But, in my main DB i'm getting the error:
Creating table "_pg_sample"."pg_catalog_pg_statistic" DBD::Pg::db do failed: ERROR:  column "stavalues1" has pseudo-type "anyarray" at ./pg_sample line 296
main::__ANON__('DBD::Pg::db do failed: ERROR:  column "stavalues1" has pseudo...', 'DBI::db=HASH(0x1f25c38)', undef) called at ./pg_sample line 538
Dropping sample schema _pg_sample
Done.

Command:
./pg_sample dbname --data-only --file=test.sql --limit="ja_admins = 1000" --verbose 


Would you know why do I get that error?

Cheers

Re: pg_dump limit 1000 rows

From
bricklen
Date:
On Wed, Mar 2, 2016 at 6:44 PM, drum.lucas@gmail.com <drum.lucas@gmail.com> wrote:
But, in my main DB i'm getting the error:
Creating table "_pg_sample"."pg_catalog_pg_statistic" DBD::Pg::db do failed: ERROR:  column "stavalues1" has pseudo-type "anyarray" at ./pg_sample line 296
main::__ANON__('DBD::Pg::db do failed: ERROR:  column "stavalues1" has pseudo...', 'DBI::db=HASH(0x1f25c38)', undef) called at ./pg_sample line 538
Dropping sample schema _pg_sample
Done.

Would you know why do I get that error?

Sorry, no idea. I know the author is quite responsive though. I had a few issues when I used it a year or two ago and he was really helpful.

Re: pg_dump limit 1000 rows

From
"drum.lucas@gmail.com"
Date:
wrote:
But, in my main DB i'm getting the error:
Creating table "_pg_sample"."pg_catalog_pg_statistic" DBD::Pg::db do failed: ERROR:  column "stavalues1" has pseudo-type "anyarray" at ./pg_sample line 296
main::__ANON__('DBD::Pg::db do failed: ERROR:  column "stavalues1" has pseudo...', 'DBI::db=HASH(0x1f25c38)', undef) called at ./pg_sample line 538
Dropping sample schema _pg_sample
Done.

Would you know why do I get that error?

Sorry, no idea. I know the author is quite responsive though. I had a few issues when I used it a year or two ago and he was really helpful.

Cool.. I'm already talking to him...


If someone else needs help with the same issue - have a look on the link above