Schema Export/Import - Mailing list pgsql-general

From Bryan Murphy
Subject Schema Export/Import
Date
Msg-id bd8531800802290929ka758a87j4225c965e5628f02@mail.gmail.com
Whole thread Raw
List pgsql-general
Hey guys, we're changing the way we version our database from some old
unsupported (and crappy) ruby migrations-like methodology to straight
SQL scripts.  We currently run CruiseControl.NET on a windows machine
pointed to a test database server hosted on linux for our builds.  At
each build we attempt to drop the old database, create a new one,
export the schema, then run the unit tests.

The old way worked fine, but the new way is a little trickier.  We
have 169 migrations now, and I want to blow out the old migrations and
reset 169 as our starting point.  I dumped our current schema (pg_dump
-s) to a sql file, and when I manually pipe it through psql,
everything is fine.

However, our build environment being a windows machine complicates
this process.  It's not so easy to just shell out and call psql.  I
will do it, if I have to, but I'm wondering if there's a better way.
Unfortunately, I have not found a way to run this schema via Npgsql or
PG Admin.

What I want to know is if there is a way I can programmatically get
the same behavior as piping the script through the psql command line.
Both PG Admin and Npgsql seem to error out when running some commands
(ex: CREATE TYPE gtsvector) where psql continues gracefully.  There
also seems to be a hard limit on the size of the command that can be
passed in to PG Admin and Npgsql which we don't experience using the
command line.  Our schema definition is well beyond that hard-limit
size, and I'm not looking forward to manually breaking up 20,000 lines
of sql into separate files. :(

Just fishing for ideas.

Thanks,
Bryan

pgsql-general by date:

Previous
From: Kaloyan Iliev
Date:
Subject: Re: Insert vs Update syntax
Next
From: Chris Bowlby
Date:
Subject: issue with an assembled date field