Thread: some unrelated questions

some unrelated questions

From
Erik Price
Date:
Hi,

I am writing a Java app and currently am using postgres with it.  I have
a unit test suite that runs frequently and inserts new data to test the
code.  In order to be able to re-run this suite indefinitely, I have
written some SQL commands into my build script to DROP the database and
then CREATE the database again, repopulating the tables from known data
exported to a file, so that the tests always run with the database in a
known state.


I have two questions:

1. Are there any hazards in dropping a database and then creating it
again, over and over?  This seemed like the cleanest way to remove all
of the data, including the sequences, but I am wondering if there is a
danger that the 'current OID' might be getting artificially pushed
toward its upper limit or something.


2. The [Ant] build script has no problem performing the necessary SQL
inserts when I use a certain file that I have handwritten that contains
nothing but CREATE TABLE and INSERT statements.  Fine.  However, if I
need to make some change to the database, this means I also have to
update the handwritten file (for instance, if I add new test data to the
database or change the structure of a table).  This seems foolish, when
I could simply export the database schema and data to a file and then
use this file.

However, the format used by pg_dump consists of COPY statements, not
INSERT statements.  The format also assumes that the commands are being
run from the shell, as there are certain escapes and references to
stdin.  My build script does -not- run from a subshell, rather it is
executed via JDBC through the Ant <sql> task, so the SQL parser chokes
when it encounters some shell-specific escapes in the file.  Is there a
way to run pg_dump so that the exported data resembles MySQL's mysqldump
data (entirely CREATE TABLE and INSERT statements) instead of the COPY
statements?



Thank you so much if you can advise me.


Erik


Re: some unrelated questions

From
"Rod Cope"
Date:
Erik,

I would highly recommend you check out DbUnit (www.dbunit.org), an Open
Source extension to JUnit that does exactly what you want.  It has
ready-to-use Ant tasks that will snapshot your database, or just certain
tables, or even the results of selects into flat or hierarchical XML files.
You do this one time after you have your test database ready to go.

Then in your unit tests, the database will be automatically restored to the
state given in the XML files before each test is run, therefore guaranteeing
that the results of one test does not interfere with the next.  You can
specify if you want tables to be dropped and recreated or just emptied and
repopulated, etc.

It also gives you a framework to easily (in one line) compare live result
sets against saved results to see if you broke something.  It supports
result set ordering to make sure you're comparing apples to apples.

If you use multiple databases, it's even better because you can snapshot the
data from one and use it to re-populate another.  We use PostgreSQL, MySQL,
and others without any problem.

We use DbUnit in all our database-enabled sample applications in
Out-of-the-Box.  It's a must-have for database development work in Java.

Hope that helps,
Rod

Rod Cope
EJB Solutions, Inc.
rod.cope@ejbsolutions.com


-----Original Message-----
From: pgsql-general-owner@postgresql.org
[mailto:pgsql-general-owner@postgresql.org] On Behalf Of Erik Price
Sent: Wednesday, June 11, 2003 11:54 AM
To: pgsql-general@postgresql.org
Subject: [GENERAL] some unrelated questions

Hi,

I am writing a Java app and currently am using postgres with it.  I have
a unit test suite that runs frequently and inserts new data to test the
code.  In order to be able to re-run this suite indefinitely, I have
written some SQL commands into my build script to DROP the database and
then CREATE the database again, repopulating the tables from known data
exported to a file, so that the tests always run with the database in a
known state.


I have two questions:

1. Are there any hazards in dropping a database and then creating it
again, over and over?  This seemed like the cleanest way to remove all
of the data, including the sequences, but I am wondering if there is a
danger that the 'current OID' might be getting artificially pushed
toward its upper limit or something.


2. The [Ant] build script has no problem performing the necessary SQL
inserts when I use a certain file that I have handwritten that contains
nothing but CREATE TABLE and INSERT statements.  Fine.  However, if I
need to make some change to the database, this means I also have to
update the handwritten file (for instance, if I add new test data to the
database or change the structure of a table).  This seems foolish, when
I could simply export the database schema and data to a file and then
use this file.

However, the format used by pg_dump consists of COPY statements, not
INSERT statements.  The format also assumes that the commands are being
run from the shell, as there are certain escapes and references to
stdin.  My build script does -not- run from a subshell, rather it is
executed via JDBC through the Ant <sql> task, so the SQL parser chokes
when it encounters some shell-specific escapes in the file.  Is there a
way to run pg_dump so that the exported data resembles MySQL's mysqldump
data (entirely CREATE TABLE and INSERT statements) instead of the COPY
statements?



Thank you so much if you can advise me.


Erik


---------------------------(end of broadcast)---------------------------
TIP 6: Have you searched our list archives?

http://archives.postgresql.org




Re: some unrelated questions

From
Erik Price
Date:
Rod,

Awesome.  This sounds a lot better than my <sql> task with the exported
files.  I hope it is quick to learn, but I am checking it out right now.


Erik


Rod Cope wrote:
> Erik,
>
> I would highly recommend you check out DbUnit (www.dbunit.org), an Open
> Source extension to JUnit that does exactly what you want.  It has
> ready-to-use Ant tasks that will snapshot your database, or just certain
> tables, or even the results of selects into flat or hierarchical XML files.
> You do this one time after you have your test database ready to go.
>
> Then in your unit tests, the database will be automatically restored to the
> state given in the XML files before each test is run, therefore guaranteeing
> that the results of one test does not interfere with the next.  You can
> specify if you want tables to be dropped and recreated or just emptied and
> repopulated, etc.
>
> It also gives you a framework to easily (in one line) compare live result
> sets against saved results to see if you broke something.  It supports
> result set ordering to make sure you're comparing apples to apples.
>
> If you use multiple databases, it's even better because you can snapshot the
> data from one and use it to re-populate another.  We use PostgreSQL, MySQL,
> and others without any problem.
>
> We use DbUnit in all our database-enabled sample applications in
> Out-of-the-Box.  It's a must-have for database development work in Java.
>
> Hope that helps,
> Rod
>
> Rod Cope
> EJB Solutions, Inc.
> rod.cope@ejbsolutions.com
>
>
> -----Original Message-----
> From: pgsql-general-owner@postgresql.org
> [mailto:pgsql-general-owner@postgresql.org] On Behalf Of Erik Price
> Sent: Wednesday, June 11, 2003 11:54 AM
> To: pgsql-general@postgresql.org
> Subject: [GENERAL] some unrelated questions
>
> Hi,
>
> I am writing a Java app and currently am using postgres with it.  I have
> a unit test suite that runs frequently and inserts new data to test the
> code.  In order to be able to re-run this suite indefinitely, I have
> written some SQL commands into my build script to DROP the database and
> then CREATE the database again, repopulating the tables from known data
> exported to a file, so that the tests always run with the database in a
> known state.
>
>
> I have two questions:
>
> 1. Are there any hazards in dropping a database and then creating it
> again, over and over?  This seemed like the cleanest way to remove all
> of the data, including the sequences, but I am wondering if there is a
> danger that the 'current OID' might be getting artificially pushed
> toward its upper limit or something.
>
>
> 2. The [Ant] build script has no problem performing the necessary SQL
> inserts when I use a certain file that I have handwritten that contains
> nothing but CREATE TABLE and INSERT statements.  Fine.  However, if I
> need to make some change to the database, this means I also have to
> update the handwritten file (for instance, if I add new test data to the
> database or change the structure of a table).  This seems foolish, when
> I could simply export the database schema and data to a file and then
> use this file.
>
> However, the format used by pg_dump consists of COPY statements, not
> INSERT statements.  The format also assumes that the commands are being
> run from the shell, as there are certain escapes and references to
> stdin.  My build script does -not- run from a subshell, rather it is
> executed via JDBC through the Ant <sql> task, so the SQL parser chokes
> when it encounters some shell-specific escapes in the file.  Is there a
> way to run pg_dump so that the exported data resembles MySQL's mysqldump
> data (entirely CREATE TABLE and INSERT statements) instead of the COPY
> statements?
>
>
>
> Thank you so much if you can advise me.
>
>
> Erik
>
>
> ---------------------------(end of broadcast)---------------------------
> TIP 6: Have you searched our list archives?
>
> http://archives.postgresql.org
>
>
>
>
> ---------------------------(end of broadcast)---------------------------
> TIP 2: you can get off all lists at once with the unregister command
>     (send "unregister YourEmailAddressHere" to majordomo@postgresql.org)
>