Re: question - Mailing list pgsql-general

From Adrian Klaver
Subject Re: question
Date
Msg-id 561FC068.4020700@aklaver.com
Whole thread Raw
In response to question  (anj patnaik <patna73@gmail.com>)
Responses Re: question
List pgsql-general
On 10/14/2015 06:39 PM, anj patnaik wrote:
> Hello,
>
> I recently downloaded postgres 9.4 and I have a client application that
> runs in Tcl that inserts to the db and fetches records.
>
> For the majority of the time, the app will connect to the server to do
> insert/fetch.
>
> For occasional use, we want to remove the requirement to have a server
> db and just have the application retrieve data from a local file.
>
> I know I can use pg_dump to export the tables. The questions are:
>
> 1) is there an in-memory db instance or file based I can create that is
> loaded with the dump file? This way the app code doesn't have to change.

No.

>
> 2) does pg support embedded db?

No.

> 3) Or is my best option to convert the dump to sqlite and the import the
> sqlite and have the app read that embedded db.

Sqlite tends to follow Postgres conventions, so you might be able to use
the pg_dump output directly if you use --inserts or --column-inserts:

http://www.postgresql.org/docs/9.4/interactive/app-pgdump.html

>
> Finally, I am noticing pg_dump takes a lot of time to create a dump of
> my table. right now, the table  has 77K rows. Are there any ways to
> create automated batch files to create dumps overnight and do so quickly?

Define long time.

What is the pg_dump command you are using?

Sure use a cron job.

>
> Thanks for your inputs!


--
Adrian Klaver
adrian.klaver@aklaver.com


pgsql-general by date:

Previous
From: Joe Conway
Date:
Subject: Re: postgres function
Next
From: Geoff Winkless
Date:
Subject: Re: postgres function