Re: best way to write large data-streams quickly? - Mailing list pgsql-general

From Jerry Sievers
Subject Re: best way to write large data-streams quickly?
Date
Msg-id 8737039bm7.fsf@jsievers.enova.com
Whole thread Raw
In response to Re: best way to write large data-streams quickly?  (Mark Moellering <markmoellering@psyberation.com>)
List pgsql-general
Mark Moellering <markmoellering@psyberation.com> writes:

<snip>

>
> How long can you run COPY?  I have been looking at it more closely. 
> In some ways, it would be simple just to take data from stdin and
> send it to postgres but can I do that literally 24/7?  I am
> monitoring data feeds that will never stop and I don't know if that
> is how Copy is meant to be used or if I have to let it finish and
> start another one at some point? 

Launch a single copy and pipe data into it for an extended period an/or
bulk is fine but nothing will be visible until the statement is finished
and, if it were run in a transaction block, the block committed.

HTH

>
> Thanks for everyones' help and input!
>
> Mark Moellering
>
>
>
>

--
Jerry Sievers
Postgres DBA/Development Consulting
e: postgres.consulting@comcast.net
p: 312.241.7800


pgsql-general by date:

Previous
From: Raghavendra Rao J S V
Date:
Subject: Planning to change autovacuum_vacuum_scale_factor value to zero.Please suggest me if any negative impact.
Next
From: "Jehan-Guillaume (ioguix) de Rorthais"
Date:
Subject: Re: Postgresql Split Brain: Which one is latest