Re: Replicating hundreds of thousandw of rows - Mailing list pgsql-general

From Simon Riggs
Subject Re: Replicating hundreds of thousandw of rows
Date
Msg-id CANP8+jJo121812Yq3H4DVAnVEbE9+zpZJBM6rwPnn8dx9p8EHw@mail.gmail.com
Whole thread Raw
In response to Replicating hundreds of thousandw of rows  (Job <Job@colliniconsulting.it>)
List pgsql-general
On 25 November 2016 at 06:23, Job <Job@colliniconsulting.it> wrote:
> Hello,
>
> we need to replicate hundreds of thousands of rows (for reporting) between Postgresql Database nodes that are in
differentlocations. 
>
> Actually, we use Rubyrep with Postgresql 8.4.22.

8.4 is now end-of-life. You should move to the latest version.

> It works fine but it is very slow with a massive numbers of rows.
>
> With Postgresql 9.x, are there some ways to replicate (in background, not in real time!), these quantities of data?
> We need a periodical syncronization..,

You have a choice of

* Physical streaming replication, built-in from 9.0+
* Logical streaming replication, partially built in from 9.4+ using pglogical
and
* Logical streaming replication, built in from 10.0+ (not yet released)

Performance is much better than rubyrep

--
Simon Riggs                http://www.2ndQuadrant.com/
PostgreSQL Development, 24x7 Support, Remote DBA, Training & Services


pgsql-general by date:

Previous
From: Job
Date:
Subject: Replicating hundreds of thousandw of rows
Next
From: Adrian Klaver
Date:
Subject: Re: pg_am access in simple transaction?