On Wed, May 23, 2012 at 5:11 PM, Herouth Maoz <herouth@unicell.co.il> wrote:
> A replication solution is not very good, either, because of course I can't define indexes differently, I don't want
*all*transactions in all tables to be sent, and also, because I may want to cross reference data from different
systems.So ideally, I want to have a reporting database, where specific tables (or maybe even just specific columns)
fromvarious databases are collected, and have a reporting tool connect to this database. But I want to push the data
intointo that database as close to real time as possible.
Look at PgQ from Skytools. You can queue your OLTP data changes and
restore specific columns only into your OLAP database.
>
> The most important data I am currently considering are two tables which have an average of 7,600 transactions per
hour(standard deviation 10,000, maximum in May is 62,000 transactions per hour). There may be similar pairs of tables
collectedfrom more than one database.
>
> I assume this is not an uncommon scenario. What solutions would you recommend?
>
>
> Herouth
> --
> Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
> To make changes to your subscription:
> http://www.postgresql.org/mailpref/pgsql-general
--
Sergey Konoplev
a database and software architect
http://www.linkedin.com/in/grayhemp
Jabber: gray.ru@gmail.com Skype: gray-hemp Phone: +79160686204