Re: Using Postgres to store high volume streams of sensor readings - Mailing list pgsql-general

From Ciprian Dorin Craciun
Subject Re: Using Postgres to store high volume streams of sensor readings
Date
Msg-id 8e04b5820811210852k6ce7a7b6ub43b4368de33fc8c@mail.gmail.com
Whole thread Raw
In response to Re: Using Postgres to store high volume streams of sensor readings  (Tom Lane <tgl@sss.pgh.pa.us>)
Responses Re: Using Postgres to store high volume streams of sensor readings  (Tom Lane <tgl@sss.pgh.pa.us>)
List pgsql-general
On Fri, Nov 21, 2008 at 6:06 PM, Tom Lane <tgl@sss.pgh.pa.us> wrote:
> "Ciprian Dorin Craciun" <ciprian.craciun@gmail.com> writes:
>>     In short the data is inserted by using COPY sds_benchmark_data
>> from STDIN, in batches of 500 thousand data points.
>
> Not sure if it applies to your real use-case, but if you can try doing
> the COPY from a local file instead of across the network link, it
> might go faster.  Also, as already noted, drop the redundant index.
>
>                        regards, tom lane


    Hy!

    It won't be that difficult to use a local file (now I'm using the
same computer), but will it really make a difference? (I mean have you
seen such issues?)

    Thanks,
    Ciprian Craciun.

pgsql-general by date:

Previous
From: "Ciprian Dorin Craciun"
Date:
Subject: Re: Using Postgres to store high volume streams of sensor readings
Next
From: Tom Lane
Date:
Subject: Re: Postgres mail list traffic over time