Thread: Performance of a single (big) select and Multiprocessor Machines

Performance of a single (big) select and Multiprocessor Machines

From
Konstantinos Agouros
Date:
Hi,

I am currently setting up a logfile-analysis using postgres. The logfiles
are big (some 1 of 2 mio entries a day) which I reduce somewhat. Here comes
my question. Is there a way in postgres to make use of the extra cpu(s) the
machine has for the single tasks of importing the data and doing the
somewhat intensive selects that result from the sheer amount of data.
Today I talked to the Oracle guys at the company they told me one can make
Oracle do that. But who wants to use (and set up) Oracle.

Konstantin

P.S.: Are there other opimization tips besides setting up indizes and
learn to do better sql?
--
Dipl-Inf. Konstantin Agouros aka Elwood Blues. Internet: elwood@agouros.de
Otkerstr. 28, 81547 Muenchen, Germany. Tel +49 89 69370185
----------------------------------------------------------------------------
"Captain, this ship will not sustain the forming of the cosmos." B'Elana Torres

Re: Performance of a single (big) select and Multiprocessor Machines

From
Tom Lane
Date:
Konstantinos Agouros <elwood@agouros.de> writes:
> Is there a way in postgres to make use of the extra cpu(s) the
> machine has for the single tasks of importing the data and doing the
> somewhat intensive selects that result from the sheer amount of data.

Maybe I'm missing something, but it seems like all you need to do is
run the data import and the selects in different processes (multiple
backends).

There isn't any way to apply multiple CPUs in a single SELECT, if that's
what you were hoping for.  Perhaps you could break down the data
reduction task into independent subqueries, but that will take some
thought :-(

            regards, tom lane

Re: Performance of a single (big) select and Multiprocessor Machines

From
Konstantinos Agouros
Date:
On Thu, Jan 18, 2001 at 11:38:38PM -0500, Tom Lane wrote:
> Konstantinos Agouros <elwood@agouros.de> writes:
> > Is there a way in postgres to make use of the extra cpu(s) the
> > machine has for the single tasks of importing the data and doing the
> > somewhat intensive selects that result from the sheer amount of data.
>
> Maybe I'm missing something, but it seems like all you need to do is
> run the data import and the selects in different processes (multiple
> backends).
>
> There isn't any way to apply multiple CPUs in a single SELECT, if that's
> what you were hoping for.  Perhaps you could break down the data
> reduction task into independent subqueries, but that will take some
> thought :-(
This was what I was hoping for... thanks for the quick response.

Konstantin
>
>             regards, tom lane

--
Dipl-Inf. Konstantin Agouros aka Elwood Blues. Internet: elwood@agouros.de
Otkerstr. 28, 81547 Muenchen, Germany. Tel +49 89 69370185
----------------------------------------------------------------------------
"Captain, this ship will not sustain the forming of the cosmos." B'Elana Torres