On Thu, Jan 18, 2001 at 11:38:38PM -0500, Tom Lane wrote:
> Konstantinos Agouros <elwood@agouros.de> writes:
> > Is there a way in postgres to make use of the extra cpu(s) the
> > machine has for the single tasks of importing the data and doing the
> > somewhat intensive selects that result from the sheer amount of data.
>
> Maybe I'm missing something, but it seems like all you need to do is
> run the data import and the selects in different processes (multiple
> backends).
>
> There isn't any way to apply multiple CPUs in a single SELECT, if that's
> what you were hoping for. Perhaps you could break down the data
> reduction task into independent subqueries, but that will take some
> thought :-(
This was what I was hoping for... thanks for the quick response.
Konstantin
>
> regards, tom lane
--
Dipl-Inf. Konstantin Agouros aka Elwood Blues. Internet: elwood@agouros.de
Otkerstr. 28, 81547 Muenchen, Germany. Tel +49 89 69370185
----------------------------------------------------------------------------
"Captain, this ship will not sustain the forming of the cosmos." B'Elana Torres