Thread: Big array speed issues

Big array speed issues

From
"Merkel Marcel (CR/AEM4)"
Date:

Hi,

I have some speed issues with a big array in a table. I hope you can help me to tune my query.

My table looks like this:

Id                |  timestamp  | map
Primary key |  timestamp  | array of real [34][28]

With an index on timestamp

My query is the following:

Select map[1,1], map[1,2] …. Map[34,28] from table where timestamp > x and timestamp < y order by timestamp

Expected return is about 5000 rows of the table. I have to run this query multiple times with different x and y values

The table is huge (about 60000 entries) but will get even much more bigger.

The query takes ages on a 3.GhZ Xeon processor with 2 GB RAM.  I'm using postgresql 7.4 .

Any hints how I can speedup this ?  (use postgres 8.1, change table setup, query one row or column of the array )

I use libpqxx to access the database. This might be another bottleneck, but I assume my query and table setup is the bigger bottleneck. Would it make sense to fetch the whole array ? (Select map from table where …  and parse the array manually)

Thanks for your help.

Marcel

Re: Big array speed issues

From
"Merlin Moncure"
Date:
On 6/20/06, Merkel Marcel (CR/AEM4) <Marcel.Merkel@de.bosch.com> wrote:

> I use libpqxx to access the database. This might be another bottleneck, but
> I assume my query and table setup is the bigger bottleneck. Would it make
> sense to fetch the whole array ? (Select map from table where …  and parse
> the array manually)

have you tried similar approach without using arrays?

merlin

Re: Big array speed issues

From
"Merkel Marcel (CR/AEM4)"
Date:

Von: Merlin Moncure [mailto:mmoncure@gmail.com]
An: Merkel Marcel (CR/AEM4)
Cc: pgsql-performance@postgresql.org
Betreff: Re: [PERFORM] Big array speed issues

On 6/20/06, Merkel Marcel (CR/AEM4) <Marcel.Merkel@de.bosch.com> wrote:

> I use libpqxx to access the database. This might be another
bottleneck, but
> I assume my query and table setup is the bigger bottleneck. Would it
make
> sense to fetch the whole array ? (Select map from table where ...  and
parse
> the array manually)

have you tried similar approach without using arrays?

Merlin


Not yet. I would first like to know what is the time consuming part and
what is a work around. If you are sure individual columns for every
entry of the array solve the issue I will joyfully implement it. The
downsize of this approch is that the array dimensions are not always the
same in my scenario. But I have a workaround in mind for this issue.

Cheers

Marcel




Re: Big array speed issues

From
"Jim C. Nasby"
Date:
On Wed, Jun 21, 2006 at 09:29:03AM +0200, Merkel Marcel (CR/AEM4) wrote:
>
>
> Von: Merlin Moncure [mailto:mmoncure@gmail.com]
> An: Merkel Marcel (CR/AEM4)
> Cc: pgsql-performance@postgresql.org
> Betreff: Re: [PERFORM] Big array speed issues
>
> On 6/20/06, Merkel Marcel (CR/AEM4) <Marcel.Merkel@de.bosch.com> wrote:
>
> > I use libpqxx to access the database. This might be another
> bottleneck, but
> > I assume my query and table setup is the bigger bottleneck. Would it
> make
> > sense to fetch the whole array ? (Select map from table where ...  and
> parse
> > the array manually)
>
> have you tried similar approach without using arrays?
>
> Merlin
>
>
> Not yet. I would first like to know what is the time consuming part and
> what is a work around. If you are sure individual columns for every
> entry of the array solve the issue I will joyfully implement it. The
> downsize of this approch is that the array dimensions are not always the
> same in my scenario. But I have a workaround in mind for this issue.

Before mucking about with the code, I'd absolutely try 8.1. I've
generally seen it double the performance of 7.4.

Also, output from EXPLAIN ANALYZE would make it a lot easier to figure
out what the issue is, and it would be good to try this without
selecting any of the arrays.
--
Jim C. Nasby, Sr. Engineering Consultant      jnasby@pervasive.com
Pervasive Software      http://pervasive.com    work: 512-231-6117
vcard: http://jim.nasby.net/pervasive.vcf       cell: 512-569-9461

Re: Big array speed issues

From
"Merlin Moncure"
Date:
> Not yet. I would first like to know what is the time consuming part and
> what is a work around. If you are sure individual columns for every
> entry of the array solve the issue I will joyfully implement it. The
> downsize of this approch is that the array dimensions are not always the
> same in my scenario. But I have a workaround in mind for this issue.

The first thing I would try would be to completely normalize te file, aka

create table data as
(
  id int,
  t timestamp,
  map_x int,
  map_y int,
  value float
);

and go with denormalized approach only when this doesn't work for some reason.

merlin