Thread: index compatible date_trunc in postgres?

index compatible date_trunc in postgres?

From
Bryce Nesbitt
Date:
I've got a legacy app that does 8.3 incompatible date searches like so:
explain select count(*) from contexts where publication_date like '2006%';
explain select count(*) from contexts where publication_date like
'2006-09%';

I've got my choice of refactoring, but all these share the same
sequential scan limitation:
explain select count(*) from contexts where publication_date::text LIKE
'2006%';
explain select count(*) from contexts where
date_trunc('year',publication_date) = '2006-01-01';
explain select count(*) from contexts where extract('year' from
publication_date) = '2006';

Are there any other index compatible methods, other than turning it into
a range search?
explain select count(*) from contexts where publication_date >=
'2006-01-01' and publication_date < '2007-01-01';
explain select count(*) from contexts where publication_date >=
'2006-09-01' and publication_date < '2006-09-31 24:00:00';



Re: index compatible date_trunc in postgres?

From
"Scott Marlowe"
Date:
On Thu, Dec 18, 2008 at 10:46 PM, Bryce Nesbitt <bryce2@obviously.com> wrote:
> I've got a legacy app that does 8.3 incompatible date searches like so:
> explain select count(*) from contexts where publication_date like '2006%';
> explain select count(*) from contexts where publication_date like
> '2006-09%';
>
> I've got my choice of refactoring, but all these share the same
> sequential scan limitation:
> explain select count(*) from contexts where publication_date::text LIKE
> '2006%';
> explain select count(*) from contexts where
> date_trunc('year',publication_date) = '2006-01-01';
> explain select count(*) from contexts where extract('year' from
> publication_date) = '2006';
>
> Are there any other index compatible methods, other than turning it into
> a range search?
> explain select count(*) from contexts where publication_date >=
> '2006-01-01' and publication_date < '2007-01-01';
> explain select count(*) from contexts where publication_date >=
> '2006-09-01' and publication_date < '2006-09-31 24:00:00';

You can create an index on date_trunc (on timestamp without timezone,
but not on timestamp with timezone since it's not immutable)

create index mytable_datetrunc_month on mytable (date_trunc('month',
timestampfield));


Re: index compatible date_trunc in postgres?

From
Alvaro Herrera
Date:
Scott Marlowe escribió:
> On Thu, Dec 18, 2008 at 10:46 PM, Bryce Nesbitt <bryce2@obviously.com> wrote:
> > I've got a legacy app that does 8.3 incompatible date searches like so:
> > explain select count(*) from contexts where publication_date like '2006%';
> > explain select count(*) from contexts where publication_date like
> > '2006-09%';

> You can create an index on date_trunc (on timestamp without timezone,
> but not on timestamp with timezone since it's not immutable)

But you can create an index on the timestamptz AT TIME ZONE 'GMT', I
think.

-- 
Alvaro Herrera                                http://www.CommandPrompt.com/
The PostgreSQL Company - Command Prompt, Inc.


Re: index compatible date_trunc in postgres?

From
Tom Lane
Date:
Bryce Nesbitt <bryce2@obviously.com> writes:
> I've got a legacy app that does 8.3 incompatible date searches like so:
> explain select count(*) from contexts where publication_date like '2006%';
> explain select count(*) from contexts where publication_date like
> '2006-09%';

> I've got my choice of refactoring, but all these share the same
> sequential scan limitation:

Why are you worrying?  The old method surely didn't get indexed either.

(At least, it didn't unless you had an index on publication_date::text,
in which case just throwing in the cast will produce the same results
in 8.3.)
        regards, tom lane


Downgrade database and problem with sequences

From
"Sebastian Rychter"
Date:
I am currently using postgresql 8.3 and whenever I want to get the last
inserted ID use :
Select currval(pg_get_serial_sequence(table, field))

Then, I had to downgrade a database for certain customer to use postgresql
8.1, and when I execute that same function I found that
pg_get_serial_sequence returns nothing.

Is there any way to dump my database and then restore it on a previous
version and keep using that pg_get_serial_sequence()  function ? 

Thanks,

Sebastian



Re: Downgrade database and problem with sequences

From
Tom Lane
Date:
"Sebastian Rychter" <srychter@anvet.com.ar> writes:
> I am currently using postgresql 8.3 and whenever I want to get the last
> inserted ID use :
> Select currval(pg_get_serial_sequence(table, field))

> Then, I had to downgrade a database for certain customer to use postgresql
> 8.1, and when I execute that same function I found that
> pg_get_serial_sequence returns nothing.

Probably because the column wasn't actually declared SERIAL to 8.1.
I imagine you ignored the failure of the ALTER SEQUENCE OWNED BY
command in the 8.3 dump, but that is exactly what the missing piece
is here.  8.1 did not have an explicit OWNED BY command, and the only
way to make the right dependency magic happen was to use the SERIAL
keyword while creating the table.
        regards, tom lane


Re: index compatible date_trunc in postgres?

From
Bryce Nesbitt
Date:
Tom Lane wrote:
> Why are you worrying?  The old method surely didn't get indexed either.

Continuous improvement?

Since there already IS an index available, I figure I might as well use 
it, especially since this DB had real performance issues.  The table 
itself is medium sized in our world, at 647596 rows.


Scott Marlowe wrote:
> You can create an index on date_trunc (on timestamp without timezone,
> but not on timestamp with timezone since it's not immutable)
That could work...
create index brycetest5 on contexts (date_trunc('month',publication_date));
(13 MB index, same size as the index on the non-truncated date).

Unfortunately complains about immutability:
create index brycetest5 on contexts (to_char(publication_date,'YYYY-MM'));

But in this case I think I'll rewrite the query into a range, and stick 
with the present index. Thanks for your help.