6 aug 2007 kl. 21:47 skrev Heikki Linnakangas:
> Henrik Zagerholm wrote:
>> I know the query retrieves way more which is really necessary to
>> show to
>> the user so I would gladly come up with a way to limit the query
>> so the
>> GUI doesn't hang for several minutes if a user does a bad search.
>> The problem is that I don't know a good way of limit the search
>> efficiently as only going on tbl_file with limit 100 could make the
>> query only to return 10 rows if the user doesn't have access to
>> 900 of
>> the files (This is what the join with tbl_acl does). Using cursors
>> doesn't help because I really don't retrieve that much data
>
> Could you just add a LIMIT 100 to the end of the query, if 100 rows is
> enough? That would cut the runtime of the query, if there's a quicker
> plan to retrieve just those 100 rows.
As you can see in the query I already have a limit 20 and it doesn't
make any difference as the query still does the big joins between
tbl_file, tbl_file_structure and tbl_acl.
This is why I think I have to come up with a way of using sub select
with internal limits. Maybe have a cursor like procedure using these
so I always get the correct number of lines back.
>
> Another alternative is to use statement_timeout. If a query takes
> longer
> than specified timeout, it's automatically aborted and an error is
> given.
Interesting! I'll take a look at that.
Thanks,
henrik
>
> --
> Heikki Linnakangas
> EnterpriseDB http://www.enterprisedb.com