Yes, a limit should speed up your query from the sounds of things.
Is a table with 10 million of rows with a primary key but conditions used in select are not part of PK.
My table is like this:
Column 1 – PK
Column 2 – Indexed
Column 3 – Indexed
Column 2 and Column 3 are Indexed in the same index but they are not mark as unique or PK
The Query is like this:
Select column1 from myTable where Column2 between X and Y
I am expecting just one record
This really depends on the type of query you’re talking about. If there’s only one row in the table you’re querying then no, I don’t think it’ll change anything. If you’re querying a single row using a primary key it shouldn’t change anything. If you’re doing an aggregate query, say a sum of a bunch of rows, it also won’t improve performance.
If you’re doing a query on a table with multiple rows and not filtering by a primary key or other unique index then yes, it will improve the query.
Hi
I really like to do efficient SQL queries so, my question is if I am expecting no more than one row from a select, using the LIMIT 1 could improve the performance?
If I use my logic, the LIMIT 1 instruction tell to postgres that stop searching when found 1 record, but maybe it is unnecessary
Thanks
Anibal