Hi, this is a general RDBMS question, not specific to pg. It occurred to
me while I was trying to design an interface between application and
SQL.
Suppose that the user fills in a complex query form, and you are coding
the application that converts the user's input to a where clause. It may
prove beneficial if you construct a treelike structure like this:
AND
|
+-OR
| |
| + Condition A
| |
| + Condition B
|
+-OR
|
+ Condition C
|
+ AND
|
+ Condition D
|
+ Condition E
|
+ Condition F
This would become
WHERE (A OR B) AND (C OR (D AND E AND F))
It seems complex at first, but the code will be cleaner, scale better,
and be made portable easier if you are adding nodes and leaves to a tree
as you are scanning the user's input, than if you try to construct a
where clause directly. After finishing with the tree, it is
straightforward to convert it to a where clause, after which you send
the SQL to the RDBMS.
What will the RDBMS do next? It will parse your SQL statement and
presumably convert it to a tree of conditions. Well, I had that ready in
the first place!
Whether my idea about the tree is good or not, it is true that the
application initially has its data in some data structures suitable for
computers rather than humans; it converts them to SQL, which is suitable
for humans, only so that the SQL will be converted back to structures
suitable for computers. The most obvious example is that integers are
converted to decimal by the application only to be converted back to
binary by the RDBMS.
I understand that SQL is the interface between apps and RDBMS's because
of history, not because it is correct design. Could you point me to a
link or book or paper that deals with this paradox? Thanks!