Re: is this a good approach? - Mailing list pgsql-sql

From Michael Moore
Subject Re: is this a good approach?
Date
Msg-id CACpWLjNUH6=KsD8agoacgn6kFHtNvK7pS=5gmdU4w7NYjFG2qQ@mail.gmail.com
Whole thread Raw
In response to Re: is this a good approach?  ("David G. Johnston" <david.g.johnston@gmail.com>)
List pgsql-sql


On Fri, May 13, 2016 at 6:50 PM, David G. Johnston <david.g.johnston@gmail.com> wrote:
On Fri, May 13, 2016 at 5:52 PM, Michael Moore <michaeljmoore@gmail.com> wrote:
I have a function that accepts about 20 input parameters. Based on the values of these parameters a SELECT statement will be constructed. The SELECT will be from two main tables, but based on parameters it might join up to 3 additional tables. the WHERE clause will have many possibilities too. The where clause will reference various input parameters. In other words, it's a hot mess and only god knows what the SELECT statement will look like by the time all the pieces are lashed together. The only thing that won't vary is the column list. That is fixed. 

All of the above is pretty much not up for redesign as I am duplicating some functionality that exists in Oracle.  One of the goals of my design is to minimize the number of distinct cursors that are the result of cursor parses. To that end I will be using the USING clause. For example

sql_select :=
  'SELECT vendor_key::text rslt from tx_vendor where vendor_key = any ($1)';
return query execute sql_select using v_blurb_type_codes;

The problem is, I will have a variable number of USING arguments with differing datatypes depending on how the SELECT statement gets built. 

So, my brilliant idea is to load the needed values into a JSONB where the key to each of the jsonb values is a sequential number. It would look something like:
{"1":"good","2":"1234,4321,7787","3":"overtime"}

Then, when I go to execute the query it will look like
case countOfParmsUsedInSQL
when 1 then
    return query execute sql_select using json_vars::jsonb->>'1';
when 2 then
    return query execute sql_select using json_vars::jsonb->>'1',json_vars::jsonb->>'2';
when 3 then
    return query execute sql_select using json_vars::jsonb->>'1',json_vars::jsonb->>'2',json_vars::jsonb->>'3';
end; 

So, my questions are:
1) Will this work? I am concerned that I will have trouble with the datatypes in jsonb. i.e. for example, sometimes parameter 2 might be a NUMBER and other times it might be ::bigint[] (bigint[] would be used in cases like  $1 shown in my first example above. 

​I know you said ignore this but it should be at least technically possible (though never tried it myself) - though maybe hstore would be a better serialization mechanism.​  You'd have to ensure that values are all valid text/literal inputs for the specified parameters which themselves would have casts on them.  So key 2 would have to be <'{1234,4321,7787}'> which is then cast as <::bigint[]?

The problem here is that JSON does not retain the datatype of the variable you store hence (as you mentioned) you have to cast it as you retrieve the value. Sometimes you might need to cast say the VALUE of the 3rd KEY to  bigint and other time it might have to be cast to text[]. So at this point I said to myself, "Why not also store the DATATYPE's name along with the VALUE?" Then the problem becomes, How do I cast a VALUE to a DATATYPE who's name is in a variable? [insert time travel paradox headache here]

So my latest thought is to retain the JSON idea but also write a series of functions each with the simple job of performing a cast to specific datatype. I.e. funCastToTextArray() returns text[] ; funCastToBigInt() returns bigint;   funCastToTimestamp returns timestamp.

Then as I am building by SQL string, do something like:

'where myvar = any ( funCastToTextArray($1))  and bla bla bla'

So now everything I store in my JSON values column will be a string aka text. 
Now the afore mentioned CASE statement should work fine. But what will that do to the execution plan? I guess I'll have to find out. 

2) Is there a better way to avoid string concatenation of parameters which will result in thousands of hard parses? Oracle has dynamic binding variables to help solve this problem.
I only mention this in case I have not explained adequately the nature of the problem I am trying to solve.  


Custom GUC session globals?

set_config('prefix.name', 'value')
current_setting('prefix.name') --> value

This becomes nicer in 9.6 with the ability to supply a default for a missing variable on the current_setting call.

A quick look at "context" in the Oracle link this is what looks to be our closest equivalent.

In more recent versions you will get an error if the variable prefix.name does not exist so you either need to dynamically ensure it is not requested when not needed or structure code so that default values to setup and then overridden by user-defined values.

​David J.

Cool, I'll experiment with GUC session globals  a bit. 
Thanks David,
Mike

pgsql-sql by date:

Previous
From: "David G. Johnston"
Date:
Subject: Re: is this a good approach?
Next
From: Ezequiel Luis Pellettieri
Date:
Subject: Re: Create a trigger only for certain users