I'm trying to reliably determine whether Use Declare/Fetch cursors (Abbreviation: B6) is in use for a given DSN based on how my program sees the connection string. Unfortunately the B6 value doesn't show--but it appears if I could properly decode the CX= piece I can determine whether Use Declare/Fetch is active.
Here is an example string returned by my app: DSN=psql-mydsn;DATABASE=mydb;SERVER=localhost;PORT=5432;UID=postgres;PWD=mypassword;CA=d;A6=;A7=100;A8=4096;B0=255;B1=8190;BI=0;C2=dd_;;CX=1b543b8;A1=7.4-1
for the above dsn I set to use defaults, then just set the UseDeclare Fetch in the system dsn.
Other examples for CX= CX=1b503ba (all defaults--Use Declare/Fetch disabled by default) CX=1b547ba (all defaults, but CommLog set to record) CX=1b40b0 (all check boxes unchecked, except Use Declare/Fetch is checked--note this example has one less hex character--I'm thinking maybe a leading 0 is ignored, but not sure for which pair (1b, 40, or b0).
I looked at the source, but it would take me a while to decipher this--I was hoping somebody might have a hint.