Re: WIP json generation enhancements: fk-tree-to-json() - Mailing list pgsql-hackers
| From | Hannu Krosing |
|---|---|
| Subject | Re: WIP json generation enhancements: fk-tree-to-json() |
| Date | |
| Msg-id | 50B3CC6D.2090100@krosing.net Whole thread Raw |
| In response to | Re: WIP json generation enhancements (Dimitri Fontaine <dimitri@2ndQuadrant.fr>) |
| List | pgsql-hackers |
On 11/22/2012 11:54 AM, Dimitri Fontaine wrote:
> Andrew Dunstan <andrew@dunslane.net> writes:
>> Here is a WIP patch for enhancements to json generation.
>>
>> First, there is the much_requested json_agg, which will aggregate rows
>> directly to json. So the following will now work:
>>
>> select json_agg(my_table) from mytable;
>> select json_agg(q) from (<myquery here>) q;
> Awesome, thanks!
>
> How do you handle the nesting of the source elements? I would expect a
> variant of the aggregate that takes two input parameters, the datum and
> the current nesting level.
>
> Consider a tree table using parent_id and a recursive query to display
> the tree. You typically handle the nesting with an accumulator and a
> call to repeat() to prepend some spaces before the value columns. What
> about passing that nesting level (integer) to the json_agg()?
It still would not produxe nesting, just a nicer format.
If you want real nesting, you may want a version of my pl/python function
row-with-all-dependents-by-foreign-key-to-json()
which outputs a table row and then recursively all rows from other (or the same) table which have a foreign key
relationshipto this row
I use it to backup removed objects.
I would love to have something similar as a built-in function, though
the current version
has some limitations and lacks some checks, like check for FK loops.
Function follows:
-------------------------------------------------------
CREATE OR REPLACE FUNCTION record_to_json_with_detail(table_name text,
pk_value int) RETURNS text AS $$
import json,re
def fk_info(table_name): fkplan = plpy.prepare(""" SELECT conrelid::regclass as reftable,
pg_get_constraintdef(c.oid)as condef FROM pg_constraint c WHERE c.confrelid::regclass = $1::regclass AND
c.contype= 'f' """, ("text",)) cdefrx = re.compile('FOREIGN KEY [(](.*)[)] REFERENCES .*[(](.*)[)].*') fkres =
plpy.execute(fkplan,(table_name,)) for row in fkres: reffields, thisfields =
cdefrx.match(row['condef']).groups() yield thisfields, row['reftable'],reffields
def select_from_table_by_col(table_name, col_name, col_value): qplan = plpy.prepare('select * from %s where %s = $1'
%
(table_name, col_name), ('int',)) return plpy.execute(qplan, (col_value,))
def recursive_rowdict(table_name, row_dict): rd = dict([(a,b) for (a,b) in row_dict.items() if b is not None]) #
skip NULLs rd['__row_class__'] = table_name for id_col, ref_tab, ref_col in fk_info(table_name): q2res =
select_from_table_by_col(ref_tab,
ref_col,row_dict[id_col]) if q2res: try: rd['__refs__::' + id_col] +=
[recursive_rowdict(ref_tab,row) for row in q2res] except KeyError: rd['__refs__::' + id_col]
=
[recursive_rowdict(ref_tab,row) for row in q2res] return rd
q1res = select_from_table_by_col(table_name, 'id', pk_value)
return json.dumps(recursive_rowdict(table_name, q1res[0]), indent=3)
$$ LANGUAGE plpythonu;
create table test1(id serial primary key, selfkey int references test1,
data text);
create table test2(id serial primary key, test1key int references test1,
data text);
insert into test1 values(1,null,'top');
insert into test1 values(2,1,'lvl1');
insert into test2 values(1,1,'lvl1-2');
insert into test2 values(2,2,'lvl2-2');
select record_to_json_with_detail('test1',1); record_to_json_with_detail
------------------------------------------- { "__row_class__": "test1", "data": "top", "id": 1,
"__refs__::id":[ { "__row_class__": "test1", "selfkey": 1, "data": "lvl1",
"id":2, "__refs__::id": [ { "__row_class__": "test2", "test1key": 2,
"data": "lvl2-2", "id": 2 } ] }, {
"__row_class__":"test2", "test1key": 1, "data": "lvl1-2", "id": 1 } ] }
(1 row)
Time: 6.576 ms
---------------------------------------
Hannu Krosing
pgsql-hackers by date: