Storing the XML text has problems - you have to parse it every time you
want something - that has to cause a huge performance hit.
I use XML a lot for all sorts of purposes, but it is appropriate for
data transfer rather than data storage, IMNSHO.
cheers
andrew
Christopher Kings-Lynne wrote:
> Do this:
>
> 1. Create a new type called 'xml', based on text.
>
> 2. The xmlin function for that type will validate what you are
> enterering is XML
>
> 3. Create new functions to implement XPath, SAX, etc. on the xml type.
>
> 4. Extend the contrib/ltree gist-based tree indexing scheme to work on
> xml and hence the operations in no.3 above are really fast...
>
> Chris
>
>
> Andrew Dunstan wrote:
>
>> Christopher Browne wrote:
>>
>>> But I think back to the XML generator I wrote for GnuCash; it has the
>>> notion of building up a hierarchy of entities and attributes, each of
>>> which is visible as an identifyable object of some sort. Mapping that
>>> onto a set of PostgreSQL relations wouldn't work terribly well.
>>>
>>>
>>
>> *nod* I have tried this several times - it just doesn't work well,
>> because the maps are too different.
>>
>> You could do something like this:
>> . a table for each element type, fields being the attributes, plus
>> the node id.
>> . a table to tie everything together (parent_id, child_id,
>> child_order, child_type).
>> In theory you could even generate the DB schema from an XML schema
>> and evaluate it with XPath-like expressions.
>>
>> But why put yourself to such bother? I have never found a good reason
>> to do this sort of thing.
>>
>> cheers
>>
>> andrew
>>
>>
>> ---------------------------(end of broadcast)---------------------------
>> TIP 7: don't forget to increase your free space map settings
>
>
>
> ---------------------------(end of broadcast)---------------------------
> TIP 7: don't forget to increase your free space map settings
>