Re: array support patch phase 1 patch - Mailing list pgsql-patches
From | Joe Conway |
---|---|
Subject | Re: array support patch phase 1 patch |
Date | |
Msg-id | 3E80EC54.2040203@joeconway.com Whole thread Raw |
In response to | Re: array support patch phase 1 patch (Tom Lane <tgl@sss.pgh.pa.us>) |
Responses |
Re: array support patch phase 1 patch
|
List | pgsql-patches |
Tom Lane wrote: > I looked over this patch a little bit, mostly at the anyarray/anyelement > changes; I didn't study the ArrayExpr stuff (except to notice that yeah, > the grammar change is very ugly; can't it be made independent of the > number of levels?). I struggled with it for quite a while, but then again, I can't claim yacc grammar as a particularly strong point. I kept running into reduce/reduce errors with anything completely recursive, or when it did work, I found myself restricted to one level of nesting. I'll try again -- any suggestions for something similar to study? > parameter types could be made visible to the function. I see your concern > here, but I think the only adequate solution is to make sure the function > can learn the types of all its arguments, as well as the assigned return > type. (Of course it could deduce the latter from the former, but we may > as well save it the effort, especially since it would have to repeat > parse-time catalog lookups in many interesting cases.) I came to the same conclusion with ArrayExpr/ArrayExprState -- you can always deduce the array type from the element type and vice-versa, but it seemed costly to do in the executor when it can be done just once in the parser. > A notion that I've toyed with more than once is to allow a function > to get at the parse tree representation for its call (ie, the FuncExpr > or OpExpr node). If we did that, a function could learn all these > types by examining the parse tree, and it could learn other things > besides. A disadvantage is that functions would have to be ready to > cope with all the wooliness of parse trees. It'd probably be bright > to make some convenience functions "get my return type", "get type > of my n'th argument" to localize this logic as much as possible. OK -- I'll take a look at that option. > In short then, what about passing an Expr* link in FmgrInfo as a > substitute for actualRetType? (An additional advantage of this way > is that it's obvious whether or not the information has been provided, > which it would not be for, say, functions invoked via DirectFunctionCallN. > The patch as it stands fails silently if a polymorphic function is called > from a call site that doesn't provide the needed info.) Sounds good to me -- thanks for the quick feedback! A question -- I was thinking that we will need to allow one array type to be coerced into another in this brave new world (e.g. ARRAY[1,2,3]::float8[] results in "ERROR: Cannot cast type integer[] to double precision[]"). My plan was to check the source and target datatypes in can_coerce_type() and coerce_type() to see if both are array types. If so, use the element types to determine whether we can coerce, and loop over the array to coerce, etc. There are a few issues that I can think of with this (and undoubtedly you will have some others ;-)): - this would add at least one cache lookup to every call to can_coerce_type - if we pass by can_coerce_type with an array, we'd need to do the same cache lookup in coerce_type unless we modify the api of the related functions to pass along the "isarray" status of the datatype - currently the only reliable(?) method to determine if a datatype is a varlena array is to check for '_' as the first character of the type name -- it seems wrong to depend on that forever. Any suggestions? I was toying with the idea that an isarray attribute should be added to pg_type -- thoughts on that? Joe
pgsql-patches by date: