On Fri, Dec 14, 2012 at 9:52 AM, joshua <jzuellig@arbormetrix.com> wrote:
> Tom-
> My apologies, I'm still somewhat new to this. Specifically, I'm dealing with
> COPY FROM CSV. I had assumed that since a csv is essentially a pile of text
> and COPY FROM is smart enough to interpret all sorts of csv entries into
> postgresql data types that if I wanted to allow a nonstandard conversion,
> I'd have to define some sort of cast to allow COPY FROM to interpret, say
> ...,green,... as {'green}.
>
> Merlin-
> I could set this up to use a staging table, but honestly, given our systems,
> it'd be easier for me to change all of our source csv's to simply read
> ...,{abc},... instead of ...,abc,... than to change our code base to use a
> series of staging tables (we will be using brackets in the future; this is
> more of a backwards compatibility issue). Especially since it currently
> doesn't have to inspect the target data type of columns we load up, it
> simply allows the COPY FROM command to do all of the interpreting which
> brings me back to my original point. :)
If input csv doesn't match your destination structure, then staging
the input to a temporary work table and processing the transformation
with a query is really the way to go. Hacking casts is about as ugly
as it gets.
merlin