On Tue, Nov 22, 2011 at 10:21 AM, Scott Marlowe <scott.marlowe@gmail.com> wrote:
> On Tue, Nov 22, 2011 at 7:50 AM, Reid Thompson <reid.thompson@ateb.com> wrote:
>> Note that I manually added the 4 to the desired size of 35..again, for
>> some legacy reasons inside PG. Done. That's it. Should we check?
>>
>> d TABLE1
>>
>> TABLE "public.TABLE1"
>> COLUMN | TYPE | Modifiers
>> --------+-----------------------+-----------
>> COL1 | CHARACTER VARYING(35) |
>>
>> Such a simple yet effective trick. Of course it'd be nicer if this is
>> somehow included in a more proper way in the database, but this does the
>> job.
>
> Note that this method works around all the safe guards etc that make
> sure your data is safe and coherent. It works, as long as you're
> careful what you're doing.
>
And by careful, be aware that there are certainly considerations you
need to have for indexes and/or partition mismatches that might be
involved here. At a minimum I'd suggest upgrading to 8.3.$latest
(ideally the next release, which will likely be out in a couple weeks)
as there are some bugs in this area in older releases (and 8.3.7
certainly qualifies).
> the real solution, to me, is to stop using varchar limits unless
> there's a real reason for them. I.e. arbitrary limits on things like
> name lengths make no sense in the db.
>
Yeah, I have often subscribed to this idea in the past, though there
is a valid argument for saying that while you don't have a specific
limit you care about, there are values of length that are long enough
that they probably indicate garbage data or something gone wrong. In a
world where Postgres actually handled this problem gracefully (and I
think 9.1 does), I don't think this rule is as clear cut as it used to
be.
Robert Treat
conjecture: xzilla.net
consulting: omniti.com