Re: COPY to table with array columns (Longish) - Mailing list pgsql-sql
From | Phillip Smith |
---|---|
Subject | Re: COPY to table with array columns (Longish) |
Date | |
Msg-id | 008801c68e98$0cb1a620$9b0014ac@ITPhil Whole thread Raw |
In response to | Re: COPY to table with array columns (Longish) ("Aaron Bono" <postgresql@aranya.com>) |
Responses |
Re: COPY to table with array columns (Longish)
|
List | pgsql-sql |
The export from the D3 system is written in PICK BASIC – similar to ‘normal’ BASIC.
This currently exports Attribute (field) 1 to 9 of each SYS record with a Primary Key starting with “Z” (‘]’ is a wildcard in the SSELECT statement)
153 EXECUTE 'SSELECT SYS WITH A0 "Z]"' CAPTURING JUNK
154 LOOP
155 READNEXT SYS.ID THEN
156 READ SYS.REC FROM SYS, SYS.ID ELSE
157 SYS.ID = 'XXXXXX'
158 END
159 *
160 OUTLINE = SYS.ID
161 FOR SYS.SUB = 1 TO 9
162 OUTLINE = OUTLINE:AM:SYS.REC<SYS.SUB,1>
163 NEXT SYS.SUB
164 *
165 CONVERT CHAR(252) TO "" IN OUTLINE
166 CONVERT CHAR(92) TO "" IN OUTLINE
167 CONVERT CHAR(34) TO "" IN OUTLINE
168 OUTLINE = OCONV(OUTLINE, "MCU")
169 N=%FPUTS(OUTLINE:NL, (CHAR*)STREAM)
170 END ELSE
171 SYS.ID = 'XXXXXX'
172 END
173 490 NULL
174 UNTIL SYS.ID = 'XXXXXX' DO REPEAT
So you’re suggesting creating a child table for each SYS record? Ie, a table called “ZPRECMPL” etc?
Thanks for your input guys,
Cheers,
-p
-----Original Message-----
From: aaron.bono@gmail.com [mailto:aaron.bono@gmail.com] On Behalf Of Aaron Bono
Sent: Tuesday, 13 June 2006 12:58 PM
To: Tom Lane
Cc: phillips@weatherbeeta.com.au; pgsql-sql@postgresql.org
Subject: Re: [SQL] COPY to table with array columns (Longish)
I agree with Tom. Personally I cannot think of a time I would use an array column over a child table. Maybe someone can enlighten me on when an array column would be a good choice.
What language are you using to do the export if I may ask?
-Aaron
On 6/12/06, Tom Lane <tgl@sss.pgh.pa.us> wrote:
"Phillip Smith" <phillips@weatherbeeta.com.au> writes:
> The whole sys file is variable length records like this - they range =
> from 1
> to over 17,000 fields per record.
17000? I think you really need to rethink your schema. While you could
theoretically drop 17000 elements into a PG array column, you wouldn't
like the performance --- it'd be almost unsearchable for instance.
I'd think about two tables, one with a single row for each SYS record
from the original, and one with one row for each detail item (the
invoice numbers in this case). With suitable indexes and a foreign key
constraint, this will perform a lot better than an array-based
translation.
And no, in neither case will you be able to import that file without
massaging it first.
regards, tom lane
*******************Confidentiality and Privilege Notice*******************
The material contained in this message is privileged and confidential to the addressee. If you are not the addressee indicated in this message or responsible for delivery of the message to such person, you may not copy or deliver this message to anyone, and you should destroy it and kindly notify the sender by reply email.
Information in this message that does not relate to the official business of Weatherbeeta must be treated as neither given nor endorsed by Weatherbeeta. Weatherbeeta, its employees, contractors or associates shall not be liable for direct, indirect or consequential loss arising from transmission of this message or any attachments