Thread: Backup database with entries > 8192 KBytes

Backup database with entries > 8192 KBytes

From
"G. Anthony Reina"
Date:
I know that I can't insert a tuple into Postgres > 8192 KBytes long. We
need to store data in a variable length float array which can take up a
total length of greater than this amount. To get around the limit, we
simply insert a zeroed array (which takes up less character space) and
then update the array in chunks specifying where in the array to put the
data.


e.g. INSERT INTO tablename VALUES '{0,0,0,0,0,0, .... }';       to pad the array with zeros (this, of course, has to be
less
than 8192 KBytes)

then

UPDATE tablename SET array1[1:100] = '{123.9, 12345.987, 123454555.87,
.... }'
etc.

This works fine.

Okay, long intro for a short question. When we do a pg_dump and then
restore the database should the COPY contained within the pg_dumped file
be able to handle these long arrays?

-Tony Reina





Re: [HACKERS] Backup database with entries > 8192 KBytes

From
Tom Lane
Date:
"G. Anthony Reina" <reina@nsi.edu> writes:
> I know that I can't insert a tuple into Postgres > 8192 KBytes long.

Er, make that 8K.  If the limit were 8Meg, a lot fewer people would be
complaining about it...

> need to store data in a variable length float array which can take up a
> total length of greater than this amount. To get around the limit, we
> simply insert a zeroed array (which takes up less character space)

Huh?  I assume you're talking about an array of float4 (or float8).
That's going to be 4 (or 8) bytes per value internally, zero or not.

Maybe you are thinking of the external textual representation of the
array.

> Okay, long intro for a short question. When we do a pg_dump and then
> restore the database should the COPY contained within the pg_dumped file
> be able to handle these long arrays?

Offhand I'm not sure.  I don't see any obvious restriction in copy.c,
but there could be lurking problems elsewhere.  Have you tried it?
        regards, tom lane


Re: [HACKERS] Backup database with entries > 8192 KBytes

From
"G. Anthony Reina"
Date:
Tom Lane wrote:

> "G. Anthony Reina" <reina@nsi.edu> writes:
> > I know that I can't insert a tuple into Postgres > 8192 KBytes long.
>
> Er, make that 8K.  If the limit were 8Meg, a lot fewer people would be
> complaining about it...
>

Right. My mistake. 8192 Bytes

>
> > need to store data in a variable length float array which can take up a
> > total length of greater than this amount. To get around the limit, we
> > simply insert a zeroed array (which takes up less character space)
>
> Huh?  I assume you're talking about an array of float4 (or float8).
> That's going to be 4 (or 8) bytes per value internally, zero or not.
>

float4 variable array

>
> Maybe you are thinking of the external textual representation of the
> array.
>

right. so the representation when I pass it into Postgres through a C
program:

e.g.
   sprintf(data_string, "INSERT INTO tablename VALUES '{%f'", array[0]);   for (i=1; i < length_of_array; i++) {
sprintf(temp_string,", %f", array[i]);       strcat(data_string, temp_string);   }   strcat(data_string, "}'");
 
   if (strlen(data_string) > 8192)) {       printf("String too long\n");       exit_nicely(conn);   }
   res = PQexec(conn, data_string);

>
> > Okay, long intro for a short question. When we do a pg_dump and then
> > restore the database should the COPY contained within the pg_dumped file
> > be able to handle these long arrays?
>
> Offhand I'm not sure.  I don't see any obvious restriction in copy.c,
> but there could be lurking problems elsewhere.  Have you tried it?

Yes. But some tables didn't go in when I restored the backup. It was probably
due to an error somewhere in the COPY command as far as I've been able to
surmise. I'm wondering if this could have been that problem (i.e. the array
was > 8K as so the COPY command crapped out when I tried to restore from a
pg_dump). When I get some time I'll try to make a small test database with
long arrays, pg_dump them, and then restore them.

Sorry to be so confusing. Trying to do 10 things at once. Must be the coffee
...     ;>)
Thanks for the help.
-Tony