Thread: Re: I am facing a difficult problem..

Re: I am facing a difficult problem..

From
Constantin Teodorescu
Date:
junho91@kmail.com wrote:
>
> Hello...
> I am using postgresSQL....
> I am facing a difficult problem..
>
> it was occured error below
> ==========================
> Warning: PostgresSQL query failed: PQexec() -- query is too long. Maximum length is 8191 in
/home/httpd/html/lline/add.php3on line 151 
> ==========================

Did you wrote in add.php3 a query statement that's bigger than 8191 ?
I can hardly imagine that ...

I think there's another error.
Send me the query statement that gives the error and I would look over
it.
It's strange ...

> I don't know how to do for the error???
> Please inform me the way I can write more than 8191...

As I know, it's not possible to write a query bigger than 8191 bytes.
But in real world, that's supposed to happen extremely rare.

> I don't have anyone to get help...

There's a lot of other guys at postgresql-interfaces@postgresql.org that
can help you also.

--
Constantin Teodorescu
FLEX Consulting Braila, ROMANIA

Re: [INTERFACES] Re: I am facing a difficult problem..

From
Adam Haberlach
Date:
On Mon, Mar 29, 1999 at 06:25:13AM +0000, Constantin Teodorescu wrote:
> junho91@kmail.com wrote:
> >
> > Hello...
> > I am using postgresSQL....
> > I am facing a difficult problem..
> >
> > it was occured error below
> > ==========================
> > Warning: PostgresSQL query failed: PQexec() -- query is too long.
> > Maximum length is 8191 in /home/httpd/html/lline/add.php3 on line 151
> > ==========================

> As I know, it's not possible to write a query bigger than 8191 bytes.
> But in real world, that's supposed to happen extremely rare.

    It would depend upon your database.  I'm handling a bug database
with some rather large text fields.  If someone pastes 8k of text into
one of them, the INSERT statement will fail for that reason.
    I would like to know if there is a simple solution to this as well,
or if it will be addressed in the future.  I do have other options, but
they entail shuffling a lot of legacy data around.

Update Postgresql auto from SQL Server 6.5

From
Chairudin Sentosa
Date:
Hi friends,

I have a Linux + Postgresql machine and an SQL Server 6.5 machine.
There is one table (usage-detail) in SQL Server keeps getting data every seconds.

Now, I want data from table (usage-details) in SQL Server to show up in postgresql in real time (if possible).

What I am doing now is:
1. Spool the table (usage-details) from SQL Server 6.5 every 30 minutes.
2. Ftp the file to Linux machine
3. Use perl to run 'insert' command to insert data to table (usage-details) in Postgresql.

I would like to do it automatically, without doing any spooling or ftping.
How to do it?
I want that one table (usage-details) keeps synchronized  between postgresql and SQL Server 6.5

Can anyone help me, please?

Thank you for your help.

Regards,
Chai






Re: [INTERFACES] Re: I am facing a difficult problem..

From
Ari Halberstadt
Date:
I asked recently about a similar problem. PosgreSQL limits rows to 8191
bytes. A solution is to split the data into segments. My application is a
bulletin board and messages can be pretty long. The data are split into
separate rows and are broken on word boundaries to keep the split clean
(and facilitate full-text searching if it were hosted with, say, Oracle
8i). Notice that you are also limited by query length. The code (in Java)
for doing the split turned out to be a bit involved.

>On Mon, Mar 29, 1999 at 06:25:13AM +0000, Constantin Teodorescu wrote:
>    It would depend upon your database.  I'm handling a bug database
>with some rather large text fields.  If someone pastes 8k of text into
>one of them, the INSERT statement will fail for that reason.
>    I would like to know if there is a simple solution to this as well,
>or if it will be addressed in the future.  I do have other options, but
>they entail shuffling a lot of legacy data around.


-- Ari Halberstadt mailto:ari@shore.net <http://www.magiccookie.com/>
PGP public key available at <http://www.magiccookie.com/pgpkey.txt>



Re: [INTERFACES] Re: I am facing a difficult problem..

From
Tom Lane
Date:
Adam Haberlach <haberlaa@ricochet.net> writes:
>     I would like to know if there is a simple solution to this as well,
> or if it will be addressed in the future.

It'll probably get fixed someday, but the solution is not trivial;
don't hold your breath.  I'd guess that it might happen two or three
releases from now, say six months to a year away.  (This has been
discussed many times before, so check the pgsql-hackers list archives
if you want details about the technical issues.)

In the meantime, your options for large text fields are (1) split them
into multiple database records, or (2) store them in "large objects".
Both are pretty ugly :-(

            regards, tom lane

Re: [INTERFACES] Re: I am facing a difficult problem..

From
James Olin Oden
Date:
Tom Lane wrote:

> Adam Haberlach <haberlaa@ricochet.net> writes:
> >       I would like to know if there is a simple solution to this as well,
> > or if it will be addressed in the future.
>
> It'll probably get fixed someday, but the solution is not trivial;
> don't hold your breath.  I'd guess that it might happen two or three
> releases from now, say six months to a year away.  (This has been
> discussed many times before, so check the pgsql-hackers list archives
> if you want details about the technical issues.)
>
> In the meantime, your options for large text fields are (1) split them
> into multiple database records, or (2) store them in "large objects".
> Both are pretty ugly :-(
>
>                         regards, tom lane

   Of course you can store them in your file system as separate files and store
their filename as a column in a table.  This approach eats up inodes pretty
quickly, though, so its not the best solution.  I recently thought of another
option; instead of storing them as separate files do the following:

   1) Write out a file with a unique name (one that is unique to the table in
which you store the name).
   2) Now run "ar" and insert the file into an archive named after the column
in the database:

          ar r archive filename

   3) Now delete the file from the file system.

When you want to retrieve this file from the archive type:

   ar p archive filename

When you want to update just repeat the same process as above.

Normally, "ar" is used to create libraries of object files; the thing is that
is it's "normal" use but it will create an _updatable_ archive of any type of
file (text files being the ones in question now).

Cheers...james