Anatoly K. Lasareff hat gesagt: // Anatoly K. Lasareff wrote:
> >>>>> "BB" == Brian Baquiran <brianb@evoserve.com> writes:
>
> >> I want to insert a long text (up to 10.000 words) into a table
> >> (for example, table1) into a field (field1) which is a 'text'
> >> field. I have tried the followings: INSERT INTO table1 VALUES ('
> >> long text'...) UPDATE table1 SET field1='long text' and is not
> >> working. I'm using servlets and Apache server, on Linux.
>
> BB> What data type are you using for the text field? As far as I
> BB> know, the 'text' datatype can only take 8K.
>
> BB> I don't know what the maximum size for varchar is.
>
> 8K is maximum length of whole record. In your case you must use 'large
> objects' as datatype for big text.
How does one do this, preferably with perl and DBI?
What is the best way to handle big (>= 8k) text fields? It would be nice if
postgres could have an easy interface for larger text sizes or a set of
example functions to deal with such chunks.
A connected question of mine is:
I have played with the data types text and varchar to store some text from a
webbrowser with a perl-DBI cgi script.
The table has one INT field called "id" and one text field where I tested
the types: text, varchar, varchar(2000) and varchar(7000).
Inserting text only works if the text is a lot smaller than 8k - only about
20 lines of text get inserted without beeing cropped.
How can I assert that the whole text gets inserted and how can I get a
message/error if the text did not fit or has been shortend? Has anybody an
example script or perl code snippet with text fields roughly 2000-4000 bytes
long?
This problem annoys me for some weeks now so I think I am an idiot and
Postgres proves it...
Any help would be great and maybe let me sleep better ;)
--
__ __
Frank Barknecht ____ ______ ____ __ trip\ \ / /wire ______
/ __// __ /__/ __// // __ \ \/ / __ \\ ___\
/ / / ____/ / / / // ____// /\ \\ ___\\____ \
/_/ /_____/ /_/ /_//_____// / \ \\_____\\_____\
/_/ \_\