Thread: RE: [SQL] Insert a long text

RE: [SQL] Insert a long text

From
"Jackson, DeJuan"
Date:

> -----Original Message-----
> Anatoly K. Lasareff hat gesagt: // Anatoly K. Lasareff wrote:
>
> > >>>>> "BB" == Brian Baquiran <brianb@evoserve.com> writes:
> >
> >  >> I want to insert a long text (up to 10.000 words) into a table
> >  >> (for example, table1) into a field (field1) which is a 'text'
> >  >> field. I have tried the followings: INSERT INTO table1 VALUES ('
> >  >> long text'...) UPDATE table1 SET field1='long text' and is not
> >  >> working. I'm using servlets and Apache server, on Linux.
> >
> >  BB> What data type are you using for the text field? As far as I
> >  BB> know, the 'text' datatype can only take 8K.
> >
> >  BB> I don't know what the maximum size for varchar is.
> >
> > 8K is maximum length of whole record. In your case you must
> use 'large
> > objects' as datatype for big text.
>
> How does one do this, preferably with perl and DBI?
>
> What is the best way to handle big (>= 8k) text fields? It
> would be nice if
> postgres could have an easy interface for larger text sizes
> or a set of
> example functions to deal with such chunks.
>
> A connected question of mine is:
> I have played with the data types text and varchar to store
> some text from a
> webbrowser with a perl-DBI cgi script.
> The table has one INT field called "id" and one text field
> where I tested
> the types: text, varchar, varchar(2000) and varchar(7000).
> Inserting text only works if the text is a lot smaller than
> 8k - only about
> 20 lines of text get inserted without beeing cropped.
> How can I assert that the whole text gets inserted and how can I get a
> message/error if the text did not fit or has been shortend?
> Has anybody an
> example script or perl code snippet with text fields roughly
> 2000-4000 bytes
> long?
> This problem annoys me for some weeks now so I think I am an idiot and
> Postgres proves it...
>
> Any help would be great and maybe let me sleep better ;)
If you need to search the text possible solutions:
  1) store the text in chunks in the database.
     in other words instead of just using
       CREATE TABLE my_text(message_id INT, data TEXT)
     try
       CREATE TABLE my_text(message_id INT, chunk_id INT, data TEXT);
     then figure out what your insert limit is and break each TEXT
     block into chunks of that size.
  2) use a full-text-index algorithm (great for searches, but a space
hog).
     not mutually exclusive from option 1 above, but there are
     implementations that could make option 1 unnecessary if you know
that
     you'll never have a text block greater than a certain size.
     Look in the contrib directory of the distribution for a working
     sample/example.
If you don't care about searching:
  3) use large objects
     There are also examples of this in the contrib directory.
     This option could be combined with option 2 to implement searching,

     but I wouldn't want to have to change the implementations to
support
     it.

Hope this helps,
    -DEJ

Re: [SQL] Insert a long text

From
Frank Barknecht
Date:
Jackson, DeJuan hat gesagt: // Jackson, DeJuan wrote:

>   1) store the text in chunks in the database.
>      in other words instead of just using
>        CREATE TABLE my_text(message_id INT, data TEXT)
>      try
>        CREATE TABLE my_text(message_id INT, chunk_id INT, data TEXT);
>      then figure out what your insert limit is and break each TEXT
>      block into chunks of that size.

To store longer text I am now trying to do it in this way. But first I have
to find out what is the insert limit. For this I wrote a little test script
with perl and DBI which now gives me some strange results.

The script is basically this:

#!/usr/bin/perl -w
use diagnostics;
use DBI;
use strict;

# Konfiguration:
my $dbname="einblick";
my $user="freak";
my $passwd="";
my $tabelle = "my_text";

# For testing inserts I made a rather small long text: ~ 420 bytes
my $text = '
0 0
0 1
0 2
0 3
0 4
0 5
0 6
0 7
0 8
0 9
1 0
# ... and so on until
9 9
1 0 0
1 0 1
1 0 2
1 0 3
1 0 4
1 0 5
1 0 6
1 0 7
1 0 8
1 0 9
';
my $id = 80;

# PG connect
my $dbh = DBI->connect("dbi:Pg:dbname=$dbname", $user, $passwd)
             || die "Can't connect to $dbname: $DBI::errstr";

# SQL-Statement
my $sql = "INSERT INTO $tabelle
( message_id, chunk_id, data) VALUES ( ?, ?, ?)";
my $sth = $dbh->prepare("$sql") || die "Can't prepare statement: $DBI::errstr";
$sth->trace(2, "/tmp/trace_pg");
my $rc = $sth->execute( $id, 0, $text )
             || die "Can't execute statement: $DBI::errstr";
$sth->finish;
$dbh->disconnect;
# END testscript

This runs without problems at first sight but it does NOT insert the whole
text from $text. Instead it cuts it off at around line "9 9"
This seems not to be a problem with the data type anymore because I can
change the entry with pgaccess and make it a longer one. The whole text does
indeed fit into the column but I don't manage to insert it with my script.

I traced the transaction and this shows something like:

/tmp/trace_pg:
dbd_st_execute: statement = >INSERT INTO my_text
( message_id, chunk_id, data) values ( 80, 0, '
0 0
0 1
   [... I did shorten the output here!]
9 6
9 7
9 ...')<
    <- execute= 1 at /home/freak/bin/langtext.pl line 137.
    -> finish for DBD::Pg::st (DBI::st=HASH(0x8209244)~0x81ca35c)
    <- finish= 1 at /home/freak/bin/langtext.pl line 139.
    -> DESTROY for DBD::Pg::st (DBI::st=HASH(0x81ca35c)~INNER)
dbd_st_destroy
    <- DESTROY= undef

Obiously I am still doing something wrong here but I have no idea what
it is. It would be great if someone could help me.
Many thanks in advance,
--
                                                     __    __
 Frank Barknecht           ____ ______   ____ __ trip\ \  / /wire ______
                          / __// __  /__/ __// // __  \ \/ /  __ \\  ___\
                         / /  / ____/  / /  / // ____// /\ \\  ___\\____ \
                        /_/  /_____/  /_/  /_//_____// /  \ \\_____\\_____\
                                                    /_/    \_\