The tuple limit is the reason why we created a function for writing large
text files to a database. Here's the function we use:
function inject($id, $text) {
// this function splits the article, $text, into 4k chunks and stores it.
global $db;
$rs = db_exec("SELECT * FROM article WHERE textid LIKE '$id-%'"); // check
to see if the text exists
if ($rs) {
db_exec("DELETE FROM article WHERE textid LIKE '$id-%'"); // delete old
text if it does
}
$row=1;
$string="";
while ($text != "") {
if (strlen($text) <= 4096) {
$string = $text;
$text = "";
} else {
$x = 4096;
while (substr($text,$x,1) != "\n" && $x > 0) { $x = $x - 1; }
if ($x==0) {
$x = 4096;
while (substr($text,$x,1) != " " && $x > 0) { $x = $x - 1; }
}
$string = substr($text,0,$x);
$text = substr($text,$x+1);
}
$string = trim($string);
if ($string!="") {
$blockid = $id . "-" . substr("00".$row,-2);
$len = strlen($id);
$kbid = str_replace("KB","",$id);
if (strpos($kbid,":")) { $kbid = substr($kbid,0,strpos($kbid,":")); }
if (!$kbid) { $string = addslashes($string); }
if ($len > strlen($kbid)) {
$sql = "INSERT INTO article (kbid,textid,article) VALUES
('$kbid','$blockid','$string')";
db_exec($sql);
} else {
db_exec("INSERT INTO article (textid,article) VALUES
('$blockid','$string')");
}
}
$row++;
}
}
function retrieve($id) {
// this function retrieves and concatenates the files.
$rs = db_exec("SELECT * from article WHERE textid LIKE '$id-%' ORDER BY
textid"); // check to see if text exists
if ($rs) {
$row=0;
$string="";
while ($row < pg_numrows($rs)) {
$rec = pg_fetch_object($rs,$row);
$string = $string . $rec->article;
$row++;
}
} else {
$string="";
}
return($string);
}
It's likely, with a little bit of tweaking, that these functions could be
used to store image files?
Justin