Well, I am writing an email client where data will be stored on both the
client and server. I have a table that stores all the message headers, and a
table that stores the entire source for every message (including encoding
for attachments etc.)
Every time a user clicks on a mail folder, it pulls their message headers
from the headers table. Every time a user clicks on a message, it needs to
pull
The message body etc. from the message source table.
Now as you can imagine, on the server side, if you have 100 users, and all
their message source sitting in one big table, it can slow down read
operations because of all the disk i/o.
Previously, I was using MySQL and placing all the users data into separate
tables gave me a huge performance increase.
I'm not sure if PostGreSQL will handle this better. But my main concern over
this matter is the problem with Disk I/O on one big table.
-----Original Message-----
From: Andrej Ricnik-Bay [mailto:andrej.groups@gmail.com]
Sent: Sunday, November 26, 2006 1:01 AM
To: pgsql-novice@postgresql.org
Cc: Greg Quinn; Stephan Szabo
Subject: SPAM-LOW: Re: [NOVICE] Inserting values into a variable table
On 11/26/06, Stephan Szabo <sszabo@megazone.bigpanda.com> wrote:
> You may consider seeing whether or not there's another way to lay out
> the data as well that might be easier to work with.
I'm with Stephan on this one - I'd say that having all mail in one
table, and using the "random string" that denominates the inbox
to select the messages on a per user basis. Makes the code
to handle things much cleaner, and I can't see a security (or
other) benefit in the separate tables.
Cheers
--
No virus found in this outgoing message.
Checked by AVG Free Edition.
Version: 7.5.430 / Virus Database: 268.14.16/551 - Release Date: 11/25/2006
10:55 AM