> Hi,
>
> I currently us miniSql and have had problems with performance on a
> large
> database. as an example I have a 30mg+ simple database each entry is
> date
> stamped. When I request 1 years worth of data it take forever to
> return and
> most of the time the browser times out. but when I request the data
> through
> a CGI program using text files it is very quick. My question is does
> anyone
> use postGres on databases that are the same or larger sizes than mine,
> and
> could someone give me an idea of the performance.
I don't personally use databases in PostgreSQL that are that big but
there have been discussion on this list about the 2Gig file size limits
and the fact that Postgres can only handle 500 heavily loaded
simultaneous connections on someone box. So, I don't think that you
have anything to worry about.
I say export your table to a tab delimited text file, and use postgres'
copy command to import it into a table. Then put some indexes on the
date_stamp (I would use a datetime field instead).
> in mSQL the query looks something like this :
And you query would change to this.
> select * from table where date_stamp > 19970101 and date_stamp <
> 19980101
select * from table where date_stamp between '1/1/1997' and '1/1/1998';
> -----------------
> table has 13 fields.
You also might want to check out PHP (http://www.php.net) for the web
output.
> Thanks,
>
> -G.
>