Hi Juan,
First of all congratulations on you project :)
We, at MADEIRA GPS, use Postgresql and PostGIS as the corner stone of our
fleet management solution and have tens of *millions* of records in a single
vehicles history table without any visible performance problem (we do however
clean it every year).
A thought, however, regarding your plans for gps data acquisition/storage:
every second... isn't that a bit too much?
We, for most of our customers, offer minute-by-minute tracking and, this is
important, *optimize* the vehicles' history table when writing data into it
by means of comparing the data from the last record - i.e. if the info is the
same *don't* write it! This will surely save you space ;-)
About simultaneous queries:
Last we checked we had ~200 of them with PGSQL still pumping at full
speed... ;-)
As a final note, IMHO, PGSQL/PostGIS is better than MySQL for a number of
reasons:
- proven robustness
- tight integration with PostGIS
- large user base (an always friendly bunch willing to help out each
other ;-) )
- ...
Regards,
Pedro Doria Meunier
GSM: +351961720188
Skype: pdoriam
On Tuesday 17 March 2009 11:25:08 am Juan Pereira wrote:
> Hello,
>
> I'm currently developing a program for centralizing the vehicle fleet GPS
> information -http://openggd.sourceforge.net-, written in C++.
>
> The database should have these requirements:
>
> - The schema for this kind of data consists of several arguments -latitude,
> longitude, time, speed. etc-, none of them is a text field.
> - The database also should create a table for every truck -around 100
> trucks-.
> - There won't be more than 86400 * 365 rows per table -one GPS position
> every second along one year-.
> - There won't be more than 10 simultaneously read-only queries.
>
> The question is: Which DBMS do you think is the best for this kind of
> application? PostgreSQL or MySQL?
>
>
> Thanks in advance
>
> Juan Karlos.