On Wed, May 12, 2010 at 11:33 AM, Richard Broersma
<richard.broersma@gmail.com> wrote:
On Wed, May 12, 2010 at 9:18 AM, Justin Graf <
justin@magwerks.com> wrote:
> I would do a plain text file something like XML. Given this is for
> industrial use 10 years is a good number for warranty and support, but
> this stuff will hang around years later, think 20 to 30 years. How
> many people understand FLAT ISAM tables from the 1980's today, let alone
> tools to read/modify the records.
>
> I suggest storing the records in manner that is human readable
These are all good points. There is one concern that I do have, this
information will be used to audit the billing system. Is there any
concern for loss of data if a file rewrite is interrupted by a power
failure? When using postgres there are some protections provided to
reduce this kind of data loss.
However, I do agree that tabular/xml data would stand the test of time.
I would suggest "sqlite" (specifically, version 3). It is well tested to survive system crashes. It is an embedded database engine (runs in same address space as your process, you use it just like any other C library). If your program is not running, then neither is it and the database file(s) are not "open" on the disk. It also works great with perl and can even be used from a CMD script (by spawning its shell, 'sqlite.exe'). I use sqlite in a variety of projects when Postgresql would be "overkill" and I've been vary happy with it.
http://www.sqlite.org/sqlite.html