Now, I've got to convince my project's software
architech, that a bigint would be better than a
decimal.
Does anyone know where I could get some documentation
on how the int and decimal are implemented so I could
prove to him that ints are better? Can people suggest
good points to make in order to prove it?
Thanks in advance.
--- Tom Lane <tgl@sss.pgh.pa.us> wrote:
> "Yusuf W." <unicef2k@yahoo.com> writes:
> > For the application that I'm working on, we want
> to
> > use data types that are database independent.
> (most
> > databases has decimal, but not big int).
>
> Most databases have bigint, I think.
>
> > Anyhow, we are planning on using decimal(19,0) for
> our
> > primary keys instead of a big int, would there be
> a
> > performance difference in using a bigint over
> using decimals?
>
> You'll be taking a very large performance hit, for
> very little benefit
> that I can see. How hard could it be to change the
> column declarations
> if you ever move to a database without bigint?
> There's not normally
> much need for apps to be explicitly aware of the
> column type names.
>
> regards, tom lane
__________________________________
Do you Yahoo!?
The New Yahoo! Shopping - with improved product search
http://shopping.yahoo.com