Bruce Momjian wrote:
> Jan Wieck wrote:
>> Bruce Momjian wrote:
>> > Peter Eisentraut wrote:
>> >> Tom Lane writes:
>> >>
>> >> > What Peter was advocating in that thread was that we enable -g by
>> >> > default *when building with gcc*. I have no problem with that, since
>> >> > there is (allegedly) no performance penalty for -g with gcc. However,
>> >> > the actual present behavior of our configure script is to default to -g
>> >> > for every compiler, and I think that that is a big mistake. On most
>> >> > non-gcc compilers, -g disables optimizations, which is way too high a
>> >> > price to pay for production use.
>> >>
>> >> You do realize that as of now, -g is the default for gcc? Was that the
>> >> intent?
>> >
>> > I was going to ask that myself. It seems strange to include -g by default ---
>> > we have --enable-debug, and that should control -g on all platforms.
>>
>> Could it be that there ought to be a difference between the defaults of
>> a devel CVS tree, a BETA tarball and a final "production" release?
>
> I am afraid that adds too much confusion to the debug situation. We
> have a flag to do -g; let people use it if they want it.
>
Well, -g eats up some disk space, but for a gcc it doesn't need CPU
cycles or anything else. I doubt many people who pay the horrible
storage capacity overhead for PostgreSQL are that concerned about some
extra symbols stored with their binaries, but let's not argue about that
one.
The other compiler flags like -O are much more important because the out
of the box configuration is the one we're allways blamed for. If it's
too hard to teach autoconf the difference between gcc and non-gcc, then
rip it.
Jan
--
#======================================================================#
# It's easier to get forgiveness for being wrong than for being right. #
# Let's break this rule - forgive me. #
#================================================== JanWieck@Yahoo.com #