Thread: deprecating contrib for PGXN
I have missed it if this was discussed before but ... Would now be a good time to start deprecating the contrib/ directory as a way to distribute Pg add-ons, with favor given to PGXN and the like instead? It would make sense to leave contrib/ alone for 9.1, but I believe that it should start slimming down as we move towards 9.2, with any content that can easily be migrated to PGXN/etc being taken out of contrib/ . Or, the policy would be to stop adding new things to contrib/ except in the odd case where that is surely the best place to put it, so only the legacy things are there, and for the legacy things, they are removed case-by-case as workable distributions for them first appear on PGXN/etc. An analogy for policy here would be Perl 5 and what Perl modules it bundles. The Perl modules that have the most business being bundled with Perl are those minimal ones whose function is to go out to CPAN and install other modules. Another analogy would be Parrot and languages implemented over it. Originally, various language compilers were bundled with Parrot, and they gradually migrated to their own distributions, Rakudo for example. If this general policy of deprecating contrib/ is agreed on, then at the very least the documentation shipped with 9.1 should mention it being deprecated and talk about migration strategies. Or 9.1 could include a CPAN-like program that makes it easier to install PGXN extensions, if that is applicable, so there is an overlap period where people could get the legacy add-ons either way. -- Darren Duncan
On 05/17/2011 01:31 PM, Darren Duncan wrote: > I have missed it if this was discussed before but ... > > Would now be a good time to start deprecating the contrib/ directory as > a way to distribute Pg add-ons, with favor given to PGXN and the like > instead? If PGXN moves into .Org infrastructure (which I believe is currently the plan) then yes, contrib should go away. JD -- Command Prompt, Inc. - http://www.commandprompt.com/ PostgreSQL Support, Training, Professional Services and Development The PostgreSQL Conference - http://www.postgresqlconference.org/ @cmdpromptinc - @postgresconf - 509-416-6579
On Tue, May 17, 2011 at 9:45 PM, Joshua D. Drake <jd@commandprompt.com> wrote: > On 05/17/2011 01:31 PM, Darren Duncan wrote: >> >> I have missed it if this was discussed before but ... >> >> Would now be a good time to start deprecating the contrib/ directory as >> a way to distribute Pg add-ons, with favor given to PGXN and the like >> instead? > > If PGXN moves into .Org infrastructure (which I believe is currently the > plan) then yes, contrib should go away. It'll need to be made to work properly on Windows first, which means solving issues around the lack of a compiler on 99.9% of Windows boxes, and consequently, how a binary distribution would work with PostgreSQL builds that may differ from machine to machine in important ways (think integer datetimes for example). -- Dave Page Blog: http://pgsnake.blogspot.com Twitter: @pgsnake EnterpriseDB UK: http://www.enterprisedb.com The Enterprise PostgreSQL Company
On Tue, May 17, 2011 at 4:45 PM, Joshua D. Drake <jd@commandprompt.com> wrote: > On 05/17/2011 01:31 PM, Darren Duncan wrote: >> >> I have missed it if this was discussed before but ... >> >> Would now be a good time to start deprecating the contrib/ directory as >> a way to distribute Pg add-ons, with favor given to PGXN and the like >> instead? > > If PGXN moves into .Org infrastructure (which I believe is currently the > plan) then yes, contrib should go away. What is the benefit of getting rid of it? -- Robert Haas EnterpriseDB: http://www.enterprisedb.com The Enterprise PostgreSQL Company
Robert Haas wrote: > On Tue, May 17, 2011 at 4:45 PM, Joshua D. Drake <jd@commandprompt.com> wrote: >> On 05/17/2011 01:31 PM, Darren Duncan wrote: >>> I have missed it if this was discussed before but ... >>> >>> Would now be a good time to start deprecating the contrib/ directory as >>> a way to distribute Pg add-ons, with favor given to PGXN and the like >>> instead? >> If PGXN moves into .Org infrastructure (which I believe is currently the >> plan) then yes, contrib should go away. > > What is the benefit of getting rid of it? Maybe something could be clarified for me first. Are the individual projects in contrib/ also distributed separately from Pg, on their own release schedules, so users can choose to upgrade them independently of upgrading Pg itself, or so their developers can have a lot of flexibility to make major changes without having to follow the same stability or deprecation timetables of Pg itself? If the only way to get a contrib/ project is bundled with Pg, then the project developers and users don't get the flexibility that they otherwise would have. That's the main answer, I think. -- Darren Duncan
On Tue, 2011-05-17 at 13:45 -0700, Joshua D. Drake wrote: > If PGXN moves into .Org infrastructure (which I believe is currently > the plan) then yes, contrib should go away. Well, it is not an enough reason to kick contrib off. I am not aware that PGXN is a community driven project, and not aware that it has the same standards that contrib/ has. PGXN cannot replace contrib. It can only be an add-on to contrib. -- Devrim GÜNDÜZ Principal Systems Engineer @ EnterpriseDB: http://www.enterprisedb.com PostgreSQL Danışmanı/Consultant, Red Hat Certified Engineer Community: devrim~PostgreSQL.org, devrim.gunduz~linux.org.tr http://www.gunduz.org Twitter: http://twitter.com/devrimgunduz
On Tue, 2011-05-17 at 20:37 -0700, Darren Duncan wrote: > > Are the individual projects in contrib/ also distributed separately > from Pg, on their own release schedules, No. > If the only way to get a contrib/ project is bundled with Pg, then the > project developers and users don't get the flexibility that they > otherwise would have. These types of stuff goes under pgfoundry, for now. -- Devrim GÜNDÜZ Principal Systems Engineer @ EnterpriseDB: http://www.enterprisedb.com PostgreSQL Danışmanı/Consultant, Red Hat Certified Engineer Community: devrim~PostgreSQL.org, devrim.gunduz~linux.org.tr http://www.gunduz.org Twitter: http://twitter.com/devrimgunduz
Darren Duncan <darren@darrenduncan.net> writes: > Would now be a good time to start deprecating the contrib/ directory as a > way to distribute Pg add-ons, with favor given to PGXN and the like instead? The first important fact is that contrib/ code is maintained by the PostgreSQL-core product team, and I guess they prefer to have it all in a single git repository. Some contribs are only there so that we know when we break extensibility features, so it would be bad to move them away. The other problem is that the facility we need to provide the most is binary distributions (think apt-get). Lots of site won't ever compile stuff on their production servers. So while PGXN is a good tool, it's not a universal answer. Regards, -- Dimitri Fontaine http://2ndQuadrant.fr PostgreSQL : Expertise, Formation et Support
On 05/18/2011 10:30 AM, Dimitri Fontaine wrote: > Darren Duncan <darren@darrenduncan.net> writes: >> Would now be a good time to start deprecating the contrib/ directory as a >> way to distribute Pg add-ons, with favor given to PGXN and the like instead? > > The first important fact is that contrib/ code is maintained by the > PostgreSQL-core product team, and I guess they prefer to have it all in > a single git repository. Some contribs are only there so that we know > when we break extensibility features, so it would be bad to move them > away. > > The other problem is that the facility we need to provide the most is > binary distributions (think apt-get). Lots of site won't ever compile > stuff on their production servers. So while PGXN is a good tool, it's > not a universal answer. yeah moving contrib/ to pgxn will significantly reduce its usefulness - a lot of places have a policy that will allow stuff that is packaged by the OS-supplier (and that will include the contrib-package) but there is no way at all that they will allow something like pgxn or cpan or any other "from source" installation method. So by moving contrib somewhere else we will loose a lot of functionality in those scenarios. Stefan
On May 18, 2011, at 10:30 AM, Dimitri Fontaine wrote: > The other problem is that the facility we need to provide the most is > binary distributions (think apt-get). Lots of site won't ever compile > stuff on their production servers. So while PGXN is a good tool, it's > not a universal answer. Yeah, I would think that, *if* we were to seriously look at deprecating contrib (and I'm not suggesting that at all), onewould *first* need to solve the binary distribution problems. I think building tools so that PGXN distributions are automatically harvested and turned into StackBuilder/RPM/.deb binarieswould be the place to start on that. Best, David
On Wed, May 18, 2011 at 12:15 PM, David E. Wheeler <david@kineticode.com> wrote: > On May 18, 2011, at 10:30 AM, Dimitri Fontaine wrote: > >> The other problem is that the facility we need to provide the most is >> binary distributions (think apt-get). Lots of site won't ever compile >> stuff on their production servers. So while PGXN is a good tool, it's >> not a universal answer. > > Yeah, I would think that, *if* we were to seriously look at deprecating contrib (and I'm not suggesting that at all), onewould *first* need to solve the binary distribution problems. > > I think building tools so that PGXN distributions are automatically harvested and turned into StackBuilder/RPM/.deb binarieswould be the place to start on that. Yep, that seems pretty apropos. And for sure, we'd want to have the "contrib" material easily included via PGXN-derived packages *before* deprecating them from the 'core'. It ought to be reasonably easy to cope with "contrib" material switching between 'core' and 'some other well-identifiable place'; that's merely the matter of having a pointer point into contrib or into some place else. (I have observed this, and I'm sure Dimitri can concur, with the way the el-get package manager for Emacs can point to packages in a diverse set of kinds of places, including dpkg, Git repos, bzr repos, or pulling them via wget from wikis and such.) It'll be time to drop the contrib material from the "core" when that shift leads to a 1 line configuration change somewhere that leads to packages for Debian/Fedora/Ports drawing their code from the new spot. I'd fully expect that to wait until a year or more from now. -- When confronted by a difficult problem, solve it by reducing it to the question, "How would the Lone Ranger handle this?"
On May 18, 2011, at 12:24 PM, Christopher Browne wrote: > It'll be time to drop the contrib material from the "core" when that > shift leads to a 1 line configuration change somewhere that leads to > packages for Debian/Fedora/Ports drawing their code from the new spot. > > I'd fully expect that to wait until a year or more from now. Right, and assuming someone has the tuits to create the necessary automation to feed into other systems. Best, David
"David E. Wheeler" <david@kineticode.com> writes: > On May 18, 2011, at 10:30 AM, Dimitri Fontaine wrote: >> The other problem is that the facility we need to provide the most is >> binary distributions (think apt-get). Lots of site won't ever compile >> stuff on their production servers. So while PGXN is a good tool, it's >> not a universal answer. > Yeah, I would think that, *if* we were to seriously look at deprecating contrib (and I'm not suggesting that at all), onewould *first* need to solve the binary distribution problems. > I think building tools so that PGXN distributions are automatically harvested and turned into StackBuilder/RPM/.deb binarieswould be the place to start on that. Hmmm ... I think the real point of those policies about "no source builds" is to ensure that their systems are only running code that's been vetted to some degree by a responsible person (ie, an authorized packager for whatever distro they run). So any sort of automated collection of packages would go directly against what the policies are trying to accomplish, and would likely lead to the policies being amended to specifically ban use of your repo :-( regards, tom lane
On May 18, 2011, at 1:23 PM, Tom Lane wrote: >> I think building tools so that PGXN distributions are automatically harvested and turned into StackBuilder/RPM/.deb binarieswould be the place to start on that. > > Hmmm ... I think the real point of those policies about "no source > builds" is to ensure that their systems are only running code that's > been vetted to some degree by a responsible person (ie, an authorized > packager for whatever distro they run). So any sort of automated > collection of packages would go directly against what the policies are > trying to accomplish, and would likely lead to the policies being > amended to specifically ban use of your repo :-( Well, it's up to the maintainers of those repos whether they want use such a feed. Magnus and Devrim seemed interested init on Twitter last week; Magnus, in fact, originally suggested it. https://twitter.com/magnushagander/status/65431239770898434 Best, David
"David E. Wheeler" <david@kineticode.com> writes: > I think building tools so that PGXN distributions are automatically > harvested and turned into StackBuilder/RPM/.deb binaries would be the place > to start on that. Well, I'm not sure I buy into that idea, I need to think about it some more. The thing with debian for example is that the package building needs to be all automatic, and determistic — you're not granted to have the next version build a different set of binary packages. We're working about that very point with postgresql-X.Y-extension packages so that you can have a new binary package produced when you add support for a new major version, but we're not there yet. Having the set of binary packages change manually is ok, but debian also have the concept of binNMU which is an infrastructure forced rebuild if you wish (picture libc upgrades). So, given how the debian packaging actually works, having something automated that works from “distributions” which in PGXN can contain several extensions — I'm not seeing it. It looks a little like how things work in the Java world with jar and war packaging… FYI, I'm still working on apt.postgresql.org so that we have debian packaging for all major versions here, and all extensions for all those major versions too. It's not the first item on my TODO list, but we will get there eventually — this year I would figure, we even have a team forming. Regards, -- Dimitri Fontaine http://2ndQuadrant.fr PostgreSQL : Expertise, Formation et Support
On May 18, 2011, at 1:47 PM, Dimitri Fontaine wrote: > Well, I'm not sure I buy into that idea, I need to think about it some > more. The thing with debian for example is that the package building > needs to be all automatic, and determistic — you're not granted to have > the next version build a different set of binary packages. > > We're working about that very point with postgresql-X.Y-extension > packages so that you can have a new binary package produced when you add > support for a new major version, but we're not there yet. Having the > set of binary packages change manually is ok, but debian also have the > concept of binNMU which is an infrastructure forced rebuild if you wish > (picture libc upgrades). > > So, given how the debian packaging actually works, having something > automated that works from “distributions” which in PGXN can contain > several extensions — I'm not seeing it. It looks a little like how > things work in the Java world with jar and war packaging… I think it must be my ignorance of Debian (and Java) packaging at work here, because I don't understand any of the above(except the par where you need to think about it some more, which is smart). > FYI, I'm still working on apt.postgresql.org so that we have debian > packaging for all major versions here, and all extensions for all those > major versions too. It's not the first item on my TODO list, but we > will get there eventually — this year I would figure, we even have a > team forming. That sounds awesome. Best, David
On Wed, May 18, 2011 at 13:47, Dimitri Fontaine <dimitri@2ndquadrant.fr> wrote: > "David E. Wheeler" <david@kineticode.com> writes: >> I think building tools so that PGXN distributions are automatically >> harvested and turned into StackBuilder/RPM/.deb binaries would be the place >> to start on that. > > Well, I'm not sure I buy into that idea, I need to think about it some > more. The thing with debian for example is that the package building > needs to be all automatic, and determistic — you're not granted to have > the next version build a different set of binary packages. > > We're working about that very point with postgresql-X.Y-extension > packages so that you can have a new binary package produced when you add > support for a new major version, but we're not there yet. Having the > set of binary packages change manually is ok, but debian also have the > concept of binNMU which is an infrastructure forced rebuild if you wish > (picture libc upgrades). > > So, given how the debian packaging actually works, having something > automated that works from “distributions” which in PGXN can contain > several extensions — I'm not seeing it. It looks a little like how > things work in the Java world with jar and war packaging… I don't see why it couldn't, at least for a fair number of extensions.. It does require the ability to differentiate between patch releases and feature releases, though, which I believe is currently missing in pgxn (correct me if i'm wrong), but that's a solvable problem, no? Also, if it has several extensions, it should generate several DEB's - assuming they're independent extensions, right? If so, where's the problem? -- Magnus Hagander Me: http://www.hagander.net/ Work: http://www.redpill-linpro.com/
On May 18, 2011, at 2:47 PM, Magnus Hagander wrote: > I don't see why it couldn't, at least for a fair number of > extensions.. It does require the ability to differentiate between > patch releases and feature releases, though, which I believe is > currently missing in pgxn (correct me if i'm wrong), but that's a > solvable problem, no? PGXN requires semantic versions. If authors use the correctly, then you can rely on the z in x.y.z to be a patch/bug fixrelease, and the y and z to indicate new features. > Also, if it has several extensions, it should generate several DEB's - > assuming they're independent extensions, right? If so, where's the > problem? Maybe they're not independent. But why is that a problem. There are a *lot* of DEBs with multiple Perl modules in them. Best, David
On Wed, May 18, 2011 at 14:49, David E. Wheeler <david@kineticode.com> wrote: > On May 18, 2011, at 2:47 PM, Magnus Hagander wrote: > >> I don't see why it couldn't, at least for a fair number of >> extensions.. It does require the ability to differentiate between >> patch releases and feature releases, though, which I believe is >> currently missing in pgxn (correct me if i'm wrong), but that's a >> solvable problem, no? > > PGXN requires semantic versions. If authors use the correctly, then you can rely on the z in x.y.z to be a patch/bug fixrelease, and the y and z to indicate new features. Does it support having both v 1.3.1 and v1.4.0 and v2.0.2 at the same time? I somehow got the idea that old versions were removed when I uploaded a new one, but I happy to be wrong :-) >> Also, if it has several extensions, it should generate several DEB's - >> assuming they're independent extensions, right? If so, where's the >> problem? > > Maybe they're not independent. But why is that a problem. There are a *lot* of DEBs with multiple Perl modules in them. Yeah, I don't see the problem if they *are* dependent. -- Magnus Hagander Me: http://www.hagander.net/ Work: http://www.redpill-linpro.com/
On May 18, 2011, at 2:58 PM, Magnus Hagander wrote: > Does it support having both v 1.3.1 and v1.4.0 and v2.0.2 at the same > time? I somehow got the idea that old versions were removed when I > uploaded a new one, but I happy to be wrong :-) The distribution has only one version, of course, but perl extensions in 9.1, you can include multiple versions of an extensionin one distribution. Best, David
On Wed, May 18, 2011 at 15:05, David E. Wheeler <david@kineticode.com> wrote: > On May 18, 2011, at 2:58 PM, Magnus Hagander wrote: > >> Does it support having both v 1.3.1 and v1.4.0 and v2.0.2 at the same >> time? I somehow got the idea that old versions were removed when I >> uploaded a new one, but I happy to be wrong :-) > > The distribution has only one version, of course, but perl extensions in 9.1, you can include multiple versions of an extensionin one distribution. Won't that break if different (major) versions have different dependencies? -- Magnus Hagander Me: http://www.hagander.net/ Work: http://www.redpill-linpro.com/
On May 18, 2011, at 3:08 PM, Magnus Hagander wrote: >> The distribution has only one version, of course, but perl extensions in 9.1, you can include multiple versions of anextension in one distribution. > > Won't that break if different (major) versions have different dependencies? I don't understand the question… David
On Wed, May 18, 2011 at 15:17, David E. Wheeler <david@kineticode.com> wrote: > On May 18, 2011, at 3:08 PM, Magnus Hagander wrote: > >>> The distribution has only one version, of course, but perl extensions in 9.1, you can include multiple versions of anextension in one distribution. >> >> Won't that break if different (major) versions have different dependencies? > > I don't understand the question… If I include both version 1 and version 2 of an extension in one. And version 2 has more dependencies than version 1 (or the other way around). Then those dependencies will be required for version 1 as well... -- Magnus Hagander Me: http://www.hagander.net/ Work: http://www.redpill-linpro.com/
On May 18, 2011, at 3:22 PM, Magnus Hagander wrote: > If I include both version 1 and version 2 of an extension in one. And > version 2 has more dependencies than version 1 (or the other way > around). Then those dependencies will be required for version 1 as > well... Yes. But if they're that decoupled, then they ought to be in separate distributions. Best, David
"David E. Wheeler" <david@kineticode.com> writes: > Yes. But if they're that decoupled, then they ought to be in separate > distributions. I somehow fail to picture how you map distributions with debian packages. The simple way is to have a distribution be a single source package that will produce as many binary packages as it contains extensions. Now, if a single extension appears in more than one distribution, as far as debian packaging is concerned, you're hosed. So I still think we need to manually package for debian… Regards, -- Dimitri Fontaine http://2ndQuadrant.fr PostgreSQL : Expertise, Formation et Support
On May 18, 2011, at 3:27 PM, Dimitri Fontaine wrote: > "David E. Wheeler" <david@kineticode.com> writes: >> Yes. But if they're that decoupled, then they ought to be in separate >> distributions. > > I somehow fail to picture how you map distributions with debian > packages. The simple way is to have a distribution be a single source > package that will produce as many binary packages as it contains > extensions. How do CPAN modules get packaged? Example: http://packages.debian.org/squeeze/all/libsvn-notify-perl/filelist > Now, if a single extension appears in more than one distribution, as far > as debian packaging is concerned, you're hosed. Yeah. That might happen, but should be uncommon. > So I still think we need to manually package for debian… Well, maybe packages could be auto-generated but vetted by a human? Just a thought. Best, David