Re: Heroku early upgrade is raising serious questions - Mailing list pgsql-advocacy
From | damien clochard |
---|---|
Subject | Re: Heroku early upgrade is raising serious questions |
Date | |
Msg-id | 516D3648.3090903@dalibo.info Whole thread Raw |
In response to | Re: Heroku early upgrade is raising serious questions (David Johnston <polobo@yahoo.com>) |
List | pgsql-advocacy |
> > I agree it's better to forego open source doctrine in the interest of > preventing a larger evil. If we had not given Heroku early acces and they > got hacked then the discussion would revolve around how said hack could have > been prevented instead. The is decidedly a worse discussion then making > them a special class of user. > I see this argument coming in various forms but still I don't understand why Heroku is so different from other large-scale users. Sure Heroku was quite exposed to this vulnerability because : a- they allow port access to their database servers from untrusted networks (see [1]) b- they don't have tools to deploy a security fix quickly. (see [2]) But these are technical choices. I'm not judging them and I'm not saying these two problems are easy to solve for a large-scale company. But Heroku could have done things different and they choose not to filter the access to their servers and they didn't build the machinery for fast security fix deployment. Moreover they're talking about it in public so I guess these are choices they've made and they know the consequences behind them... -- Let's take an example and imagine an hosting company called pgCloud, with thousand of PostgreSQL servers in their public could. pgCloud is in competition with Heroku. pgCloud implemented a sophisticated IP filtering system to protect their network and they built serious security scripts to update all their servers in less than an hour. pgCloud did this because they what to offer the best possible level of security to their customers. Even if a nasty zero-day exploit is released someday (God forbid!) Building these tools and procedures did cost a lot of time and money. But that's fine because when a PostgreSQL vulnerability is discovered, they can keep calm and wait for the security release. But now pgCloud has a problem : Heroku was allowed to deploy the security fix before it went public, while they had to wait for the security release like everyone else.... So their question is : what shoud we do ? Should we spend time and money to build/maintain a secure postgres cloud or should we "gain trust" from the PostgreSQL community in order to be allowed to deploy earlier ? And what would customer choose between the two ? pgCloud has a secured network but Heroku has early access to the fix... -- This example is simplistic of course but I think it shows clearly why we should stay as "neutral" as possible (I didn't say "fair") and avoid any special treatment that would disrupt a market. Otherwise we're just opening a big shiny pandora box. And yes I am aware that a company can have both : a secure network AND trust from the community. But if I were sarcastic, I'd say that right now the best security strategy for a large scale cloud provider is to set unrestricted access to all their servers, don't bother with update scripts and ask the community for early access because they're so exposed. [1] https://news.ycombinator.com/item?id=5493353 [2] https://blog.heroku.com/archives/2013/4/4/heroku_postgres_databases_patched
pgsql-advocacy by date: