Thread: Hardware requirements for a PostGIS server
Dear PostgreSQL users, I am posting here a question that I initially asked on the PostGIS list [1], where I was advised to try here too (I will keep both lists updated about the developments on this issue). I am currently planning to set up a PostgreSQL + PostGIS instance for my lab. Turns out I believe this would be useful for the whole center, so that I'm now considering setting up the server for everyone—if interest is shared of course. At the moment, I am however struggling with what would be required in terms of hardware, and of course, the cost will depend on that—at the end of the day, it's really a matter of money well spent. I have then a series of questions/remarks, and I would welcome any feedback from people with existing experience on setting up a multi-user PostGIS server. I'm insisting on the PostGIS aspect, since the most heavy requests will be GIS requests (intersections, spatial queries, etc.). However, people with similar PostgreSQL setup may have very relevant comments about their own configuration. * My own experience about servers is rather limited: I used PostGIS quite a bit, but only on a desktop, with only 2 users. The desktop was quite good (quad-core Xeon, 12 Go RAM, 500 GB hd), running Debian, and we never had any performance issue (although some queries were rather long, but still acceptable). * The use case I'm envisioning would be (at least in the foreseeable future): - About 10 faculty users (which means potentially a little bit more students using it); I would have hard time considering more than 4 concurrent users; - Data would primarily involve a lot (hundreds/thousands) of high resolution (spatial and temporal) raster and vector maps, possibly over large areas (Florida / USA / continental), as well as potentially millions of GPS records (animals individually monitored); - Queries will primarily involve retrieving points/maps over given areas/time, as well as intersecting points over environmental layers [from what I understand, a lot of I/O, with many intermediary tables involved]; other use cases will involve working with steps, i.e. the straight line segment connecting two successive locations, and intersecting them with environmental layers; * I couldn't find comprehensive or detailed guidelines on-line about hardware, but from what I could see, it seems that memory wouldn't be the main issue, but the number of cores would be (one core per database connection if I'm not mistaken). At the same time, we want to make sure that the experience is smooth for everyone... I was advised on the PostGIS list to give a look at pgpool (however, UNIX only). * Is there a difference in terms of possibilities, performance and usability between a Linux-based and a MS-based server (from the user perspective)? My center is unfortunately MS-centered, and existing equipment runs with MS systems... It would thus be easier for them to set up a MS-based server. Does it change anything for the user? (I insist on the user perspective, since I and others will not admin the system, but only use it) * Does anyone have worked with a server running the DB engine, while the DB itself was stored on another box/server? That would likely be the case here since we already have a dedicated box for file storage. Along these lines, does the system of the file storage box matter (Linux vs. MS)? * We may also use the server as a workstation to streamline PostGIS processing with further R analyses/modeling (or even use R from within the database using PL/R). Again, does anyone have experience doing it? Is a single workstation the recommended way to work with such workflow? Or would it be better (but more costly) to have one server dedicated to PostGIS and another one, with different specs, dedicated to analyses (R)? I realize my questions and comments may be a confusing, likely because of the lack of experience about these issues on my side. I really welcome any feedback of people working with PostgreSQL servers (+ PostGIS ideally!) in a small unit, or any similar setting that could be informative! In advance, thank you very much! Sincerely, Mathieu Basille. [1] Start of the thread here: http://lists.osgeo.org/pipermail/postgis-users/2015-February/040120.html -- ~$ whoami Mathieu Basille http://ase-research.org/basille ~$ locate --details University of Florida \\ Fort Lauderdale Research and Education Center (+1) 954-577-6314 ~$ fortune « Le tout est de tout dire, et je manque de mots Et je manque de temps, et je manque d'audace. » -- Paul Éluard _______________________________________________ postgis-users mailing list postgis-users@lists.osgeo.org http://lists.osgeo.org/cgi-bin/mailman/listinfo/postgis-users -- ~$ whoami Mathieu Basille http://ase-research.org/basille ~$ locate --details University of Florida \\ Fort Lauderdale Research and Education Center (+1) 954-577-6314 ~$ fortune « Le tout est de tout dire, et je manque de mots Et je manque de temps, et je manque d'audace. » -- Paul Éluard
Responses in-line: On Tue, 10 Feb 2015 19:52:41 -0500 Mathieu Basille <basille.web@ase-research.org> wrote: > > I am posting here a question that I initially asked on the PostGIS list > [1], where I was advised to try here too (I will keep both lists updated > about the developments on this issue). > > I am currently planning to set up a PostgreSQL + PostGIS instance for my > lab. Turns out I believe this would be useful for the whole center, so that > I'm now considering setting up the server for everyone?if interest is > shared of course. At the moment, I am however struggling with what would be > required in terms of hardware, and of course, the cost will depend on > that?at the end of the day, it's really a matter of money well spent. I > have then a series of questions/remarks, and I would welcome any feedback > from people with existing experience on setting up a multi-user PostGIS > server. I'm insisting on the PostGIS aspect, since the most heavy requests > will be GIS requests (intersections, spatial queries, etc.). However, > people with similar PostgreSQL setup may have very relevant comments about > their own configuration. > > * My own experience about servers is rather limited: I used PostGIS quite a > bit, but only on a desktop, with only 2 users. The desktop was quite good > (quad-core Xeon, 12 Go RAM, 500 GB hd), running Debian, and we never had > any performance issue (although some queries were rather long, but still > acceptable). > > * The use case I'm envisioning would be (at least in the foreseeable future): > - About 10 faculty users (which means potentially a little bit more > students using it); I would have hard time considering more than 4 > concurrent users; > - Data would primarily involve a lot (hundreds/thousands) of high > resolution (spatial and temporal) raster and vector maps, possibly over > large areas (Florida / USA / continental), as well as potentially millions > of GPS records (animals individually monitored); > - Queries will primarily involve retrieving points/maps over given > areas/time, as well as intersecting points over environmental layers [from > what I understand, a lot of I/O, with many intermediary tables involved]; > other use cases will involve working with steps, i.e. the straight line > segment connecting two successive locations, and intersecting them with > environmental layers; > > * I couldn't find comprehensive or detailed guidelines on-line about > hardware, but from what I could see, it seems that memory wouldn't be the > main issue, but the number of cores would be (one core per database > connection if I'm not mistaken). At the same time, we want to make sure > that the experience is smooth for everyone... I was advised on the PostGIS > list to give a look at pgpool (however, UNIX only). # of cores helps in parallel processing. But 4 simultaneous users doesn't particularly mean 4 simultaneous queries. How much time do your users spend running queries vs. idling? If you don't expect more than 4 concurrent users, I would think you'll be fine with a single quad-core CPU. I would get the fastest CPU available, though, as it will make number crunching go faster. I can't see any reason why you'd want/need pgpool. pgpool is generally useful when you have a LOT of simultaneous connections, and you're only estimating 4. Additionally, pgpool is fairly easy to add on later if you need it ... so my recommendation would be not to worry about it just yet. > * Does anyone have worked with a server running the DB engine, while the DB > itself was stored on another box/server? That would likely be the case here > since we already have a dedicated box for file storage. Along these lines, > does the system of the file storage box matter (Linux vs. MS)? Yes. If you have a lot data that will need to be crunched, I would consider getting SSDs directly attached to the computer running Postgres. Anything you put between RAM and your disks that slows down transfers is going to hurt performance. However, since you haven't made an estimate of the physical size of the data, I can't comment on whether sufficient SSD storage is cost effective or not. If you can't get DAS storage, you can make up for some of the performance hit by getting lots of RAM. Part of the effectiveness of the RAM is dependent on the OS and it's storage drivers, though, and I have no experience with how well Windows does that ... and since you didn't mention which file storage technology you're using, I can't comment on that either. SAN and NAS storage vary wildly from brand to brand on their performance characteristics, so it's difficult to say unless you can find someone who has tried the exact hardware you're liable to be using. If performance is important, I highly recommend DAS, and furthermore SSDs if you can afford them. > * We may also use the server as a workstation to streamline PostGIS > processing with further R analyses/modeling (or even use R from within the > database using PL/R). Again, does anyone have experience doing it? Is a > single workstation the recommended way to work with such workflow? Or would > it be better (but more costly) to have one server dedicated to PostGIS and > another one, with different specs, dedicated to analyses (R)? I know nothing about R. But the question isn't really dependent on R. Whether it works will depend on how memory and CPU intensive the code you're running in R is, and whether that's enough CPU/memory usage to interfere with what Postgres needs to do its portion of the work. Usually, you'll get better performance by running your non-Postgres processes on another machine, thus increasing the total # of cores and amount of RAM available to the process, but sometimes, when the transfer of data from the database to the other code is the bottleneck, the opposite is true. Sorry that I'm saying "it depends" so many times, but hopefully the details on how it depends will help you make decisions, or at least tell you what to investigate to decide. -- Bill Moran
On 11/02/15 13:52, Mathieu Basille wrote: > Dear PostgreSQL users, > > I am posting here a question that I initially asked on the PostGIS > list [1], where I was advised to try here too (I will keep both lists > updated about the developments on this issue). > > I am currently planning to set up a PostgreSQL + PostGIS instance for > my lab. Turns out I believe this would be useful for the whole center, > so that I'm now considering setting up the server for everyone—if > interest is shared of course. At the moment, I am however struggling > with what would be required in terms of hardware, and of course, the > cost will depend on that—at the end of the day, it's really a matter > of money well spent. I have then a series of questions/remarks, and I > would welcome any feedback from people with existing experience on > setting up a multi-user PostGIS server. I'm insisting on the PostGIS > aspect, since the most heavy requests will be GIS requests > (intersections, spatial queries, etc.). However, people with similar > PostgreSQL setup may have very relevant comments about their own > configuration. > > * My own experience about servers is rather limited: I used PostGIS > quite a bit, but only on a desktop, with only 2 users. The desktop was > quite good (quad-core Xeon, 12 Go RAM, 500 GB hd), running Debian, and > we never had any performance issue (although some queries were rather > long, but still acceptable). > > * The use case I'm envisioning would be (at least in the foreseeable > future): > - About 10 faculty users (which means potentially a little bit more > students using it); I would have hard time considering more than 4 > concurrent users; > - Data would primarily involve a lot (hundreds/thousands) of high > resolution (spatial and temporal) raster and vector maps, possibly > over large areas (Florida / USA / continental), as well as potentially > millions of GPS records (animals individually monitored); > - Queries will primarily involve retrieving points/maps over given > areas/time, as well as intersecting points over environmental layers > [from what I understand, a lot of I/O, with many intermediary tables > involved]; other use cases will involve working with steps, i.e. the > straight line segment connecting two successive locations, and > intersecting them with environmental layers; > > * I couldn't find comprehensive or detailed guidelines on-line about > hardware, but from what I could see, it seems that memory wouldn't be > the main issue, but the number of cores would be (one core per > database connection if I'm not mistaken). At the same time, we want to > make sure that the experience is smooth for everyone... I was advised > on the PostGIS list to give a look at pgpool (however, UNIX only). > > * Is there a difference in terms of possibilities, performance and > usability between a Linux-based and a MS-based server (from the user > perspective)? My center is unfortunately MS-centered, and existing > equipment runs with MS systems... It would thus be easier for them to > set up a MS-based server. Does it change anything for the user? (I > insist on the user perspective, since I and others will not admin the > system, but only use it) > > * Does anyone have worked with a server running the DB engine, while > the DB itself was stored on another box/server? That would likely be > the case here since we already have a dedicated box for file storage. > Along these lines, does the system of the file storage box matter > (Linux vs. MS)? > > * We may also use the server as a workstation to streamline PostGIS > processing with further R analyses/modeling (or even use R from within > the database using PL/R). Again, does anyone have experience doing it? > Is a single workstation the recommended way to work with such > workflow? Or would it be better (but more costly) to have one server > dedicated to PostGIS and another one, with different specs, dedicated > to analyses (R)? > > I realize my questions and comments may be a confusing, likely because > of the lack of experience about these issues on my side. I really > welcome any feedback of people working with PostgreSQL servers (+ > PostGIS ideally!) in a small unit, or any similar setting that could > be informative! > > In advance, thank you very much! > > Sincerely, > Mathieu Basille. > > > [1] Start of the thread here: > http://lists.osgeo.org/pipermail/postgis-users/2015-February/040120.html > When I looked at tuning options for PostgreSQL, I found that there were limitations mentioned for Microsoft O/S's. I get the general impression from my reading from multiple sources over the years, that if you are serious about performance on a server, then you should prefer Linux to Microsoft. Note that most servers run Linux, and that over 95% of the top 500 super computers runs Linux - which is rather telling about the perception of Microsoft's performance. Cheers, Gavin
> I am currently planning to set up a PostgreSQL + PostGIS instance for my > lab. Turns out I believe this would be useful for the whole center, so > that I'm now considering setting up the server for everyone—if interest > is shared of course. At the moment, I am however struggling with what > would be required in terms of hardware Just for perspective, here are the specs required to run a Nominatim server, which uses PostGIS to do geocoding on OpenStreetMap data: http://wiki.openstreetmap.org/wiki/Nominatim/Installation Of course maybe your users have more detailed data, but at least that link will give you something to think about. Good luck! Paul
Thanks to everyone who contributed to this thread, either on the PostGIS [1] or the PostgreSQL [2] mailing lists. I will try to summarize everything in this message, which I will actually post on both lists to give an update to everyone. I hope it can be useful for other people interested. Please feel free to add more advice and other experiences, this is always useful! Performance =========== * CPU Good CPU required for faster processing. Number of cores helps in parallel processing, but number of users != number of queries (Example: with no more than 4 concurrent users, it should be fine with a single quad-core CPU). * Memory Examples go from 8 to >32 GB RAM. * Disks Lots of I/0 with geoprocessing requires fast disks: best with SSD, otherwise 10k/15k RPM. An alternative would be to store indexes on faster disks and data on slower disks (need to tune PostgreSQL). Better to have direct-attached storage (DAS), i.e. on the server directly (direct transfer between RAM and disks); external storage requires good network (additional RAM increases performance). * Massive multi-user environment (lot of simultaneous connections): pgpool [3] (Linux/UNIX only). pgpool can be added later on, no need to worry about it as a start. Platform ======== Linux is the platform of choice: * Easier administration (install/configuration/upgrade), which is also true for addons/dependencies (starting with PostGIS, but also GEOS, GDAL, PL/R); * Better performance [4]; * More tuning options (limited with MS systems); There is still the possibility of a virtualbox on a MS server. Other considerations ==================== * Backup: integrate a script that runs daily pg_dump daily to export and upload DB to storage box (which is part of the backup system) * Integration with R: a dedicated R server brings more flexibility / extensions (e.g. Shiny) / performance (more cores and memory available for PostGIS) except if data transfer is the bottleneck. Use Pl/R for small functions (also if it fits naturally into PostgreSQL workflow) / otherwise in R with PostgreSQL connector. Example setups ============== * Dell Precision 2×6 cores, 20 GB RAM, SSD for indexes, 7200 HDD for big tables [Rémi Cura]: Various usages, from visualization (few users) to complex queries with a lot of reading/writing (several users). * Bare metal machine with ESXI; PostgreSQL machine with 8Gb RAM; 2 quad-core processors; PostgreSQL tuned for fast reads, with large cache; pgPool; disks: 2 7.200RPM disks - with RAID 1 [George Silva]: 12 concurrent QGIS users, editing around 50.000 km² of land use coverage in 1:5000 scale with lot of detail (in two separate DB). More editing than processing; some heavy queries (e.g. "complete feature" tool from QGIS) can take some time. * Nominatim (OpenStreetMap data) [5]: > 1 GB RAM necessary, >32 GB recommended; 700 GB HDD; SSD recommended; example machine: 12-core with 32GB RAM and standard SATA disks, I/O limiting factor. Thanks again for the good feedback! This gives me very useful information to get started (I think this is still going to be a long process). Mathieu Basille. [1] http://lists.osgeo.org/pipermail/postgis-users/2015-February/040120.html [2] http://www.postgresql.org/message-id/54DAA7D9.8020908@ase-research.org [3] http://www.pgpool.net/ [4] https://stackoverflow.com/questions/8368924/postgresql-performance-on-windows-using-latest-versions [5] http://wiki.openstreetmap.org/wiki/Nominatim/Installation Le 10/02/2015 19:52, Mathieu Basille a écrit : > Dear PostgreSQL users, > > I am posting here a question that I initially asked on the PostGIS list > [1], where I was advised to try here too (I will keep both lists updated > about the developments on this issue). > > I am currently planning to set up a PostgreSQL + PostGIS instance for my > lab. Turns out I believe this would be useful for the whole center, so that > I'm now considering setting up the server for everyone—if interest is > shared of course. At the moment, I am however struggling with what would be > required in terms of hardware, and of course, the cost will depend on > that—at the end of the day, it's really a matter of money well spent. I > have then a series of questions/remarks, and I would welcome any feedback > from people with existing experience on setting up a multi-user PostGIS > server. I'm insisting on the PostGIS aspect, since the most heavy requests > will be GIS requests (intersections, spatial queries, etc.). However, > people with similar PostgreSQL setup may have very relevant comments about > their own configuration. > > * My own experience about servers is rather limited: I used PostGIS quite a > bit, but only on a desktop, with only 2 users. The desktop was quite good > (quad-core Xeon, 12 Go RAM, 500 GB hd), running Debian, and we never had > any performance issue (although some queries were rather long, but still > acceptable). > > * The use case I'm envisioning would be (at least in the foreseeable future): > - About 10 faculty users (which means potentially a little bit more > students using it); I would have hard time considering more than 4 > concurrent users; > - Data would primarily involve a lot (hundreds/thousands) of high > resolution (spatial and temporal) raster and vector maps, possibly over > large areas (Florida / USA / continental), as well as potentially millions > of GPS records (animals individually monitored); > - Queries will primarily involve retrieving points/maps over given > areas/time, as well as intersecting points over environmental layers [from > what I understand, a lot of I/O, with many intermediary tables involved]; > other use cases will involve working with steps, i.e. the straight line > segment connecting two successive locations, and intersecting them with > environmental layers; > > * I couldn't find comprehensive or detailed guidelines on-line about > hardware, but from what I could see, it seems that memory wouldn't be the > main issue, but the number of cores would be (one core per database > connection if I'm not mistaken). At the same time, we want to make sure > that the experience is smooth for everyone... I was advised on the PostGIS > list to give a look at pgpool (however, UNIX only). > > * Is there a difference in terms of possibilities, performance and > usability between a Linux-based and a MS-based server (from the user > perspective)? My center is unfortunately MS-centered, and existing > equipment runs with MS systems... It would thus be easier for them to set > up a MS-based server. Does it change anything for the user? (I insist on > the user perspective, since I and others will not admin the system, but > only use it) > > * Does anyone have worked with a server running the DB engine, while the DB > itself was stored on another box/server? That would likely be the case here > since we already have a dedicated box for file storage. Along these lines, > does the system of the file storage box matter (Linux vs. MS)? > > * We may also use the server as a workstation to streamline PostGIS > processing with further R analyses/modeling (or even use R from within the > database using PL/R). Again, does anyone have experience doing it? Is a > single workstation the recommended way to work with such workflow? Or would > it be better (but more costly) to have one server dedicated to PostGIS and > another one, with different specs, dedicated to analyses (R)? > > I realize my questions and comments may be a confusing, likely because of > the lack of experience about these issues on my side. I really welcome any > feedback of people working with PostgreSQL servers (+ PostGIS ideally!) in > a small unit, or any similar setting that could be informative! > > In advance, thank you very much! > > Sincerely, > Mathieu Basille. > > > [1] Start of the thread here: > http://lists.osgeo.org/pipermail/postgis-users/2015-February/040120.html > -- ~$ whoami Mathieu Basille http://ase-research.org/basille ~$ locate --details University of Florida \\ Fort Lauderdale Research and Education Center (+1) 954-577-6314 ~$ fortune « Le tout est de tout dire, et je manque de mots Et je manque de temps, et je manque d'audace. » -- Paul Éluard
On 12/02/15 12:38, Mathieu Basille wrote: > Thanks to everyone who contributed to this thread, either on the > PostGIS [1] or the PostgreSQL [2] mailing lists. I will try to > summarize everything in this message, which I will actually post on > both lists to give an update to everyone. I hope it can be useful for > other people interested. Please feel free to add more advice and other > experiences, this is always useful! [...] > * Memory > Examples go from 8 to >32 GB RAM. > Because RAM is relatively cheap, sometimes money spent on lots of RAM is better than buying fast disks - especially if the vast majority of SQL executed is read only and most of the database likely to be accessed can reside in RAM along with relevant indexes and working memory. Good SSD's are quite reliably fine for database use, unless you have an enormous number of UPDATE/CREATE/DELETE stuff going on - even then, you might find that appropriate SSD's will still be okay. Note that SSD offerings are constantly changing, and tend to be improving in many areas such as reliability, performance, and cost per GB (obviously, be wary of market speak!). Though you should still have regular backups. > > > Platform > ======== > > Linux is the platform of choice: > * Easier administration (install/configuration/upgrade), which is also > true for addons/dependencies (starting with PostGIS, but also GEOS, > GDAL, PL/R); > * Better performance [4]; > * More tuning options (limited with MS systems); > > There is still the possibility of a virtualbox on a MS server. > Performance of a database is usually (always?) better on an O/S running on bare metal. [...] > * Integration with R: a dedicated R server brings more flexibility / > extensions (e.g. Shiny) / performance (more cores and memory available > for PostGIS) except if data transfer is the bottleneck. Use Pl/R for > small functions (also if it fits naturally into PostgreSQL workflow) / > otherwise in R with PostgreSQL connector. > You might want to look at SageMath (think Mathematica & MatLab), as it incorporates R and provides much more functionality in some areas: http://sagemath.org http://www.sagemath.org/doc/reference/interfaces/sage/interfaces/r.html [...] >> >> [1] Start of the thread here: >> http://lists.osgeo.org/pipermail/postgis-users/2015-February/040120.html >> > All the best! You have certainly been very thorough in your homework, and I'm sure there are many people here who would love to hear how things turn out. Cheers, Gavin
On 12/02/15 12:38, Mathieu Basille wrote: [...] >> [1] Start of the thread here: >> http://lists.osgeo.org/pipermail/postgis-users/2015-February/040120.html [...] http://lists.osgeo.org/pipermail/postgis-users/2015-February/040134.html [...] * About usage being mostly read: this will be true for most "pure GIS" tasks (mostly intersecting), but I find that (from experience), we usually end up with a lot of intermediary tables for our analyses (new tables for the most part, not new columns). [...] For greater performance of intermediary tables: if the these tables can be easily recreated, then you might want to make use of PostgreSQL's unlogged tables: http://www.postgresql.org/docs/9.4/static/sql-createtable.html [...] CREATE [ [ GLOBAL | LOCAL ] { TEMPORARY | TEMP } | UNLOGGED ] TABLE [ IF NOT EXISTS ] table_name ( [ [...] UNLOGGED If specified, the table is created as an unlogged table. Data written to unlogged tables is not written to the write-ahead log (see Chapter 29), which makes them considerably faster than ordinary tables. However, they are not crash-safe: an unlogged table is automatically truncated after a crash or unclean shutdown. The contents of an unlogged table are also not replicated to standby servers. Any indexes created on an unlogged table are automatically unlogged as well. [...] Cheers, Gavin
On 12 February 2015 at 00:38, Mathieu Basille <basille.web@ase-research.org> wrote: > Platform > ======== > > Linux is the platform of choice: > * Easier administration (install/configuration/upgrade), which is also true > for addons/dependencies (starting with PostGIS, but also GEOS, GDAL, PL/R); > * Better performance [4]; > * More tuning options (limited with MS systems); It has to be said that Linux isn't the only choice there. For example, FreeBSD (or NetBSD/OpenBSD) are popular choices for Postgres database servers as well and they perform great and have splendid documentation (an area where I often find Linux a little lacking). They might even be a bit more stable. There are also still several commercial UNIX flavours. I can't say how any of these alternatives (in combination with PostGIS) compare to Linux though, nor whether PostGIS is even available on all of these, but I suspect they're at least on par for performance and stability. Of all of these, Windows is probably the least suitable OS for the job. Which is the most suitable depends on quite a few things, not in the least how likely you'll be able to get experienced support for them. If you're new to the OS and you have to support the system yourself for any length of time, I think that good documentation is a factor to take into account. Am I biased? Definitely. -- If you can't see the forest for the trees, Cut the trees and you'll see there is no forest.
Thanks to Gavin and Alban for additional considerations, all very useful. As for Linux, I have to admit that I am biased too! I use it heavily, which is the reason I would incline for its use. But after all, since I'm not going to administrate the server, the best choice will probably be IT choice! As long as it makes sense from a user perspective (performance, feature set, usability, etc.)... Thanks again! As usual, feel free to add more to the discussion, but there are already plenty of information that is very useful to start the process with IT in my center! Best, Mathieu. Le 12/02/2015 08:34, Alban Hertroys a écrit : > On 12 February 2015 at 00:38, Mathieu Basille > <basille.web@ase-research.org> wrote: > >> Platform >> ======== >> >> Linux is the platform of choice: >> * Easier administration (install/configuration/upgrade), which is also true >> for addons/dependencies (starting with PostGIS, but also GEOS, GDAL, PL/R); >> * Better performance [4]; >> * More tuning options (limited with MS systems); > > It has to be said that Linux isn't the only choice there. For example, > FreeBSD (or NetBSD/OpenBSD) are popular choices for Postgres database > servers as well and they perform great and have splendid documentation > (an area where I often find Linux a little lacking). They might even > be a bit more stable. > > There are also still several commercial UNIX flavours. I can't say how > any of these alternatives (in combination with PostGIS) compare to > Linux though, nor whether PostGIS is even available on all of these, > but I suspect they're at least on par for performance and stability. > > Of all of these, Windows is probably the least suitable OS for the job. > > Which is the most suitable depends on quite a few things, not in the > least how likely you'll be able to get experienced support for them. > If you're new to the OS and you have to support the system yourself > for any length of time, I think that good documentation is a factor to > take into account. > > Am I biased? Definitely. > -- ~$ whoami Mathieu Basille http://ase-research.org/basille ~$ locate --details University of Florida \\ Fort Lauderdale Research and Education Center (+1) 954-577-6314 ~$ fortune « Le tout est de tout dire, et je manque de mots Et je manque de temps, et je manque d'audace. » -- Paul Éluard