I have a single-disk virtual Linux system and a read-only dataset
which is exposed to internet and completely replaced from time to
time.
I compiled following steps in order to secure and speedup such
PostgreSQL/PostGIS instance:
1. Re-configure PostgreSQL server as following:
a. Disabling autovacuum daemon.
b. Setting postgresql.conf parameters:
fsync=off
synchronous_commit=off
full_page_writes=off
2. Restart server, login as db admin, create database, create an app.-user.
3. Load dataset...:
a. with owner 'app.-user' in schema PUBLIC;
b. create indexes;
c. issue a VACUUM ANALYZE command on user tables.
4. Create a 'read-only' user (login role) with only read access to
user defined tables:
GRANT SELECT ... TO read_only_user
5. Optimize and secure session by following parameters:
SET transaction_read_only TO FALSE;
SET TRANSACTION READ ONLY;
6. Go to step 3 in case of new data or a harddisk crash.
Questions:
=> Any comments on securing such a PostgreSQL instance further?
All user tables reside in schema PUBLIC, Ok?
=> Any comments on making this PostgreSQL instance 'robust'?
E.g. which situations (except for harddisk crashes) can leave a
read-only dataset in an inconsistent state where PostgreSQL server
can't restart? An immediate shutdown?
=> Any comments on speeding up/optimizing such a read-only dataset?
What about wal_level and archive_mode?
Yours, Stefan