Thread: Pushing result set to table on different machine?
Hi, I recently added a linux/windows laptop to our home network. I have started Postgresql (8.0.3) on the laptop and have "replicated" my main database on the laptop. (pg_dump, ftp and pg-"undump"). I would like to keep them reasonably in sync. Slony is overkill and I think the "mirror" contrib is possibly overkill also. I have installed the dblink package and can easily "pull" data from my main server back to the laptop. Is there a elegant way to "push" the data from the main server to the laptop? Jerry
Hi, I have a PostGIS enabled Postgres database. For institutional (un)reasons I'm the database owner in all but reality. I'm writing a plperl function to get the lat & lon coords from a geometry and execute a system call to get the depth (or altitude) at a location from a global datamodel. So I and other users can do something like: update site set depth = depth(todeg(site_geom)); (where site_geom is a point geometry. The todeg function is plpsql to convert the point to lat long coords from any other projections, to match the coordinate system of the global grid. It works fine.) My problem is that the language needs to be plperlu (the unsecured implementation of plperl) to be allowed to execute the system call to get the depth at the specified location. To work, the plperlu function must be created by the superuser, who I assume is postgres. That is not me. Sigh. Is there any way I can develop (with the various "create or replace function" iterations this wiil probably require) this plperlu function as a non superuser? Thanks, Brent Wood
On Tue, 13 Sep 2005, Jerry LeVan wrote: > Hi, > I recently added a linux/windows laptop to our home network. > > I have started Postgresql (8.0.3) on the laptop and have > "replicated" my main database on the laptop. (pg_dump, > ftp and pg-"undump"). > > I would like to keep them reasonably in sync. Slony is > overkill and I think the "mirror" contrib is possibly > overkill also. > > I have installed the dblink package and can easily "pull" > data from my main server back to the laptop. > > Is there a elegant way to "push" the data from the main > server to the laptop? I have not tried this with Postgres, but have done similar things with other databases and related packages. In the Postgres case, a script on the server which remotely runs a command on the laptop (ssh/rexec/rsh as you prefer) could run a copy from table (on the server) | copy to table from stdin (remotely on the laptop) Something to empty tables first might help, but any command can be set up to run on the laptop, but be invoked from the server. Data from a srever run command can, as above, be piped as input to a command run by the laptop (but started from/by the server) I don't know that I'd recommend it, but you may be able to rsynch the database directory. Set up the script & run it on the server whenever you want. Brent Wood
Brent Wood <b.wood@niwa.co.nz> writes: > That is not me. Sigh. Is there any way I can develop (with the various > "create or replace function" iterations this wiil probably require) this > plperlu function as a non superuser? If you could, it would be a security hole, which we would fix with utmost alacrity. Untrusted-language functions may only be created by superusers. Can you compartmentalize the depth-accessing function as a small plperlu function, and do all the interesting stuff in plain plperl atop that? regards, tom lane
On Wed, 14 Sep 2005, Tom Lane wrote: > Brent Wood <b.wood@niwa.co.nz> writes: > > That is not me. Sigh. Is there any way I can develop (with the various > > "create or replace function" iterations this wiil probably require) this > > plperlu function as a non superuser? > > If you could, it would be a security hole, which we would fix with > utmost alacrity. Untrusted-language functions may only be created > by superusers. Pretty much what I expected but figured I'd ask :-) > > Can you compartmentalize the depth-accessing function as a small > plperlu function, and do all the interesting stuff in plain plperl > atop that? > Much of the preprocessing required is in plpgsql using PostGIS functions. The whole Perl thing is only about 10 lines long so not worth splitting if avoidable. The depth accessing command is a one line system call with coords as parameters. The rest just builds a bounding box for the point location so that the call to GMT restricts it's access to the terrain model to a few square degrees worth of data and not the entire few Gb :-) A bit faster that way! One option is another system with Postgres/PostGIS where I am superuser & can develop, then get the superuser of the working database to run the SQL's for me to create the functions. Or see if the situation can justify me getting superuser status. Working with code can be easier than wrestling with beauracracy :-) Thanks, Brent
On Wed, Sep 14, 2005 at 04:11:25PM +1200, Brent Wood wrote: > Hi, > > My problem is that the language needs to be plperlu (the unsecured > implementation of plperl) to be allowed to execute the system call > to get the depth at the specified location. To work, the plperlu > function must be created by the superuser, who I assume is postgres. The reason that only the superuser can do this is that a procedure in any untrusted PL can do things like 'rm -rf $PGDATA' and have it obeyed. > That is not me. Sigh. Is there any way I can develop (with the > various "create or replace function" iterations this wiil probably > require) this plperlu function as a non superuser? Nope. Assuming you can't get access as db superuser, you might want to look into LISTEN/NOTIFY system http://blackhawk.supernews.net/listen.pl.txt http://www.postgresql.org/docs/current/static/libpq-notify.html http://www.postgresql.org/docs/current/static/sql-listen.html Cheers, D -- David Fetter david@fetter.org http://fetter.org/ phone: +1 510 893 6100 mobile: +1 415 235 3778 Remember to vote!