Re: raw output from copy - Mailing list pgsql-hackers
From | Kohei KaiGai |
---|---|
Subject | Re: raw output from copy |
Date | |
Msg-id | CADyhKSVeBHcTgcoSy7A0Y=uqESvUgr6=71H5oOBWYHYtUn8e7A@mail.gmail.com Whole thread Raw |
In response to | Re: raw output from copy (Pavel Stehule <pavel.stehule@gmail.com>) |
List | pgsql-hackers |
2016-12-06 16:59 GMT+09:00 Pavel Stehule <pavel.stehule@gmail.com>: > > > 2016-12-06 1:50 GMT+01:00 Kohei KaiGai <kaigai@kaigai.gr.jp>: >> >> 2016-12-05 22:45 GMT+09:00 Pavel Stehule <pavel.stehule@gmail.com>: >> > >> > There are more goals: >> > >> > 1. user friendly import of text or binary data - import text data (with >> > psql) from file is possible - but you have to load a content to psql >> > variable. For binary data you should to use workaround based on LO and >> > transformation from LO to bytea. >> > >> > 2. user friendly export text or binary data - now, the binary data can >> > be >> > exported only via transformation to LO. The XML has very interesting >> > features when is passing from/to client binary. This feature is >> > impossible >> > in psql now. >> > >> : >> <snip> >> : >> >> It seems to me extend of COPY statement for this optimization is a bit >> >> overkill >> >> solution. Do we find out an alternative solution that we can build on >> >> the existing >> >> infrastructure? >> > >> > The advantage and sense of COPY RAW was reusing existing interface. The >> > question was: How I can export/import binary data simply from psql >> > console? >> > >> OK, I could get your point. >> >> Likeky, we can implement the feature without COPY statement enhancement >> by adding a special purpose function and \xxx command on psql. >> >> Let's assume the two commands below on psql: >> >> \blob_import <table_name> <column_name> (STDIN|<filename>) >> \blob_export <query> (STDOUT|<filename>) >> >> On \blob_import, the psql command reads the binary contents from either >> stdin or file, than call a special purpose function that takes three >> arguments; table name, column name and a binary data chunk. >> PQexecParams() of libpq allows to deliver the data chunk with keeping >> binary data format, then the special purpose function will be able to >> lookup the destination table/column and construct a tuple that contains >> the supplied data chunk. (I think xxxx_recv handler shall be used for >> data validation, but not an element of this feature.) >> >> >> On \blob_export, the psql command also set up a simple query as follows: >> SELECT blob_export((<user's supplied query)) >> For example, >> \blob_export SELECT binary_data FROM my_table WHERE id = 10 /tmp/aaa >> shall be transformed to >> SELECT blob_export((SELECT binary_data FROM my_table WHERE id = 10)) > > > This is reason why I prefer a COPY statement - because it does all necessary > things natural. But if there is a disagreement against COPY RAW it can be > implemented as psql commands. > Yes, both of approach will be able to implement what you want to do. I agree it is valuable if psql can import/export a particular item with simple shell-script description, however, here is no consensus how to implement it. If psql supports the special \xxx command, it is equivalently convenient from the standpoint of users, with no enhancement of the statement. I hope committers comment on the approach we will take on. Thanks, > export should be similar like \g, \gset feature > > so > > SELECT xmldoc FROM xxxx > \gbinary_store xxxx.xxx > > import is maybe better solved by proposed file references in queries > > Regards > > Pavel > > >> >> >> This function is declared as: >> blob_export(anyelement) RETURNS bytea >> So, as long as the user supplied query returns exactly one column and >> one row, it can transform the argument to the binary stream, then psql >> command receive it and dump somewhere; stdout or file. >> >> How about your thought? >> >> Thanks, >> -- >> KaiGai Kohei <kaigai@kaigai.gr.jp> > > -- KaiGai Kohei <kaigai@kaigai.gr.jp>
pgsql-hackers by date: