I need to import some log-files of an application [...] The import would be easy if the files had a constant name but the app creates csv files with names like "ExportYYYYMMDD".
So how would I get the filenames into the sql-script?
Do man on find and look for -exec.
I could find the files and exec a shell script but how can I have a SQL script take the found filenames as parameter?
The SQL script needs to create a temp table then COPY the file with the filename it got as parameter into the temp table then insert from there into the log-table
How would I get the filenames into the SQL script?
Assuming your main script - the one that mounts the directory and finds the file name - is in bash you can easily put a small script into a heredoc block with variable substitution:
some script stuff that mounts remote directory and sets variable logfilename ... psql -your -connection -parameters <<EOS some preliminary setup statements \copy .... from $logfilename ... some processing statements EOS
The disadvantage of this approach is that it is difficult-to-impossible to detect and handle statement-level errors. But for short scripts like simple imports this may not be an issue or may be easily solved by wrapping things in a begin;...commit; block.