Hello group:
I am working on a huge datasets (every dataset will
have over 50K lines and I have 100 such datasets).
Previously I wrote SQL statements (inserts and insert
within Insert statements).
However, the process went on and on for days due to
many lines and large number of datasets.
I am about to try COPY command. Here is snippet of my
data.
Table Gene:
gene_no |  gene_name
--------------------
   1    |     abc
   2    |     def
   3    |     kir
   .    |     ...
   .    |     ...
   N    |     xxx
--------------------
Here is how my data looks in a tab delim format:
gene_name     expr_value    exp_value2
 def             100.9        300.3
 kir             200.4        334.4
 sir              39.9          4.9
 ..               ....        .....
(30K lines like these)
Now I want to use copy command and upload this whole
tab delim file into the following table:
Table: exp
exp_no  |  gene_no(FK)  | exp_one | exp_two
---------------------------------------
There are two questions :
1. In table 'exp' I have gene_no and not gene_name. In
the tab delim file I have gene_name.  So, what is the
ideal way to link these up - a fast one after
'COPY'ing this tab delim file in to exp_table.  should
I include gene_name also into the 'exp' table or ask
postgres to link up with 'Gene' table through
'gene_no' foreign key.
2. Can this be done by simple SQL statement or should
I have to write a pl/pgql script.
would any one please help me in teaching to get around
this problem.
Thank you.
Srini
__________________________________________________
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around
http://mail.yahoo.com