Thread: COPY FROM
Dear,
I need to do a bulk upload (2,600,000 records) of data into a PostgreSQL (v8.0.3) table. I'm trying to achieve this from Visual Basic with ADO and psqlODBC (v8.1.2) but I can't get it working. Currently my code looks like this.
Dim conn As New ADODB.Connection Dim query As String 'DSN less connection query = "DRIVER={PostgreSQL Unicode};SERVER=10.100.1.24;PORT=2345;DATABASE=bigdb;BoolsAsChar=0;TrueIsMinus1=1;Debug=0;CommLog=0" conn.CursorLocation = adUseClient conn.Open query, "bad", "xxxxxxxx" query = "COPY dunn_main (duns, company, company_short, zip, phone, employee_number, legal_id, sic_id, source_id) " & _ "FROM STDIN WITH NULL AS 'NULL' DELIMITER AS ','" conn.Execute query, , adCmdText + adExecuteNoRecords + adAsyncExecute
In the driver logging I can see that it's waiting for the data now, but I can't really figure out how to deliver it. Since the source data (as a text file with fixed length fields) is only available on client side and needs some processing before being ready to import I'm using something like this to prepare the data:
Private Type Dunn_Record CO_NAME As String * 90 PCODE As String * 8 DUNS As String * 9 EMPS_COMP As String * 9 LE As String * 2 L As String * 1 TEL_NBR As String * 14 US72 As String * 4 crlf As String * 2 End Type Dim record As Dunn_Record Dim filehandle As Integer Dim filename As String Dim numLines as long Dim line As Long filehandle = FreeFile filename = "E:\source.txt" Open filename For Random Access Read Lock Read Write As #filehandle Len = Len(record) numLines = LOF(1) / Len(record) For line = 2 to numLines Get #filehandle, line, record With record query = query & CLng(.DUNS) & "," query = query & "'" & Replace(Trim(.CO_NAME), "'", "''") & "'," query = query & "'" & ascii_easy(.CO_NAME) & "'," query = query & "'" & Trim(.PCODE) & "'," query = query & phone(.TEL_NBR) & "," If Len(Trim(.EMPS_COMP)) Then query = query & CLng(.EMPS_COMP) Else query = query & "NULL" query = query & "," If Len(Trim(.LE)) Then query = query & CLng(.LE) Else query = query & "NULL" query = query & "," query = query & CLng(.US72) & "," query = query & rs!source_id End With 'DELIVER THE DATA IN query TO THE DRIVER Next lineI have tried several methods to deliver the prepared data to the driver but without any succes.
- Writing to STDOUT
Private Declare Function GetStdHandle Lib "Kernel32" (ByVal nStdHandle As Long) As Long Private Declare Function WriteFile Lib "Kernel32" (ByVal hFile As Long, ByVal lpBuffer As String, ByVal nNumberOfBytesToWrite As Long, lpNumberOfBytesWritten As Long, lpOverlapped As Any) As Long Private Const STD_OUTPUT_HANDLE = -11& Dim stdhandle As Long Dim llResult As Long stdhandle = GetStdHandle(STD_OUTPUT_HANDLE) WriteFile stdhandle, query, Len(query), llResult, ByVal 0&
- Writing to a socket
Dim socket As New Winsock With socket .Protocol = sckUDPProtocol .RemoteHost = "10.100.1.24" .RemotePort = 2345 .Connect End With socket.SendData query
- Executing it
conn.Execute query
- Writing to some stream
Dim str As New Stream With str .Mode = adModeWrite .Open End With str.WriteText query
So basically my question is : how do I deliver the prepared data to the driver? Any help (tips, working code, example, ...) would be appreciated.
Best regards
>>> "Miguel Juan" <mjuan@cibal.es> 2006-02-08 14:57 >>>
----- Original Message -----From: Bart DegryseCc: Bart DegryseSent: Wednesday, February 08, 2006 11:03 AMSubject: [ODBC] COPY FROMDear,
I need to do a bulk upload (2,600,000 records) of data into a PostgreSQL (v8.0.3) table. I'm trying to achieve this from Visual Basic with ADO and psqlODBC (v8.1.2) but I can't get it working. Currently my code looks like this.Dim conn As New ADODB.Connection Dim query As String 'DSN less connection query = "DRIVER={PostgreSQL Unicode};SERVER=10.100.1.24;PORT=2345;DATABASE=bigdb;BoolsAsChar=0;TrueIsMinus1=1;Debug=0;CommLog=0" conn.CursorLocation = adUseClient conn.Open query, "bad", "xxxxxxxx" query = "COPY dunn_main (duns, company, company_short, zip, phone, employee_number, legal_id, sic_id, source_id) " & _ "FROM STDIN WITH NULL AS 'NULL' DELIMITER AS ','" conn.Execute query, , adCmdText + adExecuteNoRecords + adAsyncExecuteIn the driver logging I can see that it's waiting for the data now, but I can't really figure out how to deliver it. Since the source data (as a text file with fixed length fields) is only available on client side and needs some processing before being ready to import I'm using something like this to prepare the data:
Private Type Dunn_Record CO_NAME As String * 90 PCODE As String * 8 DUNS As String * 9 EMPS_COMP As String * 9 LE As String * 2 L As String * 1 TEL_NBR As String * 14 US72 As String * 4 crlf As String * 2 End Type Dim record As Dunn_Record Dim filehandle As Integer Dim filename As String Dim numLines as long Dim line As Long filehandle = FreeFile filename = "E:\source.txt" Open filename For Random Access Read Lock Read Write As #filehandle Len = Len(record) numLines = LOF(1) / Len(record) For line = 2 to numLines Get #filehandle, line, record With record query = query & CLng(.DUNS) & "," query = query & "'" & Replace(Trim(.CO_NAME), "'", "''") & "'," query = query & "'" & ascii_easy(.CO_NAME) & "'," query = query & "'" & Trim(.PCODE) & "'," query = query & phone(.TEL_NBR) & "," If Len(Trim(.EMPS_COMP)) Then query = query & CLng(.EMPS_COMP) Else query = query & "NULL" query = query & "," If Len(Trim(.LE)) Then query = query & CLng(.LE) Else query = query & "NULL" query = query & "," query = query & CLng(.US72) & "," query = query & rs!source_id End With 'DELIVER THE DATA IN query TO THE DRIVER Next lineI have tried several methods to deliver the prepared data to the driver but without any succes.
- Writing to STDOUT
Private Declare Function GetStdHandle Lib "Kernel32" (ByVal nStdHandle As Long) As Long Private Declare Function WriteFile Lib "Kernel32" (ByVal hFile As Long, ByVal lpBuffer As String, ByVal nNumberOfBytesToWrite As Long, lpNumberOfBytesWritten As Long, lpOverlapped As Any) As Long Private Const STD_OUTPUT_HANDLE = -11& Dim stdhandle As Long Dim llResult As Long stdhandle = GetStdHandle(STD_OUTPUT_HANDLE) WriteFile stdhandle, query, Len(query), llResult, ByVal 0&- Writing to a socket
Dim socket As New Winsock With socket .Protocol = sckUDPProtocol .RemoteHost = "10.100.1.24" .RemotePort = 2345 .Connect End With socket.SendData query- Executing it
conn.Execute query- Writing to some stream
Dim str As New Stream With str .Mode = adModeWrite .Open End With str.WriteText querySo basically my question is : how do I deliver the prepared data to the driver? Any help (tips, working code, example, ...) would be appreciated.
Best regards
Hi Bart, Just create an ODBC Entry on your local computer, add a Refernce to ActiveX Dataobjects 2.8 oder 2.7 to your vbp Projekt, open an ADODB.connection to the Server and send SQL-Insert Statements. Little Example Private Function Insert() Dim DBS As New ADODB.Connection Dim SQLString As String DBS.Open "Provider=MSDASQL.1;Persist Security Info=False;Extended Properties=DSN=YourODBCDatabaseName;" SQLString = "Insert into SomeTable (Field1,Field2,Field3) Values ('111','aaa','bbb')" DBS.Execute SQLString End Function Hope that helps... Your Mail is a little bit "unreadable" ;-) regards, Thomas. -- _____________________________________________ Hela Gewürzwerk Hermann Laue GmbH & Co.KG EDV Thomas Holschen Beimoorweg 11 22926 Ahrensburg Tel. : +49 4102/496-381 http://www.hela-food.de _____________________________________________ >>> "Bart Degryse" <Bart.Degryse@indicator.be> schrieb am Mittwoch, 8. Februar 2006 um 11:03 in Nachricht <s3e9d02f.034@webaccess.indicator.be>: > Dear, > I need to do a bulk upload (2,600,000 records) of data into a PostgreSQL > (v8.0.3) table. I'm trying to achieve this from Visual Basic with ADO and > psqlODBC (v8.1.2) but I can't get it working. Currently my code looks like > this. Dim conn As New ADODB.Connection Dim query As String 'DSN less > connection query = "DRIVER={PostgreSQL > Unicode};SERVER=10.100.1.24;PORT=2345;DATABASE=bigdb;BoolsAsChar=0;TrueIsMin > us1=1;Debug=0;CommLog=0" conn.CursorLocation = adUseClient conn.Open > query, "bad", "xxxxxxxx" query = "COPY dunn_main (duns, company, > company_short, zip, phone, employee_number, legal_id, sic_id, source_id) " & _ > "FROM STDIN WITH NULL AS 'NULL' DELIMITER AS ','" conn.Execute > query, , adCmdText + adExecuteNoRecords + adAsyncExecute > In the driver logging I can see that it's waiting for the data now, but I > can't really figure out how to deliver it. Since the source data (as a text > file with fixed length fields) is only available on client side and needs > some processing before being ready to import I'm using something like this to > prepare the data: > Private Type Dunn_Record CO_NAME As String * 90 PCODE As > String * 8 DUNS As String * 9 EMPS_COMP As String * 9 LE > As String * 2 L As String * 1 TEL_NBR As String * 14 > US72 As String * 4 crlf As String * 2 End Type Dim record As > Dunn_Record Dim filehandle As Integer Dim filename As String Dim > numLines as long Dim line As Long filehandle = FreeFile filename = > "E:\source.txt" Open filename For Random Access Read Lock Read Write As > #filehandle Len = Len(record) numLines = LOF(1) / Len(record) For line > = 2 to numLines Get #filehandle, line, record With record > query = query & CLng(.DUNS) & "," query = query & "'" & > Replace(Trim(.CO_NAME), "'", "''") & "'," query = query & "'" & > ascii_easy(.CO_NAME) & "'," query = query & "'" & Trim(.PCODE) & "'," > query = query & phone(.TEL_NBR) & "," If > Len(Trim(.EMPS_COMP)) Then query = query & CLng(.EMPS_COMP) Else query = query > & "NULL" query = query & "," If Len(Trim(.LE)) Then query > = query & CLng(.LE) Else query = query & "NULL" query = query & "," > query = query & CLng(.US72) & "," query = query & > rs!source_id End With 'DELIVER THE DATA IN query TO THE DRIVER > Next lineI have tried several methods to deliver the prepared data to the > driver but without any succes. > Writing to STDOUT > Private Declare Function GetStdHandle Lib "Kernel32" (ByVal nStdHandle > As Long) As Long Private Declare Function WriteFile Lib "Kernel32" (ByVal > hFile As Long, ByVal lpBuffer As St ring, ByVal nNumberOfBytesToWrite As Long, > lpNumberOfBytesWritten As Long, lpOverlapped As Any) As Long Private Const > STD_OUTPUT_HANDLE = -11& Dim stdhandle As Long Dim llResult As Long > stdhandle = GetStdHandle(STD_OUTPUT_HANDLE) WriteFile stdhandle, query, > Len(query), llResult, ByVal 0&Writing to a socket > Dim socket As New Winsock With socket .Protocol = > sckUDPProtocol .RemoteHost = "10.100.1.24" .RemotePort = 2345 > .Connect End With socket.SendData queryExecuting it > conn.Execute queryWriting to some stream > Dim str As New Stream With str .Mode = adModeWrite > .Open End With str.WriteText query > So basically my question is : how do I deliver the prepared data to the > driver? Any help (tips, working code, example, ...) would be appreciated. > Best regards Diese E-Mail enthält vertrauliche und/oder rechtlich geschützte Informationen. Wenn Sie nicht der richtige Adressat sind oder diese E-Mail irrtümlich erhalten haben, informieren Sie bitte den Absender und löschen Sie diese E-Mail. Das unerlaubte Kopieren sowie die unbefugte Weitergabe dieser E-Mail ist nicht gestattet. Aus Rechts- und Sicherheitsgründen ist die in dieser E-Mail gegebene Information nicht rechtsverbindlich. This e-mail contains confidential and/or privileged information. If you are not the intended addressee or have received this e-mail in error please notify the sender and delete this e-mail. Any unauthorized copying, disclosure or distribution of the material in this e-mail is strictly forbidden. Due to legal and security reasons the information contained in this e-mail is not legally binding.
> In the driver logging I can see that it's waiting for the data now, > but I can't really figure out how to deliver it. Since the source > data (as a text file with fixed length fields) is only available > on client side and needs some processing before being ready to import I think ODBC doesn't support something like standard input ... Or maybe it does? Do you try print the prepared data to the standard output? > So basically my question is : how do I deliver the prepared data > to the driver? Any help (tips, working code, example, ...) > would be appreciated. If you can't find the way you can try to use insert statements. You may collect multiple insert statements into one transaction to speed up the data loading. Regards, Luf
Ludek Finstrle wrote: >> In the driver logging I can see that it's waiting for the data now, >> but I can't really figure out how to deliver it. Since the source >> data (as a text file with fixed length fields) is only available >> on client side and needs some processing before being ready to import > > I think ODBC doesn't support something like standard input ... > Or maybe it does? Do you try print the prepared data to the standard > output? I don't know about ODBC at the low level, but I've certainly never found a way to do this with DAO or ADO. I don't *think* it's possible. >> So basically my question is : how do I deliver the prepared data >> to the driver? Any help (tips, working code, example, ...) >> would be appreciated. > > If you can't find the way you can try to use insert statements. > You may collect multiple insert statements into one transaction > to speed up the data loading. The fastest approach I've found that *is* VB-friendly is to ship the data over as a bound parameter to a stored procedure that unpacks it into rows on the server side (writing a suitable function returning SETOF RECORD to split apart the string or blob). I don't actually know if there's an upper limit on the size of such arguments, but I've pushed 30k rows or so (~1MiB of total data) without difficulty using psqlodbc and ADO.Command objects. Since that's past 64kiB, and computer programmers not being terribly creative folk with it comes to max sizes, I suspect this means it'll be safe up to at least 2GiB if you have the memory to prepare and receive such a behemoth. I have a trivial split-on-delimiter char routine for simple stuff, and a from_csv written in PL/perl using Text::CSV for multi-column data or anything that needs to potentially quote the delimiter appearing in the contained data. I'm not actually sure the 'simple' version is even faster in any meaningful sense, but I wrote it first so it's still around. If there's interest I should be able to post them, but it's a work project so I don't have it handy tonight. For the CSV version, I implemented everything except multi-line values, which I had no need for - the concept would certainly allow it with a little more work in the parser function. It streams through the input splitting consuming lines of input and using return_next to build the result set, so if the query is able to consume rows as they come out (no sort or similar) it shouldn't ever need to have both the argument and the full recordset in memory. But, with my strings only in the megabyte range I haven't every really tested carefully to prove that it doesn't. Usage looks roughly like the following INSERT INTO tbl_foo (foo,bar,baz) SELECT * FROM from_csv(?) AS t(foo int4, bar text, baz date) with the arg bound to a string like 'foo,bar,baz foo1,bar1,baz1 foo2,bar2,baz2' IIRC, this took the time to load the data from ~45 seconds (using a series of INSERTs within a single txn) to about 500ms (seemingly pretty much network constrained), so the difference is pretty dang dramatic. You have to be careful to stringify non-text values in a way that postgres is happy with when you pack the argument, and then the type statements in the AS clause will coerce things so the rest of the query can ignore the fact they were delivered packed in text. YMMV. One could also use something other than CSV as the serialization, and the typesafety and serialization/parsing efficiency might be better if you did, but this was good enough for me and is nicely generic. Someday I mifhr also do one for the QDataStream's packing shipped over in a blob, since that would let my other C++/Qt apps have a more type-strict container, but that hasn't happened yet. This technique also allows you to materialize tabular 'immediate' data for a query in its FROM statement, which can be quite handy. I've actually replaced quite a few instances where I used to prepare a temp table or loop repeating a prepared SELECT varying the arguments by doing so. It avoids the repeated network latency (important in my case, though obviously not if the DB is local) and lets me feed in all the search data on one shot, so the db can often switch techniques from a series of hash or index lookups to something that shows better locality - depending on the structure of the query it can be an enormous win (one important operation went from 90s to ~300ms). > Regards, > > Luf