Thread: How to add xml data to table

How to add xml data to table

From
"Andrus Moor"
Date:
soap response below contains table of products, approx 5000 rows.
Table of products (below) is nearly similar structure as xml data.
 
Products table needs to be updated from xml data in every hour.
 
How to add this xml data to table of products ?
 
Should I use xpath() function or any other ides ?
Using npgsql and C# in ASP .NET / Mono.
 
Andrus.
 
 
CREATE TABLE products (
SupplierCode char(20) primary key,
SegmentId char(8),
GroupId char(8),
ClassId char(8),
SeriesId char(8),
VendorId char(2),
PartNumbrt char(27),
Name Text,
Warranty Numeric(6,2),
Price Numeric(10,4),
Quantity Numeric(8,2)
)
 
Data which is required to add looks like:
 
<?xml version="1.0" encoding="utf-8"?>
<soap12:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
<soap12:Body>
<GetProductListResponse xmlns="http://xxx.yy.zz/">
<GetProductListResult>
<ProductList>
<Product>
<SupplierCode>001982</SupplierCode>
<SegmentId>65000000</SegmentId>
<GroupId>65010000</GroupId>
<ClassId>65010200</ClassId>
<SeriesId>10001125</SeriesId>
<VendorId>AM</VendorId>
<PartNumber>ADA3000BIBOX</PartNumber>
<Name>AMD Athlon64 3000+ (1800MHz/L2 Cache 512KB) Socket 939, BOX</Name>
<Warranty>36</Warranty>
<Price>196.00000</Price>
<Quantity>0</Quantity>
<DateExpected>1999-01-01T00:00:00</DateExpected>
<IsNewProduct>true</IsNewProduct>
</Product>
<Product>
<SupplierCode>001512</SupplierCode>
<SegmentId>65000000</SegmentId>
<GroupId>65010000</GroupId>
<ClassId>65010200</ClassId>
<SeriesId>10001125</SeriesId>
<VendorId>AM</VendorId>
Acme API Specification v 1.0
13
<PartNumber>ADA3000AXBOX</PartNumber>
<Name>AMD Athlon64 3000+ (2000MHz/1600MHz/L2 Cache 512KB) Socket 754, BOX</Name>
<Warranty>36</Warranty>
<Price>296.00000</Price>
<Quantity>0</Quantity>
<GrossWeight>3.6000</GrossWeight>
<DateExpected>1999-01-01T00:00:00</DateExpected>
<IsNewProduct>false</IsNewProduct>
</Product>
</ProductList>
</GetProductListResult>
</GetProductListResponse>
</soap12:Body>
</soap12:Envelope>

Re: How to add xml data to table

From
"Francisco Figueiredo Jr."
Date:
I think your best bet would be to go with XmlReader as it provides a
fast read only parsing of the document.

From MS doc about linq to xml:
http://msdn.microsoft.com/en-us/library/bb387048.aspx

"XmlReader is a fast, forward-only, non-caching parser.

LINQ to XML is implemented on top of XmlReader, and they are tightly
integrated. However, you can also use XmlReader by itself.

For example, suppose you are building a Web service that will parse
hundreds of XML documents per second, and the documents have the same
structure, meaning that you only have to write one implementation of
the code to parse the XML. In this case, you would probably want to
use XmlReader by itself.

In contrast, if you are building a system that parses many smaller XML
documents, and each one is different, you would want to take advantage
of the productivity improvements that LINQ to XML provides."


I think your case fits the first example.

This way you could use xmlreader to extract the values and then fill
NpgsqlParameter values and execute the insert command.

I hope it helps.



2011/10/7 Andrus Moor <eetasoft@online.ee>:
> soap response below contains table of products, approx 5000 rows.
> Table of products (below) is nearly similar structure as xml data.
>
> Products table needs to be updated from xml data in every hour.
>
> How to add this xml data to table of products ?
>
> Should I use xpath() function or any other ides ?
> Using npgsql and C# in ASP .NET / Mono.
>
> Andrus.
>
>
> CREATE TABLE products (
> SupplierCode char(20) primary key,
> SegmentId char(8),
> GroupId char(8),
> ClassId char(8),
> SeriesId char(8),
> VendorId char(2),
> PartNumbrt char(27),
> Name Text,
> Warranty Numeric(6,2),
> Price Numeric(10,4),
> Quantity Numeric(8,2)
> )
>
> Data which is required to add looks like:
>
> <?xml version="1.0" encoding="utf-8"?>
> <soap12:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
> xmlns:xsd="http://www.w3.org/2001/XMLSchema"
> xmlns:soap12="http://www.w3.org/2003/05/soapenvelope">
> <soap12:Body>
> <GetProductListResponse xmlns="http://xxx.yy.zz/">
> <GetProductListResult>
> <ProductList>
> <Product>
> <SupplierCode>001982</SupplierCode>
> <SegmentId>65000000</SegmentId>
> <GroupId>65010000</GroupId>
> <ClassId>65010200</ClassId>
> <SeriesId>10001125</SeriesId>
> <VendorId>AM</VendorId>
> <PartNumber>ADA3000BIBOX</PartNumber>
> <Name>AMD Athlon64 3000+ (1800MHz/L2 Cache 512KB) Socket 939, BOX</Name>
> <Warranty>36</Warranty>
> <Price>196.00000</Price>
> <Quantity>0</Quantity>
> <DateExpected>1999-01-01T00:00:00</DateExpected>
> <IsNewProduct>true</IsNewProduct>
> </Product>
> <Product>
> <SupplierCode>001512</SupplierCode>
> <SegmentId>65000000</SegmentId>
> <GroupId>65010000</GroupId>
> <ClassId>65010200</ClassId>
> <SeriesId>10001125</SeriesId>
> <VendorId>AM</VendorId>
> Acme API Specification v 1.0
> 13
> <PartNumber>ADA3000AXBOX</PartNumber>
> <Name>AMD Athlon64 3000+ (2000MHz/1600MHz/L2 Cache 512KB) Socket 754,
> BOX</Name>
> <Warranty>36</Warranty>
> <Price>296.00000</Price>
> <Quantity>0</Quantity>
> <GrossWeight>3.6000</GrossWeight>
> <DateExpected>1999-01-01T00:00:00</DateExpected>
> <IsNewProduct>false</IsNewProduct>
> </Product>
> </ProductList>
> </GetProductListResult>
> </GetProductListResponse>
> </soap12:Body>
> </soap12:Envelope>



--
Regards,

Francisco Figueiredo Jr.
Npgsql Lead Developer
http://www.npgsql.org
http://gplus.to/franciscojunior
http://fxjr.blogspot.com
http://twitter.com/franciscojunior

Re: How to add xml data to table

From
"Francisco Figueiredo Jr."
Date:
I think this approach is much better as you can solve everything on
server itself.

About your question on http request I don't know.
Sorry for that. :(
Maybe there is a module for Postgresql which can enable you to make http calls?



On Sat, Oct 8, 2011 at 17:15, Andrus <kobruleht2@hot.ee> wrote:
> Thank you.
> I got also the following code:
>
> First import the XML into a staging table:
>
> CREATE TABLE xml_import
> (
>  xml_data  xml
> )
>
> with product_list as (
>  select
>
unnest(xpath('/soap12:Envelope/soap12:Body/pl:GetProductListResponse/pl:GetProductListResult/pl:ProductList/pl:Product',
> xml_data,
>         ARRAY[ array['xsd', 'http://www.w3.org/2001/XMLSchema-instance'],
>                array['soap12', 'http://www.w3.org/2003/05/soapenvelope'],
>                array['pl', 'http://xxx.yy.zz/']])) as product
>  from xml_import
> )
> select (xpath('/Product/SupplierCode/text()', product)::varchar[])[1] as
> suppliercode,
>      (xpath('/Product/SegmentId/text()', product)::varchar[])[1] as
> segmentid,
>      (xpath('/Product/PartNumber/text()', product)::varchar[])[1] as
> partnumber,
>      to_number((xpath('/Product/Price/text()', product)::varchar[])[1],
> '99999.99999') as price,
>      to_number((xpath('/Product/GrossWeight/text()',
> product)::varchar[])[1], '9999.9999') as weight
> from product_list
>
> Looks simpler than using XmlReader, isn't it?
> How to invoke asmx web service call (= http POST request) from
> PostgreSql server  which reads http response to xml_import  table ?
>
> How to call stored procedure periodically after every one hour in server?
>
> In this case we can create stored procedure, client side code is not
> nessecary at
> all.
>
> Andrus.
>
> -----Algsõnum----- From: Francisco Figueiredo Jr.
> Sent: Saturday, October 08, 2011 9:38 PM
> To: Andrus Moor
> Cc: pgsql-general@postgresql.org
> Subject: Re: [GENERAL] How to add xml data to table
>
> I think your best bet would be to go with XmlReader as it provides a
> fast read only parsing of the document.
>
> From MS doc about linq to xml:
> http://msdn.microsoft.com/en-us/library/bb387048.aspx
>
> "XmlReader is a fast, forward-only, non-caching parser.
>
> LINQ to XML is implemented on top of XmlReader, and they are tightly
> integrated. However, you can also use XmlReader by itself.
>
> For example, suppose you are building a Web service that will parse
> hundreds of XML documents per second, and the documents have the same
> structure, meaning that you only have to write one implementation of
> the code to parse the XML. In this case, you would probably want to
> use XmlReader by itself.
>
> In contrast, if you are building a system that parses many smaller XML
> documents, and each one is different, you would want to take advantage
> of the productivity improvements that LINQ to XML provides."
>
>
> I think your case fits the first example.
>
> This way you could use xmlreader to extract the values and then fill
> NpgsqlParameter values and execute the insert command.
>
> I hope it helps.
>
>
>
> 2011/10/7 Andrus Moor <eetasoft@online.ee>:
>>
>> soap response below contains table of products, approx 5000 rows.
>> Table of products (below) is nearly similar structure as xml data.
>>
>> Products table needs to be updated from xml data in every hour.
>>
>> How to add this xml data to table of products ?
>>
>> Should I use xpath() function or any other ides ?
>> Using npgsql and C# in ASP .NET / Mono.
>>
>> Andrus.
>>
>>
>> CREATE TABLE products (
>> SupplierCode char(20) primary key,
>> SegmentId char(8),
>> GroupId char(8),
>> ClassId char(8),
>> SeriesId char(8),
>> VendorId char(2),
>> PartNumbrt char(27),
>> Name Text,
>> Warranty Numeric(6,2),
>> Price Numeric(10,4),
>> Quantity Numeric(8,2)
>> )
>>
>> Data which is required to add looks like:
>>
>> <?xml version="1.0" encoding="utf-8"?>
>> <soap12:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
>> xmlns:xsd="http://www.w3.org/2001/XMLSchema"
>> xmlns:soap12="http://www.w3.org/2003/05/soapenvelope">
>> <soap12:Body>
>> <GetProductListResponse xmlns="http://xxx.yy.zz/">
>> <GetProductListResult>
>> <ProductList>
>> <Product>
>> <SupplierCode>001982</SupplierCode>
>> <SegmentId>65000000</SegmentId>
>> <GroupId>65010000</GroupId>
>> <ClassId>65010200</ClassId>
>> <SeriesId>10001125</SeriesId>
>> <VendorId>AM</VendorId>
>> <PartNumber>ADA3000BIBOX</PartNumber>
>> <Name>AMD Athlon64 3000+ (1800MHz/L2 Cache 512KB) Socket 939, BOX</Name>
>> <Warranty>36</Warranty>
>> <Price>196.00000</Price>
>> <Quantity>0</Quantity>
>> <DateExpected>1999-01-01T00:00:00</DateExpected>
>> <IsNewProduct>true</IsNewProduct>
>> </Product>
>> <Product>
>> <SupplierCode>001512</SupplierCode>
>> <SegmentId>65000000</SegmentId>
>> <GroupId>65010000</GroupId>
>> <ClassId>65010200</ClassId>
>> <SeriesId>10001125</SeriesId>
>> <VendorId>AM</VendorId>
>> Acme API Specification v 1.0
>> 13
>> <PartNumber>ADA3000AXBOX</PartNumber>
>> <Name>AMD Athlon64 3000+ (2000MHz/1600MHz/L2 Cache 512KB) Socket 754,
>> BOX</Name>
>> <Warranty>36</Warranty>
>> <Price>296.00000</Price>
>> <Quantity>0</Quantity>
>> <GrossWeight>3.6000</GrossWeight>
>> <DateExpected>1999-01-01T00:00:00</DateExpected>
>> <IsNewProduct>false</IsNewProduct>
>> </Product>
>> </ProductList>
>> </GetProductListResult>
>> </GetProductListResponse>
>> </soap12:Body>
>> </soap12:Envelope>
>
>
>
> --
> Regards,
>
> Francisco Figueiredo Jr.
> Npgsql Lead Developer
> http://www.npgsql.org
> http://gplus.to/franciscojunior
> http://fxjr.blogspot.com
> http://twitter.com/franciscojunior
>



--
Regards,

Francisco Figueiredo Jr.
Npgsql Lead Developer
http://www.npgsql.org
http://gplus.to/franciscojunior
http://fxjr.blogspot.com
http://twitter.com/franciscojunior

Re: How to add xml data to table

From
"Andrus"
Date:
There was dblink contrib module which reads data from other PostgreSql
database using tcp/ip connection.
Maybe there is similar for http/asmx data retrieval ?

About periodic call of stored procedure, is there cron contrib for PosgreSql
? Or can we force some code call on autofacuum or after
every server request like poor man crontab in drupal/php? This code can
check and call refresh on evry hour.

Can we add some trigger code for some frequently used table or view to
implement poor man crontab ?

Can some serve sider langeage used for those ?
Is server side C#/.NET,Mono already implemented ?

Andrus.



-----Algsõnum-----
From: Francisco Figueiredo Jr.
Sent: Saturday, October 08, 2011 11:26 PM
To: Andrus
Cc: pgsql-general@postgresql.org
Subject: Re: [GENERAL] How to add xml data to table

I think this approach is much better as you can solve everything on
server itself.

About your question on http request I don't know.
Sorry for that. :(
Maybe there is a module for Postgresql which can enable you to make http
calls?



On Sat, Oct 8, 2011 at 17:15, Andrus <kobruleht2@hot.ee> wrote:
> Thank you.
> I got also the following code:
>
> First import the XML into a staging table:
>
> CREATE TABLE xml_import
> (
>  xml_data  xml
> )
>
> with product_list as (
>  select
>
unnest(xpath('/soap12:Envelope/soap12:Body/pl:GetProductListResponse/pl:GetProductListResult/pl:ProductList/pl:Product',
> xml_data,
>         ARRAY[ array['xsd', 'http://www.w3.org/2001/XMLSchema-instance'],
>                array['soap12', 'http://www.w3.org/2003/05/soapenvelope'],
>                array['pl', 'http://xxx.yy.zz/']])) as product
>  from xml_import
> )
> select (xpath('/Product/SupplierCode/text()', product)::varchar[])[1] as
> suppliercode,
>      (xpath('/Product/SegmentId/text()', product)::varchar[])[1] as
> segmentid,
>      (xpath('/Product/PartNumber/text()', product)::varchar[])[1] as
> partnumber,
>      to_number((xpath('/Product/Price/text()', product)::varchar[])[1],
> '99999.99999') as price,
>      to_number((xpath('/Product/GrossWeight/text()',
> product)::varchar[])[1], '9999.9999') as weight
> from product_list
>
> Looks simpler than using XmlReader, isn't it?
> How to invoke asmx web service call (= http POST request) from
> PostgreSql server  which reads http response to xml_import  table ?
>
> How to call stored procedure periodically after every one hour in server?
>
> In this case we can create stored procedure, client side code is not
> nessecary at
> all.
>
> Andrus.
>
> -----Algsõnum----- From: Francisco Figueiredo Jr.
> Sent: Saturday, October 08, 2011 9:38 PM
> To: Andrus Moor
> Cc: pgsql-general@postgresql.org
> Subject: Re: [GENERAL] How to add xml data to table
>
> I think your best bet would be to go with XmlReader as it provides a
> fast read only parsing of the document.
>
> From MS doc about linq to xml:
> http://msdn.microsoft.com/en-us/library/bb387048.aspx
>
> "XmlReader is a fast, forward-only, non-caching parser.
>
> LINQ to XML is implemented on top of XmlReader, and they are tightly
> integrated. However, you can also use XmlReader by itself.
>
> For example, suppose you are building a Web service that will parse
> hundreds of XML documents per second, and the documents have the same
> structure, meaning that you only have to write one implementation of
> the code to parse the XML. In this case, you would probably want to
> use XmlReader by itself.
>
> In contrast, if you are building a system that parses many smaller XML
> documents, and each one is different, you would want to take advantage
> of the productivity improvements that LINQ to XML provides."
>
>
> I think your case fits the first example.
>
> This way you could use xmlreader to extract the values and then fill
> NpgsqlParameter values and execute the insert command.
>
> I hope it helps.
>
>
>
> 2011/10/7 Andrus Moor <eetasoft@online.ee>:
>>
>> soap response below contains table of products, approx 5000 rows.
>> Table of products (below) is nearly similar structure as xml data.
>>
>> Products table needs to be updated from xml data in every hour.
>>
>> How to add this xml data to table of products ?
>>
>> Should I use xpath() function or any other ides ?
>> Using npgsql and C# in ASP .NET / Mono.
>>
>> Andrus.
>>
>>
>> CREATE TABLE products (
>> SupplierCode char(20) primary key,
>> SegmentId char(8),
>> GroupId char(8),
>> ClassId char(8),
>> SeriesId char(8),
>> VendorId char(2),
>> PartNumbrt char(27),
>> Name Text,
>> Warranty Numeric(6,2),
>> Price Numeric(10,4),
>> Quantity Numeric(8,2)
>> )
>>
>> Data which is required to add looks like:
>>
>> <?xml version="1.0" encoding="utf-8"?>
>> <soap12:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
>> xmlns:xsd="http://www.w3.org/2001/XMLSchema"
>> xmlns:soap12="http://www.w3.org/2003/05/soapenvelope">
>> <soap12:Body>
>> <GetProductListResponse xmlns="http://xxx.yy.zz/">
>> <GetProductListResult>
>> <ProductList>
>> <Product>
>> <SupplierCode>001982</SupplierCode>
>> <SegmentId>65000000</SegmentId>
>> <GroupId>65010000</GroupId>
>> <ClassId>65010200</ClassId>
>> <SeriesId>10001125</SeriesId>
>> <VendorId>AM</VendorId>
>> <PartNumber>ADA3000BIBOX</PartNumber>
>> <Name>AMD Athlon64 3000+ (1800MHz/L2 Cache 512KB) Socket 939, BOX</Name>
>> <Warranty>36</Warranty>
>> <Price>196.00000</Price>
>> <Quantity>0</Quantity>
>> <DateExpected>1999-01-01T00:00:00</DateExpected>
>> <IsNewProduct>true</IsNewProduct>
>> </Product>
>> <Product>
>> <SupplierCode>001512</SupplierCode>
>> <SegmentId>65000000</SegmentId>
>> <GroupId>65010000</GroupId>
>> <ClassId>65010200</ClassId>
>> <SeriesId>10001125</SeriesId>
>> <VendorId>AM</VendorId>
>> Acme API Specification v 1.0
>> 13
>> <PartNumber>ADA3000AXBOX</PartNumber>
>> <Name>AMD Athlon64 3000+ (2000MHz/1600MHz/L2 Cache 512KB) Socket 754,
>> BOX</Name>
>> <Warranty>36</Warranty>
>> <Price>296.00000</Price>
>> <Quantity>0</Quantity>
>> <GrossWeight>3.6000</GrossWeight>
>> <DateExpected>1999-01-01T00:00:00</DateExpected>
>> <IsNewProduct>false</IsNewProduct>
>> </Product>
>> </ProductList>
>> </GetProductListResult>
>> </GetProductListResponse>
>> </soap12:Body>
>> </soap12:Envelope>
>
>
>
> --
> Regards,
>
> Francisco Figueiredo Jr.
> Npgsql Lead Developer
> http://www.npgsql.org
> http://gplus.to/franciscojunior
> http://fxjr.blogspot.com
> http://twitter.com/franciscojunior
>



--
Regards,

Francisco Figueiredo Jr.
Npgsql Lead Developer
http://www.npgsql.org
http://gplus.to/franciscojunior
http://fxjr.blogspot.com
http://twitter.com/franciscojunior


Re: How to add xml data to table

From
"Andrus"
Date:
Thank you.
I got also the following code:

First import the XML into a staging table:

CREATE TABLE xml_import
(
   xml_data  xml
)

with product_list as (
  select

unnest(xpath('/soap12:Envelope/soap12:Body/pl:GetProductListResponse/pl:GetProductListResult/pl:ProductList/pl:Product',
xml_data,
          ARRAY[ array['xsd', 'http://www.w3.org/2001/XMLSchema-instance'],
                 array['soap12', 'http://www.w3.org/2003/05/soapenvelope'],
                 array['pl', 'http://xxx.yy.zz/']])) as product
  from xml_import
)
select (xpath('/Product/SupplierCode/text()', product)::varchar[])[1] as
suppliercode,
       (xpath('/Product/SegmentId/text()', product)::varchar[])[1] as
segmentid,
       (xpath('/Product/PartNumber/text()', product)::varchar[])[1] as
partnumber,
       to_number((xpath('/Product/Price/text()', product)::varchar[])[1],
'99999.99999') as price,
       to_number((xpath('/Product/GrossWeight/text()',
product)::varchar[])[1], '9999.9999') as weight
from product_list

Looks simpler than using XmlReader, isn't it?
How to invoke asmx web service call (= http POST request) from
PostgreSql server  which reads http response to xml_import  table ?

How to call stored procedure periodically after every one hour in server?

In this case we can create stored procedure, client side code is not
nessecary at
all.

Andrus.

-----Algsõnum-----
From: Francisco Figueiredo Jr.
Sent: Saturday, October 08, 2011 9:38 PM
To: Andrus Moor
Cc: pgsql-general@postgresql.org
Subject: Re: [GENERAL] How to add xml data to table

I think your best bet would be to go with XmlReader as it provides a
fast read only parsing of the document.

From MS doc about linq to xml:
http://msdn.microsoft.com/en-us/library/bb387048.aspx

"XmlReader is a fast, forward-only, non-caching parser.

LINQ to XML is implemented on top of XmlReader, and they are tightly
integrated. However, you can also use XmlReader by itself.

For example, suppose you are building a Web service that will parse
hundreds of XML documents per second, and the documents have the same
structure, meaning that you only have to write one implementation of
the code to parse the XML. In this case, you would probably want to
use XmlReader by itself.

In contrast, if you are building a system that parses many smaller XML
documents, and each one is different, you would want to take advantage
of the productivity improvements that LINQ to XML provides."


I think your case fits the first example.

This way you could use xmlreader to extract the values and then fill
NpgsqlParameter values and execute the insert command.

I hope it helps.



2011/10/7 Andrus Moor <eetasoft@online.ee>:
> soap response below contains table of products, approx 5000 rows.
> Table of products (below) is nearly similar structure as xml data.
>
> Products table needs to be updated from xml data in every hour.
>
> How to add this xml data to table of products ?
>
> Should I use xpath() function or any other ides ?
> Using npgsql and C# in ASP .NET / Mono.
>
> Andrus.
>
>
> CREATE TABLE products (
> SupplierCode char(20) primary key,
> SegmentId char(8),
> GroupId char(8),
> ClassId char(8),
> SeriesId char(8),
> VendorId char(2),
> PartNumbrt char(27),
> Name Text,
> Warranty Numeric(6,2),
> Price Numeric(10,4),
> Quantity Numeric(8,2)
> )
>
> Data which is required to add looks like:
>
> <?xml version="1.0" encoding="utf-8"?>
> <soap12:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
> xmlns:xsd="http://www.w3.org/2001/XMLSchema"
> xmlns:soap12="http://www.w3.org/2003/05/soapenvelope">
> <soap12:Body>
> <GetProductListResponse xmlns="http://xxx.yy.zz/">
> <GetProductListResult>
> <ProductList>
> <Product>
> <SupplierCode>001982</SupplierCode>
> <SegmentId>65000000</SegmentId>
> <GroupId>65010000</GroupId>
> <ClassId>65010200</ClassId>
> <SeriesId>10001125</SeriesId>
> <VendorId>AM</VendorId>
> <PartNumber>ADA3000BIBOX</PartNumber>
> <Name>AMD Athlon64 3000+ (1800MHz/L2 Cache 512KB) Socket 939, BOX</Name>
> <Warranty>36</Warranty>
> <Price>196.00000</Price>
> <Quantity>0</Quantity>
> <DateExpected>1999-01-01T00:00:00</DateExpected>
> <IsNewProduct>true</IsNewProduct>
> </Product>
> <Product>
> <SupplierCode>001512</SupplierCode>
> <SegmentId>65000000</SegmentId>
> <GroupId>65010000</GroupId>
> <ClassId>65010200</ClassId>
> <SeriesId>10001125</SeriesId>
> <VendorId>AM</VendorId>
> Acme API Specification v 1.0
> 13
> <PartNumber>ADA3000AXBOX</PartNumber>
> <Name>AMD Athlon64 3000+ (2000MHz/1600MHz/L2 Cache 512KB) Socket 754,
> BOX</Name>
> <Warranty>36</Warranty>
> <Price>296.00000</Price>
> <Quantity>0</Quantity>
> <GrossWeight>3.6000</GrossWeight>
> <DateExpected>1999-01-01T00:00:00</DateExpected>
> <IsNewProduct>false</IsNewProduct>
> </Product>
> </ProductList>
> </GetProductListResult>
> </GetProductListResponse>
> </soap12:Body>
> </soap12:Envelope>



--
Regards,

Francisco Figueiredo Jr.
Npgsql Lead Developer
http://www.npgsql.org
http://gplus.to/franciscojunior
http://fxjr.blogspot.com
http://twitter.com/franciscojunior


How to fix lost synchronization with server

From
"Andrus"
Date:
After upgrading server to Postgres 9.3 in Debian customer cannot create
backups anymore. pg_dump returns
error lost synchronization with server:

"C:\myapp\..\pg_dump\pg_dump.exe" -ib -Z3 -f "C:\mybackup.backup" -Fc -h
1.2.3.4 -U user -p 5432 mydb

pg_dump: Dumping the contents of table "attachme" failed: PQgetCopyData()
failed.
pg_dump: Error message from server: lost synchronization with server: got
message type "d", length 5858454
pg_dump: The command was: COPY firma1.attachme (id, idmailbox, attachname,
attachbody, attachtype) TO stdout;


attachme table contains 4487 records
Its  total size is 1016 MB. Most data is contained in one bytea column

I changed

ssl_renegotiation_limit = 512GB

in postgresql.conf but problem persists.

postgres log file does not contain any information about this.

How to fix or diagnose the issue ?

Should I

1. Add --inserts line option to pg_dump. According to (1) it fixes the
issue.
2. Turn ssl off
3. Change something in VMWare . According to (1) it occurs in VMWare only


Server:

PostgreSQL 9.3.3 on x86_64-unknown-linux-gnu, compiled by gcc (Debian
4.7.2-5) 4.7.2, 64-bit
Debian Linux x64 is running under VMWare, 2 cores

Apache and Mono 3.2.8 with mod_mono MVC4 applicati is also running in this
server


Client:

Windows computer running 9.3 pg_dump.exe over in LAN but external IP address
(1.2.3.4) is used


It worked if server was Widows 2003 server running earlier Postgres 9
without SSL.


Andrus.

(1) https://groups.google.com/forum/#!topic/pgsql.bugs/-bS1Lba3txA



Re: How to fix lost synchronization with server

From
Tom Lane
Date:
"Andrus" <kobruleht2@hot.ee> writes:
> pg_dump: Dumping the contents of table "attachme" failed: PQgetCopyData()
> failed.
> pg_dump: Error message from server: lost synchronization with server: got
> message type "d", length 5858454
> pg_dump: The command was: COPY firma1.attachme (id, idmailbox, attachname,
> attachbody, attachtype) TO stdout;

> attachme table contains 4487 records
> Its  total size is 1016 MB. Most data is contained in one bytea column

I think this is probably an out-of-memory situation inside pg_dump, ie
libpq failing to make its input buffer large enough for the incoming row.
It's hard to believe that there's not 6MB available on any modern machine,
so I'm thinking this is an OS-level restriction on how much memory we can
get.  On a Unix machine I'd recommend looking at the ulimit settings
pg_dump is being run under.  Dunno the equivalent for Windows.

            regards, tom lane


Re: How to fix lost synchronization with server

From
"Andrus"
Date:
Hi!

>> pg_dump: Error message from server: lost synchronization with server: got
>> message type "d", length 5858454
>I think this is probably an out-of-memory situation inside pg_dump, ie
>libpq failing to make its input buffer large enough for the incoming row.
>It's hard to believe that there's not 6MB available on any modern machine,
>so I'm thinking this is an OS-level restriction on how much memory we can
>get.  On a Unix machine I'd recommend looking at the ulimit settings
>pg_dump is being run under.  Dunno the equivalent for Windows.

Backup computer has modern Windows client OS.
It has GBs of memory and swap file possibility.

Based on my knowledge there is no memory settings in windows which can
restrict 6MB allocation.
On memory shortage Windows shows message like "Increasing swap file size".
Customer did'nt report such message.

Dump worked for years without issues when server was 32 bit Windows 2003
server and Postgres and pg_dump were earlier version 9 (but after upgrade
new rows are added to attachme table).

How to create backup copies or diagnose the issue ?
I can change pg_dump execution parameters.
I can install VC++ Express and compile something to add  diagnozing if this
can help.
Maybe this message can improved to include more details about the reason.

Andrus.



Re: How to fix lost synchronization with server

From
David G Johnston
Date:
Andrus Moor wrote
> Dump worked for years without issues when server was 32 bit Windows 2003
> server and Postgres and pg_dump were earlier version 9 (but after upgrade
> new rows are added to attachme table).
>
> How to create backup copies or diagnose the issue ?
> I can change pg_dump execution parameters.
> I can install VC++ Express and compile something to add  diagnozing if
> this
> can help.
> Maybe this message can improved to include more details about the reason.

Given this is an upgrade, though the specifics were omitted, is there some
possibility of a pg_dump/server version mis-match being the cause.  I could
see where a 32-bit client connecting to a 64bit server could possible
exhibit apparent memory-related issues.

David J.




--
View this message in context:
http://postgresql.1045698.n5.nabble.com/How-to-add-xml-data-to-table-tp4881402p5803020.html
Sent from the PostgreSQL - general mailing list archive at Nabble.com.


Re: How to fix lost synchronization with server

From
"Andrus"
Date:
Hi!

>Given this is an upgrade, though the specifics were omitted, is there some
>possibility of a pg_dump/server version mis-match being the cause.  I could
>see where a 32-bit client connecting to a 64bit server could possible
>exhibit apparent memory-related issues.

pg_dump is 32-bit version. pg_dump -V returns

pg_dump (PostgreSQL) 9.3.0


Server is x64 :

PostgreSQL 9.3.3 on x86_64-unknown-linux-gnu, compiled by gcc (Debian
4.7.2-5) 4.7.2, 64-bit

Can this cause the issue ?

Andrus.



Re: How to fix lost synchronization with server

From
"Andrus"
Date:
Hi!

>Given this is an upgrade, though the specifics were omitted, is there some
>possibility of a pg_dump/server version mis-match being the cause.  I could
>see where a 32-bit client connecting to a 64bit server could possible
>exhibit apparent memory-related issues.

pg_dump is 32-bit version. pg_dump -V returns

pg_dump (PostgreSQL) 9.3.0


Server is x64 :

PostgreSQL 9.3.3 on x86_64-unknown-linux-gnu, compiled by gcc (Debian
4.7.2-5) 4.7.2, 64-bit

Can this cause the issue ?

Andrus.



Re: How to fix lost synchronization with server

From
Tom Lane
Date:
"Andrus" <kobruleht2@hot.ee> writes:
>> Given this is an upgrade, though the specifics were omitted, is there some
>> possibility of a pg_dump/server version mis-match being the cause.  I could
>> see where a 32-bit client connecting to a 64bit server could possible
>> exhibit apparent memory-related issues.

> pg_dump is 32-bit version. pg_dump -V returns
> pg_dump (PostgreSQL) 9.3.0

> Can this cause the issue ?

Hm.  It wouldn't *cause* the issue, but certainly a 32-bit pg_dump would
have lots less headroom if there were a memory bloat problem.

I looked back at the previous thread you mentioned (bug #7914) and was
reminded that we never did understand what was going on in that report.
I'm not sure if you are seeing the same thing though.  That user reported
that he was able to see pg_dump's memory consumption bloating well beyond
what it ought to be (I suppose he was watching the process in whatever
Windows' equivalent of ps or top is).  Do you see that?

If there is a memory leakage type issue involved then it's less surprising
that a request for a "mere" 6MB would fail --- once we've leaked enough
memory, it's certainly gonna fail at some point.  This doesn't get us much
closer to understanding the problem though.

            regards, tom lane


Re: How to fix lost synchronization with server

From
"Andrus"
Date:
>I looked back at the previous thread you mentioned (bug #7914) and was
>reminded that we never did understand what was going on in that report.
>I'm not sure if you are seeing the same thing though.  That user reported
>that he was able to see pg_dump's memory consumption bloating well beyond
>what it ought to be (I suppose he was watching the process in whatever
>Windows' equivalent of ps or top is).  Do you see that?

This occurs in customer computer which I din't observe.

I added --inserts  parameter to pg_dump and ssl=false in postgresql.conf
file.

After that backup works OK.
Is it OK to use  --inserts parameter ?

Andrus.



Re: How to fix lost synchronization with server

From
Tom Lane
Date:
"Andrus" <kobruleht2@hot.ee> writes:
>> I looked back at the previous thread you mentioned (bug #7914) and was
>> reminded that we never did understand what was going on in that report.
>> I'm not sure if you are seeing the same thing though.  That user reported
>> that he was able to see pg_dump's memory consumption bloating well beyond
>> what it ought to be (I suppose he was watching the process in whatever
>> Windows' equivalent of ps or top is).  Do you see that?

> This occurs in customer computer which I din't observe.

> I added --inserts  parameter to pg_dump and ssl=false in postgresql.conf
> file.

> After that backup works OK.

Now that we've identified the actual problem [1], I'm guessing the reason
why --inserts appears to make it go away is that then pg_dump uses a
SELECT which requires more server-side effort than COPY.  So that slows
down the server just a bit, and if the phase of the moon is favorable the
timing no longer results in this buffer-bloat behavior.  But I wouldn't
call that a reliable fix.  You'd be better off applying the patch.

            regards, tom lane

[1] http://git.postgresql.org/gitweb/?p=postgresql.git;a=commitdiff;h=2f557167b19af79ffecb8faedf8b7bce4d48f3e1


Re: How to fix lost synchronization with server

From
"Andrus"
Date:
>But I wouldn't
>call that a reliable fix.  You'd be better off applying the patch.

Where to get this fix in binary form for Windows 32-bit ?

pg_dump.exe uses files below [1]. Is it sufficient to replace libpq.dll file
?
Where to get its compiled version or how to compile it in Windows ?

[1]
libeay32.dll
libiconv.dll
libintl.dll
libpq.dll
msvcr100.dll
pg_dump.exe
ssleay32.dll
zlib1.dll



Re: How to fix lost synchronization with server

From
Michael Paquier
Date:
On Fri, May 9, 2014 at 2:37 AM, Andrus <kobruleht2@hot.ee> wrote:
> Where to get this fix in binary form for Windows 32-bit ?
Here, but you will need to wait until 9.3.5 is out:
http://www.postgresql.org/download/windows/.
If you are in a hurry, you can still compile manually pg_dump and
deploy it where you need it...

> Where to get its compiled version or how to compile it in Windows ?
There are many ways to do that, the most popular methods involving
MinGW or msvc. More information is available on the docs as well:
http://www.postgresql.org/docs/9.3/static/install-windows.html

And here are some more resources.
https://wiki.postgresql.org/wiki/Running_%26_Installing_PostgreSQL_On_Native_Windows
https://wiki.postgresql.org/wiki/Building_With_MinGW

Regards,
--
Michael


Re: How to fix lost synchronization with server

From
"Andrus"
Date:
>> Where to get this fix in binary form for Windows 32-bit ?
>Here, but you will need to wait until 9.3.5 is out:
>http://www.postgresql.org/download/windows/.

When 9.3.5 or Windows x32 nightly build or test build  will be out ?

Andrus.