Thread: import a .out file
Forgive my complete lack of knowledge here, but I have a bit of a problem! I'm in the middle of moving a website from one server to another, and it happens to be that POSTGRESQL is not available on the new server (MYSQL and MSSQL are the only ones supported). I have been given a .out file which appears to contain PostGre-specific SQL code. I tried importing it directly to MYSQL without success -- I knew it wasn't going to be that easy, but I had to try :) I wasn't able to find anything specific about .out files using the search feature on postgresql.org -- is this a standard format for data dumps? What's the best way to import this data, and eventually get it into an MSSQL (preferred) or MYSQL (if it's easier)? Thanks in advance! Jonathon
On 9/14/07, Jonathon Stierman <jonathon@vimm.com> wrote: > > Forgive my complete lack of knowledge here, but I have a bit of a problem! > > I'm in the middle of moving a website from one server to another, and it > happens to be that POSTGRESQL is not available on the new server (MYSQL and > MSSQL are the only ones supported). I have been given a .out file which > appears to contain PostGre-specific SQL code. I tried importing it directly > to MYSQL without success -- I knew it wasn't going to be that easy, but I > had to try :) > > I wasn't able to find anything specific about .out files using the search > feature on postgresql.org -- is this a standard format for data dumps? > What's the best way to import this data, and eventually get it into an MSSQL > (preferred) or MYSQL (if it's easier)? > Not sure, but maybe you can use the "strings" utility to output all the strings in the binary a.out file. This may produce a large output so you would want to do this: strings a.out > strings.log And then later view the contents of strings.log to find the SQL queries. Also do man strings for complete information regarding the strings utility. regards, -- Sibte Abbas
--- Jonathon Stierman <jonathon@vimm.com> wrote: > Forgive my complete lack of knowledge here, but I have a bit of a problem! > > I'm in the middle of moving a website from one server to another, and it > happens to be that POSTGRESQL is not available on the new server (MYSQL and > MSSQL are the only ones supported). I have been given a .out file which > appears to contain PostGre-specific SQL code. I tried importing it directly > to MYSQL without success -- I knew it wasn't going to be that easy, but I > had to try :) > > I wasn't able to find anything specific about .out files using the search > feature on postgresql.org -- is this a standard format for data dumps? > What's the best way to import this data, and eventually get it into an MSSQL > (preferred) or MYSQL (if it's easier)? I am not familiar with anything known as a ".out" file. However, I have a guest as to what it is. the pg_dump shell command will dump a database to a text file. The DBA that uses this shell command if free to specify any name and extension that the DBA desires. Your DBA must have chosen ".out" has his/her preference. I like to use ".dump" myself. I've seen others use ".sql" Your second problem is that, postgresql contains alot of features and datatypes ( some part of the SQL standard, and others are PostgreSQL extensions to the standard), that are not supported by MySQL or MSSQL server. {Pretty much every RDBMS has its own extesions not supported by the compitition.} You will have to search to ".out" file to see if the original DB designer employeed and of the advanced features that PostgreSQL offers. If the DB designer only use the "plain vanilla" features it will be easier to port the database. You probably will not find any how-to pages on the PostgreSQL website that shows how to port PostgreSQL to MySQL or MSSQL Server. However, if you search MySQL's homepage I would imagine that they have some porting instructions that would be helpful to you. HTH, Regards, Richard Broersma Jr.
Jonathon Stierman wrote: > Forgive my complete lack of knowledge here, but I have a bit of a problem! > > I'm in the middle of moving a website from one server to another, and it > happens to be that POSTGRESQL is not available on the new server (MYSQL and > MSSQL are the only ones supported). I have been given a .out file which > appears to contain PostGre-specific SQL code. I tried importing it directly > to MYSQL without success -- I knew it wasn't going to be that easy, but I > had to try :) > > I wasn't able to find anything specific about .out files using the search > feature on postgresql.org -- is this a standard format for data dumps? > What's the best way to import this data, and eventually get it into an MSSQL > (preferred) or MYSQL (if it's easier)? First, you will have to determine how the ".out" file was produced. Ask the person who created it. It sounds like it might be a database dump. If that is the case, you will indeed have some work ahead of you as you translate all the postgresql-specific stuff into mysql or mssql. Some of that can be handled by database tools, but some might need to be done by hand. Sean
Yep, it definitely is a SQL dump -- it looks much the same as .sql files that I'm used to dealing with on MYSQL databases. Do you have any favorite translator tools that I could look in to? The file is massive (58 megs worth) and going through it by hand would be quite tedious. I did find this on the MYSQL homepage: http://solutions.mysql.com/solutions/partner.php?partner=1881&other=1 Suggesting some commercial options are available. Have any of you used this tool, or other tools, with success? Thanks Sean, Sibte, & Richard for your help thus far -- it is most appreciated! Jonathon -----Original Message----- From: pgsql-novice-owner@postgresql.org [mailto:pgsql-novice-owner@postgresql.org] On Behalf Of Sean Davis Sent: Friday, September 14, 2007 1:11 PM To: Jonathon Stierman Cc: pgsql-novice@postgresql.org Subject: Re: [NOVICE] import a .out file Jonathon Stierman wrote: > Forgive my complete lack of knowledge here, but I have a bit of a problem! > > I'm in the middle of moving a website from one server to another, and > it happens to be that POSTGRESQL is not available on the new server > (MYSQL and MSSQL are the only ones supported). I have been given a > .out file which appears to contain PostGre-specific SQL code. I tried > importing it directly to MYSQL without success -- I knew it wasn't > going to be that easy, but I had to try :) > > I wasn't able to find anything specific about .out files using the > search feature on postgresql.org -- is this a standard format for data dumps? > What's the best way to import this data, and eventually get it into an > MSSQL > (preferred) or MYSQL (if it's easier)? First, you will have to determine how the ".out" file was produced. Ask the person who created it. It sounds like it might be a database dump. If that is the case, you will indeed have some work ahead of you as you translate all the postgresql-specific stuff into mysql or mssql. Some of that can be handled by database tools, but some might need to be done by hand. Sean ---------------------------(end of broadcast)--------------------------- TIP 7: You can help support the PostgreSQL project by donating at http://www.postgresql.org/about/donate
Jonathon Stierman wrote: > Yep, it definitely is a SQL dump -- it looks much the same as .sql files > that I'm used to dealing with on MYSQL databases. > > Do you have any favorite translator tools that I could look in to? The file > is massive (58 megs worth) and going through it by hand would be quite > tedious. > > I did find this on the MYSQL homepage: > http://solutions.mysql.com/solutions/partner.php?partner=1881&other=1 > Suggesting some commercial options are available. Have any of you used this > tool, or other tools, with success? > > Thanks Sean, Sibte, & Richard for your help thus far -- it is most > appreciated! How many tables? If it is just a few, I would ask for tab-delimited text files from postgres and then design the database in mysql or mssql to match the postgres database. SQL::Translator for perl is one option, but it usually requires some hand-editing, also. Sean
--- Jonathon Stierman <jonathon@vimm.com> wrote: > Yep, it definitely is a SQL dump -- it looks much the same as .sql files > that I'm used to dealing with on MYSQL databases. > > Do you have any favorite translator tools that I could look in to? The file > is massive (58 megs worth) and going through it by hand would be quite > tedious. I bet most of the 58MB are just data. If you can, see if you get get another dump of only the db SCHEMA. This is what you should take a look at first. I would look at the datatype and any constraints and compare what the other DB support. If you only care about the tables and data, you could also "suck" these tables into MS-Access using ODBC. Next you could rebuild the relationship that existed in PostgreSQL. From this point I see that MS SQL server has help docs on how to port MS-Access to MS-SQL Server. Regards, Richard Broersma Jr.
It's only about 20 tables -- so creating them by hand in M(S/Y)SQL wouldn't take too long. I still need to get the data into PostGreSQL though before I can run my own export though. I'm using the pgAdminIII tool to try to import the script, but I am (sadly) getting strange results when running the import. Maybe I'm going about this the wrong way: Right click my database >> choose CREATE script Click the "Open" folder button and select my .out file Click the Execute button It chugs for a good 2 or 3 minutes (fine by me, it's a 50 meg file it's got to parse and execute) -- but at the end, it doesn't give any confirmation of failure or success. I open up my database, and it's still completely empty -- no table structure or data. I'm wondering if it's a problem with the .out dump, or my process of importing... any ideas? Jonathon -----Original Message----- From: Sean Davis [mailto:sdavis2@mail.nih.gov] Sent: Friday, September 14, 2007 1:54 PM To: Jonathon Stierman Cc: pgsql-novice@postgresql.org Subject: Re: [NOVICE] import a .out file Jonathon Stierman wrote: > Yep, it definitely is a SQL dump -- it looks much the same as .sql > files that I'm used to dealing with on MYSQL databases. > > Do you have any favorite translator tools that I could look in to? > The file is massive (58 megs worth) and going through it by hand would > be quite tedious. > > I did find this on the MYSQL homepage: > http://solutions.mysql.com/solutions/partner.php?partner=1881&other=1 > Suggesting some commercial options are available. Have any of you > used this tool, or other tools, with success? > > Thanks Sean, Sibte, & Richard for your help thus far -- it is most > appreciated! How many tables? If it is just a few, I would ask for tab-delimited text files from postgres and then design the database in mysql or mssql to match the postgres database. SQL::Translator for perl is one option, but it usually requires some hand-editing, also. Sean
Most of that 58 megs is data. One table alone has 200k records, which is most of the file... So I think this is going to be my easiest route -- cutting out the record inserts and then writing/running a MSSQL-friendly script to get the data into it's new home. Since there are relatively few tables, I can afford to write a few translator scripts in my favorite programming language. Thanks all for your help with this! I really do need to expand my PostGreSQL horizons -- I'm surprised I've lasted these 3 years in development without running across it before. Thanks again, guys! Jonathon -----Original Message----- From: Richard Broersma Jr [mailto:rabroersma@yahoo.com] Sent: Friday, September 14, 2007 1:59 PM To: Jonathon Stierman; pgsql-novice@postgresql.org Subject: Re: [NOVICE] import a .out file --- Jonathon Stierman <jonathon@vimm.com> wrote: > Yep, it definitely is a SQL dump -- it looks much the same as .sql > files that I'm used to dealing with on MYSQL databases. > > Do you have any favorite translator tools that I could look in to? > The file is massive (58 megs worth) and going through it by hand would > be quite tedious. I bet most of the 58MB are just data. If you can, see if you get get another dump of only the db SCHEMA. This is what you should take a look at first. I would look at the datatype and any constraints and compare what the other DB support. If you only care about the tables and data, you could also "suck" these tables into MS-Access using ODBC. Next you could rebuild the relationship that existed in PostgreSQL. From this point I see that MS SQL server has help docs on how to port MS-Access to MS-SQL Server. Regards, Richard Broersma Jr.
Going just on the few details provided, what I think would probably work best in this case : - Get copy of postgresql up and running with the data - Make a schema dump - Make a data dump - Load the schema into mysql or failing that recreate it manually - Load the data - this shouldn't take much tweaking The data dump you were given may have not been set to use INSERT commands, which may cause problems, I'm not sure if mysql supports COPY. I've been working with postgresql everyday for a couple years now if you'd like to pay me to produce a mysql dump from your postgresql dump, email me directly. That said, I'll still of course offer any help I can on list. - Adrien Jonathon Stierman wrote: > Most of that 58 megs is data. One table alone has 200k records, which is > most of the file... > > So I think this is going to be my easiest route -- cutting out the record > inserts and then writing/running a MSSQL-friendly script to get the data > into it's new home. Since there are relatively few tables, I can afford to > write a few translator scripts in my favorite programming language. > > Thanks all for your help with this! I really do need to expand my > PostGreSQL horizons -- I'm surprised I've lasted these 3 years in > development without running across it before. > > Thanks again, guys! > > Jonathon > > > -----Original Message----- > From: Richard Broersma Jr [mailto:rabroersma@yahoo.com] > Sent: Friday, September 14, 2007 1:59 PM > To: Jonathon Stierman; pgsql-novice@postgresql.org > Subject: Re: [NOVICE] import a .out file > > --- Jonathon Stierman <jonathon@vimm.com> wrote: > >> Yep, it definitely is a SQL dump -- it looks much the same as .sql >> files that I'm used to dealing with on MYSQL databases. >> >> Do you have any favorite translator tools that I could look in to? >> The file is massive (58 megs worth) and going through it by hand would >> be quite tedious. > > I bet most of the 58MB are just data. If you can, see if you get get > another dump of only the db SCHEMA. This is what you should take a look at > first. I would look at the datatype and any constraints and compare what > the other DB support. > > If you only care about the tables and data, you could also "suck" these > tables into MS-Access using ODBC. Next you could rebuild the relationship > that existed in PostgreSQL. From this point I see that MS SQL server has > help docs on how to port MS-Access to MS-SQL Server. > > Regards, > Richard Broersma Jr. > > > > ---------------------------(end of broadcast)--------------------------- > TIP 9: In versions below 8.0, the planner will ignore your desire to > choose an index scan if your joining column's datatypes do not > match >
Just a follow-up, scratch that, I confused this message as being on a different list, local to my area and non-pgsql related. Oops. Apologies all, - Adrien A Gilmore wrote: > Going just on the few details provided, what I think would probably work > best in this case : > > - Get copy of postgresql up and running with the data > - Make a schema dump > - Make a data dump > - Load the schema into mysql or failing that recreate it manually > - Load the data - this shouldn't take much tweaking > > The data dump you were given may have not been set to use INSERT > commands, which may cause problems, I'm not sure if mysql supports COPY. > > I've been working with postgresql everyday for a couple years now if > you'd like to pay me to produce a mysql dump from your postgresql dump, > email me directly. That said, I'll still of course offer any help I can > on list. > > - Adrien > > Jonathon Stierman wrote: >> Most of that 58 megs is data. One table alone has 200k records, which is >> most of the file... >> >> So I think this is going to be my easiest route -- cutting out the record >> inserts and then writing/running a MSSQL-friendly script to get the data >> into it's new home. Since there are relatively few tables, I can >> afford to >> write a few translator scripts in my favorite programming language. >> >> Thanks all for your help with this! I really do need to expand my >> PostGreSQL horizons -- I'm surprised I've lasted these 3 years in >> development without running across it before. >> >> Thanks again, guys! >> >> Jonathon >> >> >> -----Original Message----- >> From: Richard Broersma Jr [mailto:rabroersma@yahoo.com] Sent: Friday, >> September 14, 2007 1:59 PM >> To: Jonathon Stierman; pgsql-novice@postgresql.org >> Subject: Re: [NOVICE] import a .out file >> >> --- Jonathon Stierman <jonathon@vimm.com> wrote: >> >>> Yep, it definitely is a SQL dump -- it looks much the same as .sql >>> files that I'm used to dealing with on MYSQL databases. >>> >>> Do you have any favorite translator tools that I could look in to? >>> The file is massive (58 megs worth) and going through it by hand >>> would be quite tedious. >> >> I bet most of the 58MB are just data. If you can, see if you get get >> another dump of only the db SCHEMA. This is what you should take a >> look at >> first. I would look at the datatype and any constraints and compare what >> the other DB support. >> >> If you only care about the tables and data, you could also "suck" these >> tables into MS-Access using ODBC. Next you could rebuild the >> relationship >> that existed in PostgreSQL. From this point I see that MS SQL server has >> help docs on how to port MS-Access to MS-SQL Server. >> >> Regards, >> Richard Broersma Jr. >> >> >> >> ---------------------------(end of broadcast)--------------------------- >> TIP 9: In versions below 8.0, the planner will ignore your desire to >> choose an index scan if your joining column's datatypes do not >> match >> > > ---------------------------(end of broadcast)--------------------------- > TIP 9: In versions below 8.0, the planner will ignore your desire to > choose an index scan if your joining column's datatypes do not > match >