Thread: Looking for a large database for testing
Hello, I would like to test the performance of my Java/PostgreSQL applications especially when making full text searches. For this I am looking for a database with 50 to 300 MB having text fields. e.g. A table with books with fields holding a comment, table of content or example chapters or what ever else. Does anybody have an idea where I can find a database like this or does even have something like this? -- Best Regards / Viele Grüße Sebastian Hennebrueder ---- http://www.laliluna.de Tutorials for JSP, JavaServer Faces, Struts, Hibernate and EJB Get support, education and consulting for these technologies - uncomplicated and cheap.
Sebastian Hennebrueder schrieb: > Hello, > > I would like to test the performance of my Java/PostgreSQL applications > especially when making full text searches. > For this I am looking for a database with 50 to 300 MB having text fields. > e.g. A table with books with fields holding a comment, table of content > or example chapters > or what ever else. > > Does anybody have an idea where I can find a database like this or does > even have something like this? > You can download the wikipedia content. Just browse the wikimedia site. Its some work to change the data to be able to import into postgres, but at least you have a lot real world data - in many languages.
Tino Wildenhain schrieb: > Sebastian Hennebrueder schrieb: > >> Hello, >> >> I would like to test the performance of my Java/PostgreSQL applications >> especially when making full text searches. >> For this I am looking for a database with 50 to 300 MB having text >> fields. >> e.g. A table with books with fields holding a comment, table of content >> or example chapters >> or what ever else. >> >> Does anybody have an idea where I can find a database like this or does >> even have something like this? >> > You can download the wikipedia content. Just browse the wikimedia site. > Its some work to change the data to be able to import into postgres, > but at least you have a lot real world data - in many languages. I have just found it. Here there is a link http://download.wikimedia.org/ They have content in multiple languages and dumps up to 20 GB. -- Best Regards / Viele Grüße Sebastian Hennebrueder ---- http://www.laliluna.de Tutorials for JSP, JavaServer Faces, Struts, Hibernate and EJB Get support, education and consulting for these technologies - uncomplicated and cheap.
On Tue, Aug 16, 2005 at 09:29:32AM +0200, Sebastian Hennebrueder wrote: > I would like to test the performance of my Java/PostgreSQL applications > especially when making full text searches. > For this I am looking for a database with 50 to 300 MB having text fields. > e.g. A table with books with fields holding a comment, table of content > or example chapters > or what ever else. You could try the OMIM database, which is currently 100M It contains both journal references and large sections of 'plain' text. It also contains a large amount of technical terms which will really test any kind of soundex matching if you are using that. http://www.ncbi.nlm.nih.gov/Omim/omimfaq.html#download Unfortunately it only comes as a flat text file, but is very easy to parse. And if you start reading it, you'll probably learn quite a lot of things you really didn't want to know!! :-D -Mark
Sebastian, you can try document generator. I used http://www.cs.rmit.edu.au/~jz/resources/finnegan.zip yuo can play with freq. of words and document length distribution. Also, I have SentenceGenerator.java which could be used for generation of synthetic texts. Oleg On Tue, 16 Aug 2005, Sebastian Hennebrueder wrote: > Hello, > > I would like to test the performance of my Java/PostgreSQL applications > especially when making full text searches. > For this I am looking for a database with 50 to 300 MB having text fields. > e.g. A table with books with fields holding a comment, table of content > or example chapters > or what ever else. > > Does anybody have an idea where I can find a database like this or does > even have something like this? > > Regards, Oleg _____________________________________________________________ Oleg Bartunov, sci.researcher, hostmaster of AstroNet, Sternberg Astronomical Institute, Moscow University (Russia) Internet: oleg@sai.msu.su, http://www.sai.msu.su/~megera/ phone: +007(095)939-16-83, +007(095)939-23-83
Sebastian Hennebrueder schrieb: >Tino Wildenhain schrieb: > > > > >>You can download the wikipedia content. Just browse the wikimedia site. >>Its some work to change the data to be able to import into postgres, >>but at least you have a lot real world data - in many languages. >> >> > >I have just found it. Here there is a link >http://download.wikimedia.org/ >They have content in multiple languages and dumps up to 20 GB. > > > Just if anybody wants to import the wikipedia data. I had considerable problems to get the proper encoding working. I downloaded the german content from wikipedia, which is a dump of a unicode encoded database of mysql (utf8) I used MySql 4.1 on Windows 2000 to read the dump and then copied the data with a small application to postgreSQL In mysql.ini you should configure the setting max_allowed_packet = 10M I set it to 10, wich worked out. Else you can not import the dump into mysql. The error message was something like lost connection .... The default encoding of mysql was latin1 which worked. Then I imported the dump mysql -uYourUserName -pPassword --default-character-set=utf8 database < downloadedAndUnzippedFile The default-character-set is very important Create table in postgres (not with all the columns) CREATE TABLE content ( cur_id int4 NOT NULL DEFAULT nextval('public.cur_cur_id_seq'::text), cur_namespace int2 NOT NULL DEFAULT (0)::smallint, cur_title varchar(255) NOT NULL DEFAULT ''::character varying, cur_text text NOT NULL, cur_comment text, cur_user int4 NOT NULL DEFAULT 0, cur_user_text varchar(255) NOT NULL DEFAULT ''::character varying, cur_timestamp varchar(14) NOT NULL DEFAULT ''::character varying ) ; After this I copied the data from mySql to postgres with a small Java application. The code is not beautiful. private void copyEntries() throws Exception { Class.forName("org.postgresql.Driver"); Class.forName("com.mysql.jdbc.Driver"); Connection conMySQL = DriverManager.getConnection( "jdbc:mysql://localhost/wikidb", "root", "mysql"); Connection conPostgreSQL = DriverManager.getConnection( "jdbc:postgresql://localhost/wiki", "postgres", "p"); Statement selectStatement = conMySQL.createStatement(); StringBuffer sqlQuery = new StringBuffer(); sqlQuery.append("insert into content ("); sqlQuery .append("cur_id, cur_namespace, cur_title, cur_text, cur_comment, cur_user, "); sqlQuery.append("cur_user_text , cur_timestamp) "); sqlQuery.append("values (?,?,?,?,?,?,?,?)"); PreparedStatement insertStatement = conPostgreSQL .prepareStatement(sqlQuery.toString()); // get total rows java.sql.ResultSet resultSet = selectStatement .executeQuery("select count(*) from cur"); resultSet.next(); int iMax = resultSet.getInt(1); int i = 0; while (i < iMax) { resultSet = selectStatement .executeQuery("select * from cur limit "+i +", 2000"); while (resultSet.next()) { i++; if (i % 100 == 0) System.out.println("" + i + " von " + iMax); insertStatement.setInt(1, resultSet.getInt(1)); insertStatement.setInt(2, resultSet.getInt(2)); insertStatement.setString(3, resultSet.getString(3)); insertStatement.setString(4, resultSet.getString(4)); // this blob field is utf-8 encoded byte comment[] = resultSet.getBytes(5); insertStatement.setString(5, new String(comment, "UTF-8")); insertStatement.setInt(6, resultSet.getInt(6)); insertStatement.setString(7, resultSet.getString(7)); insertStatement.setString(8, resultSet.getString(8)); insertStatement.execute(); } } } -- Best Regards / Viele Grüße Sebastian Hennebrueder ---- http://www.laliluna.de Tutorials for JSP, JavaServer Faces, Struts, Hibernate and EJB Get support, education and consulting for these technologies.
On Tue, Aug 16, 2005 at 09:29:32AM +0200, Sebastian Hennebrueder wrote: > Hello, > > I would like to test the performance of my Java/PostgreSQL applications > especially when making full text searches. > For this I am looking for a database with 50 to 300 MB having text fields. > e.g. A table with books with fields holding a comment, table of content > or example chapters > or what ever else. > > Does anybody have an idea where I can find a database like this or does > even have something like this? Most benchmarks (such as dbt* and pgbench) have data generators you could use. -- Jim C. Nasby, Sr. Engineering Consultant jnasby@pervasive.com Pervasive Software http://pervasive.com 512-569-9461