Thread: SELECT * FROM xy WHERE name LIKE '%german special char'
Hello, in my postgres db i have several Tables with german Special Chars like ÄÖÜßäöü. I have no problem displaying them with any terminal, php, jdbc and so on, but the like pattern doesn't find any of this special chars. The Statement SELECT * FROM XYZ WHERE name LIKE '%Müller%' doesn't any of a matching row. Is there a solution for this problem ? LIKE '%M_ller%' doesn't work, too ! I don't want to change the data in the tables, because displaying the data is ok, but i have to search in it. Thanks Berger
"Albrecht Berger" <berger1517@gmx.ch> writes: [...] > The Statement > SELECT * FROM XYZ WHERE name LIKE '%Müller%' > doesn't any of a matching row. It does work for me: regression=# select 'Müller' LIKE '%Müller%';?column? ----------t (1 row) postgres version?, database encoding?, plataform? Regards, Manuel.
"Albrecht Berger" <berger1517@gmx.ch> writes: > The Statement > SELECT * FROM XYZ WHERE name LIKE '%M�ller%' > doesn't any of a matching row. > Is there a solution for this problem ? LIKE '%M_ller%' doesn't work, too ! The only way I could see for the underscore not to match is if the character is actually stored as two or more bytes, but Postgres doesn't know it should treat that sequence as a single logical character. Are you using a multibyte character set representation (eg, Unicode)? If so, did you build Postgres with MULTIBYTE support enabled, and did you specify the correct character set when you created the database? If you're not sure about this theory, try looking to see whether length() of one of the problem strings agrees with the number of characters you think there are. regards, tom lane
hi, all 1. can i do something this in postgresql "SELECT array_data[1..20] FROM mytable WHERE condition" i want to get certain data with range in my array data. if i fetch all data, it took take a long time (very long time just for fetch from database). i'm trying with maximum array size (for me it's very big) "INSERT INTO mytable(mytable_id, array_data) VALUES(1, '{1,2,3,.......so on until 1000000}')" it's fast when insert or update :-), but very slow when fetch/select :-( is there a trick to do it? second question. 2. which are better (from speed and size side)? i make "one" table with array like this TABLE mytable mytable_id | array_data ------------------------------------- 1 | {1,2,...,100000} 2 | {1,2,...,100000} 3 | {1,2,...,100000} .... or like this one, many table but no array TABLE mytable_1 was_array_data ---------------- 1 2 ... 100000 TABLE mytable_2 was_array_data ---------------- 1 2 ... 100000 TABLE mytable_3 ... Thanks in advance Best Regards Johny Jugianto
Hi, > second question. > 2. which are better (from speed and size side)? > i make "one" table with array like this > TABLE mytable > mytable_id | array_data > ------------------------------------- > 1 | {1,2,...,100000} > 2 | {1,2,...,100000} > 3 | {1,2,...,100000} > .... > > or like this one, many table but no array > TABLE mytable_1 > was_array_data > ---------------- > 1 > 2 > ... > 100000 if you have only these numbers in array better to use more tables with an int, cause it can be indexed and fast selected. CoL