I found a much easier way.
create.sql:
> drop database if exists segtest;
> create database segtest;
> \c segtest;
> create table tcv_scene_datas(cv_scene_id bigint primary key) partition by range (cv_scene_id);
> do $$
> declare
> i int;
> range_start bigint;
> range_end bigint;
> partition_name text;
> begin
> for i in 0..100 loop
> range_start := 1 + (i * 10000);
> range_end := range_start + 10000;
> partition_name := 'tcv_scene_datas_' || LPAD(i::TEXT, 3, '0');
> execute format(
> 'create table %I partition of tcv_scene_datas for values from (%s) to (%s)',
> partition_name,
> range_start,
> range_end
> );
> end loop;
> end $$;
> insert into tcv_scene_datas(cv_scene_id) select id from generate_series(1,1_000_000) id;
crash.sql:
> \c segtest
> with ids as (select (random()*1_000_000)::int id from generate_series(1,1000))
> update tcv_scene_datas set cv_scene_id=cv_scene_id where cv_scene_id in(select id from ids);
Launch crash.sql in 16 threads of infinite loops:
> seq 16 | xargs -P 16 -I {} sh -c 'while true; do psql -f crash.sql; done'
In 1-2 minutes, 5 processes died with segfault.
Also I expected deadlocks with such query, strangely database did not report them.
Let me know if you need more data.
__
Best wishes, Yuri