On Feb 13, 2008, at 20:45, Tore Halset wrote:
> The box have 16GB of ram, but my original test file was only 25GB.
> Sorry. Going to 33GB lowered the numbers for writing. Here you have
> some samples.
>
> % sh -c "dd if=/dev/zero of=bigfile bs=8k count=4000000 && sync"
> 32768000000 bytes (33 GB) copied, 103.722 seconds, 316 MB/s
> 32768000000 bytes (33 GB) copied, 99.669 seconds, 329 MB/s
>
> % time dd if=bigfile of=/dev/null bs=8k
> 32768000000 bytes (33 GB) copied, 85.4235 seconds, 384 MB/s
> 32768000000 bytes (33 GB) copied, 85.4628 seconds, 383 MB/s
>
> Regards,
> - Tore.
And here are the bonnie++ numbers. I am a bonnie++ newbie so I ran it
with no options.
Version 1.03c ------Sequential Output------ --Sequential Input-
--Random-
-Per Chr- --Block-- -Rewrite- -Per Chr- --Block--
--Seeks--
Machine Size K/sec %CP K/sec %CP K/sec %CP K/sec %CP K/sec
%CP /sec %CP
harteigen 32136M 83983 97 221757 40 106483 19 89603 97 268787
22 886.1 1
------Sequential Create------ --------Random
Create--------
-Create-- --Read--- -Delete-- -Create-- --Read---
-Delete--
files /sec %CP /sec %CP /sec %CP /sec %CP /sec
%CP /sec %CP
16 +++++ +++ +++++ +++ +++++ +++ +++++ +++ +++++ +++
+++++ +++
harteigen,32136M,
83983,97,221757,40,106483,19,89603,97,268787,22,886.1,1,16,+++++,+++,++
+++,+++,+++++,+++,+++++,+++,+++++,+++,+++++,+++
Regards,
- Tore.