• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Raid 0 on the SB750 Chipset Review

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Junglebizz

Member
Joined
May 8, 2003
Location
BC
I recently purchased a new system and this time around, I decided to finally play around with RAID 0. After assembling the system and figuring out the build in RAID utility, I decided to setup my system in one array encompassing both hard drives entirely and went about installing my operating system and overclocking it.

The system (yes it's in my sig, but it may change eventually):
AMD Phenom II x3 720 Black Edition - OC'd to 3588 MHz (17.5x205 @ 1.475v)
Gigabyte GA-MA790X-UD4P with F5a BIOS
G.Skill Pi 2 x 2GB PC-8500 DDR2
WD WD6401AALS 640GB x2



After running a couple benchmarks I was curious if I could get some better performance out of my array. Last night I decided to try setting up my hard drives in various configurations to see if there was a noticeable difference in performance.

I had been doing a lot of reading regarding "short-stroking" raid 0 setups in order to increase performance, but I wanted to see some numbers that would relate to my specific application. So four windows installs later, I had a bunch of benchmark data that I thought I would share!


Test 1

Here is a diagram showing the configuration for this test:
Test1-Single1280GBArray-128kStripeD.png

NOTE: Unused Space in this diagram is space assigned to another partition using the remaining space that is not formatted.

This was the configuration I did by default when I built the system. This being my first RAID array and not knowing any better, I made 1 big array out of 100% of each hard drive and proceeded to make a 40GB partition for Windows XP.

Here were some performance results from the benchmark tools HD Tune Pro 3.50 and HD Tach 3.0.4.0:

NOTE: Some tests were run multiple times for consistency. Where applicable, I have included links to the results of the subsequent tests.



HD Tune Benchmark
HDTune_Benchmark_AMD_____20_Stri-1.png

Test run 2


HD Tune Random Access
HDTune_Random_Access_AMD_____20_-1.png

Test run 2


HD Tune File Benchmark
HDTune_File_Benchmark_AMD_____20-1.png

Test run 2


HD Tach Benchmark - Quick
HDTach_Benchmark_AMD_____20_Stripe_.png

Test run 2 - Long
Test run 3 - Long


Now, compared to what I had been using for windows in my old computer (40GB partition on a Seagate 320GB SATA I HDD), the numbers shown in the HD Tune benchmarks were about triple the throughput of what I had before, so regardless, this was a victory in my books.

Since some people have concerns about failing hard drives destroying their RAID arrays, I decided to see what the performance would have been like if I wasn't using RAID at all in Test 2.
 
Last edited:
Test 2

Here is a diagram showing the configuration for this test:
Test2-SingleHDD-StandardInstall.png

NOTE: Unused Space in this diagram is space assigned to another partition using the remaining space that is not in the XP Partition.

This was the configuration I have always done in the last when installing Windows. No RAID, just a single hard drive with a small OS partition and the rest being used for storage in various other partitions.

Here were some performance results from the benchmark tools HD Tune Pro 3.50 and HD Tach 3.0.4.0:

NOTE: Some tests were run multiple times for consistency. Where applicable, I have included links to the results of the subsequent tests.



HD Tune Benchmark
HDTune_Benchmark_WDC_WD6401AALS--1.png

Test run 2


HD Tune Random Access
HDTune_Random_Access_WDC_WD6401AALS.png



HD Tune File Benchmark
HDTune_File_Benchmark_WDC_WD6401-1.png

Test run 2


HD Tach Benchmark - Quick
HDTach_Benchmark_WDC_WD6401AALS-00L.png

Test run 2 - Long
Test run 3 - Long


Compared to the results of test 1, you can see that the transfer rates are significantly different when using only one hard drive. Another interesting thing to notice is that the random access time is only 0.5ms worse and the burst data rate is almost the same compared to the single RAID 0 array.

In test 3, I wanted to see if there would be any noticeable improvement by creating two RAID arrays instead of one.
 
Last edited:
Test 3

Here is a diagram showing the configuration for this test:
Test3-DualRAID0Array-64kStripeDiagr.png

NOTE: Blank Space in this diagram is space assigned to another partition using the remaining space that is not in the XP Partition. Also, the other 50GB in the 90GB Array is where I would have Windows 7 installed on it's own partition.

Not thinking about the fact that I had only done a 128k striped array prior to this, I went and set Array 1 to use 64k stripes while Array 2 was set to 128k stripes, but Array 2 was not tested, assuming it would be only used for storage.

Here were some performance results from the benchmark tools HD Tune Pro 3.50 and HD Tach 3.0.4.0:

NOTE: Some tests were run multiple times for consistency. Where applicable, I have included links to the results of the subsequent tests.



HD Tune Benchmark
HDTune_Benchmark_AMD_____20_Stripe_.png

Test run 2
Test run 3



HD Tune Random Access
HDTune_Random_Access_AMD_____20_Str.png

Test run 2


HD Tune File Benchmark
HDTune_File_Benchmark_AMD_____20_St.png

Test run 2


HD Tach Benchmark
HDTach_Benchmark_AMD_____20_Stripe_.png

Test run 2
Test run 3


Wow! Random access times are down to 7.3ms in some of the tests! That's down 4ms from the single array in test 1! Not bad, but then again, I used 64k stripes this time instead of 128k. Could that be what makes up the performance difference?

In test 4, I recreate test 3 using 128k stripes instead of 64k stripes.
 
Last edited:
Test 4

Here is a diagram showing the configuration for this test:
Test4-DualRAID0Array-128kStripeDiag.png

NOTE: Blank Space in this diagram is space assigned to another partition using the remaining space that is not in the XP Partition.

Since I made two changes before running test 3, from one array to two and from 128k stripe to 64k stripe, I decided to also run test 3 again using 128k stripe on array 1 to see if that made a significant difference or not.

Here were some performance results from the benchmark tools HD Tune Pro 3.50 and HD Tach 3.0.4.0:

NOTE: Some tests were run multiple times for consistency. Where applicable, I have included links to the results of the subsequent tests.



HD Tune Benchmark
HDTune_Benchmark_AMD_____20_Stripe_.png

Test run 2


HD Tune Random Access
HDTune_Random_Access_AMD_____20_Str.png

Test run 2


HD Tune File Benchmark
HDTune_File_Benchmark_AMD_____20_St.png

Test run 2


HD Tach Benchmark
HDTune_Benchmark_AMD_____20_Stri-2.png

Test run 2

Well, the random access times were consistent with the 64k dual array, but the average transfer rate was lower. Still, this type of setup is clearly an improvement over a single array with the same 128 stripe.


The following table compares some of the results shown in most of the tests. The highlighted sections correspond to the highest or lowest value as determined by the column title:

charts.png



Final Thoughts

Why did I do all this? Curiosity mostly and the fact that I had not set my system up completely, meaning it was the opportune time for some benchmarking!.

Looking at the numbers it is clear that there is an advantage to using multiple arrays between two hard drives with the SB750. A 4ms random access improvement should be noticeable when loading anything. Though Maximum transfer rates were relatively similar between the single and dual arrays, the Minimum transfer rate was almost 40% higher with the dual array. This helps lead to higher average transfer rates on the whole, which of course, is good!

Though the chart shows CPU utilization to be a bit higher, this measurement fluctuated significantly throughout testing. The dual array with 64k stripes varied between 3.6% and 4.4% in HD Tune, while HD Tach reported it to be between 2 and 3% (they also shows it as being +/- 2% though).

After doing these tests, I know that I am going to setup my system with dual arrays and have my OS on a 64k stripe with the rest as storage on a 128k stripe, which seems to give excellent performance.


One More Thing...
Now, here's something that I just can't seem to wrap my head around. Why would there be such a significant difference between test 1 and test 4? I mean, they are both on a 40GB partition at the outer edge of the hard drive. Both of them are using 128k stripe. The only difference is that the one is on a smaller array than the other, but how should that make a difference if it is only ever working within the boundaries of that 40GB partition? Anyone know why this would make a difference?
 
That's impressive. I couldn't seem to get anything lower than 10ms with 2 x 320GB 7200.11 Seagate on my Intel ICH9R controller :(
 
In search of the Optimal Array Size - Test 5

After conducting many tests implying that the array size making a difference, I decided to try and find the optimal array size to use for my windows

installation. Considering the smaller the Array, the better performance I was seeing, I decided to try the smallest RAID 0 array I could install windows to,

which was 2GB. This array was also using the 64k stripe.


HD Tune Benchmark
HDTune_Benchmark_AMD_____20_Stripe_.png

Test run 2
Test run 3



HD Tune Random Access
HDTune_Random_Access_AMD_____20_Str.png

Test run 2


HD Tune File Benchmark
HDTune_File_Benchmark_AMD_____20_St.png

Test run 2
Test run 3



HD Tach Benchmark
HDTach_Benchmark_AMD_____20_Stri-1.png

Test run 2


As you can see from these results, the random access time has once again improved by 1.8ms down to 5.5ms using the 2GB array size over the 90GB array. The max transfer rate has gone up a bit as well, but the minimum has also gone down a lot. The average has also gone down substantially from the 90GB array while the burst has remained fairly consistent.

So it appears that no matter how small you short stroke your RAID array, you will see performance gains and speed improvements, but the fact that the average transfer rate takes such a big hit is a bit of a concern, not to mention a 2GB partition leaves virtually no room for installing programs to the same drive as your OS.

Myself, I typically install Windows XP to a 40GB partition, so for the next test, I decided to create an array that would be just the right size for one 40GB partition.
 
Test 6 - 40GB Array and 40GB Partition

After trying the smallest, largest and an array size meant for a dual boot setup, I decided to try for an array size that would encompass only one operating system with enough room for any other programs I could want to install. Note that in the RAID tool, specifying 40GB will only give you around 37.5GB in windows. The RAID tool seems to take the labeled hard drive size into account instead of the actual capacity, so in order to get 40GB in windows you will need to enter at least 43GB in the RAID tool. As you will see in these tests, I chose 42GB which only gave me 39.1GB in windows. oops!

HD Tune Benchmark
HDTune_Benchmark_AMD_____20_Stri-2.png

Test run 2
Test run 3



HD Tune Random Access
HDTune_Random_Access_AMD_____20_Str.png

Test run 2
Test run 3



HD Tune File Benchmark
HDTune_File_Benchmark_AMD_____20_St.png

Test run 2
Test run 3



HD Tach Benchmark
HDTach_Benchmark_AMD_____20_Stri-2.png

Test run 2
Test run 3
Test run 4


Updated Chart:
charts2.png


Keeping it consistent, performance was better with the 40GB array compared to the 90GB array, but was of course slower than the 2GB array, but the minimum and average speeds were consistently the highest of all the configurations, putting the 2GB array to shame. The random access time was also brought down another 0.5ms compared to the 90GB array and though the 2GB array was even faster than this by another 1.3ms, having less than 2GB in windows is just not that practical.

Considering I will probably still use XP for the majority of what I do, I just may stick with this type of setup and just install Windows 7 RC on a small partition on the other array which uses the rest of the hard drive space.

The only other option I may want to try before finally installing my new system for good is with a combined RAID 0 and RAID 1 setup as it appears that the second array can be set to RAID 1 after all. Maybe having both windows installations on the Raid 0 and having the Raid 1 array for storage would be ideal? I'll find out eventually!
 
That's a great run -through... well done.

If you're interested... I'm running atm 4 x 1Tb Blacks in Matrix Raid, and access is just over 6ms

http://www.ocforums.com/showpost.php?p=6112052&postcount=1847

I'm going to be moving to an Areca card soon, so Matrix Raid will disappear, hence my interest in what you've laid out. Not having done non-Matrix raid for years... I assume one can only have a single type over the disks? ie - with my 4 disks, I will have to create a single Raid0 array, and then break it up into partitions as normal? Tks
 
I envy your burst speeds :drool:

I find it interesting that the access time is roughly the same as compared to the 640gb Caviar Black, regardless of having twice as many drives.
 
I envy your burst speeds :drool:

I find it interesting that the access time is roughly the same as compared to the 640gb Caviar Black, regardless of having twice as many drives.

Cheers.

Yeah... I think there must be a 'wall' that jumps out at physical platters as they approach 6ms lol.

Still... nice and snappy.

I'm just waiting for the dust to settle after this latest round of SSD's all hit the shelves, and then will grab a couple of them and use the blacks simply as storage on my main rig... probably completely raid10 (but haven't thought too deeply about that yet).
 
One More Thing...
Now, here's something that I just can't seem to wrap my head around. Why would there be such a significant difference between test 1 and test 4? I mean, they are both on a 40GB partition at the outer edge of the hard drive. Both of them are using 128k stripe. The only difference is that the one is on a smaller array than the other, but how should that make a difference if it is only ever working within the boundaries of that 40GB partition? Anyone know why this would make a difference?
Because Test 1 benchmarked the whole 1.2TB array, and Test 4 Benchmarked a 90GB array. You can't benchmark partitions with HD Tach and HD Tune, so you weren't comparing the same size space. If you did some type of real world testing, timing application performance, you'd get the same results. Partitions short stroke a drive just as effectively as using smaller arrays, it's just not benchmarkable with the popular programs to give numbers for people to compare and/or brag about.
 
Just be careful on how you use that second array. If you put a lot of partitions on there (especially with unused space), that you are going to be regularly accessing, then could have heads flying back and forth all over the place.

However, since you can't even benchmark a dual array (at the same time [that I know of]), you probably won't even notice any performance hit anyway.

Your setup in Test 6 is similar to a setup I'm running now. It's just that if you're actually using data on all parts of the drive, you really won't see any improvement in performance over a single array (except in the synthetic benchmarks which only test 1 array at a time).

Depending on your goals, if you are going to be accessing data on all parts of the drive, someone suggested to me to even try using them as two separate drives instead. For example:

1st Drive - Short-stroked to 50-60gb or so with 1 partition for both OS/Apps.
2nd Drive - First partition is Games, second is Data/Storage.

This way you have one disk taking care of the OS, and your second disk loading up your game or dealing with your data. If you're playing games, you most likely won't be accessing anything from the Data partition so this would be the optimal setup for a 2 drive non-raid setup.

I never got around to testing it out though, because you can't really benchmark it - well not easily in any way that I could think of - those synthetic benchmarks definitely wouldn't help.

I just ended up buying more drives so that I could keep my storage/data entirely separate from my Raid0.
 
Last edited:
Im curious, did you ever run a test to compare different raid levels? I currently have a mirrored strip, raid 1+0 or something, and am thinking about switching to raid 5, a strip with parity drive.

Was just wondering what the performance difference would be..

Also, is the SB750 "fake RAID", or does it have its own hw controller like a pci card would?
 
Im curious, did you ever run a test to compare different raid levels? I currently have a mirrored strip, raid 1+0 or something, and am thinking about switching to raid 5, a strip with parity drive.

Was just wondering what the performance difference would be..

Also, is the SB750 "fake RAID", or does it have its own hw controller like a pci card would?

To test Raid 5 and Raid 1+0, I would need more than two hard drives and I have not purchased any more, but I can say that my Raid array is still going strong with no issues so far.
 
Kudos to Jungle!! got me thinking about my array.

I also use the SB750 but on four disks all in Raid0, I tested a few possibilities:
In short Raid5 with 4 disks is barely faster than a single disk.
Raid 10 is about the same as 2 disk Raid 0.

HD Tune trial software
4x WD RE3 250gb Hard drives
 
Can you add crystal disk benchmarks?

Also would you care to have this on the front page as a full blown review? I like it. Nice work. PM me please if you're interested.

Joe
 
Very nice! I hope to RAID some 64gb/32gb SSD's eventually and was hoping it wouldn't be too difficult on the Amd side of things. I could just consult you! ;) Nice results.
 
Can you add crystal disk benchmarks?

Also would you care to have this on the front page as a full blown review? I like it. Nice work. PM me please if you're interested.

Joe

I will look into crystal disk benchmarks (I haven't heard of them before but I will look it up!)

As for putting it on the front page, that would be awesome! The review has been around for a year now and I can say that since setting up my Raid array, I have had zero problems with it!
 
Back