• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Why!?!?!

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Nevandal

Registered
Joined
May 18, 2003
When you look at files on your computer they are in megaBYTES. When you download a file it says it is a certain amount of megaBYTES. But, when you download it says how fast it is going in kiloBITS.....what's up with that? Why don't they just stay with bytes? It is so much simpler to figure out how fast youa re actually going. It sucks dividing by 8 every time you wanna see your actual speed :(



WHY!?!?!?
 
Listing network transfer rates in bits instead of bytes makes them appear less pathetically slow.
 
Its not actually a measurement, but rather, one uses binary, and another uses decimal measurements.

A decimal megabyte = 1,000,000 bytes (10 to 6th power)
A binary megabyte = 1,048,576 bytes (2 to 20th power)

So- windows uses DECIMAL (appears smaller), and Hard Drive Manufacturers use BINARY (appears more). Binary is actually more precises. Decimal is kind of like rounding.

As far as it being Windows fault? No. Win3.1 used binary. CMOS setups were the ones to make the change.

Someone wrote this online as a complaint:
Decimal Deception - This week I got a 200 GB hard-drive for my primary PC. When Windows 2000 booted up, it found a new 186 GB hard-drive. Where did the missing 14 GB go? The truth is it never existed. Western Digital is using marketing lies to fool the consumer. Below is some information that isn't found anywhere on the outside of the box, but it is mentioned inside the installation guide.

...the capacity of the hard drive can be reported in either decimal gigabytes (where 1 GB = 1,000,000,000 bytes) or in binary gigbytes (where 1 GB = 1,073,741,824 bytes).

Decimal Gigabyte? That's marketing lingo for let's screw the consumer because most people forget that 1 K is 1024 bytes, not 1000 bytes. The WD rep on the phone claimed that other hard-drive manufacturers do the same trick. Great! When hard-drives were smaller this discrepancy probably went unnoticed or was too small too complain about. As hard-drives get larger, the differences between the decimal size and the TRUE binary size will widen and so will customer dissatisfaction.

First off- binary is more accurate than decimal (so HDD manufacturers are more correct). Second, he is at err for not knowing which method they used to measure their hard drive space. Marketing ploys have nothing to do with it. When I see a 40GB HDD, I know that windows will report the space in decimal format (38.8GB = estimating). I know that DOS and some cmos utilities will report that same HDD in binary (40GB = actual). Who is correct? Both.


AS FAR AS INTERNET:

I think the whole thing is rediculous and time consuming. It's a waste. Internet speeds are increasing, so lets use the standard MB instead of MBits... I wish ISP's and programmers would start the change.
 
Last edited:
diggingforgold said:
Its not actually a measurement, but rather, one uses binary, and another uses decimal measurements.

A decimal megabyte = 1,000,000 bytes (10 to 6th power)
A binary megabyte = 1,048,576 bytes (2 to 20th power)


So- windows uses DECIMAL (appears smaller), and Hard Drive Manufacturers use BINARY (appears more). Binary is actually more precises. Decimal is kind of like rounding.


No. They are both just as accurate. (And Windows uses the binary format, hard drive manufactures use decimal; decimal looks like more.)

It has nothing to do with MS or accuracy. In normal everyday use a kilo means one thousand. In computer/programming use a klilo has always meant 1024.
This is because 1024 is the closest you can get to 1000 with only one bit turned on. 2^10 = 1024.

10000000000 in binary = 1024.
01111101000 in binary = 1000.

So a kilobyte is as close as you can get to 1000 bytes without getting messy. That's were it came from.

Of course then a megabyte is 1024 kilobytes and so on.
 
NookieN said:
Listing network transfer rates in bits instead of bytes makes them appear less pathetically slow.

This is the correct answer.

For almost the same reason, HDD manufacturers measure hard drives in decimal gigabytes rather than binary gigabytes.

The conversion factor between these two isn't simple.

2^(10n) binary bytes = 10^(3n) decimal bytes

where n is a positive integer. (n = 1 for kilo, 2 for mega, 3 for giga, ....)
 
Last edited:
hah

just found this in maxpc... what a coincidence



not actual quote summarising mostly

Reader: Recentl y i though i had purchased a 120 GB hdd when i got i tplugged in it only had 111 GB what gives?

Maxpc: This isnt a nwe issue.It comes down to hwow you define 1 GB. Windows uses one def. while hdd manuf. uses another. A hard drive vendor said "hard drives use gigabytes or 10 ^10. MS uses GIBIBYTES and labels them as gigabytes. MSs gigabyte is 1,073,741,824 , thus it subtracts seven percent according to TIST, 120X10^10 divided by 1073741824 = 111.7 gibibytes, NOT gigabytes as MS would have you believe. According to NIST 1 gb is 1,000,000,000. Further quoting NIST "once upon a time comp pros noticed that 2^10 was nearly equal to 1,000. And started using ISU prefix kilo to mean 1,024. That worked out well enough for a decade or two because everyone who talked kb knew taht the term implied 1024. bytes. But almost overnigt a much broader "everybody" owned comps and the trade computer pros needed to talk to physicist and engineers and even to ordinary people most of which who knew that a kilometer is 1000 meter and a kilogram is 1000 grams.

"then data storage for gigabytes and even TB became practical and the storage devices were not constructed on binary trees which meant that fofr many practical pourposes binary arithmatec was less convienient than decimal arith. The result is taht today everybody does not know what the MB is. When discussing comp memory most manuf. use mb to mean 2^20=1048576 bytes but the manf of some comp storage devices usually use the termp to mean 1,000,000 bytes. Similarly some designers of LAN use megabits per second to mean 1048576 bit/s but all of tellecom. engineers use it to mean 10^6 bit/s or 1000000 bit/s. And if two definitions of the MB are not enough , a third MB of 1024000 bytes is teh MB used to format the floppy disks. the confusion is real as is the potential for incompatibility for standards and in implimented systems.

Isnt this the same problem that sent mars probe hurtling into the planets surface? Damn you metric system!


BLAH BLAH

more info on this confusing topic is available at

http://physics.nist.gov/ccu/units/binary.html

excuse spellign errors and using BLAH BLAH to summarise thier long boring parahraphs
 
WELL....bits and bytes was why I started this thread.

For some reason it launched into a binary / decimal conversion of hard drive amounts.

I already know my 120 gig HD is gonna be around 108 :) I did the calculations myself....mwahhaa

Why don't the HDD manufacturers just advertise in binary? God knows...
 
DarkJediSleikas said:


This is the correct answer.

For almost the same reason, HDD manufacturers measure hard drives in decimal gigabytes rather than binary gigabytes.

The conversion factor between these two isn't simple.

2^(10n) binary bytes = 10^(3n) decimal bytes

where n is a positive integer. (n = 1 for kilo, 2 for mega, 3 for giga, ....)

IT IS ALL MARKETING BS!

I think people would accept if you explained it to them...and then advertised them in binary gigs.....and your download speed showed as kiloBYTES.....they would understand. AND it would make more sense (im not confused though...im not stupid, etc) just to frickkin lable it the correct way :)
 
The only reason that I can really see (other than the obvious 'to make it seem faster' explinatoin), is that sometimes the definition of a byte itself varies. Nowdays, a byte is 99% of the time 8 bits. However, back in the day, there used to be 6 and 7 bit bytes.

Also, if you measured your speed in KB/s, would you measure KB of data recieved / second, or KB of file recieved / second? Parity and other things can 'slow down' file transfer because you only get 7 bits of file, then the 8th bit would be parity.

Dunno how much parity is used anymore, but after stating communication speeds for over 10 years in Kb/s, people just tend to keep things the same. :)

JigPu
 
Nevandal said:
I think people would accept if you explained it to them...and then advertised them in binary gigs.....and your download speed showed as kiloBYTES.....they would understand.

I dunno, it never ceases to amaze me the things people don't understand. But the rated speed is still abitrary. You could rate connection speeds in nibbles if you wanted to. Fear my 375 Knibble/s connection!


Dunno how much parity is used anymore, but after stating communication speeds for over 10 years in Kb/s, people just tend to keep things the same.

I'm pretty sure most transmission protocalls use a CRC or better error correction mechanism (e.g. Reed-Solomon). Most of the time that's done at the hardware level though. So if your program is reporting 100 KB/s, that's how much data the program receives, even though the hardware might be handling 101 KB/s.
 
Back