Oh, really? Do people really understand that when talking about bandwidth or processor speed the prefixes have decimal meanings?
Anyone technical enough to know about bytes, bits and hertz would be aware of the peculiarity about measuring quantities of bytes.
I'm not saying it's correct, it's just the way it is. Standards are often illogical. For example, how many standard definitions of a 'mile' are there? Imperial, US Survey, Nautical, Roman, etc. Which one of those is correct?
This could make a good poll for the other Niners:
How many of you are using or plan to use correct SI units?
|one kibibit||1 Kibit = 210 bit = 1024 bit|
|one kilobit||1 kbit = 103 bit = 1000 bit|
|one mebibyte||1 MiB = 220 B = 1 048 576 B|
|one megabyte||1 MB = 106 B = 1 000 000 B|
|one gibibyte||1 GiB = 230 B = 1 073 741 824 B|
|one gigabyte||1 GB = 109 B = 1 000 000 000 B|