Dec = 10 Cent = 100 Mil = 1000
Using historical, global linear language sounds good to me
Post funny things about programming here! (Or just rant about your favourite programming language.)
Dec = 10 Cent = 100 Mil = 1000
Using historical, global linear language sounds good to me
Metric uses those for numbers less than 1, a situation that doesn't arrise in computing. There is nothing less than a bit, whether its set to 1 or 0.
We should be using KB, MB, GB, and TB. Also we should adopt the entire International System of Units and stop with the shit we use. The army uses metric. Why can't the rest of the population?
KiB, MiB, GiB, TiB
On the contrary.
KB = 1,000 bytes and MB = 1,000,000 is empirical.
KiB = 1024 bytes or 2^10^ and MiB = 1,048,576 or 2^20^ is Metric.
Remember, empirical is the miserable system the rest of the world abandoned because it made math and science difficult. KB makes storage miserable, never being clear whether your have the exact space your box claims it does. Please continue to Free^TM^ yourself from British "nonsense", while the rest of the world evolves.