The problem is that each time you go up another unit, the binary and decimal units diverge further.
It rarely mattered much when you’re talking about the difference between kibibytes and kilobytes. In the 1980s, with the size of memory and storage available, the difference was minor, so using the decimal unit was a pretty good approximation for most things. But as we deal with larger amounts of data, the error becomes more-significant.
The problem is that each time you go up another unit, the binary and decimal units diverge further.
It rarely mattered much when you’re talking about the difference between kibibytes and kilobytes. In the 1980s, with the size of memory and storage available, the difference was minor, so using the decimal unit was a pretty good approximation for most things. But as we deal with larger amounts of data, the error becomes more-significant.
This is exactly right. Divergence was small when sizes were small. Good point.