Ultra nerdy question

1. ## Ultra nerdy question

So I get the 10 base number system. We have 10 fingers or whatever so it makes sense

I get the 12 base number system everyone hates on. There are many factors that make it kinda better but it totally is stupid when measuring in a base twelve and calculating in a base 10

What I have never questioned until now and am curious about is why the base 8 hybrid 10 for programming? I am 'good enough' to do figure out and do what I consider basic programming stuff but I am curious if anyone has a deeper understanding to whats going on here. It just looks as irrational as the 60 second, 60 minute 12 hour twice system we see in clocks..

(Also.. Anyone have any clue wtf the French were thinking with the clock thing?)

2. ## Re: Ultra nerdy question

I have no idea. All I can say is that the French themselves don't know why clocks are like they are.

3. ## Re: Ultra nerdy question

I thought programmers were into their binary? Binary is arguably the simplest and purest positional system, and computationally the simplest to describe addition and multiplication. 8 is a power of 2. When working with a particular base, its powers become important because they behave very simply in the given base. i.e. 10, 100, 1000 in base 10.

4. ## Re: Ultra nerdy question

Originally Posted by Helz
So I get the 10 base number system. We have 10 fingers or whatever so it makes sense

I get the 12 base number system everyone hates on. There are many factors that make it kinda better but it totally is stupid when measuring in a base twelve and calculating in a base 10

What I have never questioned until now and am curious about is why the base 8 hybrid 10 for programming? I am 'good enough' to do figure out and do what I consider basic programming stuff but I am curious if anyone has a deeper understanding to whats going on here. It just looks as irrational as the 60 second, 60 minute 12 hour twice system we see in clocks..

(Also.. Anyone have any clue wtf the French were thinking with the clock thing?)
I'm not sure what you mean by "base 8 hybrid 10".

The reason computers use base 8 (or base 16) often is because of historical reasons. In the early days of computing, ASCII was invented as a standard of storing any English character as a set of 7 bits (7 being the minimum that could satisfactorily store every character) plus one byte for error checking, thus 8. Since that standard became so widespread, 8 bits to a byte became the norm.

5. ## Re: Ultra nerdy question

I'm not sure what you mean by "base 8 hybrid 10"
I thought it was called something like "base 8 hybrid 10". Given your reaction I am probably very wrong there. Something about how your still counting in a base 10 system (1-10 before starting over) but then the computer is using chunks of 8.

6. ## Re: Ultra nerdy question

In programming you generally use binary (base 2), hexadecimal (base 16), and decimal (base 10), depending on the task.

I haven't seen anybody using octal (base 8 ) at all. Maybe what you are referring to is simply that a byte is 8 bits.

A bit is a single 0 or 1. Which you can think of in a primitive machine as literally telling the machine that a certain switch should be flipped on or off.

As oops said, a byte is the smallest unit of memory large enough to store an individual character. They could make bytes 10 bits instead of 8 in order for calculations to be easier for us as humans, but that would be a huge waste of storage space.

Maybe the 8 vs 10 thing you're talking about is about the difference between conventions for prefixes regarding memory size. In most scientific fields Kilo means 1000, Mega 1 million, Giga 1 billion, etc. But with computing KB can mean both 1000 (10^3) bytes and 1024 (2^10) bytes. Windows uses 1024 while the Hard Drive companies use 1000. That's why your 1 TB (10^12 bytes) hard drive ends up being about 931 GB.

1,000,000,000,000 / 1024 / 1024 / 1024 = 931.3

There was a convention introduced to fix this problem where KB (kilo-byte) would mean 1000 bytes and KiB (Kibi-byte) would mean 1024 bytes. Unfortunately this convention is not gonna be universally adopted anytime soon because it would basically be suicidal for a hard drive company to start selling a 1 TiB drive (which is bigger and hence should be more expensive than a 1 TB drive) and try to convince the average uneducated consumer to buy their more expensive drive over the competitor's.

#### Posting Permissions

• You may not post new threads
• You may not post replies
• You may not post attachments
• You may not edit your posts
•