Ultra nerdy question
Register

User Tag List

Results 1 to 9 of 9
  1. ISO #1

    Ultra nerdy question

    So I get the 10 base number system. We have 10 fingers or whatever so it makes sense

    I get the 12 base number system everyone hates on. There are many factors that make it kinda better but it totally is stupid when measuring in a base twelve and calculating in a base 10

    What I have never questioned until now and am curious about is why the base 8 hybrid 10 for programming? I am 'good enough' to do figure out and do what I consider basic programming stuff but I am curious if anyone has a deeper understanding to whats going on here. It just looks as irrational as the 60 second, 60 minute 12 hour twice system we see in clocks..

    (Also.. Anyone have any clue wtf the French were thinking with the clock thing?)

  2. ISO #2

    Re: Ultra nerdy question

    I have no idea. All I can say is that the French themselves don't know why clocks are like they are.
    Quote Originally Posted by The Lawyer View Post
    Besides your lamp and your refridgerators, do you find anyone else suspicious?
    Quote Originally Posted by oliverz144 View Post
    it looks like many, e.g. MM and lag, suffered under the influence of paopan. However there is a victim: frinckles. He left the path of rationality and fully dived into the parallel reality of baby shark, king shark, and soviet union pizzas.
    Spoiler : The meaning of life :

  3. ISO #3

    Re: Ultra nerdy question

    I thought programmers were into their binary? Binary is arguably the simplest and purest positional system, and computationally the simplest to describe addition and multiplication. 8 is a power of 2. When working with a particular base, its powers become important because they behave very simply in the given base. i.e. 10, 100, 1000 in base 10.
    Quote Originally Posted by Blinkstorteddd02 View Post
    naz, he's claiming to have been at your house last night and infected you. I know u were drunk but PLEASE try as hard as you can to remember... That burning you felt the next morning when you went pee was from me, not him.

  4. ISO #4

    Re: Ultra nerdy question

    Quote Originally Posted by Helz View Post
    So I get the 10 base number system. We have 10 fingers or whatever so it makes sense

    I get the 12 base number system everyone hates on. There are many factors that make it kinda better but it totally is stupid when measuring in a base twelve and calculating in a base 10

    What I have never questioned until now and am curious about is why the base 8 hybrid 10 for programming? I am 'good enough' to do figure out and do what I consider basic programming stuff but I am curious if anyone has a deeper understanding to whats going on here. It just looks as irrational as the 60 second, 60 minute 12 hour twice system we see in clocks..

    (Also.. Anyone have any clue wtf the French were thinking with the clock thing?)
    I'm not sure what you mean by "base 8 hybrid 10".

    The reason computers use base 8 (or base 16) often is because of historical reasons. In the early days of computing, ASCII was invented as a standard of storing any English character as a set of 7 bits (7 being the minimum that could satisfactorily store every character) plus one byte for error checking, thus 8. Since that standard became so widespread, 8 bits to a byte became the norm.

  5. ISO #5

    Re: Ultra nerdy question

    Thanks for the heads up
    Quote Originally Posted by oops_ur_dead View Post
    I'm not sure what you mean by "base 8 hybrid 10"
    I thought it was called something like "base 8 hybrid 10". Given your reaction I am probably very wrong there. Something about how your still counting in a base 10 system (1-10 before starting over) but then the computer is using chunks of 8.

  6. ISO #6

    Re: Ultra nerdy question

    In programming you generally use binary (base 2), hexadecimal (base 16), and decimal (base 10), depending on the task.

    I haven't seen anybody using octal (base 8 ) at all. Maybe what you are referring to is simply that a byte is 8 bits.

    A bit is a single 0 or 1. Which you can think of in a primitive machine as literally telling the machine that a certain switch should be flipped on or off.

    As oops said, a byte is the smallest unit of memory large enough to store an individual character. They could make bytes 10 bits instead of 8 in order for calculations to be easier for us as humans, but that would be a huge waste of storage space.



    Maybe the 8 vs 10 thing you're talking about is about the difference between conventions for prefixes regarding memory size. In most scientific fields Kilo means 1000, Mega 1 million, Giga 1 billion, etc. But with computing KB can mean both 1000 (10^3) bytes and 1024 (2^10) bytes. Windows uses 1024 while the Hard Drive companies use 1000. That's why your 1 TB (10^12 bytes) hard drive ends up being about 931 GB.

    1,000,000,000,000 / 1024 / 1024 / 1024 = 931.3

    There was a convention introduced to fix this problem where KB (kilo-byte) would mean 1000 bytes and KiB (Kibi-byte) would mean 1024 bytes. Unfortunately this convention is not gonna be universally adopted anytime soon because it would basically be suicidal for a hard drive company to start selling a 1 TiB drive (which is bigger and hence should be more expensive than a 1 TB drive) and try to convince the average uneducated consumer to buy their more expensive drive over the competitor's.
    Last edited by DJarJar; April 8th, 2021 at 02:03 PM.
    Have you ever heard the tragedy of Darth Jar Jar the wise?

  7. ISO #7

    Re: Ultra nerdy question

    It's because the original ASCII code was 7-bit based, and bytes are usually the smallest unit of information large enough to hold a single character. But processors prefer bytes with sizes (in bits) that are powers of two, and thus bytes are now 8-bit on most machines.

  8. ISO #8

  9. ISO #9

 

 

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •