so in kinda new to the whole computer thing. but the little bit i do know makes me wounder..why is it that computer code is only ones and zeros? couldn't one be able to create there own code based on what ever they want it to be? not ones and zeros? silly question i know but just had to ask.
Answer by EnvoyOfTheEnd · Feb 02, 2012 at 12:54 PM
The greater the number of states or values the greater the room for error. For simplicity a High and Low, a 1 and 0 are the easiest to do with minimal risk of one value being mistaken for another if they are a bit closer than intended.
It is entirely possbile some interfaces do use greater numbers, just after a quick look I cannot find any mentions specfically.
Answer by AlanStryder · Feb 02, 2012 at 01:26 PM
Not really. The 1 and 0 binary is really just another way to see "on" or "off", which is exactly what is happening when the computer reads it. Circuits turning on or off at the machine level. the only way to increase the number would be to allow a circuit to be "half off" or "mostly on" which doesn't make sense.
Answer by sfrancis928 · Feb 02, 2012 at 04:28 PM
I know they've tried ternary logic (based on 3 instead of 2) in computers before, but they couldn't make it work right. They tried the "on, off, and half off" type of circuitry but they couldn't make it work accurately.
Choosing brands for your self build 12 Answers
manufacture pc or DIY barebone kit ? 0 Answers
What happened to Firewire? 9 Answers