• Question: I think you use 0 and 1 in coding. Why not any other number?

    Asked by anon-257014 on 15 Jun 2020.
    • Photo: Sreejita Ghosh

      Sreejita Ghosh answered on 15 Jun 2020: last edited 15 Jun 2020 12:02 pm


      In coding we use commands which look similar to English words. However the computer transforms these codes into machine language which is a binary code (0s and 1s). 0 implies off and 1 implies on state. Computers use many transistors and semiconductor devices (microprocessors) in its ‘brain’ or Central processing unit (CPU). The combinations produced by the different output pins being on or off is translated as a different task being done by a computer.
      To know more you can check here
      https://www.bbc.co.uk/bitesize/guides/zwsbwmn/revision/1

    • Photo: Louise Davies

      Louise Davies answered on 15 Jun 2020:


      Using 0s and 1s is called binary, and is part of how the parts that make up computers work. 0 means that something is turned off, and 1 means that something is turned on. Like a simple circuit where you can have a switch that turns a light on and off, you can make lots of circuits and link them together to make a computer – but underneath it all it’s still turning “things” (transistors) on and off to determine what’s going on.

      In actual coding that most people do day to day – we use things other than the numbers 0 and 1 – we use what’s called “programming languages”, which basically allow us humans and the computers to communicate with each other. So I can write something like “x = 13”, and the computer will convert that “sentence” into commands that turn transistors (those things in the computer than can be turned on and off) on and off to represent “13” in the computer.

      To be honest – we don’t even need to use 0 and 1! Like I said, these mean “on” and “off” and it’s just stuck around that 0 means “off” and “1” means on.

    • Photo: Steve Williams

      Steve Williams answered on 15 Jun 2020:


      The most basic reason is that at the centre of a computer is a unit called a Central Processing Unit (CPU) where all the calculations are performed. The CPU is constructed almost entirely of transistors. The most certain state of a transistor is that it is either on or off just like a switch, those on and offs can be easily translated to either 1 or 0. It is a very simple principle and one which makes computers amazingly reliable when handling numbers. We as humans mostly use a decimal system with numbers which work in units of ten. Because computers only use 0s and 1s their system is called a binary system.

    • Photo: Martin Coath

      Martin Coath answered on 15 Jun 2020: last edited 15 Jun 2020 12:16 pm


      Electronics that relies on a switch being on or off – but never something in between – is just more reliable. No room for uncertainty. It is easier to build huge complicated systems when the ‘state’ of every part of the system is known precisely. Computers that use other systems *can* be built, but they are ‘finicky’ (which is a wierd word, you might have to look that one up! :D).

    • Photo: anon

      anon answered on 15 Jun 2020:


      Remember there are 10 kinds of people in the world… those who understand binary, and those who don’t…

    • Photo: Andy Smith

      Andy Smith answered on 15 Jun 2020:


      We do use lots of other (normal!) numbers too. But, as others have already said, computers only understand 0 and 1 because the electric switches inside them (transistors) can only be on or off. So we have to convert our normal numbers (known as decimal numbers) into 0’s and 1’s (binary). For example, the decimal number 42 is represented as 101010 in binary. Thankfully the programming languages we use take care of all the messy conversion for us.

    • Photo: Anar Yusifov

      Anar Yusifov answered on 16 Jun 2020:


      Numbers themselves are not important – it is simply convenience which comes from the fact how first computers were designed (as other mentioned in terms of transistors being “on” and “off”). But as Martin Coath mentioned – there were attempts to build computers using different set of bases.

      Look for Ternary Computers for a good example. Few advantages: 1) 3 states are more optimal to store information (this comes from Information Theory) 2) computing and managing positive and negative numbers as well as infinity would be actually more power efficient and easier to construct. 3) arguably, but possibly true, your code could be more portable between systems like conventional, quantum and optical computing.

      To be honest, if people would have time to stop the world and rethink computer architecture that would be one of the first things to fix.

Comments