If you work in IT then I’m sure you get asked the question ‘How do computers work?’, well here’s a simple post you can reference…
If you want to sound clever, then you can start talking in acronymns.
Here are a few acronyms you can use:
- RAM - Random Access Memory
- CPU - Central Processing Unit
- GUI - Graphical User Interface
- CLI - Command Line Interface
By the way, many more are avalaible from the Tech Terms website.
People will smile and nod and then probably make an excuse to leave. This is not particularly helpful. Instead, I try and use real world examples to demonstrate the principles of computing.
Bits, Bytes and Lights
In essence, computers are simply a lot of switches. Computers use bits, where a bit is considered either 1 or zero, or on or off. A group of 8 bits is commonly called a byte.
To expand on this example a bit more, consider a house with eight rooms, in each room is a light switch. One room on its own is similar to a bit, where the lights are either on or off.
If you take all eight rooms together, then there can be multiple combinations of the lights being on or off in the house. The light on only in the hall could mean the letter a, whereas the light on only in the living room could be the letter b. The light on in the hall and the living room could be c, and so on. This represents a byte.
Then, imagine that your house is in a housing estate with 1,000 identical houses (or 1,024 depenging on your school of thought), then this estate represents a kilobyte, or KB. Imagine the possible combinations of lights being on or off across the estate.
Lets take this one step further and imagine that there are 1,000 estates in a city. This represents a megabyte, or MB. Across the country, there may be 1,000 of these cities, which represents a gigabyte, or GB.
A simple concept that is multiplied becomes amazingly powerful. This demonstrates the concept of storage in a computer.
If we then talk about the speed of a computer, we can utilise the same concept of the houses with lights, but consider the speed at which the lights can be turned on and off in each house as a deciding factor of computer speed, or processor power.
The faster the lights can be turned on and off, the faster the processor is. The more switches in the processor, the more it can handle at once.
Another way of making processors faster is to make more processor cores that work together. Taking the above example again, if there is a lot of lights being switched on and off in one city then the computer may decide to dedicate one of its cores entirely to this city, but spread the load of another three quiter cities over the other three cores.
The other deciding factor is how many of the houses can be viewed at any one time. An old computer may only be able to look at a single estates houses at one time, before moving on to the next house. A new computer can look at a city, or even a whole country at a time (i.e. multiple cities).
This concept represents Memory, or Random Access Memory. The more memory the computer has, the more that can be loaded into it at one time to be processed.
Increasing Computing Power
I think people understand that computer performace is improving all the time, but I don’t think they appreciate the rate of this improvement.
There is an observation called ‘Moore’s Law’ that states:
The number of transistors in a dense integrated circuit approximatley doubles every two years.
To explain this with our example, the transistor is our light switch and the dense integrated circuit is the processor.
You may be asking what this means to me, but it generally means that if you buy a computer now, then in two years time a new computer would have double the performance. If nothing else, this is a great excuse to buy yourself a new computer every two years…
I hope this makes sense and helps you understand how computers work.
Tagged with random, computers