Computers are super neat. See all this stuff on the screen in front of you? See all those other tabs with recipes for apple turnovers and Wikipedia articles about The A-Team? That's computers. Sure, it was possible for people to share information about recipes and popular culture from a half a generation ago before computers, but it was way harder. In the same way, it was possible to do a lot of fiddly math like cracking Nazi codes and computing gunnery firing tables before early computers got running. Computers made all of this way easier, and enabled a bunch of stuff that wasn't otherwise possible at all, like the movie Wargames starring Matthew Broderick and OpenSSL vulnerabilities. But let's focus on the part where computers make stuff that we would otherwise be doing anyway and make it way faster and easier.
It didn't start out that way.
Way back when, if you wanted to use a computer just to add two smallish numbers, you had to do all kinds of crazy stuff like learn binary, understand the difference between a half-adder and a full adder, the difference between one's complement and two's complement arithmetic, and then read a bunch of manuals written by some guys with skinny ties at IBM just so you could turn all of that into some inscrutable machine language. Then, you had to punch holes in these things that were like index cards the size of dollar bills from the 1890s. Finally, you would feed them into a machine that made a lot of noise and you would get your answer.
This is all way, way harder than just learning how to add and subtract, but computing had to start somewhere.
It is with this in mind that I have been trying to get a deeper understanding about low-level computer operations, and that's what prompted me to create Bedrock, which is a ruby script that simulates something like a simple computer. In a lot of ways it's much easier to program than learning to understand real low-level computer instructions, but basically it tries to get the idea across.
Bedrock simulates a computer whose memory is an array of 100 memory locations (0-99), each of which stores a 3 digit decimal number (000 - 999). Instructions are stored as 3 digit numbers just like the other data, so this is an example of a Von Neumann architecture, the kind that most computers use today. There are also two registers, Fred and Barney, and a index register whose purpose I will later explain.
When treated as instructions, the first digit of each "word" is taken to be the instruction opcode (though not in the case of "immediate instructions" as I'll explain later). The last two digits are taken to represent an address in memory (one of the 100 cells numbered from 0-99). Each instruction opcode does something different with what it finds at that address, and the thing that it does is what makes it unique and useful for the poor programmer trying to do something useful. Here are the opcodes that work in this way. Taken "AA" to stand in for the address portion of the instruction:
1AA - Load the contents of the memory address into Fred 2AA - The same, but with Barney instead of Fred 3AA - Store the contents of Fred into memory at the given location 4AA - The same, but with Barney instead 5AA - Move program execution to the given address on the next cycle 6AA - The same, but only if Fred is zero 7AA - Add the contents of the given address to Fred 8AA - The same, but for Barney 9AA - Don't do anything. Reserved for future additions
I mentioned an index register earlier, and the addresses that are used by the above instructions are actually not formed only by the "AA" part, but but by "AA" plus the contents of the index register. This is called a "direct indexed" addressing mode by skinny tie IBM types, and is the only addressing mode that Bedrock has.
The point of such a mode is to make it easier to increment through portions of memory, since incrementing and decrementing the index register is an operation with its own instruction. You don't see it above because it belongs to a special class of instructions. These all begin with "0" and don't concern themselves with memory addresses at all. They are as follows:
010 - Zero out the index register. 011 - Decrement (subtract one from) the index register 012 - Increment (add one to) the index register
You can see that there are only four of these types of instruction at the moment, but there is room for as many as 99, so there is considerable room for expansion if I feel it is needed.
Now, if this were a proper computer, somebody would have written something called an assembler (I still might, but I haven't yet). An assembler is a program that produces these strings of numbers after reading something that approximates source code, but is mostly cryptic three-letter codes. The advantage of the three-letter codes is that they're easier to remember than the even more cryptic numbers the computer understands. I don't have an assembler, but I do have the three-letter codes. Here are the ones that correspond to the instructions above:
LDF - 1AA - Load the contents of the memory address into Fred LDB - 2AA - The same, but with Barney instead of Fred STF - 3AA - Store the contents of Fred into memory at the given location STB - 4AA - The same, but with Barney instead JMP - 5AA - Move program execution to the given address on the next cycle BRZ (branch if zero) - 6AA - The same, but only if Fred is zero ADF - 7AA - Add the contents of the given address to Fred ADB - 8AA - The same, but for Barney ??? - 9AA - Don't do anything. Reserved for future additions
HLT - 000 - Halt. Don't do anything else. ZEI - 010 - Zero out the index register. INI - 011 - Decrement (subtract one from) the index register DEI - 012 - Increment (add one to) the index register
Programming the thing is way more fun than actually doing arithmetic, which I suspect the main reason that computers caught on with engineers in the first place, even if it wasn't strictly easier to get anything useful done per-se.