Contact Pictures Links Main Page Wazz Up In The Shop
  CHURCHFIELD  SERVICES ... Solving computer issues for over 30 years.
  WAZZup with computers
The very basic computer primer.
WAZZup with Internet Modems, Routers and Switches Those IPV4 numbers.
WAZZup with PORTS and PROTOCOLS The hackers key to the back-door.
WAZZup with Passwords and Usernames The simple way to security.

The basic computer primer:
For those who have not a stinkin' clue how a computer works.

In the beginning there was "thought process", back as far as neandrathals humans needed a way to communicate. with one another. The first steps were grunts, then more grunts, than grunts with pointing...then loud grunts with excessive gestures and finally a good "bitch slap" upside the chops...as in "wake up asshole"!!
Well, it is time to wake up. The very first computer was the human brain. We actually used thought process to switch on and off a "one bit" operation known as "yes or no". We kept those processes like a program in our brain. That led us to crazy things like using fire, bathing in "clean" water, wiping our ass with leaves and eventually inventing neat things like the wheel. We learned how to communicate and pro-create as soon as we figured out where to put that thing between our legs. From a few pri-mates, we became the most intelligent species walking on the planet. Then someone became a politician and human progress was stymied for centuries. Still is...
The computer is nothing more than a "on and off" switch. A "yes or no" device that uses positive and negative voltage switched rapidly in programmed cycles that are controlled by an operating system designed for a "central processor unit".
Every "processing" unit has a DATA PATH, that path carries along the "zeros and the ones" (BIT) in a perfect order, one BIT at a time. After the BIT count meets the "operating systems criteria" (BIT + BIT + BIT and so on) the BITS are added up to make the BYTE. The BYTE is the actual DATA. In other words a 4 BIT CPU, renders data from a BYTE that is 4 BITS long. It can do this because the DATA PATH is 4 BITS. In simple terms there is a D0-D3 termination point wired into the CPU.
The computer instruction is made up of a series of BYTES. Since 4 BITS cannot represent much understandable human processed information, it must be converted. Very early computers were programmed (tediously) using "machine codes" that could be interpreted in the CPU's I/O management system. These codes looked very similar to this "JMP" or "MUL". There are hundreds if not thousands of these codes. https//wikibooks.org/computing (reference)
As the computer graduated from basic to complex a more efficient way of entering data had to be offerred. This came when we broke the 4 BIT barrier and entered the 8 BYTE world. Now we could create a language set with English interpretation. Yes we still spoke to the computer using "instruction sets" that coulr be understood by the CPU, but now we added an interpreter. The ASCII codes were created around the early 1960's and there has been some revisions since. The first "computer interpretation" came along in very early 1970. It was an 8 BIT decimal code that was generated by the "keyboard processing unit". The user typed an "A" and the generator sent an 8BIT decimal code "0100 0001" (decimal 65). Each "1" represtent an ON BIT and an 8BIT code represents 8 BITS "0" through "127".
If we look at the BYTE (8 BITS) as a decimal number, we see that (from right to left) the "0" and "1" can represent a character code. All zeros = "NUL" anything else is "DATA". In the above code  the BYTE (8 BITS) is split into 2-4 BIT segments known as LSB and MSB (most and least significant BITS) The LSB represents (from RIGHT to LEFT) "1, 2, 4, and 8". The MSB represents "16, 32, 64, 128". When added together the representation is 0-255 decimal numbers. This is an 8 BIT process that equals 256, the 16 BIT process = 512 and 32 BIT = 1024 (and on and on). Now if we turn on BIT 7 we generate a 64 decimal code. If we add BIT 1 we now generate a "65" decimal code which represents "A" and in binary "01000001". The first letter in the alphabet "A" or the name Andy or world Altitude or Asshole. You get the point. It requires 7 more of these codes to generate "ASSHOLE". https:wikipedia.org/ascii (reference).
The TERMINAL unit had a standard "qwerty" keyboard. Similar to a type writer, the TERMINAL UNIT was designed this way because most of the DATA entry would be done by secretaries who already knew the typewriter. And the keyboard was also used in "radio teletype" communications so why not adapt the keyboard, the character set "ASCII" and design an interpreter around that criteria. If the CPU "instruction" requires "LDA" now the machine code could be entered in plain English as opposed to decimal codes on a keypad or punch holes in a card. Now the TERMINAL passed the sends the codes to the interpreter, which delivers the proper instruction to the CPU.
The CPU also has a "memory management unit (MMU). Now DATA, the code genertated from the keyboard, or read from a floppy (input) could be stored for use by the CPU until the program ended or the unit was shut down. The CPU is the center of the computer and in the beginning it was a very primitive four bit device.
Programs that operate the CPU are simply an instruction set which starts and ends with "CPU instructions" designed to (A) "instruct" the CPU to control external devices, (B) "wait" for input from external devices and (C) evaluate the DATA and complete the process. Then it does this all over again, for each "process cycle", which today is in the 3 GHz range of clock frequency. The CPU clock cycle determines the "base speed" of the CPU.
The NASA Apollo 11 CPU was a 4 BIT processor that used a unique operating system which allowed the device to swap subroutines (programs inside programs) in and out. Giving the computer vast ability along a simple 4 BIT DATA PATH.
Program routines were loaded into a non-volatile memory bank. These banks were accessed by switching the DATA PATH in and out (I/O) of these banks. It was these banks of memory and the operating system which controlled the CPU to complete the Apollo 11 landing in 1969.
Less than a decade after the moon landing, hobbyists were playing with 8 BIT processors (Zilog Z80) and the Motorola 6502. These processor units were marketed in Radio Shack, Sinclair and Commodore computers. There were several others as well. Heath Kit H8, released in 1977. The H8 was controlled via a terminal unit H-9 and a storage unit H-17 dual floppy drive.  http://www.oldcomputers.net/heathkit-h8.html (reference)
The H-8 operating system of choice was Digital Resarch CPM V1.0. The OS was loaded, each time you started the computer, by typing the "boot routine" into the terminal unit. The OS was loaded from floppy and then, if you had a few extra dollars, you had some kind of a programming language and compiler. Programs were also loaded in using a "punch card reader".
Inside a few years the entire OS was on a ROM inside the computer. This ROM booted the computer and brought the familiar "C:" prompt on the screen. From there everything went crazy. The IBM PC craze was on. Manufacturers came and went faster than the retail outlets that sold them. Inside five years we went from "green text screens" to 4096 colors with seperate keyboards and mouse controlled graphic interfaces. The war of operating systems was on. From Windows, there was MAC and Amiga DOS. CPM had a 16 BIT grtaphic operating system known as GEOS.
We were on our way...but, the basics remained. The "personal computer" was still stuck inside ONE PROCESS AT A TIME. We were still using on/off technology, however the data path was 16 BIT WIDE. That was a good thing, because we could do one process fast enough, that a second process seemed to flow along, even if we suspended one for the other.
Finally, some 13 years later, we had a true multi-tasking OS for our IBM based PC. We had Windows 98. This was an exciting break-through. We were now capable of processing data that was coming IN and going OUT...at the same time. These first processors used 16 bit architecture and the I/O data path was not one but two 8 BIT PATHS. The "operating system" kept everything in order. These new "operating systems" were massive and required a lot of memory. Something that, early on, they could not support. The numbers were not there and the memory manager (MMU) had to swap programs in and out of a 512K BLOCK of memory for the CPU. When the 32 by 32 BIT processors came along, we finally broke the 1 MEG memory barrier that was holding everything back. From that point on the PC became "big time". It replaced thousands of main-frame computers and eventually what may have been a small area network inside a business, we found the Internet. The "world connected" LAN. This technology, computers, is still in its infancy.
So, understanding a little about where we came from...we must understand where we are headed. The speed of a CPU is reaching 5 GHz. That speed will require some very interesting support from the OS, MEMORY and the CHIPSET that supports I/O. The DATA PATH can no longer be traces along a circuit board. The material connecting I/O, Memory and CPU can no longer be copper, silver or gold. The speed will be limited to the length of these paths and therefore size of these devices must become smaller or more integreted. We must find a material that physically can pass DATA with no, absolutely none of the critical losses in wire or printed circuit. We cannot afford to solder or mechanically attach wire leads to our support devices. In better terms, everything must be inside or along side the processor unit. This will be a nightmare for engineering if we are ever to see 800-1000 GHz processors, we must support UXHF communicating between I/O devices and the real world. Even at the speeds we see today, many of our older devices, such as printers and hard drives, they are too slow!!
We have abandoned serial communication for USB. We have abandoned FLOPPY devices and IDE interfaces for hard drives. SATA (Serial ATA) is the normal interface today. We no longer use "parallel" ports and INFRA-RED has been replaced with BLUE-TOOTH. The faster we take our devices, the less time we give our interfaces to handle outside information. But...that is becoming the best of a good thing!!
It will be nice to see what the OS really looks like for these mega-fast computers, but one thing for sure, entering information via a keyboard, will be slow and cumbersome. Multiple voice recognition (MVR) or even thought process communication is not far away in the future.

The Internet Connection:
We will try to explain the numbers and why we need MODEMS, ROUTERS and SWITCHES!!
Follow along here...the first Internet connections were slow and limited. Not so long ago, back in 1999 we were still playing with AOL, Compuserve and NETscape. Back in 2003 AOL and Compuserve settled for millions of dollars in a lawsuit over "service rebates" that these ISP providers never paid to their subscribers. That was only 13 years ago!
We may be a little further along, but in most instances, the internet remains an "poor excuse accomplishment" when it comes to serving the public.

Ports and Protocols:
Port 80 is not open...

Passwords and Usernames:
I forgot my password!!