Last Updated on March 22, 2021 by Filip Poutintsev
People across the globe have become reliant on the almost ubiquitous presence of computers in some way, shape, or form but they were unknown some 50 years ago. In their most relatable forms, minicomputers and mainframes are found in government offices, a good number of businesses, and learning environments.
Microcomputers are readily available in a large proportion of homes and businesses alike as desktops and laptops for information storage, word processing, entertainment as well as the growing area of electronic shopping. Interestingly, there are more computers in our midst in the form of vehicle dashboards, telecommunication devices, a lot of home appliances, etc that prove why a computer is important today. You do not have to have a computing technology degree to use one.
Table of Contents
What is computer technology?
To answer this question there is the first question of “What’s a computer?” This can be defined as a device that stores data, can retrieve and process it. Computer technology is a growing area that encompasses computer hardware and software, programming, networking, computer interfacing, robotics, digital and analog electronics.
We look at some of the few aspects of this broad area to gain an understanding of computer technology.
Computer Hardware
Computers use a binary system with 1 representing the “on” and 0 for “off” which early computers utilized to create a single ‘bit’ of information. The translation of the information in this format of 0s and 1s enabled these computers to carry out mathematical operations. The first machine that was computing stuff did not have the programs that we have today but instead were circuits hard wired in series to perform one single assigned task.
In essence, the function performed by this computer technology was premised on the arrangement of the circuits. To change function then the system had to be re-arranged to suit the new task. How long we have come from that time but this is the original “hardware” which was the whole computer.
In the late 1940s, there was the development of machines that used and stored programs or encoded instructions which you could imagine changed the practicality of computers and usefulness, dramatically. This development enabled users of computers to perform computational tasks without altering the “hardware” structure as before.
More practically, one could now instruct the computer to carry out the stored functions in its memory. Today’s microcomputers all have basic hardware components which include the CPU; Internal Memory Storage; one or more forms of physical data or program transfer and a network system linking the computer to other computers beyond it.
The CPU
A commonly used term is CPU but what does it stand for in computer technology? The heartbeat of the programmable form of the computer is the Central Processing Unit or CPU which is made is made up of two components, the Arithmetic Logic Unit (ALU) and the Control Unit (CU). The ALU performs basic and mundane tasks of adding and multiplying whilst the Control Unit directs the electric signal flow within the computer.
The Internal Memory
In computer technology, one of the components that set apart the modern computer operations from the primitive form is the existence of an Internal Memory which can be split into three types; the Random Access Memory (RAM), the Read-Only Memory (ROM), and the Data storage space (Disk-space). This seeks to answer where does the CPU store its computations when explaining computers.
RAM – this is the form of memory that a user is mostly interacting with which allows the inputting, alteration, and erasing of data. This serves two information storage roles which are temporary storage of programs in real-time use on the computer as well as permanent storage of programs.
ROM – this form of memory has information that is only accessible to the computer for reading and permanently fixed. It is mainly there to direct the basic tasks of the computers operations.
Input and Output (I/O) devices
These components are responsible for information transfer into and out of the computer and what you interface with when working on a computer. The early computer forms required alteration of the circuits as a means of ‘inputting’ data or using punch cards in the more sophisticated forms. These punch cards required a user to punch holes in note cards as the means of writing data that would be fed in the computer. This is a far cry from the modern means of using keyboards, mice, and voice recognition through microphones in advanced computer technology.
Computer Software
Computer programs are a set of instructions given to a computer to carry out its operations and allow it to turn from a single purpose machine to multi-purpose use for example a computer running video game software becomes a video game machine and when accesses Microsoft Word it becomes a word processing machine. There are two basic classes of computer programs which are Operating Systems and Application Programs.
Operating System– these are responsible for the internal functions of the computer. These programs organize the movement and comprehension of data between components of the computer such as internal memory, the CPU, and external input devices; enable the use of application software and take care of the perform the rudimentary tasks in the system.
Application program– this allows users to carry out specific tasks with the computer ranging from statistical analysis to word processing to gaming and there is a wide range of application programs allowing for almost limitless possibilities. However, application programs are usually limited in their functionality because of the software they are developed to run on though software developers are developing programs that able to function over more than one operating system.
Ultimately the operating system in charge of the interactions of the software to computer and user to software, there is an explicit need for the program to be compatible with it. Designers of operating software allow the application developers to access their systems through publishing application programming interfaces (APIs) specifications either freely or by license.
Computer Networks
In computer technology, first-generation were silos of information and self-contained machines. Data input into the computer and processed by it could not leave that computer easily let alone how laborious it was already to input the data. The output of one computer had to be physically transferred into another computer. The introduction of magnetic storage media made this easier yet this still meant the physical movement of this output was still necessary.
Networking was developed in the late 1960s but only until the late 1900s did the Internet allow use and access to large computer networks and today broadband allowing access to that network is a basic right. This concept of networking of computers allows communication over large physical distances between computers at great speeds than before. A good example is the use of email or electronic mail by computer users.
This networking requires technologies that are part of computer technologies to facilitate it which can be either physical by a cable or virtual telephone lines or wireless communication.
Local Area Networks
As the name suggests this is confined over a more immediate geographical area. Organizations use this format of networking to allow their members’ computers to share information and resources. Local Area Networks allow the following:
- Employees can communicate via email
- They can send files to one another
- Share computing
- They allow for the specialization of computers and storage of information on one central computer/server accessible to the other computers/clients.
- Because of the central location one document can be accessed over the network and can be easily updated.
Large-Scale Public Networks
The Department of Defense’s Advanced Research Project Agency began a project in the mid-1960s to connect its multiple computers that were scattered across the United States In the mid-1960s, researchers working to enhance memory storage and sharing capabilities. It was also important that this network be operational regardless of any cataclysmic event such as a nuclear attack or a breakdown in the telecommunication systems.
Parallel to this was the development of “store-and-forward packet switching” which was meant to avoid independent connection of nodes to each other at a time on a network. In 1970, packet switching in a four-node network was implemented by the ARPANET and between 1971 and 1981 grew to over 200 sites.
This was a precursor of the Internet which is a global collection of computer networks interconnected through what is called Transmission Control Protocol/Internet Protocol (TCP/IP). Users access the internet through Internet Service Providers (ISPs) such as Microsoft Network (MSN), America Online (AOL) among others: corporate and nonprofit portals; local connections, and government nodes.
The Internet was first overseen by a consortium called InterNIC standing for National Science Foundation’s Internet Network Information Center which comprised of the National Science Foundation, Network Solutions, Inc. (NSI), General Atomics, and AT&T. Though the Internet is largely decentralized, NSI has the mandate of registering he Internet names and addresses termed “domain names”.
These names are specific and have a hierarchical format which is server.organization.type for example www.whitehouse.gov. This was then opened up to another consortium called the Internet Corporation for Assigned Names and Numbers (ICANN) in 1999. Before that, a researcher at CERN, Tim Berners Lee developed the World Wide Web or the Web or WWW which allowed users of the Internet to share databases. The Web has pages that have uniform resource locators and allows access to hypertext documents found on Hypertext Transfer Protocol (HTTP) servers.
What is the future of computer technology?
The emergence of the Internet and the introduction to information technology have changed and catapulted computer technology. More and more computerized devices seek to be integrated into the network and mean future computers such as smartwatches will be linked to a host of other functions.
Common questions asked
Q: What is the difference between Computer Technology and Information Technology?
A: Computer technology is a growing area that encompasses computer hardware and software, programming, networking, computer interfacing, robotics, digital and analog electronics whilst Information technology is the study, design, and implementation of computer-based information systems.
Q: What is the difference between Information technology and Information systems?
A: In computer technology, Information technology is the study, design, and implementation of computer-based information systems whilst information systems refer and not limited to just information by the people and processes.