Skip to Main Content

Information Technology

Background

Developed in Asia and widely used during the Middle Ages, the abacus can be considered the origin of modern computing devices. An abacus, composed of strings and beads representing numerical values, can be used for arithmetic.

French philosopher Blaise Pascal invented the world's first digital calculator in the 17th century. His machine was based on a system of rotating drums controlled with a ratchet linkage. In honor of his early contributions to computer technology, the programming language Pascal was named after him in the 1970s. A German philosopher and mathematician, Gottfried Wilhelm von Leibniz, later improved Pascal's design, making a handheld version similar to a handheld calculator. It never became available commercially, however.

The first significant automated data-processing techniques were applied to making fabric patterns, not calculating numbers. French weaver Joseph-Marie Jacquard introduced a punch-card weaving system at the 1801 World's Fair. His system was straightforward enough; the punched cards controlled the pattern applied to the cloth as it was woven. The introduction of these looms, symbolizing the replacement of people by machines, caused riots.

After proposing in 1822 that it might be possible to compute table entries using a steam engine, Charles Babbage had second thoughts about his idea and went on to design the analytical engine that had the basic components of the modern computer in 1833. This earned him the title of father of the computer. He was aided greatly by the daughter of famous poet Lord Byron, Ada Augusta King, Countess of Lovelace, who is recognized as the world's first programmer. In 1890, U.S. inventor and statistician Herman Hollerith put the punched card system to use for the 1890 census. He discovered that perforated cards could be read electrically by machines. Each perforation could stand for some important piece of information that the machine could sort and manipulate. Hollerith founded the Calculating-Tabulating-Recording Company in 1914, which eventually was renamed International Business Machines (IBM) in 1924. IBM is still an IT industry leader today, and it remains on the cutting-edge of technology. Some of its newest projects focus on blockchain technology, data analytics, artificial intelligence (including generative AI), quantum computing, and other emerging fields.

In the mid-1940s, punched cards were also used on the Electronic Numerical Integrator and Calculator (ENIAC) at the University of Pennsylvania. ENIAC's inventors developed the world's first all-electronic, general-purpose computer for the U.S. Army. This computer was enormous and relied on over 18,000 vacuum tubes. In 1949, they introduced the Binary Automatic Computer (BINAC), which used magnetic tape, and then developed the Universal Automatic Computer (UNIVAC I) for the U.S. census. The latter was the first digital computer to handle both numerical data and alphabetical information quickly and efficiently. In 1954, IBM built the first commercial computer, the 650 EDPM, which was programmed by symbolic notation.

By the late 1950s, the transistor, invented 10 years earlier, had made the second generation of computers possible. Transistors replaced the bulky vacuum tubes and were lighter, smaller, sturdier, and more efficient.

The integrated circuits of the late 1960s introduced the solid-state technology that allowed transistors, diodes, and resistors to be carried on tiny silicon chips. These advances further reduced operating costs and increased speed, capacity, and accuracy. Minicomputers, much smaller than mainframes (large-scale computers) but of comparable power, were developed shortly afterward.

The next important advances included large-scale integration and microprocessing chips. Microchips made even smaller computers possible and reduced costs while increasing capacity. The speed with which a computer processed, calculated, retrieved, and stored data improved significantly. Decreased costs allowed manufacturers to explore new markets.

In the mid-1970s, Steve Wozniak and Steve Jobs started Apple out of their garage. Their vision was to bring computers into every home in America and even the world. Toward that end, they developed a user-friendly computer offered at a reasonable price. User-friendliness was essential, since many people without computer skills would have to adapt to the computer system. The development of their eventual product, the Macintosh computer, was the first to give on-screen instructions in everyday language and successfully use a graphical interface. In addition, Apple introduced the mouse, which allows users to point and click on screen icons to enter commands instead of typing them in one by one.

IBM and manufacturers who copied their designs were quick to enter the personal computer (PC) market once they recognized the tremendous sales potential of the device. The result was a friendly debate among computer users over which are better—Macs or PCs. Regardless of personal preference, the two incompatible systems often led to problems when people tried to share information across formats. Software designers have since developed ways to make file conversions easier and software more interchangeable.

One major trend of the last few decades was the downsizing of computer systems, replacing big mainframe computers with client-server architecture, or networking. Networks allow users greater computing flexibility and increased access to an ever-increasing amount of data.

The second major recent trend has been the rapid growth of the Internet and World Wide Web. Initially developed for the U.S. Department of Defense, the Internet is composed of numerous networks connected to each other around the world. Not surprisingly, this massive network has revolutionized information sharing. It’s used for real-time video conferencing, e-mail services, online research, social networking, e-commerce, online education, entertainment, and many other purposes. The World Wide Web usually refers to the body of information that is available for retrieval online, while the Internet generally refers to the back-end network system plus its various services. In recent years, Internet use on handheld and tablet devices and through wireless networks has revolutionized people's access to technology. There were more than 1.1 billion Web sites as of January 2023 (although only 18 percent were active), according to Siteefy. Statista reports that there were 5.3 billion internet users worldwide (or 65.7 percent of the global population).

Hardware companies are continually striving to make faster and better microprocessors and memory chips. Advances in hardware technology have led directly to advances in software applications. As the developer of Windows, Microsoft has been the leader in the software industry. Windows is a user-friendly, visual-based operating system. (An operating system is the interface between the user, the programs stored on the hardware, and the hardware itself.) Disk operating system (DOS) is one of the early operating systems, and while still used, it requires more computer knowledge than other operating systems. The Windows and Mac systems allow users to point and click on icons and menus with a mouse to tell the computer what to do, instead of having to type in specific commands by hand, as DOS requires.

Intel and Motorola have been the innovators in microprocessor design, striving for faster and more efficient processors. Such innovations allow computer manufacturers to make smaller, lighter, and quicker computers, laptops, and handheld models. As processors get faster and memory increases, computers can process more sophisticated and complicated software programming.

Two fast-growing fields are cloud computing and mobile computing. Cloud computing allows computer users to store applications and data in the “cloud,” or cyberspace, on the Internet, accessing them only as needed from a compatible tablet, handheld, or notebook computer. The International Data Corporation, a market research, analysis, and advisory firm, reports that the worldwide public cloud services market reached $233.4 billion in 2019—up from $160 billion in 2018 and $45.7 billion in 2013. The market is projected to grow at a compound annual growth rate of 19.7 percent from 2022 to 2027, with worldwide revenues reaching $1.34 trillion in 2027. Mobile computing has led to a boom in smartphones or handheld computers supported by Wi-Fi technology that allows users to access the Internet and cloud content and programs from anywhere they receive a Wi-Fi signal. In 2020, 51.5 percent of the global online population accessed the Internet from their mobile phones, according to Statista, an Internet statistics firm. This percentage is expected to grow to 72.6 percent in 2025, according to a report by the World Advertising Research Center, using data from mobile trade body GSMA. These trends are key factors driving the evolution of computing devices and the Internet today.

Other major IT trends include the growing use of the following technologies:

  • Blockchain: a distributed ledger database (similar to a relational database) that maintains a continuously-growing list of records that cannot be altered, except after agreement by all parties in the chain.
  • Artificial Intelligence (AI): training machines to perform functions and tasks in a “smart” manner that mimics human decision-making processes. Expertise in AI has become a sought-after skill for tech professionals. In 2023, AI/machine learning ranked amongst the top five in-demand skills by cybersecurity association ISC2 after not even making the top 10 in-demand skills in 2022.
  • Machine Learning: a method of data analysis that incorporates artificial intelligence to help computers study data, identify patterns or other strategic goals, and make decisions with minimal or no intervention from humans.
  • Generative AI is a form of machine learning algorithms that can be used to create new content (including text, simulations, videos, images, audio, and computer code), as well as analyze and organize vast amounts of data and other information. Examples of generative AI include ChatGPT, Bard, and DALL-E.
  • Metaverse: The metaverse is an emerging 3–D-enabled digital space that uses converging technologies (e.g., artificial intelligence, cloud computing, augmented and virtual reality, digital twins, blockchain technology, social platforms, e-commerce, Internet of Things) to create a lifelike experience online. It can be used by people to have fun, engage in commerce, meet business and other goals, and for other purposes. The data analytics firm Gartner estimates that 20 percent of people will spend at least one hour per day in the metaverse by 2026. The global metaverse market size was valued at $47.48 billion in 2022, according to Strategic Market Research, and is expected to reach a valuation of $678.8 billion by 2030.
  • Quantum Computing: a type of advanced computing in which quantum computers are used to solve challenges of massive size and complexity that cannot be solved by the computing power of traditional computers. “Quantum computers could spur the development of new breakthroughs in science, medications to save lives, machine learning methods to diagnose illnesses sooner, materials to make more efficient devices and structures, financial strategies to live well in retirement, and algorithms to quickly direct resources such as ambulances,” according to IBM. Companies such as Google, Intel, Microsoft, and IBM are making significant financial investments in quantum hardware and software. Fortune Business Insights predicts that the quantum computing market will grow from $928.8 million in 2023 to $6.5 billion by 2030, a compound annual growth rate of 32.1 percent. Demand is strong for experts in quantim computing. McKinsey & Company has found that there is only one qualified quantum candidate for every three quantum job openings.
  • Biometrics: distinctive physical or behavioral characteristics (such as fingerprints, palms, eyes, and faces) that are used to identify individuals. Biometric systems are a set of hardware and software that collect, process, and assess these characteristics and compare them against existing records to create a match. CompTIA says that biometrics “will play an important role in improving security by allowing people and devices to authenticate and move seamlessly through our high-tech world.”

Additionally, virtual reality (VR), augmented reality (AR), and mixed reality technologies are moving far beyond the video gaming industry for use in the health care, hospitality, training, architecture, and law enforcement industries. Virtual reality is technology (typically a headset that encompasses the field of vision) that allows users to immerse themselves visually, aurally, and through other sensations in imaginary worlds. Augmented reality is technology—a special headset or applications on a smartphone or tablet—that introduces virtual objects to the real world. Mixed reality involves a combination of virtual and augmented reality technology in which users can interact with virtual worlds by using real-world objects. The International Data Corporation, (IDC) an American market research, analysis, and advisory firm, predicts that worldwide spending on AR/VR products and services will experience a five-year compound annual growth rate of 32.6 percent from 2023 to 2027.

The number of AR devices worldwide is expected to increase by 37.2 percent from 2022 to 2027, according to the market research firm IDC. PC Magazine reports that AR revenue should be strongest in the following industries (listed in descending order of revenue) by 2025:

  • video games
  • health care
  • engineering
  • life events
  • video entertainment
  • real estate
  • retail
  • the military

Many experts believe that AR will eventually become more popular than VR because it has many more real-world uses in the aforementioned areas, as well as in industrial production, training and development, construction, tourism, vehicle navigation, and law enforcement.

Related Professions