Skip to Main Content

Information Technology

Defining Events

Many important events in the history of the information technology industry have made it what is today—the backbone of business, commerce, communication, entertainment, and many other aspects of our society.

Aiken and the Mark I

Howard Aiken was an inventor and professor of applied mathematics at Harvard University. With the help of colleagues at Massachusetts Institute of Technology (MIT), Harvard, and International Business Machines (IBM), Aiken invented the Mark I during the early 1940s. It performed calculations using a combination of electrical and mechanical components and relays (an electrical switch that opens and closes based on another electrical switch), and is considered the first large-scale automatic digital computer. The instruction sequence used to solve a problem—the program—was fed into the machine on a roll of punched paper tape, rather than being stored in the computer.

In 1945, the idea of storing the program within the computer was introduced, based on the concepts of mathematician John von Neumann. The instructions would be stored within a “memory,” freeing the computer from the speed limitations of the paper tape reader during execution and permitting problems to be solved without rewiring the computer.

Atanasoff and ABC

John Atanasoff, a professor from Iowa State College, first conceived of the idea of an electronic digital computer in 1930 (the Mark I relied on mechanical and electronic components). The device consisted of a rotating drum on which 1,600 capacitors (pairs of conductors separated by a nonconducting substance) were placed in 32 rows. Each capacitor could be charged positively, indicating a 1, or negatively, indicating a 0. Today’s computers operate on the same binary principle.

In the early 1940s, the completed version, the Atanasoff Berry Computer (ABC) was finished. According to Mike Hally, the author of Electronic Brains: Stories from the Dawn of the Computer Age, it was “as big as a fridge, weighed a third of a ton, and used more than 300 tubes...and it took 15 seconds to complete an arithmetic calculation.” The device was designed to resolve systems of linear equations, but it was rudimentary, not programmable, and didn’t always work properly.

Mauchly, Eckert, and ENIAC

John Mauchly, a mathematician and physicist, and Presper Eckert, an electrical engineer, wanted to do something bigger and better than what Atanasoff had done. Their goal was to produce a computer that could perform calculations in 10 minutes or less. Thus began work on the Electronic Numerical Integrator and Computer, or ENIAC.

ENIAC is considered the predecessor of most computers in use today. According to Paul Ceruzzi, the author of A History of Modern Computing, its purpose was to calculate firing tables for the U.S. Army. (A firing table contains artillery settings, based on both test firings and computer simulations, given a certain set of conditions.) This task “involved the repetitive solution of complex mathematical expressions, [and ENIAC] occupied a room that was 50 feet by 30 feet, contained 18,000 tubes and 70,000 resistors,” and was much like the clichéd image that comes to mind when one thinks of the first computers.

Vacuum Tubes, Transistors, Integrated Circuits, and Microprocessors

ENIAC and other early computers were powered by vacuum tubes, which were big and bulky and tended to burn out. Scientists began looking for an alternative technology. In 1947, experimental physicist Walter Brattain and theoretical physicist John Bardeen used semiconductors to develop the point-contact transistor. Their supervisor, William Schockley, developed the junction (sandwich) transistor, which was more commercially viable. The use of transistors led to smaller, faster, and more versatile components than were possible with vacuum-tube machines like the ENIAC. The first commercial computer to use transistors was developed in 1957 by Seymour R. Cray, a pioneer in the design of supercomputers. Because transistors use less power than vacuum tubes and have a longer life, this development alone was responsible for the improved machines called second-generation computers, the first generation being those that employed vacuum tubes. Components became smaller, as did the spacings between components, and systems became much less expensive to build.

The use of transistors revolutionized the computer industry, but the invention of the integrated circuit sent the industry into overdrive. In the late 1950s, two inventors (Jack Kilby, an engineer at Texas Instruments and Robert Noyce, the cofounder of Fairchild Semiconductor Corporation) working separately (and unaware of each other’s activities) developed the integrated circuit, or microchip, which combined transistors, resistors, and capacitor into a single chip. This invention reduced operating costs and increased capacity, speed, and accuracy. In 1959, both Texas Instruments and Fairchild Semiconductor Corporation applied for and received patents on this new technology. After several years of legal battles, the two companies decided to cross license their technologies. In 1961, the first commercially available microchips became available. Soon after, all computers were built using microchips. In 2021, the global microchip market was worth more than $527.8 billion, and is predicted to grow to $1.3 trillion by 2029, according to Fortune Business Insights. Microchips are used in every electronic product. TSM, Intel, and Qualcomm are the largest semiconductor chip makers.

In the late 1960s, the invention of the microprocessor, which allowed for thousands of integrated circuits to be built single silicon chip, further revolutionized the IT industry. The Intel 4004 chip, developed in 1971, was the first microprocessor. This technology was integrated into the first desktop computers for home users in the 1980s. These small computers became increasingly powerful, and scientists began to see the possibilities of linking these systems together and sharing information. This was the early stirrings of the World Wide Web.

The Birth of the Internet

The Internet as we know it today began life in the 1960s as a U.S. Department of Defense project called ARPANET (Advanced Research Projects Agency Network). The goal was to create a comprehensive, indestructible computer network that could communicate even when under enemy attack. It used packet-switching technology (a message delivery technique) and TCP/IP (a communications protocol) to communicate. ARPANET started out as a military-only resource, but it gradually expanded as computer professionals, scientists, companies, and others began to see the possibilities for this digital internetwork. By 1995, commercial users dominated all other users, such as military, educational, and scientific users.

The World Wide Web

Commercial Internet services and applications became popular in the 1980s, but users found it hard to access information because of poor computer interfaces. That changed in 1989, when a physicist named Tim Berners-Lee wrote a proposal in March 1989 for a “web of nodes,” “a large hypertext database with typed links,” that could be viewed by “browsers” on a network, but it generated little interest. Berners-Lee’s boss encouraged him to begin implementing his system on a newly acquired NeXT workstation (NeXT, which was founded by Steve Jobs, merged with Apple in 1997). Berners-Lee’s system was eventually dubbed the World Wide Web and was released in 1992. The World Wide Web made it much easier to access and use the Internet, and the number of Web sites gradually increased, then skyrocketed. In December 1991, there were 10 Web sites on the Internet. This number increased to 204 in September 1993, to 19.8 million in August 2000, and to 101.4 million in November 2006, according to Hobbes Internet Timeline. There were more than 1.1 billion Web sites as of January 2023 (although only 18 percent were active), according to Siteefy.

Easier Browsing and Better Desktop Computers

In late 1992, Marc Andreesen and Eric Bina, two programmers at the National Center for Supercomputing Applications at the University of Illinois Urbana-Champaign, began working on Mosaic, the first mainstream commercial browser. It combined database capabilities with a graphical user interface. The center released the browser in 1993. Mosaic was eventually revised to work with Windows and Mac computers and emerged as Netscape. While the Internet and the Web were evolving, so were desktop computers. Computers were being used for word processing, databases, and spreadsheets. Laptop computers were introduced. Most historians consider the first true laptop computer to be the Osborne 1. Produced in 1981, it weighed 24 pounds and cost $1,795. The Osborne 1 came with a five-inch screen, modem port, two 5 1/4-inch floppy drives, a large collection of bundled software programs, and a battery pack. The computer industry was growing and people were seeing its huge potential. During this period, many tech companies—including Dell, Compaq, and Microsoft—emerged as major players.

Development of E-Commerce

The World Wide Web grew rapidly from the mid-1990s onward, and companies began to view it as a means to reach customers and make money. Amazon.com, the world’s largest online retailer today (by revenue), went online in 1995, and many other companies (both “brick and mortar” and Internet-only) launched e-commerce sites. E-commerce plays a major role in the U.S. economy, and sales are only expected to increase as more companies use the Internet to well products and services. Total U.S. e-commerce sales reached $1.03 trillion in 2022, passing $1 trillion for the first time ever, according to a Digital Commerce 360 analysis of U.S. Department of Commerce figures. This was a significant increase from $449.8 billion in 2017. Most noteworthy is the rapid increase of e-commerce transactions on mobile devices. In 2023, 60 percent of all retail e-commerce was generated via mobile devices, according to Statista.com. By 2027, this percentage is expected to grow to 62 percent. Mobile e-commerce sales reached $2.2 trillion in 2023.

Dot-Com Crash

Interest in the Internet was white hot in the late 1990s. In 1997, Six Degrees, the first social networking site, launched. Blogs became extremely popular. And in 1998, the Google search engine launched and the e-commerce site PayPal was founded. In 1999, Napster, one of the first peer-to-peer file-sharing sites in the United States, began operation. Internet shopping became popular, and the term “e-commerce” was coined to describe this trend. Many Internet-based companies, known as dot-coms, were founded during this time.

And then the bottom dropped out. Tech stock speculation by the public, the availability of venture capital for tech start-ups, and other factors contributed to unrealistic stock price growth for many tech companies, which created a tech stock bubble that burst in 2000. Stock prices dropped dramatically. Nearly 500 tech companies went bankrupt—causing many people to lose their jobs. Other companies watched their stock prices plummet but managed to hang on until better times. By 2005, the tech industry began to bounce back, but many wonder if another dot-com crash is in our future.

The Rise of Social Media

After the dot-come bubble burst in 2000, the surviving Internet companies and those wishing to launch new dot-coms began to think about ways to improve the World Wide Web and better interact with users. Companies began asking for feedback about their products and services from customers and organizations began encouraging site visitors to interact with others who shared their interests. Technological developments such as improved browsers, the increasing use of broadband technology (which made the Internet faster), and the use of Flash application platforms (which improved the look and usability of Web sites) also changed the World Wide Web. These trends, according to Terry Flew, author of New Media, prompted a “move from personal Web sites to blogs and blog site aggregation, from publishing to participation, from Web content as the outcome of large up-front investment to an ongoing and interactive process, and from content management systems to links based on tagging.”

Although some forms of social media (such as virtual game worlds, massively multiplayer online role-playing games, blogs, and basic social networking sites) had been around for a while, the 2000s were the golden era for social media sites. Many sites launched during this time, including Wikipedia (2001), MySpace (2003), Flickr (2004), YouTube (2005), Facebook (2006), Twitter (which is now known as X, 2006), Tumblr (2007), Pinterest (2010), and Instagram (2010). Today, Facebook is the largest social-networking site; it had 3 billion monthly active users as of October 2023.

The COVID-19 Pandemic and Its Aftermath

In late 2019, the coronavirus COVID-19 was detected in China and quickly spread to more than 210 countries, causing tens of millions of infections, hundreds of thousands of deaths, and massive business closures and job losses. In the short-term, the COVID-19 pandemic negatively affected the health of individuals; employment opportunities at businesses, nonprofits, and government agencies; and daily life and the job search process. It also had a major effect on the IT industry. Information technology job postings were down 36 percent in July 2020 as compared to July 2019, according to industry sources. There were far fewer job listings for data scientists and IT managers than in 2019, according to Indeed.com. On the other hand, the outlook for IT businesses that specialized in e-commerce or provided services to e-commerce firms, was much better. There was also demand for social media professionals as more people stayed at home and used social media. Information technology companies such as Slack that developed chat and collaboration software experienced high demand for their products during the pandemic.

The job search process also changed during the pandemic. A survey of senior managers in the United States by the staffing firm Robert Half found that the top three hiring changes made by companies due to COVID-19 were more interviews and employee onboarding conducted remotely, a shortening of the hiring process, and more fully remote jobs being advertised. Many tech employees continue to work remotely or in hybrid arrangements—although some companies are requiring their employees to transition back to in-person work. Some employers continue to use remote hiring and onboarding approaches in order to save money.

Related Professions