Skip to Main Content

Information Technology

Current Trends and Issues

The IT industry is constantly changing. Technological advances, changing consumer and business preferences, and other industry developments fuel the emergence of new products, new career paths, and new employment hot spots. For example, blockchain technology (a distributed ledger database that maintains a continuously-growing list of records that cannot be altered) may be the biggest IT disruptor since the emergence of the Internet. The growth of mobile computing has also changed the way content is prepared and viewed. Here are some of the major trends in the industry:

Emerging Occupations

The one constant of technology and tech careers is change. Given the speed of technological advancements, some positions may even completely transform or disappear in a matter of just a few years. Do you remember when the career of computer data entry operator was a hot job? That day is long gone. The key to remaining abreast of trends in the tech industry is to stay one step ahead of the game. Here are some emerging occupations to keep your eyes on:

  • Artificial intelligence engineers are computer engineers who have specialized knowledge of artificial intelligence, machine learning, deep learning, computer vision, and related areas.
  • Blockchain developers design and create the software for distributed ledger databases.
  • Chief information security officers are top-level executives who are responsible for protecting their organization’s data and information technology (IT) systems from cyberattacks and other unauthorized use.
  • Chief Internet of Things officers serve as the bridge between research and development, design, and production departments to create everyday objects with sensors that allow them to connect to the Internet.
  • Chief marketing technologists are executive-level IT managers with the same skills, but more experience.
  • Chief mobile officers supervise all things mobile, including apps, voice/data communications, and computing services.
  • Chief robotics officers oversee all robotics-related operations at their employer and also focus on the interactions between robots and their human coworkers.
  • Cloud computing architects design and implement cloud computing systems.
  • Cloud engineers design, install, administer, and maintain cloud computing hardware, software, and infrastructure for public, private, and hybrid clouds.
  • Cybersecurity architects design, create, maintain, and manage an organization’s information technology systems to protect them from cybercrime and other unauthorized access.
  • Data scientists help businesses and other organizations gather, analyze, and utilize vast amounts of information that are collected on the Internet, through wearable technologies, and other technologies.
  • Enterprise architects create short- and long-term strategic, organizational, and technology-based plans to help companies and other entities gain a competitive edge, save time and money, improve organizational flexibility and information technology scalability, reduce risk, and meet other goals.
  • Enterprise resource planning (ERP) project managers oversee ERP software that allows an organization to manage and automate many back office functions related to technology, human resources, and services.
  • Internet of Things developers are responsible for all aspects of the IoT software, system, or other product lifecycle, such as design, setup, configuration, debugging, maintenance, and connectivity. They also focus on cloud services, data acquisition and analytics, and cybersecurity.
  • Marketing technologists possess expertise in both marketing and information technology.
  • Virtual/augmented/mixed reality hardware and software engineers design and develop these technologies for use in the gaming, health care, training, law enforcement, and other industries.

Internet and Data Security

Information security remains a hot topic as the number and sophistication of cyberattacks increases. In recent years, high-profile hacks on Sony Pictures, Target, Equifax, and Uber, and continuing cyberattacks on U.S. government agencies (including the National Security Agency) from hackers, terrorists, and state-level players have prompted industry and government officials to get serious about the cyberattack threat.

Nearly 1.5 million cybersecurity workers were employed in the U.S. in 2023, according to ISC2. But the industry association reports that there was a shortage of more than 1.3 million workers in the United States, and shortages exist worldwide. ISC2 estimates that the global cybersecurity workforce gap was nearly 4 million workers in 2023. Industries reporting the highest levels of staffing shortages included (in descending order): education, government (non military), nonprofits, military/military contractor, aerospace, healthcare, automotive, and energy/power/utilities. Skills gap factors cited by ISC2 survey respondents included:

  • unable to hire qualified workers
  • high turnover (due to low wages/lack of promotion opportunities)
  • no budget to hire new staff
  • no emphasis at the employer to train non-security IT staff to become security staff
  • people with these skills recently quit or were laid off, and no new hires have been made

Large tech companies—such as Microsoft, Google, HPE, and IBM—are creating or improving training initiatives and educational pipelines that make it easier to enter the industry. Noteworthy programs include Cyber Million (https://www.immersivelabs.com/cybermillion) from Immersive Labs and Accenture, and the SANS Diversity Cyber Academy (https://www.sans.org/cyber-academy/diversity-academy).

Wearable Technology

Wearable technologies are software applications that are worn (watches, glasses, etc.) or embedded in clothing or in other objects. They are often connected wirelessly to the Internet. Wearable technologies gather and analyze data to assist the user in his or her daily life. They are also used in the workplace (e.g., to track employee performance; as training agents, as a means to improve customer service and improve efficiency in retail settings, and expedite production by creating hands-free guidance tools; etc.). The business advisory services firm PwC reports that its data “shows that people are remarkably unconcerned about the net impact wearable technology could have on their job security or autonomy,” although other studies show that consumers have major concerns regarding the privacy and security of their personal information gathered through this technology. Wearables will also be a major change agent in the Big Data, health care, advertising, and information industries. “The growing popularity of wearable computing devices…could drive IT job creation,” predicts Robert Half Technology. “Expertise will be needed for the development of new devices and related applications, as well as to support the adoption of wearable technology in the workplace.” It looks like wearables are here to stay. According to the data analytics firm IDC, 520 million wearables were shipped in 2023, and the market is expected to grow to 625.4 million by the end of 2027, representing a 4.7 percent compound annual growth rate. “While fitness tracking, such as steps taken and distances run, has been helpful in capturing the mainstream audience, many consumers are now clamoring for a more holistic approach to health tracking, paving the way for features such as sleep monitoring, recovery metrics, readiness scores, and stress level tracking,” said Jitesh Ubrani, research manager, Mobility and Consumer Device Trackers at IDC, at the company’s Web site.

Open Source vs. Microsoft

People entering IT discover that there are two general computer paths to follow. In a nutshell, there are software applications, systems, and database programs that work on computers using Microsoft Windows, and then there is “open source” software. Open source programs are free software that any user can modify and redistribute. Open source software and Microsoft software look different, act differently, and are operated very differently from each other. IT people must generally learn at least a portion of both. When starting out, it tends to be easiest to choose one path from which to learn the basics: however, the popularity of both tends to fluctuate.

Microsoft products dominate the software market. In 2023, about 64 percent of all home and business computers used various versions of the Microsoft Windows operating system, according to Statista.com. An overwhelming majority of computers are sold with Microsoft products already installed. However, open source resources have been spreading in business because they do not cost anything; they have a reputation for speed, reliability, and improved overall performance; and they give users a great deal of power to tailor the products to their own needs. Google, AWA, OpenAI, Red Hat, Adobe, X, Facebook, IBM, Intel, LinkedIn, Microsoft, Samsung, and Netflix are just a few of the companies that use open source products—for the aforementioned reasons, but also as a strategy to attract and retain top development talent. Some of the most popular IT areas in which open source resources are used include cloud computing, content management, mobile, security, collaboration, network management, and social media. Look for open source technology to be increasingly embraced in the next several years. The data analytics firm Markets and Markets predicts that the global open-source software services market will grow at a compound annual growth rate of 16.2 percent from 2022 to 2027 and reach $54.1 billion in value.

Emerging Tech Hot Spots

When tech companies and startups are mentioned, one typically thinks of Silicon Valley in California and Seattle, Washington. That hasn’t changed, but recent surveys and studies have revealed some new up-and-coming cities for tech jobs. Many of these places aren’t as sexy and hip as Silicon Valley and Seattle (and other current tech hubs such as Boston and New York), but they offer good pay, a lower cost of living, and a good quality of life. Here is a recent list from Dice.com:

  1. Orlando, Florida
  2. Miami, Florida
  3. Detroit, Michigan
  4. Irvine, California
  5. Houston, Texas
  6. San Antonio, Texas
  7. Portland, Oregon
  8. Tampa, Florida
  9. Phoenix, Arizona
  10. Charlotte, North Carolina

One caveat: demand for specific careers varies by city, so you’ll have to dig deep to see what city is the best match for your professional background.

Federal Government Struggling to Attract Young IT Workers

Although there is strong need for IT workers at federal agencies, the federal government is having trouble attracting young workers. The Partnership for Public Service reports that about 33 percent of federal employees will reach retirement age by 2025. In 2022, less than 10 percent of the federal workforce was under age 30, according to an Office of Personnel Management. Twenty-three percent of the workforce in the private sector were in that same age group.

Young people often view federal agencies as stodgy, not innovative, and lacking career advancement opportunities and the employee perks that are available in the tech sector and at other private sector employers. Additionally, the federal hiring process is lengthy, which is sometimes unappealing to young people who want to move forward quickly with their careers post-graduation. Government agencies are attempting to counter this shortage by hosting tech-focused job fairs, educating young people about the diverse range of tech opportunities that are available with federal agencies, trying to speed up the screening and hiring process for new workers, and offering flexible work arrangements.

Government agencies, industry organizations, and other entities are creating programs and initiatives to address these shortages. In 2023, the Biden Administration unveiled the National Cyber Workforce and Education Strategy (NCWES), which seeks to address both immediate and long-term cyber workforce needs. The NCWES has four main pillars:

  1. Equip Every American with Foundational Cyber Skills: by making foundational cyber skill learning opportunities available to all and educating people about career options
  2. Transform Cyber Education: by building and improving cyber education at all levels, expanding competency-based cyber education, and making cyber education and training more affordable and accessible
  3. Expand and Enhance the National Cyber Workforce: by adopting a skills-based approach to recruitment and development and increasing access to cyber jobs for everyone, including underserved and underrepresented groups
  4. Strengthen the Federal Cyber Workforce: by communicating the benefits of cybersecurity careers in public service, reducing hiring and onboarding barriers, attracting a more diverse federal cyber workforce, and improving career pathways in the federal cyber workforce

While it’s too soon to gauge the effectiveness of the National Cyber Workforce and Education Strategy, industry experts believe that, if fully implemented, the initiative will greatly increase the number of people who pursue cyber careers. It’s also important to keep in mind that such initiatives are only as strong as their backing by the current presidential administration. Future administrations may reduce or eliminate funding of the initiative, which could cause employment gains to be reversed.

Emerging Technology: Blockchain

Computerworld reports that blockchain technology “has the potential to eliminate huge amounts of record-keeping, save money, and disrupt IT in ways not seen since the Internet arrived.” Blockchain is a distributed ledger database (similar to a relational database) that maintains a continuously-growing list of records that cannot be altered. Each entry is time-stamped and linked to the previous entry. Each digital transaction or record is called a block in the chain of records, hence the blockchain moniker. Blockchain can either be an open system, where anyone can add information, or a controlled one, where only users with permission can access the system.

A key recent development in blockchain technology is the introduction of smart contracts, computer code that is stored on a blockchain that allows certain actions to be executed without human approval under specified circumstances. “Smart-contract technology can speed up business processes, reduce operational error, and improve cost efficiency,” according to Blockchain Technology and Its Potential Impact on the Audit and Assurance Profession, a report from the American Institute of Certified Public Accountants and other organizations.

Overall global spending on blockchain technology will grow from $7.4 billion in 2022 to $94 billion by 2027, and have a five-year compound annual growth rate of 66.2 percent during this time span, according to MarketsandMarkets.

Interest in blockchain technology has grown as a result of the emergence of cryptocurrency, a digital cash system that is increasingly being used as a substitute or complement to traditional currency. Cryptocurrency payments are not processed through a central banking system or trusted third party, but are sent from payer to payee. Bitcoin is the most-popular cryptocurrency. Blockchain is the technology that is used to facilitate cryptocurrency transactions.

Growing interest in blockchain technology has created demand for IT workers with expertise in the technology--and high pay. Blockchain developers earned average salaries of $102,187 in 2024, according to GlassDoor.com. Top annual salaries exceed $200,000.

Blockchain technology is rapidly being adopted by businesses. A 2020 survey of senior executives and practitioners in more than 10 countries and territories (Brazil, Canada, China, Germany, Ireland, Israel, Mexico, Singapore, South Africa, Switzerland, United Arab Emirates, United Kingdom, and United States) by Deloitte identified the following findings related to blockchain:

  • Thirty-nine percent of respondents had already incorporated blockchain into production—up 16 percent from 2019.
  • Fifty-five percent of responding organizations viewed blockchain as a “top strategic priority.” This was an increase of 12 percent from 2018.
  • Eighty-three percent of respondents said that their companies will lose competitive advantage if they don’t adopt blockchain—up 6 percent from 2019.
  • Although blockchain has become increasingly integrated into business operations, business leaders still have concerns about barriers to full adoption. Top barriers cited by senior executives and practitioners included implementation: replacing or adapting existing legacy system (which was cited by 36 percent of respondents); concerns over sensitivity of proprietary information (33 percent); and potential security threats (33 percent).

Hot Technologies: Artificial Intelligence, Machine Learning, and Generative AI

Artificial intelligence and machine learning are two IT phrases that are often used interchangeably. Although they are linked, they are not the same. Artificial intelligence is an older and broader concept that machines can be programmed to perform functions and tasks in a “smart” manner that mimics human decision-making processes. Machine learning is a method of data analysis that incorporates artificial intelligence to help computers study data, identify patterns or other strategic goals, and make decisions with minimal or no intervention from humans. According to Forbes, “machine learning applications can read text and work out whether the person who wrote it is making a complaint or offering congratulations. They can also listen to a piece of music, decide whether it is likely to make someone happy or sad, and find other pieces of music to match the mood. In some cases, they can even compose their own music expressing the same themes, or which they know is likely to be appreciated by the admirers of the original piece.” These are just a few examples of how artificial intelligence and machine learning are changing the way computers are used and how they interact with humans.

Several AI sub-specialties—including computer vision, deep learning, and natural language processing—are in strong demand. In computer vision, huge neural networks with many layers of processing units are used to teach machines how to view and interpret the world around them by using data collected by cameras and other methods. In deep learning, massive neural networks are used to teach computers to recognize speech, identify images, and even make predictions. Natural language processing aims to teach computers to understand, interpret, and manipulate spoken and written human language.

Generative AI (e.g., ChatGPT, Bard, DALL-E) is a form of machine learning algorithms that can be used in a variety of ways, such as creating new content (including text, simulations, videos, images, audio, and computer code), helping developers write code and identify errors more efficiently; analyzing and organizing vast amounts of data and other information; and more quickly identifying solutions to problems. The research and advisory firm Gartner says that generative AI “will affect the pharmaceutical, manufacturing, media, architecture, interior design, engineering, automotive, aerospace, defense, medical, electronics and energy industries by augmenting core processes with AI models. It will impact marketing, design, corporate communications, and training and software engineering by augmenting the supporting processes that span many organizations.”

A 2023 report by Deloitte found that 74 percent of companies were testing generative AI technologies and 65 percent were already using them internally. The report, State of Ethics and Trust in Technology, can be accessed at https://www2.deloitte.com/content/dam/Deloitte/us/Documents/us-tte-annual-report-2023-12-8.pdf.

The data analytics firm IDC reports that enterprises will invest nearly $16 billion worldwide on GenAI solutions in 2023, and spending is expected to reach $143 billion in 2027 with a compound annual growth rate of 73.3 percent over the 2023-2027 forecast period. “Generative AI is more than a fleeting trend or mere hype. It is a transformative technology with far-reaching implications and business impact,” says Ritu Jyoti, Group Vice President, Worldwide Artificial Intelligence (AI) and Automation Research, at IDC.

Although generative AI is already in use in many industries, it’s important to understand that using LLMs creates several serious risks. According to the Harvard Business Review, “they can perpetuate harmful bias by deploying negative stereotypes or minimizing minority viewpoints, spread misinformation by repeating falsehoods or making up facts and citations, violate privacy by using data without people’s consent, cause security breaches if they are used to generate phishing e-mails or other cyberattacks, and harm the environment because of the significant computational resources required to train and run these tools.” These issues are currently being addressed by developers and tech companies.

As a result of advances in AI, machine learning, and generative AI, demand is growing quickly for IT professionals with this expertise. These careers are often featured on “hot job” lists. Information technology managers are also increasing salary offers to those with AI skills. Forty-six percent of managers who were surveyed for its 2024 Salary Guide said that they would increase salary offers for those with artificial intelligence expertise.

Changing Skill and Educational Requirements

The American workforce has typically been divided into white collar jobs, those that require a college degree (typically a bachelor’s degree or higher), and blue-collar jobs, those that require hands-on training, the completion of an apprenticeship, or the completion of a technical degree. With the high number of tech jobs that are going unfilled, IT company leaders are re-imagining skill and educational requirements for many positions. “Given that businesses see skills gaps in the local labour market as the foremost barrier towards achieving industry transformation and investing in learning and training on the job as the most promising workforce strategy for achieving their business goals, formulating effective reskilling and upskilling strategies for the next five years is essential for maximizing business performance,” according to the Future of Jobs Report, 2023, from the World Economic Forum (WEF). The WEF surveyed representatives from 803 companies—that collectively employed more than 11.3 million workers—to gather the results. Companies identified the following areas as the most important for upskilling and reskilling from 2023 to 2027:

  1. Analytical thinking
  2. Creative thinking
  3. AI and Big Data
  4. Leadership and social influence
  5. Resilience, flexibility, and agility
  6. Curiosity and lifelong learning
  7. Technological literacy
  8. Design and user experience
  9. Motivation and self-awareness
  10. Empathy and active listening

Networks and cybersecurity ranked 17th and programming ranked 20th on the list.

Companies and other employers of IT professionals are launching programs that re-train existing employees and educate young people to prepare them for the tech workforce. For example, in 2023, IBM announced a commitment of training two million learners in AI by the end of 2026, with a focus on underrepresented communities. To meet this goal, the tech giant is working closely with colleges and universities, launching new generative AI coursework through IBM SkillsBuild (which IBM describes as a “free education program focused on underrepresented communities in tech”), and working with partners to provide AI training to adult learners. IBM offers an apprenticeship program that prepares those without a college degree for careers in software development, hardware design, system support, lab support, and other fields. Additionally, it offers a paid Tech Re-Entry Program for skilled technical professionals who took a break from the workforce but want to learn new skills without having to earn a degree.

As IT worker shortages continue, look for more companies to focus on reskilling/upskilling and seek other ways to train qualified workers outside the four-year and graduate-level college and university system.

Cybersecurity Worker Shortages Continue

Nearly 1.5 million cybersecurity workers were employed in the U.S. in 2023, according to ISC2. But the industry association reports that there was a shortage of more than 1.3 million workers in the United States, and shortages exist worldwide. ISC2 estimates that the global cybersecurity workforce gap was nearly 4 million workers in 2023. Industries reporting the highest levels of staffing shortages included (in descending order): education, government (non military), nonprofits, military/military contractor, aerospace, healthcare, automotive, and energy/power/utilities.

Employers are taking a variety of steps to address cybersecurity and overall IT worker shortages, including increasing compensation and benefits, reimagining job requirements and expanding employment pipelines to attract IT professionals without cybersecurity or other IT experience and people who have no or little IT experience, creating and improving internship and other experiential learning opportunities, and launching programs to educate underrepresented groups (e.g., women, minorities) about educational requirements, potential career paths, and the benefits of working in the field. P–TECH (https://www.ptech.org) is one such program for minorities. It provides high school students from underserved backgrounds with the academic, technical, and professional skills and credentials they need to work in STEM careers. Participants earn both their high school diploma and a two-year associate degree in a STEM field.

Large tech companies—such as Microsoft, Google, HPE, and IBM—are creating or improving training initiatives and educational pipelines that make it easier to enter the industry. Noteworthy programs include Cyber Million (https://www.immersivelabs.com/cybermillion) from Immersive Labs and Accenture, and the SANS Diversity Cyber Academy (https://www.sans.org/cyber-academy/diversity-academy).

Related Professions