October 3, 2024 By Mike Elgan 4 min read

The surge in artificial intelligence (AI) usage over the past two and a half years has dramatically changed not only software but hardware as well. As AI usage continues to evolve, PC makers have found in AI an opportunity to improve end-user devices by offering AI-specific hardware and marketing them as “AI PCs.”

Pre-AI hardware, adapted for AI

A few years ago, AI often depended on hardware that was not explicitly designed for AI. One example is graphics processors. Nvidia Graphics Processing Units (GPUs) are crucial in AI because they handle parallel processing efficiently, which is necessary for machine learning and deep learning. Their design enables simultaneous calculations, making them more effective than CPUs for AI model training and inference.

Another primary hardware type is the Field-Programmable Gate Array (FPGA) from Intel and other companies. An FPGA is an integrated circuit (IC) that can be reprogrammed multiple times. That flexibility makes it ideal for AI tasks. FPGAs accelerate deep learning and machine learning tasks. They provide hardware customization options that mimic the behavior of GPUs or ASICs.

FPGAs can be integrated with popular AI frameworks like TensorFlow and PyTorch using tools like the Intel FPGA AI Suite and the OpenVINO toolkit.

FPGAs are used across the automotive, healthcare and other industries. They are useful in edge computing scenarios where AI capabilities must be deployed close to the data source for faster decision-making and reduced latency.

And yet another type is Application-Specific Integrated Circuits (ASICs). One example is Google’s Tensor Processing Units (TPUs). TPUs are custom ASICs developed by Google to accelerate machine learning workloads. They are optimized for TensorFlow and used extensively in Google’s data centers.

How the generative AI revolution changed hardware

OpenAI’s release of ChatGPT on November 30, 2022, changed the public’s and industry’s relationship with AI. ChatGPT quickly gained immense popularity, attracting more than one million users within five days of release. By January 2023, it had reached 100 million users, making it the fastest-growing consumer application ever.

Most importantly, the runaway success of ChatGPT in the general culture changed venture funding in favor of AI startups. Big tech giants like Microsoft, Google and Meta accelerated the development and public availability of their offerings, and Silicon Valley quickly saw the emergence of companies like Anthropic and Perplexity offering AI tools.

Now, PC manufacturers are investing in AI-capable PCs that emphasize hybrid AI and on-device intelligence. The integration of AI into personal computers is facilitated by the emergence of specialized AI chipsets, such as neural processing units (NPUs), which enhance PCs’ ability to perform AI tasks locally.

This shift is expected to impact the PC market significantly. Some 60% of PCs shipped by 2027 will be AI-capable, according to Canalys.

Explore AI cybersecurity solutions

How do AI PCs differ?

AI PCs are designed to efficiently execute AI workloads using a combination of CPUs, GPUs and NPUs, allowing them to handle tasks such as generative AI models more effectively than previous PC generations. This optimization enables AI PCs to run AI applications with improved performance, power efficiency and privacy by processing data locally rather than relying on cloud-based solutions.

Some criticize this category as a marketing gimmick and point out that many end users use generative AI through cloud-based chatbots.

Today, the public thinks of AI as large language models (LLMs) running in the cloud and used as chatbots. Over time, the power and usage of AI by end users will increasingly take place via integrated features and AI-enhanced applications.

According to Gartner Global Chief of Research Chris Howard, AI will also involve more small language models (SLMs) powering non-chatbot use cases running close to the edge rather than the cloud.

AI processing will increasingly happen closer to the user and closer to the edge. And this means the trend of AI-specific hardware will only grow.

Microsoft AI

One standout is the introduction of Microsoft’s Copilot+ PCs, a new category of Windows PCs specifically designed for AI. These PCs feature new silicon capable of performing over 40 trillion operations per second (TOPS), providing all-day battery life and access to advanced AI models. The architecture of these devices integrates a high-performance NPU alongside the CPU and GPU, enhancing their AI capabilities. This configuration allows for new experiences such as real-time AI image generation, language translation and advanced search functionalities like the “Recall” feature, which records and analyzes device activity to improve user interaction with AI models.

Microsoft has also collaborated with major OEM partners, including Acer, ASUS, Dell, HP, Lenovo and Samsung, to bring these AI-enhanced devices to market.

Apple AI

Apple has made several hardware changes to accommodate and empower AI capabilities in its devices. A significant development is the integration of Apple silicon, specifically designed to handle advanced AI processing. This includes the use of specialized neural engines in devices like the iPhone 15 Pro, which are optimized for AI tasks such as machine learning and natural language processing. These neural engines enhance the efficiency and speed of AI operations, enabling features like real-time language translation and image recognition.

Google AI

Google has made several changes to its hardware to accommodate AI. One of the significant steps includes developing and integrating its own hardware to support AI models like Gemini. This indicates a shift from relying on external chips to using proprietary technology to enhance AI capabilities.

Google has even reorganized its internal teams to integrate AI across its products better. This reorganization has led to the creation of a new Platforms and Devices team, consolidating various Google products like Pixel, Android, Chrome and ChromeOS under a single leadership. This move aims to accelerate AI integration and improve the synergy between hardware and software.

The AI hardware revolution

The popular generative AI revolution began in November 2022 and resulted in big hardware changes to accommodate power-hungry AI use cases. The recent accommodation of AI in hardware is just the beginning, no doubt. We can look forward to AI-specific hardware trickling down beyond PCs and phones and into wearables, Internet of Things devices and more.

More from Artificial Intelligence

Autonomous security for cloud in AWS: Harnessing the power of AI for a secure future

3 min read - As the digital world evolves, businesses increasingly rely on cloud solutions to store data, run operations and manage applications. However, with this growth comes the challenge of ensuring that cloud environments remain secure and compliant with ever-changing regulations. This is where the idea of autonomous security for cloud (ASC) comes into play.Security and compliance aren't just technical buzzwords; they are crucial for businesses of all sizes. With data breaches and cyber threats on the rise, having systems that ensure your…

Cybersecurity Awareness Month: 5 new AI skills cyber pros need

4 min read - The rapid integration of artificial intelligence (AI) across industries, including cybersecurity, has sparked a sense of urgency among professionals. As organizations increasingly adopt AI tools to bolster security defenses, cyber professionals now face a pivotal question: What new skills do I need to stay relevant?October is Cybersecurity Awareness Month, which makes it the perfect time to address this pressing issue. With AI transforming threat detection, prevention and response, what better moment to explore the essential skills professionals might require?Whether you're…

3 proven use cases for AI in preventative cybersecurity

3 min read - IBM’s Cost of a Data Breach Report 2024 highlights a ground-breaking finding: The application of AI-powered automation in prevention has saved organizations an average of $2.2 million.Enterprises have been using AI for years in detection, investigation and response. However, as attack surfaces expand, security leaders must adopt a more proactive stance.Here are three ways how AI is helping to make that possible:1. Attack surface management: Proactive defense with AIIncreased complexity and interconnectedness are a growing headache for security teams, and…

Topic updates

Get email updates and stay ahead of the latest threats to the security landscape, thought leadership and research.
Subscribe today