How Work Devices Can Evolve to Keep Pace in the AI ​​age - Latest Global News

How Work Devices Can Evolve to Keep Pace in the AI ​​age

 

The advent of the Neural Processing Unit (NPU) may be touted as a game changer for personal computers (PCs), but finding the right balance in product design remains crucial to meeting consumer desires.

Growing workplace demand for artificial intelligence (AI) tools, including generative AI (GenAI), has led industry players such as chipmakers and device makers to ensure their products can support this shift.

Also: What is an AI PC? (And should you buy one?)

PCs in particular will need to keep up as companies move AI workloads from the cloud to client devices. This shift can improve performance by eliminating the need to send AI workloads to the cloud and back to the user’s device. Companies could also improve data privacy and security by keeping data on-device rather than moving it in and out of the cloud.

Since AI tasks that run locally on a PC are typically performed by the CPU (central processing unit), GPU (graphics processing unit), or both, the device’s performance and battery life may degrade since neither is responsible for processing AI tasks are optimized. This is where AI-specific chips or NPUs come into play.

User needs continue to focus on hybrid working

During the global pandemic, user needs have revolved around the ability to work remotely and support a geographically dispersed workforce. This led to improvements in built-in camera and video conferencing capabilities, according to Tom Butler, executive director of commercial portfolio and product management at Lenovo and head of the hardware maker’s commercial notebook portfolio.

This led Lenovo to focus its efforts on improving camera and video quality, as well as developing microphones that support noise segmentation. Users also wanted good laptop performance, including long battery life, Butler told ZDNET.

Also: Nvidia advocates for the AI ​​PC at CES 2024

With workforces around the world still largely working remotely or hybrid, user needs continue to focus on performance, battery life and video conferencing capabilities. The rise of AI PCs has prompted market participants to consider adding some workloads, such as: B. Move functions such as noise cancellation and background blurring to NPUs.

Finding the optimal way to balance workloads is a key goal since PCs now have three core engines, Butler emphasized, adding that all of Lenovo’s new commercial laptops ship with NPUs. “We are effectively moving into the AI ​​PC era,” he said.

 

In her keynote speech at SXSW this week, AMD CEO Lisa Su also described AI as an “evolution in PC gaming.”

“This is the beginning that will allow us to become much more productive,” Su said, adding that the AI ​​PC era has already arrived and the products are already on the market today. She also said AMD is working to optimize its chips for AI, including developing “better software” for productivity.

Also: For the age of the AI ​​PC, here comes a new speed test

“2024 is the year that customers will start to question whether their device is AI-enabled,” said Asus COO Joe Hsieh in an interview with ZDNET.

Like Lenovo, Asus is working to ensure its new products have NPUs, or a core chip component, for AI workloads.

Asus is also focused on developing the necessary software engine and tools to help users train their own AI models, Hsieh said, noting that most large language models are currently only trained on public data. As personal devices handle AI workloads, Asus believes users will want these applications to use their data instead of public data.

Optimizing hardware and software for AI

When asked which requirements are the most difficult to reconcile as demand for AI PCs increases, Butler pointed to the usual trade-offs between the desire for thinner and lighter devices as well as longer battery life and better performance.

An increase in performance will inevitably affect battery life and vice versa, he said. “With NPUs [now available]“However, it allows us to shift some of the workloads that traditionally tax either the GPU or the CPU,” he noted. For example, noise cancellation functions can be moved to the NPU. Butler noted that software vendors are paying attention to how they can optimize their code to take advantage of NPUs.

In the meantime, Asus wants to provide tools to help developers choose the right computing resources, according to Albert Chang, Asus vice president and co-head of the AIoT business group. Application developers should be able to determine whether the CPU, NPU or integrated GPU needs to power their AI tool, Chang said.

The coming wave of AI PCs

Forrester predicts that 60% of companies will offer GenAI-based applications to employees and customers by 2024.

Research companies also believe that AI will boost demand for PCs. IDC predicts AI PC shipments will increase from 50 million units in 2024 to over 167 million in 2027, at which point they will account for nearly 60% of total PC shipments. IDC defines AI PCs as personal computers with system-on-a-chip capabilities designed to perform GenAI tasks on the device.

Also: Nvidia’s new AI chatbot runs locally on your PC and is free

IDC also predicts that 265.4 million PCs will be shipped worldwide in 2024, up 2% from 2023 and driven by AI PC adoption. These systems will increase shipments to 292.2 million units in 2028, representing a compound annual growth rate of 2.4% between 2024 and 2028. “The presence of AI capabilities on the device is unlikely to lead to an increase in the installed PC base, but it will certainly lead to an increase in average selling prices,” said Jitesh Ubrani, IDC’s research manager for global mobile trackers. and consumer devices.

Gartner predicts that 54.5 million AI PCs will be shipped by the end of the year, while the number of GenAI smartphones will reach 240 million. These AI devices will account for 22% of all PCs and 22% of basic and premium smartphones in 2024. The research firm defines AI PCs as those with dedicated AI accelerators or cores, NPUs, accelerated processing units (APUs), or tensor processing units (TPUs) designed to optimize and accelerate AI tasks on the device.

Also: Microsoft confirms the next Windows, Surface and AI event. Here’s what to expect

IDC categorizes AI PCs into three groups. The first features PCs with NPUs that clock up to 40 tera operations per second (TOPS), enabling certain AI functions in applications running locally on the device. AMD, Intel, Apple and Qualcomm are now among the chip manufacturers that supply such NPUs.

The second category includes next-generation AI PCs equipped with NPUs that offer between 40 and 60 TOPS and have an “AI-first” operating system that supports “persistent and pervasive” AI capabilities, according to IDC. AMD, Intel and Qualcomm are expected to begin releasing these chips in 2024, while Microsoft is expected to unveil updates to Windows 11 to take advantage of these NPUs.

The final category of advanced AI PCs offers more than 60 TOPS of NPU performance and is not currently included in any vendor’s announced product pipeline.

!function(f,b,e,v,n,t,s)
{if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};
if(!f._fbq)f._fbq=n;n.push=n;n.loaded=!0;n.version=’2.0′;
n.queue=[];t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)[0];
s.parentNode.insertBefore(t,s)}(window, document,’script’,
‘https://connect.facebook.net/en_US/fbevents.js’);
fbq(‘set’, ‘autoConfig’, false, ‘789754228632403’);
fbq(‘init’, ‘789754228632403’);

Sharing Is Caring:

Leave a Comment