Should I buy a Copilot Plus PC in 2026?
No — Copilot Plus PCs center on NPUs and Windows‑specific features that lack Linux support and broad real‑world benefit right now, making them poor value for most users.
Video Summary
don’t rush to buy a new pc in 2026 — much of the AI hardware hype doesn’t translate to real-world gains
npus (neural processing units) are math co‑processors with limited practical support outside Windows; they’re largely unused on linux
current ai workloads rely more on gpu vram and unified memory than on NPUs, so older machines often suffice
ram and high‑vram gpus are driving prices up; buying into that market now can be expensive and unnecessary
for most users, buying a well-maintained used laptop (eg. lenovo x1 carbon) or using cloud ai is smarter in 2026 than buying new high-end hardware for ai tasks
No — Copilot Plus PCs center on NPUs and Windows‑specific features that lack Linux support and broad real‑world benefit right now, making them poor value for most users.
Practically no — NPUs are matrix‑math co‑processors with spotty support; most local AI tools on Linux still rely on GPU VRAM, not NPUs.
Only if you need large GPU VRAM for local model inference or specialized tasks (high‑end video editing/gaming). Otherwise use cloud AI or buy used hardware.
Corporate off‑lease laptops like the Lenovo X1 Carbon and older gaming laptops with decent Nvidia cards provide strong daily performance and compatibility at much lower cost.
Avoid sending sensitive data to unknown cloud AIs; use local models when feasible or privacy‑focused cloud offerings (and consider options like Llama.ai or encrypted services).
"The biggest waste of the last two years is considering the so-called Copilot Plus PC, which is essentially a Microsoft-approved hardware configuration that can run Windows 11."
The Copilot Plus PC is marketed as capable of handling local AI inference, but many features are overhyped and may not deliver on performance promises.
Microsoft presents a minimum requirement of 16 GB of RAM and a Neural Processing Unit (NPU), which has shown to be impractical for users focused on Linux, as the NPU lacks any effective use cases outside of Microsoft's ecosystem.
Most users have been misled about the advantages of newer hardware, as a comparable model from three years ago often performs just as well for regular tasks.
"An NPU, or Neural Processing Unit, is essentially a math co-processor made for performing matrix multiplications, which are essential in AI inference."
Although NPUs are positioned as significant advancements for AI computations, the practical support for their utility is sparse.
Current AI applications predominantly utilize VRAM and GPU resources, making the push for NPUs largely unnecessary and a potential waste of investment.
The lack of real-world applications leveraging NPU technology suggests that older computers without such units remain perfectly functional for most user needs.
"Before you start deciding on what new computer to get, you really have to determine the applications that are important to you right now."
Users need to assess their actual computing needs instead of getting swept up in the latest hardware hype, especially those involved in gaming.
High-end graphics cards like the Nvidia 5090 are in high demand among AI users, driving up prices without necessarily meeting the needs of casual gamers.
A focus on used or older hardware can often yield smarter financial choices without sacrificing performance for most applications.
"Memory prices have skyrocketed, and the AI demand has significantly inflated costs."
The dramatic increase in hardware prices, especially RAM, has led to a situation where even previously affordable devices are now considerably more expensive.
Companies are strategically purchasing high-speed RAM types to meet AI service demands, further affecting market prices.
This economic shift is causing many manufacturers to predict a decline in new computer sales as potential customers hesitate to invest in overpriced gear.
"The reason is that Intel and AMD are copying some features of the M silicon chip from Apple and are gaining massive improvements in power draw."
The advancements in chip technology are leading to significant performance improvements due to architectural changes, especially as Intel and AMD begin to incorporate features from Apple's M silicon chip. This means that waiting for new models could yield better performance for less cost.
The extra speed that new computers offer can often be outweighed by the higher costs of components like memory, which have increased notably. As of the video's release, modern machines are priced much higher than older models with comparable performance.
For those considering a new computer purchase, it might be wiser to opt for an older model and delay any new acquisition until a better market opportunity arises.
"Get a used Lenovo X1 ThinkPad Carbon. You can get them from $300 to $400 on eBay."
For general computing tasks that require compatibility with Linux or a backup machine, the Lenovo X1 ThinkPad Carbon remains a strong recommendation. These laptops are affordable and often available at significant discounts on platforms like eBay.
Popular corporate laptops like the Lenovo X1 are sold after three-year leases, making them a budget-friendly option without much compromise on performance. They frequently feature ample memory capacity suitable for everyday tasks.
Older gaming laptops are also an option, offering robust specifications at lower prices. They typically come with older Nvidia graphics cards, making them capable of handling demanding applications.
"These cloud options can be dangerous because you are potentially sending private data to a cloud AI."
Utilizing cloud-based AI services can be risky as they may involve sending sensitive personal data to external servers. Users must carefully consider the privacy implications of engaging with these technologies.
Some AI models, such as Claude from Anthropic, remain very popular in the coding community, while others like Grok from xAI offer integrated web searches, which could keep information more current compared to traditional models.
Llama.ai serves as a safer alternative in the realm of AI, as it allows for local model usage without compromising user privacy. Users have the option to utilize its models in the cloud with predictable subscription pricing, promoting safer AI use without sacrificing functionality.