Americas

  • United States

Asia

preston_gralla
Contributing Editor

Microsoft’s Copilot+ AI PCs: Still a privacy disaster waiting to happen

opinion
Jun 12, 20247 mins
Data PrivacyGenerative AISecurity

The company’s new Recall feature will record everything a PC user does, a potential treasure trove of information and data for hackers.

road leading to disaster [planning for business continuity]
Credit: Thinkstock

Imagine that your Windows PC took screenshots of everything you do on it, including of personal data, credit card and other financial information, passwords, web sign-ins, emails, a list of web sites that you don’t want anyone to know you’ve been visiting, business information and more. Imagine your PC creates a searchable database of it all — and imagine how valuable that information would be if accessed by someone other than you.

No security or privacy issues there — after all, what could go wrong? 

The answer is plenty. In a world in which Windows has been successfully hacked for decades, and continues to be hacked, in which Windows has allowed information to be regularly stolen from top tech companies, including Microsoft itself, and from high-ranking government officials as well as countless individual users, the screenshot scenario seems as if it’s the ultimate security-and-privacy nightmare.

And yet many security pros said that’s exactly the Pandora’s Box Microsoft is about to open with the new line of AI-powered Copilot+ Windows PCs. Microsoft argues those new PCs, available beginning on June 18, will make it easy for you to find files and remember things you’ve done on your computer using the new Recall feature, which takes screenshots, stores them in a database, and uses AI to help you find and use whatever you want.

Microsoft claims there’s nothing to fear, that rock-solid security is baked directly into the new feature (though it did announce on Friday the feature would be opt-in and its data better secured — a nod to the backlash that emerged after Recall was unveiled). 

Who to believe? To find out, let’s take a look at how Recall works.

Recall: the AI-driven memory machine

Microsoft’s Copilot+ PCs, to be released by manufacturers including HP, Dell, Samsung, Asus, Acer, Lenovo and Microsoft itself, are “the most significant change to the Windows platform in decades” and “the fastest, most intelligent Windows PCs ever built,” claims Microsoft Executive Vice President, Consumer Chief Marketing Officer Yusuf Mehdi in a blog post.

The machines are powered by a system architecture that connects a PC’s CPU, GPU, and a new high-performance Neural Processing Unit (NPU) to AI large language models (LLMs) running on Microsoft’s Azure Cloud and AI small language models (SLMs) running on the PCs themselves.

Microsoft touts a variety of benefits offered by the new line of machines, including dramatically faster speeds, improved battery life, and better overall performance. However, the core of the benefits are related to AI — turbo-driven AI processing; sped-up AI image creation and photo- and image-editing; accelerated AI for applications such as Adobe Photoshop, Lightroom and Express; and increased performance of Microsoft’s AI Copilot software.

The benefit Microsoft touts the most — the first one Mehdi points to in his blog post — is the Recall feature. Recall, he says, solves “one of the most frustrating problems we encounter daily — finding something we know we have seen before on our PC.”  He claims it will let you “access virtually what you have seen or done on your PC in a way that feels like having photographic memory.”

To do that, Recall takes screenshots of your PC every five seconds, and stores them all in a searchable database. AI does the heavy lifting of analyzing those screenshots, extracting information from them, creating the database and searching through them. Microsoft claims the processing is done on the machine itself rather than in the cloud, and that the screenshots and database are safe because they’re encrypted, so users don’t have to worry about privacy or security issues.

The privacy problem?

Many security experts beg to differ. “I think a built-in keylogger and screen-shotter that perfectly captures everything you do on the machine within a certain time frame is a tremendous privacy nightmare for users,” Jeff Pollard, vice president and principal analyst at Forrester, told Computerworld shortly after Recall was announced.

Another potential issue: Even if the database and data are encrypted, a hacker with access to the machine might still do damage. “Initial access is all that is needed to potentially steal sensitive information such as passwords or company trade secrets,” said Douglas McKee, executive director of threat research at security firm SonicWall.

Security expert Kevin Beaumont, who worked for Microsoft for a short time in 2020, also weighed in, noting in a blog post that hackers gain access to devices “every day on home PCs, and corporate systems…. In essence, a keylogger is being baked into Windows as a feature.”

His research uncovered an even bigger problem. When Beaumont got his hands on the new Copilot+ software he found that Recall’s data is “written into an SQLite database in the user’s folder. This database file has a record of everything you’ve ever viewed on your PC in plain text.”

That means that a hacker doesn’t even need to gain control over someone’s PC to get at their Recall data. The hacker only needs to get at the database file, something that is straightforward and simple to do remotely. Beaumont even posted a video of Microsoft employees doing it

The criticism was sharp in Europe, as well. Kris Shrishak, who advises European legislators on AI governance, echoed Pollard in warning that Recall is a potential “privacy nightmare.” And the UK’s Information Commissioner’s Office is concerned enough about the issue that it’s already gotten in touch with Microsoft about the privacy implications.

Faced with those very public concerns, Microsoft did at least shift gears last week to try and alleviate security fears. In addition to making Recall an opt-in feature for users, the company now requires Windows Hello biometric authentication to enable the feature — requiring a “proof of presence” to search in Recall or view a timeline. Going further, Microsoft will add “just in time” decryption protected by Windows Hello Enhanced Sign-in Security (ESS). That means Recall snapshots will “only be decrypted and accessible when the user authenticates,” Pavan Davuluri, corporate vice president for Windows and Devices, said in a blog post.

Beaumont pointed to the Friday announcement in an online post, but remained skeptical: “Turns out speaking out works. Microsoft [is] making significant changes to Recall, including making it specifically opt in, requiring Windows Hello face scanning to activate and use it, and actually encrypting the database. There are obviously going to be devils in the details — potentially big ones. Microsoft needs to commit to not trying to sneak users to enable it in the future, and it needs turning off by default in Group Policy and Intune for enterprise orgs.”

He continued: “There are obviously serious governance and security failures at Microsoft around how this played out that need to be investigated, and suggests they are not serious about AI safety.”

What should you do about Recall?

Anyone who cares about privacy should think seriously about whether the benefits of the new feature are worth the dangers. Since Recall will now be turned off by default on the new Copilot+ PCs when they ship, you’ll have to think long and hard about whether you should turn it on. If you’re worried about any privacy implications, leave it off. 

As for enterprises, Recall is big trouble just waiting to happen. The feature has the potential to expose corporate data and secrets, not just files and data from individual users. Businesses should carefully consider how to protect themselves should they buy into the Copilot+ line, including making sure Recall is off on every device they buy — and that it stays turned off.