Windows 11’s Recall feature – set to be a cornerstone ability for Copilot+ PCs, allowing for a supercharged AI-powered search of the activity history of the system – continues to stir controversy, and Microsoft’s recent defense of the functionality won’t likely soothe the nerves of anyone.
Or at least that’s our feeling following The Register picking up on comments that came from Jaime Teevan, chief scientist and technical fellow at Microsoft Research, in an interview at an AI conference with Erik Brynjolfsson, director of the Stanford Digital Economy Lab.
Brynjolfsson observed that after Recall was revealed by Microsoft, there was a “backlash against all the privacy challenges around that.”
To recap briefly, in case you missed the concerns flying around about this feature, it sits in Windows 11 and takes regular screenshots of everything you’re doing. Those grabs can be searched by AI at a later date, so you can find out all manner of useful info – and there’s no doubting it’s useful.
The worry is that Recall represents a double-edged sword, with the convenience of a super-powerful search of all your PC usage history on one side, and the specter of all that data falling into the hands of a hacker on the other. To say there are lots of security and privacy concerns with how the AI feature is implemented would be a massive understatement.
Brynjolfsson then encouraged Teevan to address these issues, saying: “So, talk about both the pluses and minuses of using all that data and some of the risks that creates and also some of the opportunities.”
A balanced question, but Teevan elected to ignore the ‘minuses’ part of the equation entirely, and talk only in vague and evasive terms. She said: “Yeah, and so it’s a great question, Erik. This has come up throughout the morning as well – the importance of data. And this AI revolution that we’re in right now is really changing the way we understand data.”
She moved on to talk about businesses using data, before addressing the consumer angle as follows: “And as individuals too, we have important data, the data that we interact with all the time, and there’s an opportunity to start thinking about how to do that and to start thinking about what it means to be able to capture and use that. But of course we are rethinking what data means and how we use it, how we value it, how it gets used.”
A waffly and not very helpful response, frankly.
Undeterred, Brynjolfsson tried a specific angle to try and draw Microsoft Research’s chief scientist to actually comment meaningfully on all the concerns raised around Recall.
He asked: “Is it stored locally? [meaning the Recall data] So suppose I activate Recall, and I don’t know if I can, but when you have something like that available, I would be worried about all my personal files going up into the cloud, Microsoft, or whatever. Do you have it kept locally?”
It’s an easy reply in this case, though, as Microsoft has already made it clear that the data is only stored locally.
Teevan indicated that’s the case again: “Yeah, yeah, so this is a foundational thing that we as a company care a lot about is actually the protection of data. So Recall is a feature which captures information. It’s a local Windows functionality, nothing goes into the cloud, everything’s stored locally.”
Analysis: Never mind thinking, we need some action
Obviously, that’s good to hear, but as noted, we’ve heard it before. And sadly, that was your lot. Tougher questions weren’t raised, and Teevan certainly didn’t take any opportunity to proactively tackle some of the flak fired at Recall in recent times.
The message, vague as it was, consisted of ‘thinking’ about how we capture and use data, but with Recall about to launch in Copilot+ PCs later this month, it’s a bit late to still be thinking. Isn’t it? These devices are on the cusp of release, and Recall will be used in a very real way across some (or many) of these new machines, with some of the mentioned worries around how it could go wrong also being very real.
As The Register points out, some folks are fretting about legal implications – that lawyers could leverage Recall data on a PC against someone (possibly accessing all kinds of past messaging, even those that are of the auto-delete-after-reading variety). A privacy watchdog is already investigating the feature before its release. And security experts are throwing red flags left, right and center frankly, with some major worrying underway about how Recall could make it a breeze for hackers to steal a ton of sensitive data, apparently very easily and quickly, if your PC is unfortunately compromised.
Aside from all this, our main beef is with the setup of a Copilot+ PC and how it falls short with Recall. The method of running the feature past a Windows 11 user during setup feels clunky, and like it’s crafted to allow for people to leave it on by default, more than likely (if they don’t understand what’s being asked of them, which may be the case with less tech-savvy users).
There are lots of questions here, and Microsoft is failing to address them as far as we’ve seen. ‘It’s all kept local and everything will be fine’ is glossing over myriad issues here.
While at this point, it seems unlikely that the Recall data tanker can change course meaningfully – we’re that close to launch – we can hope that at the very least, Microsoft deals with the issues around setup. This needs to be far more transparent, and Recall needs to be off in our opinion, by default, unless a Copilot+ PC user makes a fully informed decision to turn it on.
The way the feature’s setup is dealt with right now just isn’t good enough, and that’s a failing underpinning all the other concerns here – a root problem that needs to be tackled, and hopefully will be. Fingers firmly crossed. We've contacted Microsoft for a comment.
You might also like...
from TechRadar - All the latest technology news https://ift.tt/1VE0fuw
via IFTTT
Comments
Post a Comment