home
menu

Microsoft Recall - Useful Feature or Predatory Design?

faq

What is Recall?

I wanted to write about Recall, which is a new feature that Microsoft is introducing. Essentially, it is a tool that claims it will help you retrace content you think you have seen before. It provides a timeline of screen captures that it is constantly capturing in the background as you go about using your computer. Then, you have the absolute privilege of searching among screenshots for something that you remember seeing, and the system retrieves all screenshots that are related to the search that you made. This feature also requires an incredible 50 GB of empty storage on your computer.

So, it sounds interesting, and it got me thinking how often I have needed a feature like this. As an empathetic person would, I tried to empathize with people who may require this. Personally, I think that I have only ever thought of extremely specific memes that I had seen months ago and wished there was text recognition in the image search, but I can see other professionals would actually deem it useful. I can imagine a video editor or music producer wanting to go back to an old edit and seeing how the timeline / editing bar looked. Point being, I can see why people would find this useful at times. I also do think, though, that there are different ways to satisfy the desire for version control without needing a privacy invasion and 50 GB of free space.

Why is it problematic?

Sometimes the convenience a product brings can justify the enormous amount of data it needs. I unfortunately do not think that this addition to Microsoft systems is worth the constant surveillance. Why would it be problematic, though? Well, the obvious one is that we are giving a corporation the rights to take constant screenshots of every computer session we have. This is in itself a breach of our privacy. They can contradict this point by saying it is stored locally, but this seems like something that they can easily update in the future. They can very easily say that storing them locally is too expensive in terms of storage and move it to a cloud. It is practically recording everything you look at, which is basically spyware presented as a useful feature.

Another sneaky thing Microsoft did was enable it by default on every computer with Windows 11. Only users who are aware of this feature and what it entails can go and switch it off themselves if they want to opt out. This is predatory design; why is the choice not given to me on whether I wish to use this feature at all in the first place? And, let's say I am aware of this feature and wish to disable it. Initially, there was no way to disable it through the user interface. You had to go into Windows PowerShell and type out a command to be able to disable it. Obviously, not only is this sneaky design, but the accessibility for the general public is basically compromised. So now you need to check many boxes to be able to opt out of this feature: be aware of its existence and what it's existence entails, notice that it is enabled by default, decide I wish to disable it, be computer-literate enough to use Windows PowerShell, and be good enough at research to find out how to disable it online.

Just when you think you are done, you realize... there is no way to uninstall it. You can only disable it, but this spyware will be on your computer forever, and knowing Windows, they can easily auto-enable it as a feature in the next software update. But why is this the case? Why can't people have liberty over their devices? There are many answers, but it is a shame that they put these predatory features upon people without their consent. Obviously, users of Windows rely on it for convenience; it is a nice intermediate step between the lack of control of Apple and the freedom of Linux systems. But personally, the lack of UI to disable it and the inability to uninstall it are transparent of their predatory intentions.

Training Data for AI

Not to mention, whenever there is a use of AI, there is also a privacy concern of data being fed to it. From the official website of Microsoft, you can see in Figures 1 and 2 that they claim they will use the data. Once again, this practice is enabled by default, leveraging dark design patterns to push users into giving "consent" for their data to be used. One could argue whether it is really consent, though, if something is automatically enabled by default, and you have to go disable it yourself. Personally, simply accepting Terms of Service should not equate to genuine consent - it feels more like coercion, especially if that service is withheld unless you agree to a privacy policy that ultimately says, if you do not let us steal your data, you cannot use our product. Luckily, tools like ToS;DRexist to simplify the process of understanding Terms of Service. They break down the key points and highlight concerning clauses. However, this still does not address the main issue; users must be already aware of the implications of data abuse and actively seek out ways to protect their privacy. Without this knowledge, they may never bother with these settings and never care to read the privacy statements. I plan to explore this topic in greater depth in an upcoming blog post, but this summarizes my current perspective on the matter.

faq
Figure 1: Screenshot from the Microsoft Copilot FAQs Page
faq
Figure 2: Screenshot from the Microsoft Copilot FAQs Page

Sources!

The webpages are linked from the Wayback Machine because I believe they change or delete their webpages, and I wanted to make sure to include the version I have obtained this information from.

  1. Microsoft Recall Official
  2. Microsoft Copilot for Individuals