It’s a nightmare scenario for Microsoft. The headlining feature of its new Copilot+ PC initiative, which is supposed to drive millions of PC sales over the next couple of years, is under significant fire for being what many say is a major breach of privacy and security on Windows. That feature in question is Windows Recall, a new AI tool designed to remember everything you do on Windows. The feature that we never asked and never wanted it.

Microsoft, has done a lot to degrade the Windows user experience over the last few years. Everything from obtrusive advertisements to full-screen popups, ignoring app defaults, forcing a Microsoft Account, and more have eroded the trust relationship between Windows users and Microsoft.

It’s no surprise that users are already assuming that Microsoft will eventually end up collecting that data and using it to shape advertisements for you. That really would be a huge invasion of privacy, and people fully expect Microsoft to do it, and it’s those bad Windows practices that have led people to this conclusion.

  • JasSmith@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    7
    ·
    5 months ago

    Yeah the concept is pretty damn cool. It’s just horrifying to have a company own and control that data. I suspect this is like Xbox One launch disaster in 2013, in which Microsoft initially required all consoles to have an always-online connection. People rebelled, but today and certainly on our current trajectory, it now looks like Microsoft was just a little ahead of the curve. I think people will eventually become a lot more comfortable with companies owning their data because the benefits will be so enormous. I’m not happy about that future, but I think I understand it.

    • Hackworth@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      5 months ago

      It seems to me that we’ve reached a crossroads. I’ve been very aware of the data mining, garden walls, data trading, privacy violations, security issues, ownership issues, etc. - for roughly 30 years. I regularly make the choice to be exploited for the benefits I extract, largely because the data they’ve gotten from me thus far I don’t highly value. But the necessity to develop strategies to keep the devil’s bargain beneficial has reached a fevered pitch. I want to train my own AI and public AIs. I want to explore the vast higher dimensional semantic spaces of generative models without API charges. APIs are vanishing as we speak, anyway, companies fearful of their data being extracted without compensation. Can’t really sit on the Open/Closed fence anymore.