
Microsoft wants to take a continuous stream of screenshots of your system. They have been keylogging for a long time.

Microsoft has been using Copilot to spy. (
https://www.schneier.com/blog/archives/2024/02/microsoft-is-spying-on-users-of-its-ai-tools.html )

Storage remains cheap; Microsoft's cluster is way up the list on
top500.org.

Eating as much data as you can and keeping it around forever in case you figure out how to use it at some point in the future is espoused by Microsoft, Google, Facebook, everyone, all of the large tech companies. Data is considered an asset, active user metrics represent not just ad inventory but also a continued stream of data.

AI prompts are part of that data, kept around forever, besides being used to train new versions of the AI.

AI image generation tools produce unique images.

Image search is old tech. You can build it from scratch in your house: you normalize the image, you adjust color to maximize noise and then you boil it down to a 8x8 1bpp square first so that you have a 64-bit index, and that's often enough for a match, but if you have a lot of really similar images, you can just treat that as a constraint on the search space. It's not exactly CS-101, but it's something you can do maybe two or three years in.

Microsoft makes a big chunk of its money from government contracts. PhotoDNA came out of Microsoft Research.

AI-generated images are tied to an account: if not a credit card, at least an identity with IP/email/time, maybe a phone number for 2FA.

AI-generated images are almost certainly kept around. It is not hard at OpenAI's (i.e., Microsoft's) scale to retain a searchable corpus of all previously generated images.

People are using AI tools to make avatars.

None of those people are actually anonymous.
> how the fuck are you people still postings these abominations?
I'm not a huge fan of them, but it fits really well in this case. In some cases, they work really great. Here is a more appealing drawing and you can forget about the soyjacks.
chompette.jpe