GAMING

As Grok investigations begin, VPNs are looking more appealing than ever


Back in the early days of the internet and social media, we were very naive about our data (or, I was, at very least). Sure, we’d see those posts that said “Look out! Facebook owns every photo you upload” but we didn’t turn to VPNs, we just shrugged and thought “So what? That’s just a technicality. Mark Zuckerberg doesn’t care about our selfies”, little knowing that everything we posted, said, and did, was being mined for information about us so that algorithms could manipulate us based on the whims of the highest bidder.

Now, as the Information Commissioner’s Office (ICO), begins its investigation of Elon Musk’s X platform, we realise the truly chilling extent to which data is absorbed by these mega corporations. Essentially, what’s happened is that X users have been using Grok so that they can have AI images of real women and children naked. As the ultimate incel, it’s no wonder that Elon Musk would create the one thing that they all dream of – x-ray vision that lets you see anyone you want naked. It doesn’t matter that they find you utterly repulsive; Grok gives you all the power you ever wanted.

Even though it’s absolutely depraved, I know some people argue that it’s not so bad because it is all artificially generated and therefore not real. Moving aside the fact that if you randomly drew a picture of someone you know naked without their consent and shared it publicly, you’d easily face a sexual harassment charge (and much worse if they were a child), these AI-generated images are actually a lot more ‘real’ than most people realise.

Different websites don’t accumulate data on us in a vacuum – they’re always buying and selling between each other. That’s why you might get an advert on YouTube that is related to a conversation you had with someone on WhatsApp. Now, consider this scenario. A woman (and I say ‘woman’because it is women who have been disproportionately targeted) shares an intimate photograph with somebody through a messaging app, believing it will only be seen by the trusted person it was sent to. That photo is then stored as data, shared between all the different platforms (without humans seeing it at this point) and makes its way into the data pool Grok draws from. This then means that Grok users have the potential to make AI naked pictures of people that may have been informed by real photos, and likely ones not intended for public consumption.

This gets even worse when you think about the pictures that have been generated of children. It is obvious that Grok’s data pool draws from the most sordid and disgusting illegal content on the internet, so these images are being modelled on very real abuse, and couldn’t exist without it.

In the words of William Malcom, the Executive Director of Regulatory Risk & Innovation at ICO, “The reports about Grok raise deeply troubling questions about how people’s personal data has been used to generate intimate or sexualised images without their knowledge or consent, and whether the necessary safeguards were put in place to prevent this. Losing control of personal data in this way can cause immediate and significant harm. This is particularly the case where children are involved.”

So, with all your private data being mined from every angle and used to feed generative AI tools and advertising algorithms designed to manipulate you, the privacy and encryption that the best VPN services offer (like NordVPN, Proton VPN, Surfshark, CyberGhost, or ExpressVPN) is more appealing than ever. Our top recommendation is NordVPN – and with its 30-day money-back guarantee, you’ve got plenty of time to try it out before being locked in.


Source link

Related Articles

Back to top button