The European Union has been extremely strict about its data protection rules and it has been monitoring various platforms including AI tools and their activities as well as scope. And now, in Italy, Garante, the department responsible for evaluating compliance by AI platforms regarding data privacy rules has accused OpenAI’s ChatGPT of breaching rules, which may result in extensive penalties or a ban on this AI tool. Know what the Italian authorities said in 5 points.
We are on WhatsApp Channels. Click to join.
If the violations and accusations are found to be true, then the Microsoft-backed OpenAI may have to face some big consequences including fines and a ban on the tool in the European region. EU's General Data Protection Regulation (GDPR) says that any company found breaching the rules will have to bear fines of up to 4 percent of its total global income.
Significantly, things may get worse for AI companies and their tools as the EU authorities are still grappling with the consequences of using this new technology and Italian authorities are among the most active and strict in terms of implementation so far.
AI on Apple devices? Apple is working on making AI models that can work on devices alone by making them smaller. But we may have to wait a few years for these to work to circumnavigate the “memory wall”. Know where things stand here.
AI leaders on the rise! Businesses are scrambling to appoint AI leaders with the recent boom in technology, leaving metaverse leaders in the lurch. Instead, AI is taking their place. Find out more about this shift here.
Saviour against AI! Taylor Swift’s deepfake photos went viral on X during the weekend but they’ve since been taken down. Can the Time's Person of the Year save humanity from the dark side of AI? Know all the details here.
Catch all the Latest Tech News, Mobile News, Laptop News, Gaming news, Wearables News , How To News, also keep up with us on Whatsapp channel,Twitter, Facebook, Google News, and Instagram. For our latest videos, subscribe to our YouTube channel.