AI holds massive potential for malicious use, but who will be held accountable? | Tech News

AI holds massive potential for malicious use, but who will be held accountable?

While Generative AI is rapidly advancing, it also raises concerns about its potential to be used maliciously.

By: HT TECH
| Updated on: May 25 2023, 13:27 IST
AI-powered Bing now on SwiftKey! Skype, Microsoft Start, Microsoft Edge get updates too
Artificial Intelligence
1/5 The SwiftKey mobile app now has access to AI-powered Bing features in three ways – search, chat, and tone changer. (Microsoft)
image caption
2/5 After receiving the update, the Bing icon will appear above the keyboard. By clicking on it, you can select the specific feature you want to use: Chat, Tone, or Search. (Microsoft)
Artificial Intelligence
3/5 Chat functionality is for more detailed queries, while Tone feature can help you communicate more effectively by using AI to customize your in-progress text to fit any situation. The Search functionality can quickly let you search the web from your keyboard, without switching apps. (Bloomberg)
image caption
4/5 Apart from these, the translator functionality in the mobile Bing app now offers alternative masculine and feminine translations when translating from English to Spanish, French, or Italian. (Microsoft)
Artificial Intelligence
5/5 Bing in Skype access is expanding, so that everyone in a group chat can now chat with the new Bing. Only one person in the group needs to have access to the preview. (AP)
Artificial Intelligence
icon View all Images
While Generative AI is rapidly advancing, it also raises concerns about its potential to be used maliciously. (Pexels)

While generative AI is rapidly advancing, it also raises concerns about its potential to be used maliciously. Generative artificial intelligence (AI) models may be susceptible to bias, as they learn patterns and generate output/predictions based on the data they are trained on. If the training data is biased or incomplete, the model's output can also be incorrect/biased. Also, given that AI language models can generate human-like text and can be trained to impersonate the writing style of humans, there are also serious concerns about its potential misuse for spreading fake news.

The other interesting concept being whether Generative AI are intermediaries can claim a safe harbour for the content published on their platforms. It is important to observe that, unlike search engines that only provide links to webpages/content available on the internet, Generative AI processes available data and generates an independent output. Hence, it may be difficult for all Generative AI platforms to be categorized as intermediaries under the law. Also, since, there are varied parties involved in the ChatGPT / Generative AI (GAI) ecosystem (third-party data owners, GAI companies, platform providers, and users), there could be multiple IP claimants, hence, the ownership rights in the output generated from such systems is highly contentious.

Moreover, there is limited guidance or obligation on the accountability of a GAI system and the way the output has been arrived at and this could lead to issues such as bias, accountability, and explainability. Additionally, the protection of user data and user rights is complex. It may not be possible to seek user consent when data is scraped from the internet. In such scenarios, the implementation of user right to correction, erasure, and portability among others becomes challenging.

Lastly, with the advent and wider acceptance of GAI in our daily lives, human-generated content could become a scarce commodity, hence more valuable.

By Huzefa Tavawalla, Head, Disruptive Technologies Practice Group, Nishith Desai Associates

NOTE: The views expressed are those of the author and do not necessarily reflect the opinions of HT Tech.

Catch all the Latest Tech News, Mobile News, Laptop News, Gaming news, Wearables News , How To News, also keep up with us on Whatsapp channel,Twitter, Facebook, Google News, and Instagram. For our latest videos, subscribe to our YouTube channel.

First Published Date: 25 May, 13:26 IST
NEXT ARTICLE BEGINS