india employmentnews

AI: Will artificial intelligence no longer be able to have 'emotional' conversations? Find out why China has tightened its control..

 | 
Social media

Amidst the rapid advancements in the world of Artificial Intelligence (AI), China has taken a significant step. On Saturday, China's cyber regulator released draft regulations for AI services that mimic human behavior and establish emotional connections with users. The primary objective of this move is to ensure safety and ethics in the rapidly growing consumer AI market.

What do the new rules entail?
According to the proposed regulations, this law will apply to all AI products and services available to the general public in China, particularly those that imitate human personalities, pretend to think and converse like humans, and emotionally connect with users through text, audio, images, or videos.

Increased responsibility for companies
The draft includes several strict guidelines for AI service providers. Companies must ensure that users do not become addicted to AI. They must warn users against "excessive use." Providers will have to assess the mental state and emotions of users. If a user appears overly emotional or is developing an addiction to AI, the company must intervene immediately. Companies will be responsible for security throughout the entire product lifecycle, including algorithm review, data security, and the protection of personal information.

"Red lines" established.
China has also set strict "red lines" regarding content. According to the regulations, AI services cannot generate any content that threatens national security, spreads rumors, or promotes violence and obscenity. These regulations have been released for public comment, after which they will be enacted into law.

Disclaimer: This content has been sourced and edited from Amar Ujala. While we have made modifications for clarity and presentation, the original content belongs to its respective authors and website. We do not claim ownership of the content.