Nandini YadafMay 26, 2021 at 8:54:53 AM
February 25, 2021, Ministry of Electronics and Information Technology (MeitY) Issue new guidelines For social media companies, top players (OTT), digital media publishers and a three-tier claims framework. At the time of the announcement, the ministry gave the social media platform a three-month deadline to comply with the new guidelines. The deadline is Tuesday, May 25, 2021.
If other social media platforms like Facebook, Instagram and Twitter violate the end of the term, the government can file criminal proceedings against them.
“Our goal is to comply with the provisions of computer regulations and continue to discuss some issues that require greater involvement with the government. We are committed to implementing operational processes and improving efficiency in accordance with IT regulations. Facebook remains the ability of people to express themselves freely and safely on our platform, “a Facebook spokeswoman said. Tech2 In the statement.
I also contacted Instagram for more information about it.
Twitter declined to comment.
New Digital code of ethicsThe government aims to establish a “progressive soft-touch institutional mechanism with a fair playing field.” Union IT Communications Minister Ravi Shankar Prasad said digital media mediators and ethical code guidelines control the misuse of social media platforms and broadcasting services, early sources reveal. joke information and remove it within 24 hours. Content that represents female nudes and transformed photos.
(Read again: India enforces IT legislation to regulate digital content, but new rules may fail in legal control).
According to the guidelines:
1. The social media platform must establish a grievance mechanism and appoint a grievance officer. The claim agent must register the complaint within 24 hours and process it within 15 days.
2. For complaints about the dignity of users, especially women (exposure of private parties, nudity and sexual activity, impersonation, etc.), the social media platform should remove this material within the 24 hours after the complaint is filed. There is.
Important guidelines for mediation on social networks
- We appoint a compliance officer (resident in India) who is responsible for ensuring compliance with the laws and regulations.
- Appoint a node link (resident in India) to coordinate with law enforcement agencies 24/7.
- Appoint a resident claims officer to perform the grievance mechanism. These intermediaries must submit a monthly compliance report on the number of complaints filed and whether / how they have been corrected.
Social media platform guidelines
- At the request of the courts or governments, social media platforms are required to disclose the creator of naughty tweets / messages.
- Social networking platforms require provisions for voluntary verification of users.
OTT platform guidelines
- Digital and OTT news media must reveal details about where and how to publish content
- Claims system for digital and OTT platforms
- Self-regulatory body headed by a retired SC or HC judge
In addition, the center has stated that it will create a “claims portal” where anyone with a complaint about the OTT platform or digital media content can file a complaint on the claims portal. Complaints are sent first from the center to those interested. If the complainant is not satisfied with the complainant’s response, he or she may sue the self-regulatory body established for the interested parties. You can appeal to the central government.
What does the new Digital Code of Ethics mean for users?
There are some changes to the digital ethics guidelines that may affect end users.
- A social networking platform that offers the following messaging services: WhatsApp and Messenger, you must enable traceability of the sender of the message according to the guidelines. This can compromise end-to-end encryption. In particular, this rule does not require you to disclose the content of the message, but even tracking information related to the first person calling may affect your privacy.
- The guidelines require social media intermediaries to allow “automated tools” to identify and remove content, especially related to child sexual abuse and rape representations. It can be a “flow function” that occurs when information is used for purposes other than the original specified purpose.