Social Intermediaries: Why Are They Still Scot Free?

About the Article

We often neglect to ponder upon the fundamental cause of problems prevailing in the society. With the incidents of cyberbullying sprouting with minors acting as both offenders and victims, we seek to study all spheres of society and denounce the liable authority. In view of the above statement, we, through this paper, shed light on the responsibility of intermediaries, i.e., the social media platforms in promoting unsavoury content online. Our article focuses on determining the accountability of intermediaries for defamatory content online, along with questioning individuals responsible for creating such content and their guardians.

Furthermore, we talk about the laxity in the provisions as well as the implementation of the laws specifically concerning the cyberbullying offence. The article then peruses the safe harbour provided to intermediaries, which is not only misused but also contributes to the infamy of the victim. The community guidelines made by Facebook, Instagram to control the posting of defamatory content, and preventing the minors below the age of 13 years from creating an account is noteworthy; hence this paper brings in their cons to light as well. In the concluding statement, we have laid down suggestions for government, parents, and legislation; Government to organize an investigative authority for victims to lodge complaints and legislation to amend cyber laws. Additionally, to intermediaries, we have recommended monitoring content posted by users and incorporating the ignore-report-block approach. 

Determining the Accountability 

Forwarding a message is equal to accepting the message and endorsing the message”                      

 – Madras High Court

We live in an era where humanity is overshadowed by the misdemeanour acts of the public. It would be our sin to neglect the invisible hand of any of the social platforms as the intermediaries to fume the unsavoury activities online.   

We have come across innumerable questions after witnessing the mortifying incident of “Bois Locker Room”, where the teenage girl’s images were morphed and objectified by the teenage boys.  The most important question is who can be solely held liable for promoting them? Can the parent’s upbringing be blamed or is it the reality of us becoming accustomed to the men’s misogynist approach and them objectifying women? Or is it the generation who posts such content to draw the attention of many? 

Unfortunately, all this while we failed to reveal and denounce the liability of the social platforms or the internet to encourage such gruesome acts. Internet being a platform of anonymity leaves us with a lot of unanswered questions. The intermediaries have turned a blind eye to the acts of users online who are involved in sharing preposterous innuendo latent content. Therefore, through this article, we unravel the reasons for cyberbullying, the lack of effectuation of laws and guidelines concerning the intermediaries. 

Juvenile Cyber Offender: Extent For Their Liability 

Before raising voice to demand justice, the liability of all the spheres of society should be determined. Foremost, the actions of juvenile cyber offenders are scrutinized under the Juvenile Justice (Care and Protection) Act 2015. The scope of JJ act 2015 extends to the offences committed by minors that are below 18 years of age. The Act does penalise any minor for their offences, instead of terming it as “child in conflict with the law.” The child in conflict with law refers to a “child who is alleged or found to have committed an offence and who has not completed eighteen years of age on the date of commission of such offence”. Therefore all the penalising provisions related to cyber-crime would be applicable to them but through the lens of Juvenile Justice Board which is established under the JJ act 2015.

Under the Indian Penal code 1860, Section 499 and 500 deals with the offence of defamation. Section 292  deals with acts such as distribution, public exhibition or circulation of the obscene book, pamphlet, or any other means to generate lascivious thought to corrupt the minds of the reader. On the other hand, 293provides enhanced punishment for sale etc. of obscene material to any person under the age of twenty years. Under the Information Technology Act, 2000 sections 67, 67A, 67B, deal with publishing sexually explicit and obscene content electronically. Such offences are punishable for up to 7 years.

Apart from IPC and IT Act 2000 other acts such as the “Indecent Representation Of Women (Prohibition) Act” and “The Protection Of Children From Sexual Offences (Amendment) Act”, 2019 include prohibition of indecent, offensive, obscene and sexually explicit representation or publication of such content of women and children respectively. Not only statutes but schemes like ‘CyberCrime Prevention against Women and Children (CCPWC)’ have also been approved by the Ministry of Home Affairs. Under this scheme, an online Cyber Crime reporting portal has been launched to enable the public to report complaints pertaining to Child Pornography/ Child Sexual Abuse Material, rape/gang rape imageries or sexually explicit content. This portal facilitates the public to lodge complaints anonymously or through Report and track options.

Though the legislation has satisfactory laws to deal with the publication of derogatory and obscene content, it still lacks in enabling specific provision related to cyber-porn and cyber-bullying concerning children. However, the notable attempts can be spotted under the POSCO (Amendment) 2019 which included the term cyber-pornography. The Government has also notified that the added POSCO rules 2020 would be enacted.  It will ensure more stringent rules and provide for crackdown for possession of pornographic material involving children. However, incidents such as bois locker room come in as a disappointment to these rules. 

Intermediaries: The Conduit Of The Mess 

The infamous incidents like “bois locker room” not only parade the offences of cyberbullying, defamation, sexual comments and sharing of derogatory content of the minors by the teenagers itself but also grasps our attention towards the liability of the intermediaries. To question the liability of intermediaries it becomes important to understand their role in promoting such online content. According to Information Technology Act, 2000 intermediaries ‘means any person who on behalf of another person receives, stores or transmits that record or provides any service with respect to that record and includes telecom service providers, network service providers, internet service providers, web-hosting service providers, search engines, online payment sites, online-auction sites, online-marketplaces and cyber cafes’.Information Technology Act also provides us with the role of intermediaries as “the function of the intermediary is limited to providing access to a communication system over which information made available by third parties is transmitted or temporarily stored or hosted”.

According to the Department of Electronics and Information Technology, “Social Media in recent times has become synonymous with Social Networking sites such as Facebook or Microblogging sites such as Twitter. However, very broadly social media can be defined as any web or mobile-based platform that enables an individual or agency to communicate interactively and enables the exchange of user-generated content”.The intermediaries, which includes the social media platforms act as a place to allow transmission of user-generated content online.

Safe Harbour For Intermediaries

Every entity is to be governed by laws, and social media platforms being the entity of transmission are governed by the IT Act 2000. According to section 79(2) (b) of the Information Technology Act,2000, exempts an intermediary from any liability for any third party information, data, or communication link made available or hosted by the user. But the exemption is only available when they do not initiate the transmission, select the receiver of the transmission, or Select or modify the information contained in the transmission. The act also provides for punishment for such platforms if the intermediary induces transmission of an unlawful act or if even after getting knowledge about the unlawful transmission they fail to act upon it. Hence if they fail to expeditiously remove or disable access to that material on that resource without vitiating the evidence in any manner they can be held liable under section 507 of IPC. This section states that whoever takes precaution to conceal the identity or name or abode of the person from whom the threat may come shall be punishable with 2 years of imprisonment.

The government in order to regulate the functions of intermediaries in India, enacted the Intermediaries Guidelines Rules 2011 under section 79(2) (c) “due diligence”. Where the intermediaries have to abide by the condition to make publish the rules and regulations, privacy policy and user agreement for access or usage of the intermediary’s computer resource by any person. Such rules should have a restriction on any harmful, obscene, harassing, and blasphemous defamatory, obscene, and pornographic. Also upon obtaining knowledge of any unlawful content by itself or being brought to actual knowledge by an affected person in writing or through email signed with electronic signature about any such information, they have to act within thirty-six hours and preserve information for about 90 days. They also have to report any such cyber-crime incidents to CERT-IN, which is a national agency responsible for performing functions in the area of cybersecurity.

Though there exist restrictions and regulations for the intermediaries which they have to mandatorily follow, their discretion to make rules for themselves and the minimal involvement of the Government Agencies does add to the misery of the nation. 

Liability of Intermediaries 

The liability of the intermediaries can be explored over assorted concepts. 

Firstly, the question of initiation or modification of transmission should be taken up. The process of initiation of transmission by platforms can be understood by the feature of self-automated responses, where the user has to select from a variety of already uploaded responses. Further, it modifies the transmission by following Facebook’s community guidelines which state that they may remove language that incites or facilitates serious violence”.Hence, their liability can be supported by the case of Doe v. Myspace, where while discussing the liability of Myspace platform, it contended that while making the users fill up the questionnaire it may initiate the transmission. It was also held in various cases that even if the intermediaries have minimum knowledge of the illegal content, they can take the action against the content owner without waiting for the court order to do so. If they fail to take the required action, they can be held to be liable for the same. The safe harbour of the intermediaries can be questioned on the basis of self-generated responses.

Secondly, the question of ‘due diligence’ arose in the case of K.N. Govindacharya V. Union Of India. The court stated that under Rule 3(11) of the Intermediaries Guidelines Rules 2011, the intermediaries are instructed to publish the name of the grievance officer, along with his contacts and the mechanisms by which the user or any victim who suffers due to violation of rule 3 can notify their complaint or other matter related to a computer resource. The grievance officer has to address the complaint within one month of the date of receipt of the complaint. Hence the court ordered the intermediaries, including the social media platforms to comply with the provisions.

Marital rape- An Oxymoron?

Thirdly, the required age of the minors allowed to make an account on social media platforms should be questioned. The social media platform like Facebook has devised the mechanism wherein the user on the creation of the account has to agree to the SRR provided by them. This Statement of Rights and Responsibilities (“Statement,” “Terms,” or “SRR”) derived from the Facebook Principles, governs the relationship of Facebook with its users and others who interact with Facebook. Therefore, the minors also succeed in creating a profile online by agreeing to SRR or the terms of service of such a social media platform. Hence, in the case of C.M.D. v. Facebook, Inc, it was held that Facebook’s Statement of Rights and Responsibilities (SRR) is an enforceable contract against minors who use Facebook unless those minors disaffirm the agreement. But if we see in India contracts, the contracts made with the minors are void ab initio. Under the provisions of the Information Technology Act, 2000 particularly Section 10-A, an electronic contract is valid and enforceable. The only essential requirement to validate an electronic contract in compliance with the necessary prerequisites provided under the Indian Contract Act, 1872. Hence, minors cannot agree to the terms and conditions provided by the social media platforms according to Indian laws which contradict with the minimum age requirement by the social media platforms.



According to Community Standards, Facebook specifically states that its priority is authenticity, safety, privacy, dignity.  The content restricted by them includes: 

  • Engaging in any sexual activity involving minors
  • Minors soliciting minors
  • Using their products and site functionality with the intention of sexualizing minors
  • It does not allow or entertain nudity or sexually suggestive content, hate speeches and credible threats. 
  • Content (including photos, videos, real-world art, digital content and verbal depictions) that shows minors in a sexualised context.

Facebook also takes the different initiative to ensure compliance with the standards like:

  • People can report potentially violating content, including Pages, groups, profiles, individual content and comments. They also give people control over their own experience by allowing them to block, unfollow or hide people and posts. 
  • They may also notify law enforcement on the belief that there is a genuine risk of physical harm or a direct threat to public safety.
  • It believes in warning the misuse of the account primarily. Hence, it warns for a first violation, but if they continue to violate the policies, they may restrict their ability to post on Facebook or disable their profile. They also may notify law enforcement when we believe there is a genuine risk of physical harm or a direct threat to public safety.
  • Their main focus is on the enforcement against abusive accounts to both prevent harm and avoid mistakenly taking action on good accounts.

Despite having such stringent guidelines and rules for the users on such social platforms, the laxity in implementation is still an issue. 


Facebook being the parent company of Instagram, they both share their Community Guidelines.

Apart from following the above-mentioned guidelines, Instagram mentions that:

  • If an account is established with the intent of bullying or harassing another person or if a photo or comment is intended to bully or harass someone, then the affected user has the option of reporting that account. 
  • As Instagram believes in following the zero-tolerance policy, it can remove any content that targets private individuals or harasses them with unwanted messages. 
  • It also provides its users with the ability to control the age of their Instagram content’s viewers by setting up a minimum age limit for their account.

Instagram’s feature of “sensitive content” filter was added to limit the likelihood of its users to stumble over the unwanted content. In lieu of their building a safe environment, they aim to blur the sensitive images for its users. However, the user can still access the blurred content by requesting for a security code via text from the company.

Hence, by enabling this feature, the provisions of it can be misused by the account holders. The social media platform already fails in controlling the content posted by the pages such as adult pages, this menace will only include the misery of its users. Though, Instagram has the freedom to remove any content or information if it violates their Terms of Use (including our Instagram Community Guidelines), as they are permitted or required to do so by law, the embezzlement of it is sure to happen. 


With the advent of the internet age, communication of thoughts has become an easy affair. However, with the growing success of social media platforms, the internet has become a critical hotspot for bullying. There are stringent laws prevalent to arrest the growth of derogatory content online and prohibit users to involve themselves in these acts. However, the adherence and effectuation of such laws still remain in the rudimentary stage. The imbecile people, who post such unsavoury content, not only contribute to the ill repute or infamy of the victims but also neglect the psychological effect they go through. The victims have fallen prey to the neglect of the legislation salvaged by the vague and lenient laws for the liability of intermediaries. Ergo, the intermediaries should realize the exigency of monitoring the content being posted online by its users. Moreover, there is a dire need to deter users from indulging in such destructive activities in cyberspace. 



  1. The legislation should come up with specific laws on cyberbullying. As cyberbullying through an indecent representation of minors is a serious offence even if it’s committed by minors.
  2. The cyber law should make laws pertaining to any lewd comments made by any age group and should ban the account of the said user. 


  1. The government organization should form an investigative authority wherein the victims can lodge their complaints with an assurance that the secrecy of the complainant will be maintained. 


  1. The intermediary such as Facebook, Instagram should not let any obscene or objectionable message devour over the group chats or any personal chats. Example: a new policy of “You cannot send such content” should be implemented. 
  2. The intermediary should be held liable for encouraging any derogatory content and there liability should be increased in such serious matters. 
  3. Ignore – report- block approach should be taken up by social media platforms.  Should have more personalized settings to deal with sexual cyberbullying where minors are given apt and immediate support but also make sure proper updates are also given to the affected minors.
  4. A separate authority should be deployed to scrutinize the working and handle all the complaints made thereof.  
  5. Only the age group of 16 years and above should be allowed to join such handles. Along with this precaution, the fake accounts should be removed by the checking of the personal information of the user. 


  1. Parents of both the victim and complainant should be informed and proper counselling should be carried out by the experts to avoid such recurrence. The idea is to prevent the mishappening instead of making it a crime.
  2. Parents should encourage their children to talk about appropriate and inappropriate behaviour online. They can also take concrete measures of enabling a parent control device in their cell phones to monitor the online activities of the teens.
  3. Parents should impart the knowledge of Cyberethics to their children.


  1. Ministry of Women and Child Development, Digital Exploitation of Children, Press Information Bureau Government Of India 
  3. Ambika Pandit, Zero tolerance on child porn: Govt notifies new Pocso rules, Times of India,
  4. Department of Electronics and Information Technology, Framework & Guidelines for Use of Social Media for Government Organisations, Ministry of  Communications & Information Technology Government of India,
  5. The Gazette Of India: Extraordinary, notification: Intermediaries Guidelines Rules 2011, Ministry of Electronics and Information Technology,
  6. CERT-IN, welcome to CERT-in,
  7. MICHAEL S. ISSELIN, #StopImmunizing: Why Social Networking Platform Liability Is Necessary to Provide Adequate Redress for Victims of Cyberbullying,61 N.Y.L. Sch. L. Rev. 369 (2016–2017)
  8. Facebook, Community Standards,
  9. Doe v. Myspace528 F.3d 413, 420 (5th Cir. 2008)
  10. MySpace Inc. v. Super Cassettes Industries Ltd 2 236  DLT 478(2017)
  11. IN THE HIGH COURT OF DELHI AT NEW DELHI, K.N. Govindacharya V. Union Of IndiaW.P.(C) 3672/2012 and CM Nos.7709/2012, 12197/2012 and 6888/2013,
  12. IN THE HIGH COURT OF DELHI AT NEW DELHI, K.N. Govindacharya V. Union Of India W.P.(C) 3672/2012 and CM Nos.7709/2012, 12197/2012 and 6888/2013
  13. Facebook, Statement of Rights and Responsibilities,
  14. C.M.D. v. Facebook, Inc., C 12-1216 RS (N.D. Cal. May. 20, 2013)
  15. Facebook, Community Standards,
  16. Help Centre, What types of things aren’t allowed on Facebook? , Facebook,
  17. Community Standards, Safety, Facebook,
  18. Community Standards, Introduction, Facebook,
  19. Alex Schultz, How Does Facebook Measure Fake Accounts?, Facebook,
  20. Help centre – privacy and safety centre, Community Guidelines, Instagram,[0]=Instagram%20Help&bc[1]=Privacy%20and%20Safety%20Center
  21. Help centre – privacy and safety centre, Community Guidelines, Instagram,
  22. Help centre – privacy and safety centre, Community Guidelines, Instagram,
  23. Help centre – privacy and safety centre, Community Guidelines, Instagram,[0]=Instagram%20Help&bc[1]=Privacy%20and%20Safety%20Center

BY- Aparna Gupta & Aditi Palit | AMITY LAW SCHOOL, DELHI (Affiliated to GGSIPU)

Leave a Comment

Your email address will not be published. Required fields are marked *