Authored By: Mohammad Yamin Hoque
Banglaesh Army International University of Science and Technology
Abstract
The digital revolution has transformed how Bangladeshi children learn, communicate, and socialize, exposing them to cyber threats like sexual exploitation, grooming, cyberbullying, and radicalization. The Cyber Security Ordinance, 2025, enacted on 21 May 2025, introduces provisions to address these, especially Section 25, which criminalizes digital child sexual abuse material (CSAM), revenge porn, and sextortion with harsher penalties for offenses against minors. This article examines the legal framework and its limitations, and compares them with international best practices from jurisdictions such as the United Kingdom. While the Ordinance makes progress, such as tackling AI-generated exploitative content and procedural safeguards, it reveals gaps in platform accountability, victim support, and prevention. It argues that child protection needs a holistic approach beyond criminalization, digital literacy education, institutional mechanisms, technological solutions, and multi-stakeholder collaboration. The article concludes with recommendations for legislative, policy, and implementation strategies to ensure a safer digital space for Bangladeshi children.
- Introduction
In January 2024, a 14-year-old girl from Dhaka was blackmailed by an online predator who had obtained her private photographs through a fake social media profile.1 The perpetrator threatened to circulate the images unless she complied with further exploitative demands. This case, representative of hundreds reported annually to Bangladesh’s cybercrime units, illustrates the dark underbelly of digital connectivity that increasingly endangers Bangladeshi children. As of 2024, approximately 35 million Bangladeshis under 18 years have internet access, with smartphone penetration among urban adolescents exceeding 70%.2 While this digital inclusion offers immense educational and developmental opportunities, it simultaneously exposes minors to sophisticated cyber predators, explicit content, online harassment, and radicalization.
1.2 Historical Background:
The legislative response to these threats has evolved significantly over the past two decades. The Information and Communication Technology Act, 2006 provided initial provisions for cyber crimes but lacked child-specific protections. The Digital Security Act, 2018 introduced broader cyber crime definitions but faced severe criticism for enabling abuse through vague provisions that stifled freedom of expression. Its successor, the Cyber Security Act, 2023, existed briefly before being replaced by the current Cyber Security Ordinance, 2025, which was promulgated on 21 May 2025 with explicit objectives of ensuring cyber security while safeguarding fundamental rights including freedom of expression. The Cyber Security Ordinance, 2025 marks a paradigmatic shift in Bangladesh’s approach to child online safety. Section 25 introduces comprehensive definitions and enhanced penalties for sexual harassment, blackmailing, and obscene content involving children.3
1.3 Thesis Statement:
While the Cyber Security Ordinance, 2025 introduces robust provisions for child protection in cyberspace, particularly regarding sexual exploitation and AI-generated abuse material, effective implementation requires coordinated efforts across legal, technological, and educational
1 Teen Girl Blackmailed After Online Predator Obtains Photos’ The Daily Star (Dhaka, 15 January 2024) https://www.thedailystar.net accessed 20 November 2025
2 Bangladesh Telecommunication Regulatory Commission, Internet Subscribers in Bangladesh: January-December 2024 (BTRC 2024) 15.
3 Cyber Security Ordinance 2025, s 25.
domains, including platform accountability mechanisms, specialized institutional capacity, mandatory digital literacy education, and victim-centered support systems that extend beyond punitive measures to prevention and rehabilitation.
- Legal Framework for Child Protection in Cyberspace
The Constitution of the People’s Republic of Bangladesh provides foundational protections for children’s rights. Article 28(4) permits special provisions for women and children, enabling protective discrimination.4 The Children Act, 2013 constitutes Bangladesh’s primary legislative framework for child rights and protection. Section 2(17) defines a ‘child’ as any person under 18 years of age, aligning with the UN Convention on the Rights of the Child.5 While the Children Act primarily addresses offline harm, including trafficking, labour exploitation, and physical abuse, its principles extend to cyber exploitation. Section 71 criminalizes child pornography, though its provisions are less comprehensive than the digital-specific protections now found in the Cyber Security Ordinance.6
2.1 Child Protection Provisions under the Cyber Security Ordinance, 2025: Section 25: Sexual Harassment, Blackmailing, and Obscene Content: Section 25 constitutes the Ordinance’s primary child protection provision, addressing sexual harassment, blackmailing, and distribution of obscene content through digital means. This section merits detailed analysis given its comprehensive scope and enhanced penalties for crimes against minors. Section 25(1) criminalizes the intentional or knowing use of websites or digital/electronic means to engage in blackmailing, sexual harassment, revenge porn, or sextortion, or to create, obtain, store, transmit, publish, or broadcast information, videos, images, audio-visual content, still images, graphics, or digitally captured data including AI-generated or AI-edited content that is harmful or intimidating.7
The penalty structure reflects graduated seriousness:
- General offense: imprisonment up to 2 years or fine up to 10 lakh taka, or both8
4 Constitution of the People’s Republic of Bangladesh 1972, art 28(4
5 Children Act 2013, s 2(17).
6 Children Act 2013, s 71.
7 Cyber Security Ordinance 2025, s 25(1).
8 Cyber Security Ordinance 2025, s 25(2).
- Offense against women or children under 18: imprisonment up to 5 years or fine up to 20 lakh taka, or both9
This enhanced penalty for child victims represents a significant legislative acknowledgment of children’s particular vulnerability and the heightened harm they suffer from cyber exploitation. Definition of Digital Child Sexual Abuse Material: Section 2(1)(থ) provides an extensive definition of “Digital Child Sexual Abuse Material” (DCSAM) that aligns with international standards while addressing emerging technologies:
- Depicting Sexual Acts: Material that visually, aurally, textually, or otherwise depicts or describes real or simulated sexually explicit activity, sexual organs, sexual exploitation or abuse, sexual services, or sexual communication involving a child as defined under Section 2(17) and 4 of the Children Act, 2013.10
- Soliciting Child Participation: Material that incites, excites, encourages, or directs a child to engage in or observe real or simulated sexual acts, display sexual organs, participate in sexual exploitation or abuse, provide sexual services, engage in sexual communication with the child or others, or assist in other sexual offenses defined under applicable law (including payment for sexual services), control a child for sexual exploitation, or groom a child for sexual purposes.11
- Inciting Third-Party Exploitation: Material that through incitement, excitement, encouragement, threats, or instructions causes any person through any means to cause a child to engage in or observe real or simulated sexual acts, display sexual organs, participate in sexual exploitation or abuse, provide sexual services, engage in sexual communication, assist in other sexual offenses (including payment for sexual services), control a child for sexual exploitation, or groom a child for sexual purposes.12
- Critical Weaknesses and Legislative Gaps
- No Platform Accountability: The Ordinance imposes no duties on social media platforms, messaging services, or content hosts. International best practices increasingly recognize that intermediaries, not just end-users, must take responsibility for child safety. The UK’s Online Safety Act 2023 requires platforms to prevent children from
9 Cyber Security Ordinance 2025, s 25(3).
10 Cyber Security Ordinance 2025, s 2(1)(থ)(অ).
11 Cyber Security Ordinance 2025, s 2(1)(থ)(আ).
12 Cyber Security Ordinance 2025, s 2(1)(থ)(ই).
encountering harmful content, implement age verification, and report CSAM to authorities, with multi-million pound fines for non-compliance. The Ordinance lacks equivalent provisions, leaving platforms free to maintain minimal safety measures.
- Limited Victim Support Provisions: The Ordinance is almost entirely punitive, lacking provisions for victim rehabilitation, counseling, or compensation. Section 30 permits tribunals to order compensation from fines imposed on offenders, but this is discretionary and contingent on conviction. Research consistently shows that child victims of sexual exploitation require specialized psychological support, yet the Ordinance establishes no government-funded counseling programs or victim advocate systems.
- Absence of Dedicated Child Online Safety Helpline: The Ordinance establishes no 24/7 helpline for reporting child exploitation or seeking assistance. International models such as the UK’s Child Exploitation and Online Protection Centre (CEOP) or Australia’s eSafety Commissioner’s reporting portal provide accessible, child-friendly mechanisms for reporting abuse. Without such infrastructure, Bangladeshi children and parents lack clear pathways to report concerns or seek help.
- Inadequate Cross-Border Enforcement Mechanisms: Section 48 references the Mutual Legal Assistance Act, 2012 for international cooperation, but provides no specific mechanisms for rapid response to foreign-hosted CSAM or coordination with international databases such as the International Child Sexual Exploitation (ICSE) database maintained by Interpol. Given that most social media platforms and content hosts are foreign entities, this limits practical enforcement.
- No Possession Offense for CSAM: The Ordinance criminalizes creating, obtaining, storing, transmitting, publishing, and broadcasting CSAM but does not explicitly criminalize simple possession, a gap that may create prosecution challenges. International instruments including the Optional Protocol on the Sale of Children explicitly require criminalization of possession, recognizing that consumers of CSAM drive demand for production.
- Undefined “Harmful or Intimidating” Standard: Section 25(1) criminalizes content that is “harmful or intimidating” but provides no definition or objective standards. This vagueness could enable arbitrary application or, conversely, make prosecution difficult
when defendants argue that the material does not meet undefined thresholds. More explicit legislative guidance or judicial interpretation will be necessary.
- Resource and Capacity Constraints: Even well-drafted legislation fails without implementation capacity. The Ordinance establishes various institutions but does not guarantee adequate funding, personnel, or training. Section 7 states that the Agency shall have “necessary personnel as per the organizational structure approved by the government,” but this is aspirational rather than prescriptive. Without dedicated budgetary allocations, these institutions risk being under-resourced.
- Comparative Analysis: International Best Practices
4.1United Kingdom: The Online Safety Act, 2023
The UK represents a pioneering approach to platform regulation through its Online Safety Act, 2023, which fundamentally shifts responsibility for user safety from individuals to service providers.13 Rather than relying solely on criminal law, the Online Safety Act establishes a comprehensive regulatory framework overseen by Ofcom (Office of Communications).14 The Act imposes “duties of care” on platforms to protect users, particularly children, from harmful content.
4.2 Lessons for Bangladesh: Bangladesh could adopt several elements:
- imposing duties of care on platforms operating in Bangladesh,
- requiring transparency reporting,
- mandating age assurance for child-accessed services, and
- establishing meaningful financial penalties for non-compliance. BTRC could be empowered to issue and enforce online safety codes similar to Ofcom’s regulatory model.
- Recommendations and the Path Forward
- Amend the Ordinance to Establish Platform Duties of Care
- Criminalize Simple Possession of CSAM
13 Online Safety Act 2023 (UK).
14 Online Safety Act 2023 (UK), s 3 (establishing Ofcom as the regulator).
- Establish Mandatory Reporting Obligations
- Establish a Child Online Safety Division within the National Cyber Security Agency ● Expand Digital Forensic Lab Capacity
- Deploy Network-Level CSAM Blocking
- Conclusion
The Cyber Security Ordinance, 2025 provides a foundation. Building upon it a comprehensive ecosystem for child online safety will determine whether Bangladesh’s digital future is one of opportunity or exploitation for its youngest citizens. The recommendations outlined in this article offer a roadmap. Implementation requires sustained commitment, adequate resources, and political will. But the stakes, the safety, development, and dignity of millions of Bangladeshi children, could not be higher. Bangladesh has the opportunity to become a regional leader in child online safety. Seizing that opportunity begins now.
- Bibliography
Statutes
- Constitution of the People’s Republic of Bangladesh 1972
- Cyber Security Ordinance 2025 (Ordinance No 25 of 2025)
International Instruments
- Convention on the Rights of the Child (adopted 20 November 1989, entered into force 2 September 1990) 1577 UNTS 3
- Optional Protocol to the Convention on the Rights of the Child on the sale of children, child prostitution and child pornography (adopted 25 May 2000, entered into force 18 January 2002) 2171 UNTS 227
- Children Act 2013 (Act No 24 of 2013)
Books
- Yaman Akdeniz, Internet Child Pornography and the Law: National and International Responses (Ashgate 2008)
- Julia Davidson and Petter Gottschalk, Internet Child Abuse: Current Research and Policy (Routledge, 2011)
- Sonia Livingstone and others, Children’s Online Risks and Opportunities: Comparative Findings from EU Kids Online and Net Children Go Mobile (EU Kids Online 2014)
Journal Articles
- Ethel Quayle and Max Taylor, ‘Child Pornography and the Internet: Perpetuating a Cycle of Abuse’ (2002) 23 Deviant Behavior 331
● Richard Wortley and Stephen Smallbone, ‘Child Pornography on the Internet’ (2006) Problem-Oriented Guides for Police Problem-Specific Guides Series No 41





