MACRA calls for consented efforts in child protection from harmful digital content

* A National Child Online Protection Strategy is being developed with technical assistance from ITU

* It is of paramount essence as children are at the forefront at adapting and adopting to new connected technologies

By Andrew Magombo, MANA

Malawi Communications Regulatory Authority (MACRA) has asked stakeholders for collective responsibility in safe guarding children from exposure to harmful online content.


At a consultative meeting on development of a National Child Online Protection Strategy in Lilongwe, MACRA Board member, Malla Kawale said parents, teachers and the society have a role to play in protecting children in the digital area.

Kawale said the national strategy, which is being developed with technical assistance from the International Telecommunications Union (ITU), is of paramount essence as children are at the forefront at adapting and adopting to new connected technologies.

She expressed worry that it’s trendy in this digital era for children going online and joining social media without any regulation from their parents.

MACRA Board member, Malla Kawale addressing the delegates

“While tapping into the positive impact of the internet, these children are also being exposed to harmful content and risks which have consequences both virtually and off-line,” Kawale said.

“We need views from the public on how we can combat the challenges because everyone is affected — hence it is our duty to ensure that their freedom of expression is also protected.”

She further highlighted that MACRA has taken several steps in dealing with online child abuse — including awareness and instituting legal frameworks to drive the already established national policy on Child Online Protection.

Two representatives of Kamuzu Academy

Some of the schools that participated at the workshop included Bwaila and Lilongwe Girls secondary schools and Kamuzu Academy from Kasungu.

Kamuzu Academy Head Girl, Amanda Masi said the development of the National Child Online Protection Strategy is timely as technology is rapidly advancing.

“There are certain instances whereby our photos are shared online without consent and there is also cyber bullying ,” she said. “So, we hope that strategy will go a long way in protecting our future.”

Head of communication emergency response team at MACRA, Christopher Banda said issues of online child abuse are global as advancement of internet has drawn out inquisitive minds seeking to explore more.

“A child can learn a lot of things from a stranger that the parents are not aware of, and sometimes you only notice manifestation of the behavior change when the damage is already done,” he said.


The Electronic Transactions & Cybersecurity Act (2016) mandates MACRA to ensure that information and communication technology (ICT) users including children are protected from its undesirable impacts.

Last year, when a survey indicated that two-thirds of youths who were reached out to had encountered potential harms on social media but only one in six reported it, UK’s communications watchdog Ofcom encouraged young people to report harmful online content.

According to the The Guardian, Ofcom found that 67% of people aged between 13 and 24 had seen potentially harmful content online, although only 17% report it.

The report further said online safety bill was being formulated which would require social media companies to protect children and adults from online harms.

“The most common potential harm encountered online was offensive or bad language (28%), according to respondents in Ofcom’s Online Nation 2022 report, followed by: misinformation (23%); scams, fraud and phishing (22%); unwelcome friend or follow requests (21%) and trolling (17%). A further 14% had experienced bullying, abusive behaviour and threats online.”

The Guardian also reported that Ofcom launched a campaign with TikTok influencer Lewis Leigh, supported by behavioural psychologist Jo Hemmings that saw TikTok removing more than 85 million pieces of content in the final three months of last year — with nearly 5% of that total coming from user referrals.

Instagram is reported to have removed more than 43 million pieces of content over the same period, of which more than 6% came from users reporting or flagging content.—Additional reporting by Duncan Mlanjira, Maravi Express