UNICEF Innovation Fund Seeks Startups Offering Tech Solutions to Children’s Online Safety
In an ever-enveloping digital age where children are at numerous risks, UNICEF is looking to fund startups that are working to protect them.

The pandemic that has changed life as we know it has made ‘disconnecting’ during a time of remote work and virtual school near impossible. With screen time increased across all age groups, children are at an increased risk for cyberbullying, harmful content and data privacy risk among other issues. That’s why the UNICEF Innovation Fund 2020 has chosen to focus on online child safety this year.
In conjunction with the initiatives, Global Partnership to End Violence Against Children and Giga, the fund is looking to invest in startups using fourth industrial revolution technology to address the risks children face..
Startups using machine learning (ML), artificial intelligence (AI), blockchain, or extended reality (virtual and augmented reality (VR/AR)) and that offer/will offer their product open-source should consider applying for the $100k equity-free investments made by UNICEF.
The categories are broken into four overarching themes of digital risks to children: Content, Contact, Conduct and Contract Risks.
Content Risks
Are you building tools and models to make online content, social media and gaming platforms and other services safe for children? Or are you using frontier technologies to tackle inappropriate content?
- Using data science and AI to identify and analyse hateful content
- Leveraging blockchain to verify online content
- Platform agnostic tools to identify and flag inappropriate online content on websites that cater to children
- ML applications to advise and caution children about age-inappropriate content
Contact Risks
Are you building platforms and tools to prevent online child abuse and exploitation? Or are you generating insights to assess and mitigate the threats and harms in digital environments?
- Tools to detect and stop live-streaming of child sexual abuse performed in front of a camera (usually referred to as live-streaming of CSEA)
- Block adults’ access to children for the purpose of sexual abuse on digital platforms (usually referred to as online sexual grooming or solicitation)
- Platforms that directly target online child sexual offenders and adults with a sexual interest in children (eg. flagging those accounts)
- Using ML/AI to detect, remove and report images, videos with sexual content involving children and adolescents (often referred to as child sexual abuse material, or CSAM)
Conduct Risks
Are you leveraging existing and new technologies to educate children and young people about digital risks awareness, and appropriate and safe behaviors in digital environments?
- Game-based educational tools and guidance for children to learn about the concepts of privacy, respect and sharing of content online
- Platforms to support and educate parents/guardians to keep children safe online
- Using chatbots to support victims of bullying and harassment and facilitate reporting of abuse
- Using ML/AI to monitor and model potential risks to children (mindful of ethical collection of data, privacy laws and age appropriate developmental needs of children)
Contract Risks
Are you creating tools and platforms leveraging new technologies to protect children’s and other data online? Or are you identifying and blocking inappropriate commercial platforms?
- XR solutions that teach children data literacy skills at scale and support employee training programmes on use of children’s data
- Tools that protect children’s data by allowing children or other trusted entities to control access by owning their data by default
- Mechanisms to review and provide legitimacy to information shared online
- Creating trusted collections of information and content curated and voted on by verified sources against transparent criteria
To learn more about the application process and eligibility criteria, click here.