Last week, Parliament passed a Bill to make critical amendments to the Children and Young Persons Act including, among others, greater protection for abused and neglected children, and more support for families experiencing parent-child conflict.
Notably too, the Act has been amended to extend identity protection of youth offenders aged from 16 to 18. Such protection restricts the publication and broadcast of any information relating to court proceedings that reveals their names, images, addresses, schools or other particulars.
The change helps to ensure that efforts to rehabilitate and reintegrate them are more effective, as I had explained in an earlier commentary I wrote for TODAY in 2015.
With two additional years of confidentiality for young offenders, we can derive considerable social and developmental returns. This is an especially enlightened move given that public identities assume an oversized presence in our digitally-connected society.
However, an area that the Act does not explicitly provide for is children's digital rights. And this is an area in which I believe Singapore could do more.
A Google survey released earlier this year found that Singaporean children get their first Internet-connected device at the age of eight, below the global average of 10, making them among the youngest in the world to go online.
What this means is that we must pay closer attention to the digital rights of children, who are increasingly the target of commercial exploitation online. Principally, our society needs to ensure that children enjoy the rights to privacy from their data being harvested and mined for profit.
The same Google survey found that while teachers here are worried about cyber bullying, parents are most concerned about their children's privacy and security online.
Children's privacy and security online warrant urgent, closer attention. Take the example of the Internet of Toys. In January 2018, American electronic toy company VTech was fined US$650,000 by the United States Federal Trade Commission for failing to protect the privacy of children using its gadgets.
VTech sells electronic toys and children use the company's Kid Connect app in conjunction with these toys. Through this app, VTech collected a vast trove of information including children's names, contact information, photographs and audio files, without seeking consent from parents or even informing them.
A security breach revealed that the company had failed to secure such data, allowing hackers untrammelled access.
We need to hold companies to task for their data collection processes, especially those pertaining to children, by introducing requirements for greater transparency in their practices.
Companies must also be urged to present their terms and conditions in a far more child-friendly and accessible tone so that children know what they are signing up for.
Even for companies that do have adequate data protection measures, we must consider how such data is used to develop algorithms to create or serve up more content targeted at children.
Gaming sites, online shopping, content sharing and social media platforms profit significantly from the digital native market.
However, the companies behind them often fail to take into account the responsibilities they have towards these young users and simply find more ways to engage them, for longer periods, and with ever more alluring content that aligns with their commercial interests.
Young people find themselves irresistibly drawn to these platforms, often using their digital devices excessively.
The burden is then shifted to schools and parents who must develop strategies for young people to manage their use of digital media. In such a climate, we must exhort technology companies to do more to create a safe digital environment for young people.
In February this year, Instagram head Adam Mosseri met with the British Health Secretary Matt Hancock to discuss new measures the company would introduce to handle content promoting self-harm and suicide.
This meeting arose in part from the suicide of 14-year-old British teen Molly Russell. Her family believed she took her own life because her Instagram account was replete with material about depression, self-harm and suicide.
As her father put it: "Instagram helped kill my daughter". Instagram has since pledged to better identify content relating to self-harm and hide it behind "sensitivity screens".
Closer to home, a survey by international research agency YouGov found that a third of young adults here have self-harmed, with one in 10 doing so frequently. Young Singaporeans' exposure to such content (and its effects) should therefore be more closely investigated.
When our children go online, they are vulnerable to risks of commercial exploitation, exposure to inappropriate content, cyberbullying and online sexual abuse.
We must do our best to shield them from these dangers. Thankfully, Singapore has not experienced such high-profile incidents relating to the breach of children's digital privacy or online harm.
As avid users of technology however, young people in Singapore are similarly susceptible to such risks, even as we embrace the distinct benefits they can gain from the digital world.
Another key tranche to young people's digital rights must therefore be ensuring access to digital literacy education to empower them with the skills to minimise digital risks and maximise gains.
In this regard, Singapore has done well through efforts by our various ministries and agencies. On their part, technology companies can afford to do even more to initiate and boost public education efforts.
Our children are born into a heavily digitalised environment. Many have a digital footprint even before they have sprung from their mothers' wombs, with their ultrasound images being shared on social media. That is but the beginning of their digital journey.
In light of this prevailing reality, we must make a concerted effort to ensure that children's digital rights are respected, supported and championed. In 2014, the United Nations Committee on the Rights of the Child held a Day of General Discussion on digital media and children's rights.
It also recommended that member states adopt a national coordinating framework to oversee all efforts relating to children’s digital rights and urged greater international cooperation on the same.
Given our Whole of Government approach in tackling our many national priorities, protecting and advancing children's digital rights with a national coordinating framework is a task Singapore can and should actively undertake.
While the Children and Young Persons Act does not explicitly provide for children's digital rights, the intense digitalisation we are experiencing suggests we may need to amend the law in future to do so.
Indeed, given our push to be a Smart Nation, Singapore can also be a key advocate of this movement on the world stage. This will certainly be in the interest of the healthy and positive development of our digital native children in Singapore and beyond.
Lim Sun Sun is Professor of Communication and Technology and Head of Humanities, Arts and Social Sciences at the Singapore University of Technology and Design. She is also a Nominated Member of Parliament.
This article was first published on Today here on 11 September 2019. Information is correct at the time of publication.