• If Laksaboy Forums appears down for you, you can google for "Laksaboy" as it will always be updated with the current URL.

    Due to MDA website filtering, please update your bookmark to https://laksaboyforum.xyz

    1. For any advertising enqueries or technical difficulties (e.g. registration or account issues), please send us a Private Message or contact us via our Contact Form and we will reply to you promptly.

Commentary: Online spaces should be built with children, not for them

LaksaNews

Myth
Member
SINGAPORE: At this year’s National Day Rally, Prime Minister Lawrence Wong highlighted the growing concern of excessive screen time for children.

Mr Wong emphasised the need to strike a balance between protecting youth and empowering them to leverage technology, noting that Singapore is observing the experience of countries that have restricted children’s access to social media platforms.

Australia’s recent decision to block YouTube for users aged 16 years and younger has reignited debate over age-based social media bans.

Exempted originally because of its educational value, the platform will now be grouped with TikTok, Instagram and other platforms that, according to the Australian government, need tighter controls to keep children safe.

What prompted the reversal? A survey by Australia’s internet regulator, in which four in 10 children reported seeing harmful content on YouTube.

NOT JUST CONTENT, BUT SYSTEMS​


However, harmful content is only one part of the problem. What shapes a child’s digital experience is the underlying infrastructure: autoplay, algorithmic recommendation and infinite scroll.

These systems are designed to keep users engaged, nudge behaviours and maximise screen time. They trap users into doing things they didn’t mean to: spend longer on an app, consume harmful content from autoplay or make purchases.

When “risky-by-design” features undermine online safety, age-based bans begin to look less convincing. A hard stop at age 16 might simplify enforcement, but it obscures the more complex reality.

Current design does not care whether a user is 15 or 16; the algorithms work the same. Tech companies must be held to higher standards of safety especially when their platforms shape how children learn, socialise and play.

LEARNING HOW TO SWIM SAFELY​


In Singapore, researchers from the National Institute of Education recently weighed in on a related question: Should schools ban digital devices?

The researchers argued that denying students access to digital devices out of fear is akin to refusing to teach them to swim for fear of drowning. This echoes what Mr Wong said at the National Day Rally – that Singapore must help children build digital resilience, not shelter them indefinitely.

Instead of denying access, we should be equipping young people with skills to self-regulate their digital use and to swim safely.

Related:​



Regulatory momentum is growing in the Asia-Pacific region. Governments are trialling technology that keeps underaged children out of restricted sites, and requiring greater transparency in the form of safety audits. Platforms like YouTube are rolling out AI-driven tools to verify user age and reduce exposure to certain content.

These are steps in the right direction. But the pattern remains familiar: a reactive cycle of restrictions, appeals and patchwork solutions.

When one platform is regulated, another emerges. Rules are drawn up in response to high-profile incidents, not guided by long-term strategy.

There is no denying the need for safeguards. But if bans and age restrictions are all we have, we are playing a game of catch-up that cannot be won.

We need a different approach. We should not treat children as victims requiring protection, but as participants in policy design.

Related:​


PLANNING WITH, NOT ONLY FOR, CHILDREN AND YOUTH​


Children and youth are the largest group of online users. But when UNICEF engaged adolescents around the world on AI, one noted that “most of the technologies that exist are not made with children in mind”.

The message from the teens was clear: They want to be engaged in AI’s development and future.

Consulting with youth will help us better anticipate and shape a fast-changing digital environment.

For instance, UNICEF is currently engaging children from Spain to Sierra Leone on how governments, tech companies, and caregivers can better uphold their best interests online. Through a series of hands-on workshops, children are being invited to explore how digital systems influence their lives, and how they want those systems to function.

Closer to home, the National Youth Council is collaborating with youth through the #TechHacks youth panel to examine how online harm can be mitigated and digital well-being, enhanced.

Of course, co-designing with children and young people is not without challenges. Power asymmetries, varying cognitive development stages and implementation complexity must be managed with extra care.

But these are not reasons to avoid participation. With the right support and structure, young people can meaningfully participate in shaping systems that will shape them in return.

CHANGING THE FRAME​


If we are to match the pace of digital change, we must update not just our policies, but our posture – from gatekeeping to partnering youth.

We will need to rethink what digital maturity looks like. Instead of asking, “How do we keep children off these platforms?”, we could ask: “What would these platforms look like if children had a say in how they worked?”

Instead of measuring safety by access restrictions, we could consider: What skills, values and supports help young people flourish in a world shaped by intelligent systems?

Instead of building around children, we could try building with them.

Regulation is an important start, but it’s not enough. What is needed now is a longer view – one that begins with the recognition that children will not just live in the digital future. They will build it.

As Mr Wong reminded us, every generation faces anxiety over new technologies and some of those fears, like those levelled at comics and rock and roll, have faded with time. The goal is not to dismiss concerns, but to respond with insight rather than instinct.

Chew Han Ei is Senior Research Fellow and Head of Governance and Economy at the Institute of Policy Studies, National University of Singapore.

Continue reading...
 
Back
Top