SINGAPORE: There has been a rise in the amount of harmful content Singaporeans encounter on social media platforms, a government survey has found.
While cyberbullying and sexual content remained the most common, there was a significant climb in content that incite racial or religious tension, as well as violent content, said the Ministry of Digital Development and Information in a press release on Thursday (Jul 25).
Social media also carried more harmful content than other platforms, such as messaging apps, search engines and gaming platforms, the survey found.
MDDI, previously known as the Ministry of Communications and Information, conducted the annual Online Safety Poll in April this year.
It surveyed 2,098 Singapore respondents aged 15 years old and above, to understand the experiences of Singapore users with harmful online content, and their action to address such content.
It included social media services designated by the Infocomm Media Development Authority (IMDA) under the Code of Practice for Online Safety, namely Facebook, HardwareZone, Instagram, TikTok, X and YouTube.
About three-quarters (74 per cent) of those polled encountered harmful content online, an increase from 65 per cent last year.
Two-thirds of respondents (66 per cent) encountered the content on the said social media platforms, up from 57 per cent last year.
In comparison, 28 per cent came across such content on other platforms, such as messaging websites and apps, search engines, email, news websites, gaming platforms and app stores, said MDDI. This was similar to last year’s level.
Cyberbullying and sexual content remained the most common types of harmful content on social media, with 45 per cent of respondents encountering them.
However, there was a “notable increase” from last year in encounters with content that incite racial or religious tension (13 per cent increase) and violent content (19 per cent increase), said MDDI.
Close to 60 per cent of respondents came across the harmful content on Facebook, while 45 per cent faced them on Instagram. Both platforms are owned by Meta.
“While the prevalence of harmful content on these platforms may be partially explained by their bigger user base compared to other platforms, it also serves as a reminder of the bigger responsibility these platforms bear,” said MDDI.
When taking action against harmful social media content, only a quarter of respondents reported to the platform. About one-third blocked the offending account or user.
Eight in 10 of those who tried making reports experienced issues with the reporting process, noted MDDI.
These included the platforms not removing the content in question or disabling the account responsible, not providing an update on the outcome, and also allowing the removed content to be posted again.
However, six in 10 respondents simply ignored the nefarious content without taking further action.
Commonly cited reasons included respondents not seeing the need to do anything, being unconcerned about the issue, or believing that making a report would not make a difference.
“Given the complex, dynamic and multi-faceted nature of online harms, the government, industry, and people must work together to build a safer online environment,” MDDI said.
Amendments to the Broadcasting Act kicked in in February last year, letting the government quickly disable access to egregious content on the designated social media platforms.
The Code of Practice for Online Safety also came into effect in July last year, requiring the platforms to take steps to minimise children’s exposure to inappropriate content.
The platforms are due to submit their first online safety compliance reports by the end of this month, said MDDI.
“It will provide greater transparency to help users understand the effectiveness of each platform in addressing online harms. The IMDA will evaluate their compliance and assess if any requirements need to be tightened,” said the ministry.
Minister for Digital Development and Information Josephine Teo speaking at the Reuters NEXT conference in Singapore on July 9, 2024. (Photo: REUTERS/Ore Huiying)
Earlier this month, Minister for Digital Development and Information Josephine Teo also announced a new code of practice for app stores, requiring them to implement age assurance measures. More details will be shared in due course, said MDDI.
“Beyond the Government’s legislative moves, the survey findings showed that there is room for all stakeholders, especially designated social media services, to do more to reduce harmful online content and to make the reporting process easier and more effective,” it said.
The ministry also urged users to do their part to act proactively against harmful online content by reporting to the respective platforms.
Workshops, webinars, and family activities are also being organised as part of the IMDA’s Digital for Life movement, to provide users with knowledge and tools to keep themselves and their children safe online, said MDDI.
Continue reading...
While cyberbullying and sexual content remained the most common, there was a significant climb in content that incite racial or religious tension, as well as violent content, said the Ministry of Digital Development and Information in a press release on Thursday (Jul 25).
Social media also carried more harmful content than other platforms, such as messaging apps, search engines and gaming platforms, the survey found.
MDDI, previously known as the Ministry of Communications and Information, conducted the annual Online Safety Poll in April this year.
It surveyed 2,098 Singapore respondents aged 15 years old and above, to understand the experiences of Singapore users with harmful online content, and their action to address such content.
It included social media services designated by the Infocomm Media Development Authority (IMDA) under the Code of Practice for Online Safety, namely Facebook, HardwareZone, Instagram, TikTok, X and YouTube.
INCREASE IN HARMFUL CONTENT
About three-quarters (74 per cent) of those polled encountered harmful content online, an increase from 65 per cent last year.
Two-thirds of respondents (66 per cent) encountered the content on the said social media platforms, up from 57 per cent last year.
In comparison, 28 per cent came across such content on other platforms, such as messaging websites and apps, search engines, email, news websites, gaming platforms and app stores, said MDDI. This was similar to last year’s level.
Cyberbullying and sexual content remained the most common types of harmful content on social media, with 45 per cent of respondents encountering them.
However, there was a “notable increase” from last year in encounters with content that incite racial or religious tension (13 per cent increase) and violent content (19 per cent increase), said MDDI.
Close to 60 per cent of respondents came across the harmful content on Facebook, while 45 per cent faced them on Instagram. Both platforms are owned by Meta.
Related:
“While the prevalence of harmful content on these platforms may be partially explained by their bigger user base compared to other platforms, it also serves as a reminder of the bigger responsibility these platforms bear,” said MDDI.
DEALING WITH ONLINE HARMS
When taking action against harmful social media content, only a quarter of respondents reported to the platform. About one-third blocked the offending account or user.
Eight in 10 of those who tried making reports experienced issues with the reporting process, noted MDDI.
These included the platforms not removing the content in question or disabling the account responsible, not providing an update on the outcome, and also allowing the removed content to be posted again.
However, six in 10 respondents simply ignored the nefarious content without taking further action.
Commonly cited reasons included respondents not seeing the need to do anything, being unconcerned about the issue, or believing that making a report would not make a difference.
“Given the complex, dynamic and multi-faceted nature of online harms, the government, industry, and people must work together to build a safer online environment,” MDDI said.
Related:
Amendments to the Broadcasting Act kicked in in February last year, letting the government quickly disable access to egregious content on the designated social media platforms.
The Code of Practice for Online Safety also came into effect in July last year, requiring the platforms to take steps to minimise children’s exposure to inappropriate content.
The platforms are due to submit their first online safety compliance reports by the end of this month, said MDDI.
“It will provide greater transparency to help users understand the effectiveness of each platform in addressing online harms. The IMDA will evaluate their compliance and assess if any requirements need to be tightened,” said the ministry.
Minister for Digital Development and Information Josephine Teo speaking at the Reuters NEXT conference in Singapore on July 9, 2024. (Photo: REUTERS/Ore Huiying)
Earlier this month, Minister for Digital Development and Information Josephine Teo also announced a new code of practice for app stores, requiring them to implement age assurance measures. More details will be shared in due course, said MDDI.
“Beyond the Government’s legislative moves, the survey findings showed that there is room for all stakeholders, especially designated social media services, to do more to reduce harmful online content and to make the reporting process easier and more effective,” it said.
The ministry also urged users to do their part to act proactively against harmful online content by reporting to the respective platforms.
Workshops, webinars, and family activities are also being organised as part of the IMDA’s Digital for Life movement, to provide users with knowledge and tools to keep themselves and their children safe online, said MDDI.
Continue reading...
