According To Survey 1 In 2 Singaporeans Have Experienced Online Harms Via Social Media : Josephine Teo

Online_Safety_Social_Media_Singapore_Rules

Social media platforms in Singapore are now required to implement community standards and content moderation processes to minimise users’ risk of exposure to harmful online content, under newly proposed codes.

“Online safety is a growing concern and Singapore is not alone in seeking stronger safeguards for our people”.

In a Facebook post Minister Josephine Teo wrote, “over the years, social media services have put in place measures to ensure user safety on their platforms”.

Still, more can be done given the evolving nature of harms on these platforms and the socio-cultural context of our society.

In March this year, I said that we will introduce new Codes of Practice to deal with harmful online content accessible by users in Singapore, especially our children.

WHAT ARE THE TWO PROPOSED CODES OF PRACTICE :

Minister Josephine Teo said there is an update on the proposed code and they are making progress & have developed them for social media services.

She also shared that one of the survey conducted by the Sunlight Alliance for Action (AfA) in Jan 2022 found that 1 in 2 Singaporeans personally experienced online harms, with teenagers and young adults forming the majority of those who have experienced them.

Through various engagement sessions with over 300 stakeholders, we have also noted calls to develop more support mechanisms and infrastructure for victims of online harms, she added.

The two proposed Codes are:

Code of Practice for Online Safety : We are proposing for designated social media services to have system-wide processes to enhance online safety for all users, especially young users. Younger users’ exposure to harmful or inappropriate content, including those promoting dangerous self-harming acts, should be minimised through measures such as community standards. Users who encounter or suffer from online harms – such as the non-consensual distribution of their intimate videos – should have access to user reporting mechanisms to flag them out to social media services to take appropriate action.

Content Code for Social Media Services. We are proposing to protect users from egregious harms – like sexual harm and self-harm, or as COVID-19 has shown, content that can threaten public health.

The proposed Code will allow IMDA to direct social media services to take action against harmful online content so as to protect users. We acknowledge the efforts of the tech companies to better support their users, and are now engaging them on the Codes, she added.

We will also continue to build on the good work of passionate individuals, such as the Sunlight AfA, and work towards closing the online safety gap.

She also wrote that she will look forward for public views when the Public Consultation Exercise commences in July.

Total
0
Shares
Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts