Who is responsible for Social Media Etiquette?

There’s an ongoing debate on who exactly coined the term “social media”. According to an article by Forbes, three possible candidates have claimed to have used the term, but there has been no reliable indication as to who was the first-ever say those two brilliant words. With various social media sites sprouting lately, it no longer matters who coined the term but more importantly if you are up to date on the latest trends and digital socialising means. As with all great inventions, it boils down to how we chose to explore its many merits. On a recent survey conducted with 1,080 nationally representative Malaysians, Vase.ai looked into social media usage, perception on social media regulation and the events of the mass shooting at the mosques in Christchurch.

Perception of the Role of Social Media for Social Media Regulation

Given the associations and alleged social media accounts affiliated with terrorist activities, netizens have raised questions on the accountability of social platforms. Who is responsible? According to our recent survey, the majority (82%) believe that social media companies should be responsible for preventing racial, religious and gender prejudice, as well as sexual or violent content distribution. Almost all (92%) also believed that it was the social media platforms responsibility to keep content safe for consumption of all viewers.

Forbes recently shared insights on how monopolisation takes place on social media platforms, causing the content to be saturated.

Giant corporations and media owners leverage on social media algorithms to have a monopoly over information and data. Netizens may only be able to view a fraction of news or what media companies are choosing to share at large. This way, the opinions and perception of netizens are easily compromised and controlled. Additionally, the perception of sharing and disseminating data on social media is deemed less consequential due to its degree of anonymity. Only a fraction (20%) of those who believed social media companies are to be held responsible say that the companies are currently doing enough, while a majority (71%) believe there is some but not enough effort to prevent racial, religious, and gender prejudice, as well as sexual or violent content distribution.


Are social media companies doing enough to prevent racial prejudice, religious prejudice, gender prejudice, sexual or violent content distribution?

The public’s concern for a safe social media environment can further be proven by a majority (76%) of respondents who believe that social media companies should have the authority to control what is posted on social media.


Are social media companies doing enough to keep content safe for consumption/viewers?

Also, respondents were in favour of social media companies taking down content that invoke hate, prejudice, violence or sex.


Should social media companies be allowed to take down your content that invokes hate, prejudice, violence or sex?

With this level of power and authority granted to social media companies, it could encourage a greater level of ethics and for social media to consciously work in setting a safer social space. This may not guarantee lesser prejudice and negative speech on social platforms, but it can aid in curbing the spread of such content.

Perception of the Role of Government for Social Media Regulation

The Guardian recently reported Mark Zuckerberg calling for stronger regulation of internet.

The calls for regulation recently has encouraged the need for ethical practices in media and has caused corporations and individuals to think twice about disseminating information recklessly. When we asked 1,080 nationally representative Malaysians on the concern of public education on social media etiquette, almost all (86%) respondents feel that the government is responsible for such education, Additionally, more than half (61%) of respondents who felt this way, do not think the government is doing enough to educate the public on social media etiquette.

In addition to education, the majority (80%) also were in favour of the government imposing a penalty for inappropriate content being shared and most (79%) believe that those who create the content, should be punished.


Who should the government punish if there is any inappropriate content being spread out?

Effective regulation of social media content could further be enforced with social media giants playing their part in being an informant for the government. A majority (70%) of respondents feel that social media giants should own responsibility for sharing information about suspicious activities that help the government prevent terror events from happening.

Understanding Social Media behaviours

Netizens share and contribute their thoughts to a myriad of social media news and content. Almost one quarter (23%) admitted having shared on social media on what others may perceive as racial prejudice, religious prejudice, gender prejudice, sexual or violent content. These netizens regard that disseminating these types of content could be of positive value to others, unknowing of the adverse effects. Almost half (47%) believed that sharing this type of content was creating awareness against social issues.


Why did you share content on social media that others may think contains racial prejudice, religious prejudice, gender prejudice, sexual or violent content?

While the content shared in itself creates a negative impact, the intent behind it reveals an evident lack of awareness and education on what deems as harmful and how spreading such content can impact and influence an individual or even more so, a society. On the positive side, almost all (83%) of respondents who revealed to come across content that they found offensive, a majority (71%) of them took action in reporting the content.

Perception and behaviours surrounding the mass shooting video of mosques in Christchurch

Almost all (91%) of respondents were aware of the video on the shootings that took place in mosques in Christchurch, New Zealand. Out of this, a minor 21% said they watched the whole video, while almost half (44%) watched only a part of the video.


Did you watch the mass shooting video that took place at the mosques in Christchurch?

The content was deemed offensive by a majority (83%) of those who fully and partially watched it and almost all (89%) believed that the video should have been removed by social media companies. While there was an attempt to remove the content, most (71%) who viewed the content believed that social media companies did not do a good job in removing the mass shooting video and pictures. In efforts to advocate and maintain a safe environment online, almost all (84%) believed that social media companies should vet through the content shared by the people before it goes out to the public. For example, the platform could delay all live streaming so that they have sufficient time to check and vet the content.

The heights at which content reaches a viral context is unpredictable on social platforms. Media corporations, social media companies and the government are on the spotlight as parties who are responsible for regulating the distribution of content deemed to evoke any form of prejudice and violent or sexual content. The question is then on the accountability of users. How can we as avid users of these platforms aid in ensuring there is sufficient awareness on sensitive issues, and at the same time communicate it tactfully and effectively?

*This article first appeared on Vase.ai. All stats and findings are efforts of Vase.ai online survey. To view full statistics of the findings, head on over to the Vase.ai dashboard here. To utilise the data, do firstly have a read on Vase.ai terms.

Leave a Reply

Your email address will not be published. Required fields are marked *