Importance of social media governance in age of AI | Connie L. Braun and Juliana Saxberg

By Connie L. Braun and Juliana Saxberg ·

Law360 Canada (June 26, 2024, 10:56 AM EDT) --
Connie Braun
Connie Braun
Juliana Saxberg
Juliana Saxberg
We are in an era where tweets and other social media messaging very quickly shape public opinion. Hashtags spark movements, campaigns and protests.

How do government agencies keep a finger on the pulse of public sentiment? How can organizations ensure responsible use of social platforms while safeguarding privacy? Social media monitoring and governance are essential aspects for organizations to navigate the dynamic landscape of social media effectively. For this reason, the importance of establishing, enforcing and supporting social media governance cannot be overstated.

Social media governance consists of the laws and regulations, systems, policies and processes that determine how individuals may use social media. Responsive social media governance enables governments and organizations to engage with key stakeholders including citizens, customers, legal experts and technology experts to address concerns and maintain transparency. Necessarily, quality governance includes:

  • social media monitoring, assessment and data collection;
  • decision-making models, processes and procedures;
  • designing and implementing compliance programs;
  • incident response and crisis management planning;
  • logical and reasonable penalties for non-compliance and/or infringement.

Agile governance strategies clarify the roles of decision-makers and streamline decision-making within larger organizations. With the constant evolution of social media platforms and the associated risks, we need to engage in continuous monitoring of emerging threats, co-ordinated disinformation campaigns and new forms of online harassment. Overall, governance helps to manage and reduce social media risk, thereby protecting both human and intangible assets.

On the human side, the widespread adoption of social media by Canadians has created specific, challenging problems for risk management. Statistics Canada monitors the impact of social media use on Canadians. Findings from the most recent study include the following:

  • with a population of more than 41 million in 2024, more than 36 million are Internet users representing 94 per cent of the population;
  • of the 36 million Internet users, 33 million are social media subscribers representing 91 per cent of users;
  • most often, Canadians are using social media to stay in contact with friends and family;
  • on average, Canadians spend approximately two hours per day on social media;
  • the largest age group using social media in Canada falls in the 24- to 35-year-old category;
  • Facebook (now, Meta) continues as the most popular social media platform with nearly 27 million subscribers in Canada.

While social media provides a means to strengthen connections and engagement between people, organizations and government, risks abound. To name a few, there are overuse and addiction, exposure to inappropriate content, cyberbullying, introduction to predators and other malicious actors, as well as privacy and data security, identity theft and mental health challenges. As we have written previously, AI-driven mis- and disinformation regularly published on social media profoundly impact real-world governance, including enabling interference in elections and other truly inappropriate uses.

In response to the urgent need for regulators to address social media harms, the federal government has introduced a number of compliance instruments to support effective social media governance. In 2022, the minister of Innovation, Science, and Industry introduced the long-overdue Digital Charter Implementation Act to modernize individual privacy rights in private sector activities, along with setting down rules for responsible AI development and use. Most Canadian information and privacy commissioners recommend or require organizations to conduct Privacy Impact Assessments to proactively identify risks and prepare mitigation plans for third-party software and social media platforms.

By promoting digital literacy initiatives, governments and organizations can do a lot to educate individuals on responsible social media use, identifying misinformation and protecting their online privacy and security. And, given the global nature of social media, international co-operation and harmonization of governance frameworks across jurisdictions could enhance their effectiveness and consistency. Nonstop striving for transparency in social media governance practices along with accountability for enforcing policies consistently and fairly provides a pathway to achieve success.

The recently passed Online Streaming Act updates the Broadcasting Act to better regulate Internet video and digital media and to prioritize the “needs and interests” of Canada along with including greater cultural diversity in broadcast programming. This is supported by the much-discussed Online News Act that requires electronic publishers of news content to contribute financially to the sustainability and fairness of the digital news marketplace. Finally, the recently introduced Online Harms Act proposes to amend the Criminal Code and other instruments to enable individuals and the government to hold online platforms accountable for the harm caused by content that they host.

Taken together, these legislative activities constitute a good foundation for establishing governance for social media. As we have previously recommended for AI risk management, well-governed Canadian organizations should earnestly and energetically consider designing and implementing appropriately sized social media governance strategies that will guard stakeholders and assets from the unique harms presented by social media in the age of AI. Effective social media governance requires a multi-faceted approach involving policies, technology, training and ongoing adaptation to the rapidly evolving social media landscape. Doing so will mean being able to safely enjoy the use of social media applications while taking comfort in the knowledge that we are being protected.

Connie L. Braun is a product adoption and learning consultant with LexisNexis Canada. Juliana Saxberg is a practice area consultant in corporate and public markets with LexisNexis Canada.
 
The opinions expressed are those of the author(s) and do not necessarily reflect the views of the author’s firm, its clients, Law360 Canada, LexisNexis Canada or any of its or their respective affiliates. This article is for general information purposes and is neither intended to be nor should be taken as legal advice.


Interested in writing for us? To learn more about how you can add your voice to Law360 Canada, contact Analysis Editor Peter Carter at peter.carter@lexisnexis.ca or call 647-776-6740.