The Internet Governance Forum (IGF) is a world platform that involves people from various stakeholder groups in discussions on public policy issues relating to the Internet. The IGF facilitates a common understanding of how to maximize Internet opportunities and addresses risks and challenges that arise out of the technological development.
The Internet Governance Forum 2018 held in Paris (12 - 14 November 2018) was dedicated to the Internet of trust. The following are highlights of its main message on media and content.
The all important message is how the media impacts the digital world. The intersection of media with the Internet and technology has revolutionized the sharing of knowledge, which involves cultures, economies, political systems, and various aspects of everyday life. Today it is ever more important to consider and address the negative consequences of media and changing digital content.
The following are highlights from the IGF:
Specific messages relate to the following areas:
Media culture and power
● The ways in which we distribute and consume media has led to, and has potential for further, ambiguous impacts, which should be evaluated with respect to media relationships to power sources.
● These impacts include: content versus propaganda; ‘fake news’ versus truth; media freedoms versus limits on freedom of expression (to stop ‘fake news’); and the ability of social media platforms to distribute misinformation versus their role as a tool for human engagement.
● Regulation is a possible ‘slippery path’ for control over the media and related digital systems. There exist two ‘extremes’: a completely ‘hands off’ approach by governments that leaves private provider accountable for the appropriate administration of media products and a purely state-run system that oversees and investigates ‘fake news’. An appropriate ‘balance’ should be struck between the two.
Information disorders
● ‘Fake news’ is a popular but inherently political and potentially misleading term. It is often used as a response to discredit accurate information.
● Breaking down the various ‘types’ of non-truthful information is an essential starting point for considering how to address the issue. Different kinds of misleading information, or “information disorders” (e.g., misinformation, disinformation, malinformation) will require different tactical responses.
● Precise terminology can also properly address the scope of what the ‘fake’ information can be or cause. The range can be as broad as rumor or propaganda to cyber hybrid threats; or radicalization, extremism and hate-speech to other forms of intimidation.
● Some journalist and media organisations are taking steps against the spread of misinformation. Reporters Without Borders (RWB) advocates for standard-setting in sectors related to the sharing of journalistic information. It stresses that this step needs to be in relation to the quality of the product rather than content which is a subjective, potentially political matter. RWB is calling for applying the ethics and codes of conduct to processes related to new forms of media, just as they are for traditional journalistic type of media. Algorithms that aggregate content should also operate under such standards.
● “Information disorder” trends need constant monitoring and responses could be re-calibrated accordingly.
Fake news
● ‘Fake news’ is a broad term. It is often equated with the concept of ‘post-truth politics’ and is most easily understandable through the spread of false information in a political context.
● Digital literacy can help audiences become more discerning with respect to the information that they receive through social media and messaging applications.
● ‘Big data’ and the use of data needs to be monitored, and potentially regulated, with respect to its ability to feed into and be used for the dissemination of ‘fake news’.
● Governments are exploring or implementing different methods to control ‘fake news’, including self-regulatory processes, legislative measures, possible criminalization, producing guidelines to avoid engaging in the spreading of false information, partnering with stakeholders to tackle particular issues, implementing digital literacy programs, and raising public awareness.
● Private sector representatives can pursue policies on hate speech and offensive content removal, providing more information to users about how content is managed. They can use independent fact-checkers to quickly check suspect content and withhold advertising revenue from websites that engage in ‘contentious’ behavior. , and the use of algorithms that prioritize authoritativeness and authenticity in content (over relevance).
● Civil society should be alert against inappropriate implementation of ‘fake news’ legislation, as it could be used to stifle other forms of public debate or speech for political purposes.
● Strengthening media institutions is important for combating ‘fake news’. This should include effective and clear policies on freedom of information, freedom of expression, and data protection, and as well as supporting journalists’ work and personal safety.
Elections and political processes
● ‘Fake news’ and misinformation is just one concern with respect to elections and political processes. Others include the misuse of personal data, an inability to identify the sources of information; and a lack of regulation or oversight around the use of electronic voting systems.
● People in vulnerable or underserved communities often only access the Internet through mobile devices. This can result in disproportionate targeting of those groups through social media messaging that is designed to be more easily received on mobile devices, and with less capacity to search for counter-narratives.
Local content and multilingualism
● The production of local content has the potential to build trust in the media. Its smaller scale and focus can result in more immediate and accurate judgments of the content and its validity. ● Multilingualism is an integral feature of local media content and a tool for protecting local discourses.
● Global and regional multilateral organisations are well-placed to engage in activities to strengthen local content production through fiscal, regulatory or treaty-based avenues.
The IGF is of the opinion that governments, media and social media enterprises, as well as individuals, need to learn how to live in what some call "the post-truth era". New smart initiatives, driven by multi-stakeholder collaborations, should be encouraged to combat risks and harms in and through the media. These may involve building networks and online communities, strengthening digital citizenship via media and information literacies, and digital debates.
Compiled by M 21F(2019) from IGF 2018 “The Internet of trust”. Internet Governance Forum, Paris, 12-14 November 2018. Key messages. http://www.intgovforum.org/multilingual/index.php?q=filedepot_download/6037/1415
No comments:
Post a Comment