From the Aspen Institute:
The Aspen Institute’s Commission on Information Disorder is making 15 recommendations to help government, private industry, and civil society advance solutions to and reduce the greatest harms in America’s urgent mis- and disinformation crisis. Among many other critical challenges, the ambitious report covers legislative and executive action on transparency, disclosure, and platform immunity; the collapse of local journalism; community-led methods for resisting imbalances of power further propagated by bad actors; and accountability mechanisms for “superspreaders” of lies.
Published in the Commission’s Final Report, launched today, the recommendations together aim to increase transparency and understanding, build trust, and reduce harms. A summary of each is provided at the end of this press release, along with a list of the commissioners.
Read the Final Report detailing the recommendations on the Aspen Institute’s website. Those seeking to learn more about the Commission on Information Disorder are invited to visit AspenInfoCommission.org.
What follows is a high-level overview of the final recommendations of the Aspen Institute’s Commission on Information Disorder.
Recommendations to increase transparency
Public interest research
- Implement protections for researchers and journalists who violate platform terms of service by responsibly conducting research on public data of civic interest.
- Require platforms to disclose certain categories of private data to qualified academic researchers, so long as that research respects user privacy, does not endanger platform integrity, and remains in the public interest.
High reach content disclosure
Create a legal requirement for all social media platforms to regularly publish the content, source accounts, reach and impression data for posts that they organically deliver to large audiences.
Content moderation platform disclosure
Require social media platforms to disclose information about their content moderation policies and practices, and produce a time-limited archive of moderated content in a standardized format, available to authorized researchers.
Require social media companies to regularly disclose, in a standardized format, key information about every digital ad and paid post that runs on their platforms.
Recommendations to build trust
Truth and transformation
Endorse efforts that focus on exposing how historical and current imbalances of power, access, and equity are manufactured and propagated further with mis- and disinformation — and on promoting community-led solutions to forging social bonds.
Healthy digital discourse
Develop and scale communication tools, networks, and platforms that are designed to bridge divides, build empathy, and strengthen trust among communities.
Increase investment and transparency to further diversity at social media platform companies and news media as a means to mitigate misinformation arising from uninformed and disconnected centers of power.
Local media investment
Promote substantial, long-term investment in local journalism that informs and empowers citizens, especially in underserved and marginalized communities.
Promote new norms that create personal and professional consequences within communities and networks for individuals who willfully violate the public trust and use their privilege to harm the public.
Election information security
Improve U.S. election security and restore voter confidence with improved education, transparency, and resiliency.
Recommendations to reduce harms
Comprehensive federal approach
Establish a comprehensive strategic approach to countering disinformation and the spread of misinformation, including a centralized national response strategy, clearly-defined roles and responsibilities across the Executive Branch, and identified gaps in authorities and capabilities.
Public Restoration Fund
Create an independent organization, with a mandate to develop systemic misinformation countermeasures through education, research, and investment in local institutions.
Invest and innovate in online education and platform product features to increase users’ awareness of and resilience to online misinformation.
Hold superspreaders of mis- and disinformation to account with clear, transparent, and consistently applied policies that enable quicker, more decisive actions and penalties, commensurate with their impacts — regardless of location, or political views, or role in society.
Amendments to Section 230 of the Communications Decency Act of 1996
- Withdraw platform immunity for content that is promoted through paid advertising and post promotion.
- Remove immunity as it relates to the implementation of product features, recommendation engines, and design.