Election Officials Worry Court Case Puts Critical Partnerships At Risk
On July 4th, Louisiana-based U.S. District Judge Terry Doughty issued an order broadly limiting executive branch communications with social media companies. The lawsuit alleges that the administration “censored free speech by using threats of regulatory action or protection while pressuring companies to remove what it deemed misinformation. COVID-19 vaccines, legal issues involving President Joe Biden’s son Hunter and election fraud allegations were among the topics spotlighted in the lawsuit.” A federal appeals court temporarily paused the order as proceedings continue.
The order only applies to named federal officials and agencies, but as the 2024 presidential election campaign season gets underway it is likely to have a chilling effect on election administrators’ ability to work with partners to monitor and elevate authoritative election information. The order risks fueling a culture in which government contact with social media companies is seen as inherently risky in a predatory legal environment and curbing collaborative, cross-sector efforts to promote accurate election information.
While the potential implications from the case are real, this order does not spell the end of authoritative election information efforts. First, the order is a temporary injunction and the case still has a long process of working its way through the courts. Moreover, social media platforms will continue to monitor content in accordance with their own rules and policies. Platforms can still direct users to election information and election officials can still use social media to reach voters.
Legal Risks Disincentivize Collaboration
If upheld, the order will limit the federal government’s ability to contact social media companies with the intent of “encouraging, pressuring, or inducing in any manner the removal, deletion, suppression, or reduction of content containing protected free speech.” Criminal activities are excluded from the order. Legitimate issues around free speech have been raised in the case, but the order has been criticized as having “broad scope and ambiguous terms” and leaves administration officials unsure of which actions fall under the sweeping mandate.
Uncertainty, especially paired with the risk of litigation, disincentivizes any collaboration with social media platforms — just as U.S. officials prepare for a high-turnout presidential election next year. 2024 will also be the first presidential election since the proliferation of widely accessible generative AI technologies, creating new risks for the spread of inaccurate election information. Mitigating those risks will require public-private partnerships between sources of authoritative election information (local election officials and federal security representatives) and the platforms responsible for moderating potentially erroneous content.
While this court order is limited to the federal government, many state and local election officials worry they could be next. Election officials are increasingly subject to state laws penalizing minor administrative mistakes and are regularly subject to extensive litigation, taking limited resources away from administering elections and instead tying officials up producing documents and defending themselves in court. Local election officials serving on our Task Force on Elections say they know best what information within their communities might be inaccurate or misleading to voters. If every time they come across a misleading post–for instance, claiming a voting location is closed when it’s not–they have to worry that reporting it could land them in court, some local officials might stop reporting content entirely.
Curbing Collaboration
U.S. elections are highly decentralized, making it untenable for social media platforms to coordinate across all local and state election offices on an individual basis. This necessitates a central command that can filter communication between platforms and election offices. In some cases, this need is filled by private organizations and associations of election officials, such as the National Association of State Election Directors.
At the federal level, however, this need is often filled by the Cybersecurity and Infrastructure Security Agency. CISA is mandated with protecting the nation’s critical infrastructure; since 2017, it has been on the forefront of federal efforts to promote accurate election information and “provide election stakeholders with the information they need to manage risk to their systems and assets.”
CISA is well-positioned to act as a trusted source for social media companies when it comes to complex election security topics–such as malign foreign interference in local election systems–that tech platforms may not have insight into on their own, yet this court order specifically calls out the CISA and several of its election-focused employees in its list of official actors prohibited from certain types of contact with platforms. If the order holds, it will hamstring the federal agency best suited to act as an intermediary between election administrators and tech.
The order also names several non-governmental organizations which worked collaboratively with government and social media companies to identify inaccurate election information on platforms during the 2020 election. Not all government officials can be fluent in the complex legalese of what is and is not considered protected free speech, and non-governmental organizations provide crucial perspectives from the world’s leading experts on how new technologies impact the public good, expertise government bodies do not otherwise have access to. While we understand concerns about protected free speech, this ruling may have cast too wide a net by potentially inhibiting communication with the listed organizations “or any like project or group” and could isolate government decision-making from experts on both sides of the aisle.
Oversight from courts on government officials’ actions related to free speech is a critical check against potential overreach, but it shouldn’t stop collaboration. Helping citizens engage in the democratic process is best achieved through collaboration across the whole of society.
Share
Read Next
Support Research Like This
With your support, BPC can continue to fund important research like this by combining the best ideas from both parties to promote health, security, and opportunity for all Americans.
Give NowRelated Articles
Join Our Mailing List
BPC drives principled and politically viable policy solutions through the power of rigorous analysis, painstaking negotiation, and aggressive advocacy.