Democracy's immune system is in trouble. Here's what we can do

Updated 7:56 AM ET, Fri December 1, 2017

Chat with us in Facebook Messenger. Find out what's happening in the world as it unfolds.

Craig Newmark is the founder of Craigslist and the Craig Newmark Philanthropies, which supports and connects nonprofit communities and drives civic engagement. The views expressed in this commentary are his own. This is the next installment in CNN Opinion's series on the challenges facing the media as it is under attack from critics, governments and changing technology.

(CNN)Throughout history, the media has been used to manipulate and influence political outcomes. Julius Caesar used war journaling, or a written firsthand account of war and its rationale, to justify the invasion and destruction of the Gallic tribes two millennia ago. His heir, Octavian, used a disinformation campaign to convince the Senate to declare war against Mark Antony.

In later years, new communications tools, such as the printing press and moveable type, made it easier for philosophers including John Locke and Thomas Paine to influence new forms of democracy, which, while imperfect, have been more successful than other forms of government.
What these historic figures didn't have access to is digital technology, which democratizes media but also eases the spread of disinformation. The internet allows all who have access to participate, but it also amplifies the effectiveness of media manipulation via automation. Botnets (which are made up of networks of individual, social media and bot accounts that appear to be owned by real people) act as a force multiplier, distributing disinformation in the form of "fake news" on a massive scale. We saw this in the 2016 US presidential election, as Russian propaganda flooded social media platforms to divide the nation.
However, a broad community of good actors, including foundations and nonprofits committed to journalistic integrity, are emerging to promote media literacy and good-faith reporting. As a news consumer and a philanthropist (not an expert), I'm invested in supporting these knowledgeable institutions and initiatives to ensure that we have access to fair and accurate reporting. I care about this issue because I believe a strong, vigorous and trustworthy press is the immune system of democracy.
Last year that immune system failed, which is why we need to focus on rebuilding and sustaining a trustworthy press now more than ever. While a complex task, given the evolving uses of technology in spreading disinformation, in order to remedy this issue we must create a more formalized journalistic code of ethics for the digital age.
There are several governing principles for such a code -- more than can be covered in this piece. In fact, the Society of Professional Journalists (SJP) has a comprehensive code of ethics for news organizations to follow, and the Online News Association has a "Build Your Own" ethics code project, designed to let news organizations build codes of ethics for a digital age.
    However, a few principles stand out as fundamental to trustworthy reporting on digital channels. First, all media organizations with a digital presence should commit to a standard of honest reporting -- one that places fact checking and the use of multiple sources above speculation and single sources. SJP already provides guidelines to this effect, advising journalists of any medium to always verify information, use original sources, avoid conflicts of interest and explain moral processes to audiences.
    A second principle -- already standard in traditional media -- should be that if a digital journalist makes a mistake, the outlet should admit it and correct it immediately. After all, if a mistake is made, the journalist is accountable, not just to the news organization, but to the readers at large. A correction should be done in a way that is equal or greater than the mistake that was published. For example, if the mistake is shared on a media outlet's website and across its social media platforms, the correction should be shared in the same manner. As a news consumer, I'd like to see this happen in an effective and timely manner, so that the false information has a short digital lifespan.
    A third principle, building on an idea from The Trust Project's Director Sally Lehrman and Google News Vice President Richard Gingras, encourages hyperlinking statements of facts, statistics or quotes to their original sources. This allows audiences to click through and assess the veracity of an article based on the research that has gone into its preparation. "An effective system would also allow audiences to alert editors to perceived inaccuracies ... and follow corrections," Lehrman and Gingras note.
    A final principle would make editors commit to drafting headlines that accurately represent the content of the article, not present the reader with a false understanding of what the piece is about. For example, opinion editorials should be labeled as such, not purport to be news articles. I've also observed a number of headlines that express the unverified claim as a question. This can leave the reader misinformed. While headlines should catch a reader's attention, they shouldn't be deceitful.
    We need help from social media platforms to reinforce these emerging standards and to counter media manipulation. Google, Facebook, Twitter, Bing and others are working to address this. Their work includes a commitment to using "Trust Indicators," launched on November 16 by the Trust Project, a nonpartisan initiative from Santa Clara University's Markkula Center for Applied Ethics. The Trust Indicators are a new set of transparency standards that help news consumers easily assess the quality and reliability of journalism. According to the Trust Project, "each indicator is signaled in the article and site code, providing the first standardized technical language for platforms to learn more from news sites about the quality and expertise behind journalists' work."