Lesson module posted by Dr. Keren Wang, updated SP 2025
Educational use permitted with attribution; all other rights reserved.
For permission requests, please contact the author directly.
1. OVERVIEW
This lesson will be focusing on understanding and evaluating evidence and information sources, a crucial aspect of constructing persuasive arguments. It explains how evidence interacts with values, and presents general tests for assessing the quality of evidence. We will also be learning how to locate and evaluate various sources of evidence, guiding you on choosing reliable information from books, periodicals, websites, and more. The chapter emphasizes the importance of digital literacy and critical evaluation of different types of sources.

2. UNDERSTANDING EVIDENCE
Evidence and Values
In public discourse, evidence is invariably filtered through the “terminal screens” of societal norms and cultural values, leading to divergent interpretations even when presented with the same set of facts. Consider, for instance, debates surrounding Artificial Intelligence (AI) and employment. For techno-optimists, as represented by some Silicon Valley entrepreneurs, rapid technological advancements are seen as essential for societal evolution. They may interpret the emergence of Artificial General Intelligence (AGI) labor as an auspice for accelerated economic growth, productivity, and innovation, contending that AI liberates human workers from repetitive labor and allows greater engagement in creative, strategic, or emotionally rewarding tasks. Conversely, many labor advocates and trade unionists may interpret the prospect of an AGI workforce less positively. As critics of unchecked technological disruption, they might perceive this development as a harbinger of livelihood displacement, expressing concerns that automation could trigger widespread unemployment, diminish workers’ bargaining power, and deepen existing economic inequalities. Such rhetorical divergence highlights how interpretations of evidence surrounding AI’s impact are strategically framed to reinforce broader narratives of either progress or caution.
This example illustrates that the interpretation of evidence is not merely a neutral or objective process but is deeply intertwined with rhetorical constructions that reflect and reinforce specific value systems. Recognizing this interplay is crucial for understanding the dynamics of public debates and the ways in which information is presented and perceived.

3. DIGITAL LITERACY
Digital Literacy refers to the ability to effectively navigate, evaluate, and utilize online sources of information. Digital literacy is more than simply being able to use technology; it is about understanding how to critically evaluate the veracity and quality of digital content and its sources. Key aspects of digital literacy include:
-
- Critical Evaluation of Sources: Not all websites are created equal. Digital literacy involves determining whether an online source is credible, up-to-date, and relevant, as well as recognizing the purpose of the content—whether it aims to inform, persuade, entertain, or mislead.
- Understanding Bias and Intent: It is important to understand the motives behind the creation of digital content. Websites often have particular political, social, or commercial agendas, and digital literacy involves identifying these biases. For example, a blog promoting dietary supplements might not be objective if it’s sponsored by a company that sells such products.
- Verification of Facts: Digital literacy requires cross-referencing information found online with multiple reliable sources. This helps verify facts and avoid falling for misinformation or “fake news.” For instance, a claim about a health benefit found on social media should be verified through medical publications or government health websites.
- Awareness of Digital Manipulation: The internet includes not only text but also images, videos, and audio clips, many of which may be digitally altered. Digital literacy involves assessing whether visual or multimedia evidence has been manipulated to present a biased narrative.
- Navigating Information Overload: The sheer volume of information available online can be overwhelming. Being digitally literate means knowing how to sift through large amounts of data to find high-quality, relevant information. This involves using effective search terms, recognizing institutional domains (e.g., “.gov” or “.edu”), and understanding how search engine algorithms may prioritize certain content.
-
- Digital Security and Privacy: Digital literacy also includes understanding how to protect one’s privacy online and recognizing secure websites. For example, a digitally literate individual would know to look for “https://” at the beginning of a URL as an indicator of a secure website.
Example of Digital Literacy in Practice: Suppose you are researching the benefits of electric vehicles (EVs). A digitally literate approach would involve consulting a mix of sources, including reputable news organizations (e.g., Associated Press, Reuters), trusted independent technical professional organizations or public agencies (e.g., IEEE, European Alternative Fuels Observatory), and peer-reviewed journals (Energies, Transport Reviews, Journal of Power Sources). It would also involve recognizing potential biases (such as an oil company-funded blog questioning the sustainability of EVs).
4. TESTING DIGITAL EVIDENCE
In the field of argumentation, there are several general tests of evidence that can help evaluate whether evidence used in an argument is reliable, credible, and sufficient to support a conclusion. These tests provide a comprehensive approach to assessing the quality of evidence. Here’s a detailed breakdown:
Accessibility: Is the Evidence Available?
Evidence that is accessible and open to scrutiny is generally considered more reliable.
Example: A public health official cites the number of COVID-19 cases reported by the Centers for Disease Control and Prevention (CDC). This evidence is accessible because the CDC publishes its data on a website that anyone can visit and verify. Counterexample: Someone claims that the government has "secret documents" showing proof of extraterrestrial contact. Since these alleged documents are not accessible for review, the claim fails the test of accessibility.
Internal Consistency: Does the Evidence Contradict Itself?
Evidence should not contradict itself. If evidence is self-contradictory, it weakens the argument and creates doubt regarding its reliability.
Example: A government report on unemployment must consistently present the same statistics throughout the report. If one section states an unemployment rate of 6% and another section states 8% without clarification, the evidence lacks internal consistency.
External Consistency: Does the Evidence Contradict Other Evidence?
Evidence that sharply contradicts most other reputable evidence is often seen as unreliable.
Example: A study on climate change that finds rising global temperatures should align with the majority of climate research from other scientific bodies such as NASA, the IPCC, and NOAA.
The “CRAAP” Test:
Currency
Is the Evidence Up to Date? Check whether information reflects the most recent findings or developments. Citing a meta-analysis on the effectiveness of renewable energy technologies from the previous year is preferable to citing a similar study from more than a decade ago, as the newer study will have taken into account relevant technological advancements. Evidence that has been superseded by more recent findings may no longer be applicable. Avoid sources that do not indicate the date the content was published or last updated.
Example: During the COVID-19 pandemic, many outdated news articles, studies, and public health guidelines from early stages of the pandemic continued to circulate on social media well into 2022, causing confusion over mask recommendations, vaccine efficacy, and treatment guidelines.
Relevance:
Does the Evidence Bear on the Conclusion? Confirm if the evidence meaningfully and directly addresses the topic at hand. Even seemingly high quality and authoritative evidence that does not directly relate to the argument is not helpful.
Example: For someone researching the impact of social media on teen mental health, citing a 2023 study published in a peer-reviewed public health journal examining social media use among adolescents would offer substantial relevance. Conversely, an op-ed from The Wall Street Journal that broadly discusses a social media platform's political stance would likely be irrelevant to the specific issue at hand.
Authority:
Verify the source’s credibility, relevant expertise, and institutional affiliations. This can depend on the reputation of the author or organization providing the evidence, as well as whether the source has the appropriate credentials or expertise.
Example: Consider debates surrounding the societal impact of artificial general intelligence (AGI). A peer-reviewed article authored by recognized AI experts or computer ethicists, published in reputable journals such as the Journal of Experimental & Theoretical Artificial Intelligence, holds substantial authoritative weight. Conversely, a popular podcast hosted by a tech enthusiast, while potentially relevant and informative, would not be considered authoritative scholarly evidence.
Accuracy:
Is the evidence accurate or sufficient to support its claim? Cross-reference with multiple reputable sources.
Example 1: In contentious and rapidly evolving events such as the 2022 Russian full-scale invasion of Ukraine, initial reporting often contains inaccuracies and contradictory claims. A good practice is to cross-check information with other reputable sources (e.g., Associated Press, Reuters) and use fact-checking websites such as Snopes or Media Bias / Fact Check. The Wayback Machine (Internet Archive, https://archive.org/web/) is a helpful tool for retrieving deleted pages or spotting revisions.
Purpose:
Assess whether the content aims to inform, persuade, mislead, or sell.
-
-
- Recognize the potential bias of the underlying source. Websites created to sell a product, promote a political agenda, or advocate for a specific cause may present information in a skewed manner.
- Check for an “About Us” page that details the site’s mission, authors, and background. Absence of this information can be a red flag.
- Lobbying organizations often present information strategically to advocate for specific interests, resulting in one-sided perspectives.
-
The OpenSecrets project (https://www.opensecrets.org/orgs/all-profiles) offers extensive data on federal lobbying activities in the U.S.
Language and Content Quality:
Credible websites typically use a moderate and professional tone, avoiding extreme or sensational language that appeals more to emotion than to facts. They support claims with references, links to original studies, or citations.
-
-
Example: Compare a “clickbait” headline like "5 Ways Coffee Will Instantly Cure All Health Problems!" with a more measured one such as "Research Shows Potential Health Benefits of Moderate Coffee Consumption." The latter is more likely to come from a reputable source.
-
4. OPEN ACCES TOOLS
AI and Bot Detection Tools
- Botometer: https://botometer.osome.iu.edu (Analyzes Twitter accounts to assess bot likelihood. Python API: https://github.com/osome-iu/botometer-python)
- BotD: https://github.com/fingerprintjs/BotD (An open source library for detecting bots in web apps)
- DeepFake-O-Meter v2.0: https://github.com/chelsea234/hifi_ifdl (An open-source platform for detecting deepfake images, videos, and audio)
- Giant Language Model Test Room: http://gltr.io/ (Detects text generated by large language models like ChatGPT)
- Hoaxy: https://hoaxy.osome.iu.edu (Visualizes the spread of misinformation)
2. Fact-Checking Tools
- FactCheck.org: https://www.factcheck.org (Annenberg Public Policy Center)
- Snopes: https://www.snopes.com
- PolitiFact: https://www.politifact.com
- Full Fact: https://fullfact.org
- Wikidata: https://www.wikidata.org (A structured, openly editable knowledge base for fact-checking claims)
3. Source Verification and Bias Check
- MBIC: https://mediabiasfactcheck.com (Indicates media bias and reliability ratings)
- OpenSecrets Project: https://www.opensecrets.org/orgs/all-profiles (Extensive data on federal lobbying activities in the U.S.)
- TinEye: https://tineye.com (Free reverse image search engine)
- Wayback Machine / Internet Archive: https://archive.org/web (For checking historical versions of web pages)
4. Open Access Resources for Teaching & Research
- Center for News Literacy at Stony Brook: https://www.centerfornewsliteracy.org (Open-access curricula and teaching materials)
- First Draft News: https://firstdraftnews.org (Free resources on misinformation and verification)
- Media Education Lab: https://mediaeducationlab.com (Resources and lesson plans for media literacy)
Educational use permitted with attribution; all other rights reserved.
For permission requests, please contact the author directly.

You must be logged in to post a comment.