top of page

Contemporary Fact Checking of Your Writings

  • Writer: Barbra A. Rodriguez
    Barbra A. Rodriguez
  • Aug 1
  • 7 min read

Updated: 5 days ago


It's not unusual for writers to talk about stories like Stephen Glass's, who published dozens of false articles in The New Republic over three years in the late '90s. That level of public outing is a rarity, but it can be easy for writers to mistakenly let a factual error or two slip in to a feature article, a novel you've been slaving over for six months, or a talk based on your newly published self-help book. Because such errors erode authorial trust, and fact checkers are few at magazines – and nonexistent at publishers—it's important to know how to double check your own work for errors.


As someone who's fact checked my own features for more than a decade, as well as a fitness-related book for a hybrid press and a science podcast, here are some ways to apply an eagle eye and set yourself up for success while truth testing your own content.


 Image of large paperclips on messy office desk (credit, Freepik)
Carefully cataloging source materials can reduce factual holes and errors (credit, freepik)

Reducing the Chance of Factual Mistakes

A key step to getting facts right is to prioritize this task early on. That means keeping track of sources of information as you develop a piece. For something like a 1,200-word feature, you could just use Comment balloons in your writing software for noting the hyperlink to an article where you initially found a detail, or the book title and page number, or other sources. When it comes to cataloging book-length projects, you could store all your sources in one folder in something like Scrivener or use similar spaces built into book-development software such as Novelize (theirs is called Notebooks). Or something like an Excel spreadsheet works to record sources.


Noting the date you located a detail can be important, especially with online resources, where links can become corrupted over time. For this reason, the Chicago Manual of Style used for citing sources in many books often asks authors to list online sources' dates of retrieval.


Typical things to double check include proper nouns for their spelling and consistency (people, movie titles, organizational names), and the physical location and spelling of monuments, cities and such. Careful work on those types of facts would have helped the Robert Heinlein novel Stranger in a Strange Land, for instance, which was published despite having a character whose name flips between Agnes and Alice several times. But anything from the shape of an object to the typical summer weather in Johannesburg in the 1970s might be worth checking, if there's consistency to how people talk about it. Errors also can occur in the date listed for events, weight of an object, and other numbers. In memoir, as well as other interview-based works where quotes are used, those statements add another layer to verify.


The best way to ensure quote accuracy is to record interviews, and then revisit the transcripts for accuracy. For other sources such as historical quotes, going directly to the original documents that contain certain facts is usually the best option. For instance, looking at the text from a Shakespearean play is ideal for confirming a line from it.


Confirming with secondary sources instead, usually more than one when possible, might be best with some facts. Example secondary sources include newspaper articles, encyclopedias, biographies, magazine articles, and a review article on a topic. The challenges of a single source being accurate is why, when there's a doubt with important facts, more than one source becomes ideal. For instance, I was asked to list more than one source for as many of the health-related details as possible that I fact checked over four or so months with the fitness-related book I worked on (marking each fact as True, Sort of True, False, Sort of False, or Non-verifiable).


Brunette woman holding magnifying glass in front of green wall (credit stockking on freepik)
Applying a cautious eye when looking at the source of facts is an important fact-checking step (credit, Stockking on freepik)

Quality Testing Sources

Regardless of whether a source is primary or secondary, it's important to consider how reliable the source itself is. Questions to ask include:


  • Was the content written by a verifiable expert?


  • Did they list detailed sources for their

information, suggesting they took their own fact checking seriously, and allowing you to make calls on what's true yourself?


  • Does this person have any inherent biases to their view? For instance, using a book like Memoirs of a Geisha as a resource to learn about their lives wouldn't be ideal, as the book was written by an American who didn't grow up in Japan, and whose publisher had to settle out of court after his primary interviewee noted he'd misrepresented her and the geisha culture portrayed.


  • How old is the source of information? Sometimes age doesn't matter, but in other cases, a university press release or other source might list outdated information. I've even had a professor directly give me an old version of their professional title, presumably because they hadn't yet memorized it.


  • With online content, is it crowd-sourced information? That's challenging because then it's only as reliable as the expertise of the individuals who provided input, or had the most input in the final summary. When I’ve fact checked science and health care content, crowd-sourced sites like Wikipedia.com have been verboten, with the preference being for something like Encyclopedia Britannica. That stems in part from a 2005 study in the well-regarded journal Nature, which found that Wikipedia content had about one more error on average (four instead of three) than Britannica did. However, a 2011 study that the Wikimedia Foundation funded involving Oxford University suggests that the factual differences have disappeared. Regardless, crowd-sourced sites are helpful starting points for looking at their related source links, which may lead you to a variety of useful primary content.


  • Is it more recent content from U.S. government websites? A new caveat online is that you can no longer trust all of them for data and contextual information; since January 2025, content started being scrubbed on the website of the Centers for Disease Control and Prevention and elsewhere. However, you can find the original data with a bit of searching, as described in this article from CNET, for example.


  • As with other sources, online ones might be out of date or biased. A non-profit that receives money from donors related to an illness might not always be the best single source for getting big picture numbers about how many have the illness. for instance. That's the kind of circumstance in which double-checking matters, including by whom the article has been fact checked, such as a physician, statistician, epidemiologist, as appropriate. Other tips about how to know when to trust online health information are available, such as in this tutorial from Medline Plus that provides questions to ask while reviewing content quality.


  • Was the information created by generative AI? As with crowd-sourced information, the potential for "garbage in, garbage out" applies to using content provided to you by an LLM (large language model) application. How well the typical terabytes of content are cleaned up before they're used to train the application/software is one underlying issue that undermines the reliability of LLM-produced content (called training bias); this reflects LLMs inability to filter between what's true and false, as one reviewer of their reliability put it. In addition, they can produce so-called hallucinations, where their answers appear plausible, though they're false (potentially from the questions they're being asked not being framed as best as possible). According to conversations on Reddit, there's concern as well about the applications' ability to accurately share the source document(s) used to generate their output. One step you can take is checking when the last time the LLM was trained (i.e. how old its source content is). Some recent Reddit threads have also suggested that the free version of ChatGPT appears to be using less computing power to provide answers, making for weaker answers overall, so that using a paid version may be safer.


A hoe being used to scrape clay soil (credit freepik)
Truth testing facts is like checking that the soil is best for planting crops (credit, freepik)

Digging In to the Fact Hunt

Whenever possible, setting your work aside for a while helps before going through it. That's partly to step into an analytical frame of mind to think beyond your own assumptions of content correctness. Looking at what appear to be solid sources with a skeptical eye also can matter. As one such example from that fitness-related book I fact checked, I found a misunderstanding of how odds ratios were shared in a popular news magazine article that the book's authors may have reviewed when stating that exercisers were 90 percent less likely to develop Alzheimer's-related mental decline than the participants least likely to exercise. The 2011 study they likely were referring to actually suggested a roughly 9 percent reduction in cognitive decline, or 10-fold lower.


In addition, it's key to know when you're out of your element with fact checking certain information. For instance, if you're writing a historical novel about the early 1800s, and haven't spent years studying that time period, perhaps you'd call on a history professor to review key passages related to societal customs and ways of speaking back then. Or a sensitivity/authenticity reader might be wise to hire for reviewing cultural, historical or other information. For instance, perhaps a work for young readers that covered the Russian Revolution hadn't been run by such experts, as Anastasia: The Last Grand Duchess has been pegged for glossing over the darker side of that time period.


Specialty sources provide an added way to firm up the details of what's true and what's not. Examples include using snopes.com to investigate the truth of online and urban U.S. myths, and thesmokinggun.com website for crime-related details. That's the site that revealed multiple falsehoods about James Frey's criminal past, etc., as had been shared in his "memoir," A Million Little Pieces. Examples of more traditional sites include ones that catalog the preferred scientific and common names of North American native plants (https://www.wildflower.org/plants/), or that cover where certain agricultural crops developed (https://blog.ciat.cgiar.org/origin-of-crops/), and which meals in different cultures they've become part of (https://beforefarmtotable.folger.edu/).


Unfortunately, there's no guarantee you'll have as easy an access these days to public librarians who can guide you to key specialty sources, due to challenges that may have cropped up amid federal defunding efforts. You could hire a fact checker to help you out, at a fee of anywhere from $40 to $100 per hour. Otherwise, digging deep as a writer these days might include digging to find more of the best sources for facts on your own.


By Barbra A. Rodriguez

 

To receive a free guide about your book idea's potential and my Scoops4Scribes shares on writing and publishing, click here.


Learn more about finding reliable health-related information sources from my guest post on this topic.


Look into learning more from journalist Brooke Borel's fact-checking book, which includes practice exercises.

 

Check out these checklists recapping how journalists fact check information:

 

 

Also, this fact-checking explainer for nonfiction authors includes an example spreadsheet approach for cataloging sources.



 

Commenti


  • linkedin
  • bsky icon
  • facebook

©2025 by Vital Wordplay

bottom of page