Susan M. Ward, Ph.D., is a professor of communication studies at Delaware County Community College, where she also serves as the faculty fellow for Quality Matters. Her disciplinary background focuses on rhetoric and persuasion, including participating in competitive debate. She has been involved in course design for both face-to-face and online courses for more than 20 years and is extensively involved in the community college sections of both the National Communication Association and the Eastern Communication Association. A version of this article was originally published in The Journal of Innovation, Teaching & Digital Learning Excellence (Spring 2021).
In the summer of 2019, a friend shared a story on his social media feed about how horns have started growing on the back of our heads because of using smartphones. It seemed a bit far-fetched and yet, it was a Washington Post article—not necessarily a source I’d readily dismiss as being problematic. As it turned out, there was reason to be cautious about the claim. Within a week, the fact-checking site Snopes posted an “unproven” rating about the claim and PBS published a lengthy explanation about why the claim is unproven. Why had my friend been so quick to share a story that really did seem too weird to be true? In short, there’s a certain satisfaction that comes with the instant gratification of sharing an intriguing story with the online world. When I told my students about the story, they were more interested in the possibility that the story might be true than they were in the point that it was, in fact, not true.
As an educator, I see my students struggle with determining the credibility of sources as all sorts of misinformation and disinformation pours across their screens. It’s much easier for them (and for all of us) to click share because we find a story or a meme emotionally intriguing than it is to stop and consider whether the information is correct. However, as an academic, I’ve been trained to test the veracity of claims and likewise to teach my students to do the same. I must say that I found it easier to do so when I first started teaching more than 20 years ago than I do today. Most of my students move through their day with instant access to a wide world of information, which seems to complicate figuring out whether what they’re seeing on their screens is true. My experience has been that they’d readily believe that horns are growing on the back of their skulls instead of trying to verify the claim. It became increasingly clear that I needed to level up my approach to teaching them how to test the credibility of online sources. When a colleague sent me an article about the SIFT method, I knew I had found a way to do so.i
Before I address the SIFT method, I’d like to address the shortcomings of critical thinking as a primary information literacy tool. Despite the best efforts of educators to help students with identifying credible sources, many students do not demonstrate critical thinking skills when selecting sources. There are many reasons why this is so, but one of particular note is the amount of time it takes to engage in critical thinking. Many of the information literacy techniques students are taught take more time than the average social media user is likely to spend in determining the veracity of a claim. For instance, take the popular information literacy tool known as the CRAAP checklist. This asks students to test the credibility of a source by asking questions about its currency, relevance, authority, accuracy, and purpose. While these questions are valuable, they take time to process and are not easily remembered in the moment when looking at a source. Critical thinking about the challenge faced by students discerning whether horns are growing on the back of people’s heads calls upon academics to face the shortcomings of the time involved in thinking critically about an enticing headline. In essence, the CRAAP test and others like it require a level of deep attention that most will bypass in lieu of convenience and/or attractiveness of ideas—whether credible or not—among other reasons. Wisdom calls upon educators to determine when it’s best to direct students to use a checklist method and when another option might be better, such as the SIFT method.
The SIFT Method
Based on the information literacy research of Winuberg and McGrew, Mike Caulfield developed the SIFT moves by asking, “What is the smallest set of skills that we can give people that prepares them to engage as active citizens on the web?” The SIFT moves are in part a way to determine if a source is credible—particularly online media—without engaging in deep attention to make that determination. I’ll share the SIFT moves with you and then discuss how they might be used in the classroom.
First SIFT Move: Stop
The first move of SIFT is perhaps the one behavior that when avoided leads to misinformation and disinformation going viral: Stop. Arguably, this might be the most difficult move because it requires the user to resist the urge to respond out of emotion. After all, humans are emotional beings that are more likely to be guided by the affective value of clicking share than to stop.
Second SIFT Move: Investigate the Source
The second move of SIFT is Investigate the Source. The focus of this move is for a student to know what they’re reading before they read it. This is based on the premise that knowing the expertise of the source before it is read will help determine if attention needs to be given to the source at all. Caulfield offers two ways to quickly investigate the source: hover in Twitter and use Wikipedia. Hover involves hovering over a user’s name to see a description about them. A blue check mark next to a name used to mean that the account has been verified by Twitter, but that does not automatically make the source credible. The second way to investigate a source is to use Wikipedia. Some academics might balk at the idea of using Wikipedia to investigate a source further, but the goal of SIFT is not to do a deep dive into research. Instead, the goal is to determine if a source is worthy of one’s attention, and that does not typically require concerning oneself with exploring primary sources, for example.
Third SIFT Move: Find Better Coverage
The third move of SIFT is Find Better Coverage. In this move, the person considers whether there is other available coverage of the topic. A simple search on key terms will likely result in a list of headlines that can provide a quick overview of how the topic is viewed by reading the headlines. One important part of the move is to use credible fact-checking sites such as Snopes and PolitiFact.
Fourth SIFT Move: Trace Claims, Quotes, and Media to the Original Source
The fourth move of SIFT is Trace Claims, Quotes, and Media to the Original Source. This move is perhaps the most time-consuming of the moves because it requires taking a closer look at the source. This move is necessary when the user needs to determine the veracity of the claim after deciding that it is worthy of attention given the result in investigating the source. The goal is to ensure that the information presented in the source can be traced back to the original source (i.e., the primary source as opposed to the secondary source). This means, among other tasks, checking the credibility of people who are quoted in the source. A quick way to get the tracing started is to look for hyperlinks in the source that may direct the reader to the original source. If the source lacks hyperlinks, performing a search on key terms in the source can assist with locating original sources.
Using SIFT Moves in the Classroom
One of the best ways to teach the SIFT moves to students is to be in a computer lab or an online classroom where students have ready access to a computer and the internet. However, they can also be taught in a large group format in a traditional classroom. I have used the latter format and given students the opportunity to use their own laptops or mobile devices to follow along with me. No matter which format you use, let’s imagine that you would like students to determine whether a particular source is credible. As an example, we’ll use a a Time article about how the attention spans of humans are shorter than goldfish. You can either send students a link to the article or have them Google “Time attention spans goldfish”. Since the first move of SIFT is to Stop, you’ll remind students that they wouldn’t just assume that the article is correct or continue to read it before deciding whether it’s worthy of attention. Stop ensures that we remember to read laterally. Thus, you direct them to do the second SIFT move: Investigate the source.
Investigating the source of the Time article requires the student to remove all the information after the domain name of the web address for the article and type Wikipedia after what is left. In this case, the search term would be “https://www.time.com Wikipedia.” Have the students discuss what they found on the Wikipedia page and whether they believe that Time is a credible source. Returning to the editorial, the student’s next task would be to determine whether the author is a credible voice about attention spans. You could ask students what search terms would be good to use in order to learn more about the author. A possible answer could include clicking on the author’s name. In this case, clicking on Kevin McSpadden leads to a page with a list of articles he authored for Time. It does not provide any additional information about his qualifications. One option to find out more about McSpadden is to have students search for “Kevin McSpadden LinkedIn” (to save time, direct students to the entry for Kevin McSpadden—Bozeman, Montana. Once on the LinkedIn page, you can exit out of the dialog box that pops up in order to see his abbreviated profile). Using his LinkedIn profile, lead students in a discussion about whether they believe McSpadden is a credible journalist worth paying attention to in this instance. After determining that the online magazine and the author are worth paying attention to, students can move on to the third SIFT move: Find better coverage.
One way of finding other available coverage of the topic is to do a simple Google search. In this case, the phrase “goldfish and human attention spans” is a good one to use, as it draws upon the key terms used in the Time article. A variety of sources are returned—some of which support the claim and others of which do not. At this point, ask students whether it would be a good idea to accept McSpadden’s claim or if further investigation is needed. Spoiler alert: further investigation is needed.
In performing the fourth SIFT move, Trace Claims, Quotes, and Media to the Original Source, students should look for links in the original article that may lead them to primary sources. In the case of the Time article, a reference is made to a Microsoft study, but the link is for a general Microsoft advertising page. The Time article refers to the study having been conducted by researchers in Canada. Using these context clues, have students locate the primary source of the Microsoft study by using the search terms “Microsoft study Canada attention span”. The first result is a link to a PDF file of the study. Once the file is opened, use the find option to search on “goldfish” to locate where the study addresses attention spans. In this case, page 6 in the study displays a graphic showing that the average attention span of humans was 12 seconds in 2000 and 8 seconds in 2013. It also notes that the average attention span of goldfish is 9 seconds. Here is where the claim that the attention span of a human is that of a goldfish originated. A source notation at the bottom of the graphic indicates that the statistics are from Statistics Brain. In other words, the claim being made by dozens of other articles based on the Microsoft study is not actually data from the Microsoft study itself. Ask students what they believe the next step should be in tracking down the primary source. Hopefully, they respond by performing a search for Statistics Brain. A search of Statistic Brain’s website for information about attention spans returns a result that indicates that the attention span of humans is measured in minutes and not seconds as reported in the Microsoft study. While the full article is behind a paywall, there is enough information at this point to question the veracity of the claim in question. An additional option for exploring Statistic Brain further is to search for “statistic brain goldfish attention spans”. One of the results is for an article by First Coast News that fact checks the claim. It’s possible that you might have an enterprising student who uses Google scholar to find the primary source and comes across a compilation of articles about the claim and how it is unsubstantiated. At this point, students should have concluded that there is enough doubt about the study and the claim about attention spans that Ebstein’s article should not be viewed as credible.
The four moves in the SIFT approach provide a way for students to learn how to test the credibility of an online source in an efficient and timely manner. While SIFT is not intended for deep research such as in-depth research papers, it can be used as a first step to make faster decisions about whether to consider a source worthy of attention for deeper research. For example, quickly checking the credibility of the Washington Post story about horns would save a student valuable time because they can determine that the source is not credible and thus avoid wasting time pursuing an unfounded claim as a research topic. SIFT is not going to solve the problem of online misinformation and disinformation, but it certainly is a viable tool for, as Warzel claims, “add[ing] a bit of friction into the system.” Hopefully, such friction encourages students to think twice before believing that horns are growing on the back of their skulls.ii