algorithmic bias

Submitted by Shelby Hallman on March 26th, 2024
Share this on: 
Short Description: 

Algorithms are everywhere, and they have increasing power over what we consume (Amazon, Netflix, TikTok), who we date (“the apps”), and how we understand the world (Google, ChatGPT). So, what are algorithms, and how did they become so powerful? Who are the humans that create them, and why does it matter?

In this workshop, we will explore how algorithms can perpetuate bias and discrimination, and discuss some preventive strategies. It is open to learners of all backgrounds and experience.

Workshop Instructors: Shelby Hallman, Physical Sciences and Engineering Librarian; Ashley Peterson, Research & Instruction Librarian, Media and Data Literacy; Alexandra Solodkaya, Rothman Family Food Studies Librarian

Credits: This workshop was derived from LMU's Rise Against the Machines: Understanding Algorithmic Bias workshop. 

Attachments: 
AttachmentSize
UCLA_SRW_Fall23 Algorithmic Bias Workshop Slide Deck.pdfdisplayed 1069 times3.75 MB
Algo_Bias_UCLA_Fall_23_Lesson Plan.pdfdisplayed 941 times83.65 KB
Learning Outcomes: 
  • Students will be introduced to algorithmic bias concepts, focusing on machine learning and AI.
  • Students will understand the causes and implications of bias within algorithm development and use. 
  • Students will discuss strategies to cope with or critically engage with algorithms.

Individual or Group:

Course Context (e.g. how it was implemented or integrated): 

This workshop was held virtually, via Zoom. 

Assessment or Criteria for Success
Assessment Short Description: 
Formative assessment was conducted via the in-session activities and participation. Summative assessment was conducted via an end of session survey form.
Suggested Citation: 
Hallman, Shelby. "Breaking the Code: Understanding Algorithmic Bias." CORA (Community of Online Research Assignments), 2024. https://projectcora.org/assignment/breaking-code-understanding-algorithmic-bias.
Submitted by Megan Pitz on February 15th, 2024
Share this on: 
Short Description: 

This learning session, led by a librarian, is for first-year community college students in an academic library setting. The intention of this session is to scaffold onto existing research writing skills acquired in previous education, as well as use of popular video sharing platforms to obtain information, like TikTok. Informative videos produced by everyday people are a growing form of intellectual connection between all audiences and scholarly sources based on relatability, as well as visibility of marginalized issues larger news organizations do not address. When the information messenger is familiar yet dynamic in their presentation and provides information that the public is not informed on, viewers are more inclined to listen than if it were just a research paper or a scholarly representative of a research community. There are, however, citation issues and basic research principles missing in several of these videos, based solely on most video sharing platform’s intention to obtain engagement, not to responsibly inform/educate its users. As researchers, it is crucial to discern engagement-intended, incendiary content with informative, well-researched content that our neighbors are making, even if their intentions are good.

Attachments: 
AttachmentSize
TikTok Lesson Plan.docxdisplayed 1232 times17.53 KB
AttachmentSize
lesson plan project slides.pdfdisplayed 1238 times304.28 KB
Learning Outcomes: 

By the end of this session, students will be able to:

  • Identify research as an ongoing conversation between several scaffolding and outside research and popular community voices.
  • Recognize the importance and necessity of crediting other voices inside and outside of the research and popular communities you are entering.
  • Respect your own contributions to scholarship by following citation guidelines in your own information creation.
Discipline: 
Education

Information Literacy concepts:

Individual or Group:

Course Context (e.g. how it was implemented or integrated): 
Additional Instructor Resources (e.g. in-class activities, worksheets, scaffolding applications, supplemental modules, further readings, etc.): 
Assessment or Criteria for Success
(e.g. rubric, guidelines, exemplary sample paper, etc.): 
AttachmentSize
lesson plan worksheet.pdfdisplayed 954 times108.36 KB
Assessment Short Description: 
X number of groups (based on class size, max 5 members per group) will be assigned one TikTok with a relevant, polarizing topic (i.e., Israel-Hamas conflict, self-diagnosing psychological disorders, anti-feminist podcasts, Dating Do's and Don'ts, AI/ChatGPT, school shootings, etc.) per group to watch, will answer questions together, and will ultimately decide if the creator of the TikTok is engaging in scholarly conversation or popular conversation (relevant topics to the zeitgeist at that moment in time). Groups will all come together and review their findings with the class, having designated one representative to speak for the group. Instructor will monitor discussion appropriately, with an overall time frame of 1 hour in mind.
Potential Pitfalls and Teaching Tips: 
  • This session includes both passive and active activities. The librarian begins the session by priming students in standard lecture format with what scholarly conversation is, what it looks like, and how to participate in it responsibly and respectfully. The students then engage in verbal and written group analysis of a TikTok and determine if it is a scholarly or popular information source. The students produce their learning onto the worksheet, which the librarian will collect at the end of the session to assess learning. 
  • The librarian builds on prior knowledge of students’ engagement with TikTok (as viewers and creators) or other video sharing platforms of the same format, as well as student learning of proper citation use from previous education, no matter how long ago. 
  • Popular conversation should not be taught as “lesser” than scholarly conversation, but as diversified intellectual support to scholarly conversation when used properly. The crucial factor in discerning the two is that the information provided is factually correct, well-researched, and most importantly, addresses other voices in the ongoing conversation that the creator is entering instead of operating within a vacuum.
Suggested Citation: 
Pitz, Megan. "“According to the CDC…” vs. “Someone just said…”: Identifying Scholarly and Popular Conversations on TikTok." CORA (Community of Online Research Assignments), 2024. https://projectcora.org/assignment/%E2%80%9Caccording-cdc%E2%80%A6%E2%80%9D-vs-%E2%80%9Csomeone-just-said%E2%80%A6%E2%80%9D-identifying-scholarly-and-popular-conversations.
Submitted by Shelby Hallman on June 9th, 2022
Share this on: 
Short Description: 

Algorithms are not neutral but this does not mean they are not useful tools for research. In this workshop on algorithmic bias, student learn how algorithms can perpetuate bias and discrimination and how to critically evaluate their search results.

Learning Outcomes: 

•Students will be introduced to the machine bias inherent in algorithmic decision making, with a focus on information systems.

•Students will discuss the effects of algorithm bias in order to articulate how some individuals or groups of individuals may be misrepresented or systematically marginalized in search engine results.

•Students will develop an attitude of informed skepticism in order to critically evaluate search results. 

Individual or Group:

Course Context (e.g. how it was implemented or integrated): 

Stand-alone workshop; co-curricular workshop. 

Assessment or Criteria for Success
Assessment Short Description: 
Formative assessment was conducted via the in-session activities. Summative assessment was conducted via an end of session survey form.
Suggested Citation: 
Hallman, Shelby. "Rise Against the Machines: Understanding Algorithmic Bias." CORA (Community of Online Research Assignments), 2022. https://projectcora.org/assignment/rise-against-machines-understanding-algorithmic-bias.
Submitted by Alexandria Chisholm on October 14th, 2021
Share this on: 
Short Description: 

This algorithmic literacy workshop puts a new spin on media literacy by moving beyond fake news to examine the algorithms that shape our online experiences and how we encounter information in our everyday lives.

Attachments: 
AttachmentSize
#ForYouWorkshopLessonPlan_Chisholm.pdfdisplayed 1757 times163.64 KB
AttentionAutonomyPlan_#ForYouWorkshop.pdfdisplayed 1147 times83.03 KB
Learning Outcomes: 

By the end of the #ForYou: Algorithms & the Attention Economy workshop, students will be able to:

  1. describe recommender system algorithms in order to examine how they shape individuals' online experiences through personalization
  2. analyze their online behaviors and subsequent ad profiles in order to reflect on how they influence how individuals encounter, perceive, & evaluate information, leading to echo chambers & political polarization
  3. assess how their data is used to personalize their online experience in order to build algorithmic awareness & make informed, intentional choices about their information consumption
Discipline: 
Multidisciplinary

Information Literacy concepts:

Individual or Group:

Suggested Citation: 
Chisholm, Alexandria. "#ForYou: Algorithms & the Attention Economy." CORA (Community of Online Research Assignments), 2021. https://projectcora.org/assignment/foryou-algorithms-attention-economy.
Submitted by Carolyn Schubert on May 6th, 2021
Share this on: 
Short Description: 

We use Google every day, but we do really understand why we get certain results? This event will explain what an algorithm is, how search engines use them, and how bias exists in our search results. Attendees will have a chance to reflect on the ways biased results can echo larger biases for representation in society.  Access this site at your convenience at: https://jmu.libwizard.com/f/algorithms-bias

Co-creators: Malia Willey and Alyssa Young.

AttachmentSize
Tutorial Outline.docxdisplayed 851 times36.47 KB
Learning Outcomes: 

Learning goals: 

  • Defining a broader context for algorithms 

  • Analyzing Google results for algorithmic bias 

  • Identifying actions for countering algorithmic bias

Information Literacy concepts:

Individual or Group:

Suggested Citation: 
Schubert, Carolyn. "What’s Behind a Web Search? Bias and Algorithms ." CORA (Community of Online Research Assignments), 2021. https://projectcora.org/assignment/what%E2%80%99s-behind-web-search-bias-and-algorithms.
Submitted by Carolyn Caffrey on December 10th, 2018
Share this on: 
Short Description: 

This assignment was created for a credit bearing course for first year students. It's designed to help students take what they've learned about algorithmic bias from the course lectures and readings and apply it to their own search practices. They also critically analyze search results for advertisements and compare DuckDuckGo to Google. [You could also look at this assignment as an adaptation of Jacob Berg's wonderful, "Googling Google," assignment at https://www.projectcora.org/assignment/googling-google-search-engines-ma... ]

Attachments: 
AttachmentSize
analyzingsearchengines_assignment.docxdisplayed 1313 times534.31 KB
Learning Outcomes: 

Students will be able to: -identify advertisements within a list of search results -discuss the role advertising plays in how search results are ordered -describe how search results are impacted by human biases in their ranking algorithms

Discipline: 
Multidisciplinary

Individual or Group:

Course Context (e.g. how it was implemented or integrated): 

This assignment occurred early in the semester as we discussed algorithms, bias, and filter bubbles. Students were asked to draw on class discussions and lectures on page rank, the history of search engines, and filter bubbles. Other assigned material going into this assignment the IRL podcast episode "Social Bubble Bath" and Eli Pariser's TED talk on filter bubbles. Students commented that they enjoyed this assignment, weren't aware that Google was an advertising company, and were unfamiliar with DuckDuckGo. The course itself was designed and taught by me (a librarian) as part of our first year seminar program.

Assessment or Criteria for Success
Assessment Short Description: 
Assignments were evaluated using the rubric from the attached assignment sheet. In general, students had difficulty identifying all of the advertisements. While students had no difficulty analyzing gender bias or racism in the image results, they did struggle with the phrase "god" in identifying how the results may privilege particular narratives and identities over others.
Potential Pitfalls and Teaching Tips: 

Be careful with the choose your own image search --- several students picked topics such as our institution name or vague concepts like "music" which didn't as clearly illustrate the course concepts. In the future I would remove the choose your own option for the image component. This assignment was designed with first year students in mind.

Suggested Citation: 
Caffrey, Carolyn . "Analyzing search engines: What narrative is told through the algorithm." CORA (Community of Online Research Assignments), 2018. https://projectcora.org/assignment/analyzing-search-engines-what-narrative-told-through-algorithm.
Submitted by Alexandria Chisholm on December 6th, 2018
Share this on: 
Short Description: 

This workshop delivers an action-oriented introduction to personal data privacy designed for new college students. The session is designed to reveal the systems in place to collect and analyze online behavioral data, and to unveil the real-world consequences of online profiling in contexts like sentiment shaping, consumer preferences, employment, healthcare, personal finance, and law enforcement. In lieu of a prescriptive approach, students analyze case studies to observe how online behaviors impact real-world opportunities and reflect on the benefits and risks of technology use to develop purposeful online behaviors and habits that align with their individual values. Developing knowledge practices regarding privacy and the commodification of personal information and embodying the core library values of privacy and intellectual freedom, the workshop promotes a proactive rather than reactive approach and presents a spectrum of privacy preferences across a range of contexts in order to respect students’ autonomy and agency in personal technology use.

Attachments: 
AttachmentSize
PersonalDataPlan_PennStateBerks.pdfdisplayed 1182 times622.24 KB
PrivacyWorkshopLessonPlan_Chisholm_Hartman-Caverly.pdfdisplayed 713 times189.3 KB
Learning Outcomes: 

Students will be able to: 1. recognize how their personal data and metadata are collected, along with the potential implications of such data collection 2. assess how their data is shared and make informed, intentional choices to safeguard their privacy 3. identify privacy issues facing our society 4. describe the positive case for privacy as a human right fundamental to individual well-being

Discipline: 
Multidisciplinary

Information Literacy concepts:

Individual or Group:

Collaborators: 
Suggested Citation: 
Chisholm, Alexandria. "Privacy Workshop." CORA (Community of Online Research Assignments), 2018. https://projectcora.org/assignment/privacy-workshop.
Submitted by Elisa Acosta on October 28th, 2018
Share this on: 
Short Description: 

This 30-minute activity was a quick introduction to algorithmic bias and the importance of critically evaluating search engine results. Algorithms increasingly shape modern life and can perpetuate bias and discrimination. In pairs, students analyzed the results from Google Image searches and Google Autocomplete suggestions. This activity was based on “Algorithms of Oppression: How Search Engines Reinforce Racism,” by Safiya Umoja Noble. This lesson plan was Part 1 of an hour-long workshop that also included a 30 minute Google Scholar activity. Please see Jennifer Masanaga's Google Scholar activity for Part 2: https://www.projectcora.org/assignment/exploring-google-scholar-summer-b...

Attachments: 
AttachmentSize
Lesson Plandisplayed 4796 times154.64 KB
Presentation slidesdisplayed 2217 times3.37 MB
Worksheetdisplayed 1781 times326.34 KB
Suggested Readingsdisplayed 1175 times65.96 KB
Learning Outcomes: 

1. Students will discuss the effects of algorithm bias in order to articulate how some individuals or groups of individuals may be misrepresented or systematically marginalized in search engine results. 2. Students will develop an attitude of informed skepticism in order to critically evaluate Google search results.

Individual or Group:

Course Context (e.g. how it was implemented or integrated): 

The Computer Science Summer Institute Extension Program, or CCSIX, is a 3-week on-campus summer experience for first-year students studying computer science and related STEM fields. This program is designed for groups underrepresented in computing (i.e., women, underrepresented minorities in STEM, and first-generation or low-income college students). https://cssiextension.withgoogle.com/

Potential Pitfalls and Teaching Tips: 

Incoming first-year students were shy and quiet. I revised the lesson plan to include more Think-Pair-Share and less all-class discussion. The instructor should model the Google Images activity first (Professor Style), then let students do the second activity (Computer Scientist) together in pairs. The students liked “partner time.” This was a summer bridge program, so we decided to keep the worksheets short and the activities social (students talking to each other).

Collaborators: 
Suggested Citation: 
Acosta, Elisa. "Exploring Algorithmic Bias with a Summer Bridge Program." CORA (Community of Online Research Assignments), 2018. https://projectcora.org/assignment/exploring-algorithmic-bias-summer-bridge-program.