google

Submitted by Carolyn Schubert on May 6th, 2021
Share this on: 
Short Description: 

We use Google every day, but we do really understand why we get certain results? This event will explain what an algorithm is, how search engines use them, and how bias exists in our search results. Attendees will have a chance to reflect on the ways biased results can echo larger biases for representation in society.  Access this site at your convenience at: https://jmu.libwizard.com/f/algorithms-bias

Co-creators: Malia Willey and Alyssa Young.

AttachmentSize
Tutorial Outline.docxdisplayed 855 times36.47 KB
Learning Outcomes: 

Learning goals: 

  • Defining a broader context for algorithms 

  • Analyzing Google results for algorithmic bias 

  • Identifying actions for countering algorithmic bias

Information Literacy concepts:

Individual or Group:

Suggested Citation: 
Schubert, Carolyn. "What’s Behind a Web Search? Bias and Algorithms ." CORA (Community of Online Research Assignments), 2021. https://projectcora.org/assignment/what%E2%80%99s-behind-web-search-bias-and-algorithms.
Submitted by Tara Cataldo on September 29th, 2020
Share this on: 
Short Description: 

The assignment has students search the same topic in Google and the Web of Science or BIOSIS database. They are asked to pick one result from each search, identify its components (title, author, year) and identify the container of the information (journal, book, news, etc.). They are then asked to compare and reflect on the different results. 

Attachments: 
AttachmentSize
Google vs WoS assignment.pdfdisplayed 718 times93.55 KB
Learning Outcomes: 
  1. Examine the difference between searching the open web and a literature database
  2. Identify the containers of digital information
  3. List the parts of a scholarly citation 

 

Individual or Group:

Course Context (e.g. how it was implemented or integrated): 
Additional Instructor Resources (e.g. in-class activities, worksheets, scaffolding applications, supplemental modules, further readings, etc.): 
Potential Pitfalls and Teaching Tips: 
Suggested Citation: 
Cataldo, Tara. "Google vs. Web of Science." CORA (Community of Online Research Assignments), 2020. https://projectcora.org/assignment/google-vs-web-science.
Submitted by Alexandria Chisholm on March 23rd, 2020
Share this on: 
Short Description: 

The Penn State Berks Privacy Workshop Series focuses on privacy issues for students in the past, present, and future.  The Privacy Workshop spotlights privacy practices and concerns in the current moment; Digital Leadership explores future implications of past and current digital behaviors; Digital Shred provides tools to evaluate and mitigate the damage of past digital behaviors; and Digital Wellness focuses on privacy across the lifespan - bringing together the past, present, & future by finding a balance of technology & wellness, while aligning habits and goals.  Each workshop is grounded in theory – countering approaches that overpromise user control in the face of information asymmetries and the control paradox – and embrace students’ autonomy and agency by avoiding prescribed solutions, and instead encouraging decision-making frameworks.

Attachments: 
AttachmentSize
PersonalDataIntegrityPlan_DigitalShred_PennStateBerks.pdfdisplayed 1133 times625.92 KB
DamageAssessment_IdealPortfolio_DigitalShred_PennStateBerks.pdfdisplayed 694 times786.93 KB
DigitalShredLessonPlan_Chisholm_HartmanCaverly_Glenn.pdfdisplayed 799 times165.59 KB
Learning Outcomes: 

In the Digital Shred Workshop, students will be able to:

  1. Reflect on and describe their digital privacy priorities in order to articulate the benefits and risks of their digital dossier
  2. Apply a growth mindset to critically examine their current data exhaust // digital footprint and recognize when change is needed
  3. Develop a Personal Data Integrity Plan that makes routine the process of auditing and updating their digital dossier in alignment with their privacy values
  4. Describe “digital shred” and its importance.
Discipline: 
Multidisciplinary

Information Literacy concepts:

Individual or Group:

Suggested Citation: 
Chisholm, Alexandria. "Digital Shred Workshop." CORA (Community of Online Research Assignments), 2020. https://projectcora.org/assignment/digital-shred-workshop.
Submitted by Carolyn Caffrey on December 10th, 2018
Share this on: 
Short Description: 

This assignment was created for a credit bearing course for first year students. It's designed to help students take what they've learned about algorithmic bias from the course lectures and readings and apply it to their own search practices. They also critically analyze search results for advertisements and compare DuckDuckGo to Google. [You could also look at this assignment as an adaptation of Jacob Berg's wonderful, "Googling Google," assignment at https://www.projectcora.org/assignment/googling-google-search-engines-ma... ]

Attachments: 
AttachmentSize
analyzingsearchengines_assignment.docxdisplayed 1315 times534.31 KB
Learning Outcomes: 

Students will be able to: -identify advertisements within a list of search results -discuss the role advertising plays in how search results are ordered -describe how search results are impacted by human biases in their ranking algorithms

Discipline: 
Multidisciplinary

Individual or Group:

Course Context (e.g. how it was implemented or integrated): 

This assignment occurred early in the semester as we discussed algorithms, bias, and filter bubbles. Students were asked to draw on class discussions and lectures on page rank, the history of search engines, and filter bubbles. Other assigned material going into this assignment the IRL podcast episode "Social Bubble Bath" and Eli Pariser's TED talk on filter bubbles. Students commented that they enjoyed this assignment, weren't aware that Google was an advertising company, and were unfamiliar with DuckDuckGo. The course itself was designed and taught by me (a librarian) as part of our first year seminar program.

Assessment or Criteria for Success
Assessment Short Description: 
Assignments were evaluated using the rubric from the attached assignment sheet. In general, students had difficulty identifying all of the advertisements. While students had no difficulty analyzing gender bias or racism in the image results, they did struggle with the phrase "god" in identifying how the results may privilege particular narratives and identities over others.
Potential Pitfalls and Teaching Tips: 

Be careful with the choose your own image search --- several students picked topics such as our institution name or vague concepts like "music" which didn't as clearly illustrate the course concepts. In the future I would remove the choose your own option for the image component. This assignment was designed with first year students in mind.

Suggested Citation: 
Caffrey, Carolyn . "Analyzing search engines: What narrative is told through the algorithm." CORA (Community of Online Research Assignments), 2018. https://projectcora.org/assignment/analyzing-search-engines-what-narrative-told-through-algorithm.
Submitted by Alexandria Chisholm on December 6th, 2018
Share this on: 
Short Description: 

This workshop delivers an action-oriented introduction to personal data privacy designed for new college students. The session is designed to reveal the systems in place to collect and analyze online behavioral data, and to unveil the real-world consequences of online profiling in contexts like sentiment shaping, consumer preferences, employment, healthcare, personal finance, and law enforcement. In lieu of a prescriptive approach, students analyze case studies to observe how online behaviors impact real-world opportunities and reflect on the benefits and risks of technology use to develop purposeful online behaviors and habits that align with their individual values. Developing knowledge practices regarding privacy and the commodification of personal information and embodying the core library values of privacy and intellectual freedom, the workshop promotes a proactive rather than reactive approach and presents a spectrum of privacy preferences across a range of contexts in order to respect students’ autonomy and agency in personal technology use.

Attachments: 
AttachmentSize
PersonalDataPlan_PennStateBerks.pdfdisplayed 1184 times622.24 KB
PrivacyWorkshopLessonPlan_Chisholm_Hartman-Caverly.pdfdisplayed 715 times189.3 KB
Learning Outcomes: 

Students will be able to: 1. recognize how their personal data and metadata are collected, along with the potential implications of such data collection 2. assess how their data is shared and make informed, intentional choices to safeguard their privacy 3. identify privacy issues facing our society 4. describe the positive case for privacy as a human right fundamental to individual well-being

Discipline: 
Multidisciplinary

Information Literacy concepts:

Individual or Group:

Collaborators: 
Suggested Citation: 
Chisholm, Alexandria. "Privacy Workshop." CORA (Community of Online Research Assignments), 2018. https://projectcora.org/assignment/privacy-workshop.
Submitted by Elisa Acosta on October 28th, 2018
Share this on: 
Short Description: 

This 30-minute activity was a quick introduction to algorithmic bias and the importance of critically evaluating search engine results. Algorithms increasingly shape modern life and can perpetuate bias and discrimination. In pairs, students analyzed the results from Google Image searches and Google Autocomplete suggestions. This activity was based on “Algorithms of Oppression: How Search Engines Reinforce Racism,” by Safiya Umoja Noble. This lesson plan was Part 1 of an hour-long workshop that also included a 30 minute Google Scholar activity. Please see Jennifer Masanaga's Google Scholar activity for Part 2: https://www.projectcora.org/assignment/exploring-google-scholar-summer-b...

Attachments: 
AttachmentSize
Lesson Plandisplayed 4802 times154.64 KB
Presentation slidesdisplayed 2227 times3.37 MB
Worksheetdisplayed 1784 times326.34 KB
Suggested Readingsdisplayed 1181 times65.96 KB
Learning Outcomes: 

1. Students will discuss the effects of algorithm bias in order to articulate how some individuals or groups of individuals may be misrepresented or systematically marginalized in search engine results. 2. Students will develop an attitude of informed skepticism in order to critically evaluate Google search results.

Individual or Group:

Course Context (e.g. how it was implemented or integrated): 

The Computer Science Summer Institute Extension Program, or CCSIX, is a 3-week on-campus summer experience for first-year students studying computer science and related STEM fields. This program is designed for groups underrepresented in computing (i.e., women, underrepresented minorities in STEM, and first-generation or low-income college students). https://cssiextension.withgoogle.com/

Potential Pitfalls and Teaching Tips: 

Incoming first-year students were shy and quiet. I revised the lesson plan to include more Think-Pair-Share and less all-class discussion. The instructor should model the Google Images activity first (Professor Style), then let students do the second activity (Computer Scientist) together in pairs. The students liked “partner time.” This was a summer bridge program, so we decided to keep the worksheets short and the activities social (students talking to each other).

Collaborators: 
Suggested Citation: 
Acosta, Elisa. "Exploring Algorithmic Bias with a Summer Bridge Program." CORA (Community of Online Research Assignments), 2018. https://projectcora.org/assignment/exploring-algorithmic-bias-summer-bridge-program.