- Talk about your digital footprint and what it is.
- Why you should be aware of what is online about you
- Steps you can take to manage your online presence
- Google yourself
- Remove your profiles on people search engines
- Make sure your social media data isn’t searchable on search engines like Google
The term “data brokers” describes companies that collect and sell personal data about individuals to other companies for a variety of purposes. Reports suggest many data brokers have thousands of pieces of information on most Americans , including demographic information like your full name, address, age, gender, income, education, and occupation, as well as other sources of information about your health, likes and dislikes, and more.
Data brokers combine many sources of data together to create profiles for each person. Sources of data include social media, credit cards, browsing history, and government records. Depending on the amount and quality of data in each record, these profiles can be worth a lot of money to advertisers and others.
There are three main types of data brokers : people search sites like Spokeo, marketing brokers like Datalogix, and risk-mitigation brokers like ID Analytics. MORE ABOUT DIFFERENT TYPES
- Add info on how to get yourself removed and why you might want to do that. See https://www.vice.com/en_us/article/ne9b3z/how-to-get-off-data-broker-and-people-search-sites-pipl-spokeo
 The Data Brokers: Selling your personal information: https://www.cbsnews.com/news/the-data-brokers-selling-your-personal-information/
 What Are ‘Data Brokers,’ and Why Are They Scooping Up Information About You?: https://www.vice.com/en_us/article/bjpx3w/what-are-data-brokers-and-how-to-stop-my-private-data-collection
Privacy has become a buzzword in recent years as news headlines regularly cover data breaches, social media scandals, facial recognition programs, and more. But how do these headlines affect your day-to-day life?
You share data every day, through the apps you use on your phone, the websites you visit, and the credit cards you use, as well as cameras and other tools that capture your movement and location. A lot of the time, data is collected and shared in the background, with no way for you to know it’s happening.
This data is often used by companies to create targeted advertising, but we’re increasingly seeing examples of data being used to make important decisions. For example, Target used customer data to create an algorithm that predicts whether a woman is pregnant, as well as her due date . Amazon recently patented a technology that would let its Alexa personal assistant device act like a doctor and diagnose you based on verbal cues captured by the device . Researchers have suggested that social media posts can predict a range of health conditions, from diabetes to alcoholism and mental health disorders .
While some of these advances may help with our health and wellness, there is a darker side to this data collection and analysis. What if your social media posts were used to make hiring and firing decisions? What if your HR department gave you a Fitbit, then used the data to increase your premiums? These scenarios are not movie plotlines, but are already happening.
Whether you use social media or online banking or any of the various digital tools out there, you should be able to make informed decisions about what data you share and what happens to that data once you share it.
 The Incredible Story Of How Target Exposed A Teen Girl’s Pregnancy: https://www.businessinsider.com/the-incredible-story-of-how-target-exposed-a-teen-girls-pregnancy-2012-2  Amazon patents new Alexa feature that knows when you’re ill and offers you medicine: https://www.telegraph.co.uk/technology/2018/10/09/amazon-patents-new-alexa-feature-knows-offers-medicine/  Facebook posts could help doctors spot alcoholism, diabetes or depression, study says: https://www.cnet.com/news/facebook-activity-might-help-predict-mental-and-physical-health-study-says/
Libraries have become an important point of online access for many low-income families who use library resources to perform a range of activities from online shopping to job applications (Powell et al., 2010). Many of these online transactions involve entering sensitive personal information like social security numbers. Because of this, protecting patrons’ personal information on public library computers is critical. Librarians, therefore, must have a strong understanding of the risks patrons face when sharing personal information and be able to communicate these risks to patrons.
In this study, we build on our previous findings of librarians as information intermediaries that identify privacy and security challenges library staff face when assisting patrons. In many cases, patrons may have unreasonable expectations regarding librarians’ knowledge of various types of online transactions and the devices, or assume that librarians are able to complete online forms on their behalf. Patrons are often more focused on completing the task at hand–and having the librarian help them complete it–than learning the skills to be able to complete similar future tasks on their own.
A primary goal of this larger project is to develop resources that help librarians and patrons navigate online privacy and security concerns at the library. Our previous research has revealed the need for library staff to have clear policies to refer to when assisting patrons with online activities that involve sensitive information. Therefore, we are developing a policy framework to guide libraries in creating or updating their own policies on how library staff should approach privacy and security issues.
To develop the privacy framework, we utilized a cooperative inquiry method to ideate and iterate the privacy framework with library staff through the use of participatory design (PD) techniques such as sticky noting and big paper approach (Druin, 1999, 2005; Guha et. al., 2005). The process of PD allowed us to develop this framework organically, taking into account the varied experiences and opinions of library staff, whose patrons come from a variety of demographics and have different information needs.
Druin, A. (1999). Cooperative inquiry: Developing new technologies for children with children. In CHI ’99: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 592-599.
Druin, A. (2005). What children can teach us: Developing digital libraries for children. Library Quarterly, 75(1), 20-41.
Guha M, Druin A, Chipman G, et al (2005).Working with young children as technology design partners. Communications of the ACM 48(1): 39-42.
Powell, A., Bryne, A., & Dailey, D. (2010). The Essential Internet: Digital Exclusion in Low-Income American Communities. Policy & Internet, 2(2), 159–190.
Back to Research Updates
Leveraging Funds of Knowledge to Manage Privacy Practices in Families
ABSTRACT Information and communication technologies play a critical role at home, school, and work for people of all ages. At the same time, use of these technologies can present challenges to privacy and security. In this study, we apply the concept of funds of knowledge to understand how families develop knowledge and skills around using technology and protecting personal information. Funds of knowledge explains how people gain knowledge and highlights how learning happens in a variety of environments beyond the classroom. Through interviews with 52 families living in economically disadvantaged communities in the United States, we develop a typology of privacy funds of knowledge in families. We also explore how privacy funds of knowledge inform families’ privacy practices. We conclude the paper by identifying how these findings inform the development of resources for families to further enhance their digital practices.
Full paper link.
Return to Research Updates
Our paper based on family interview has been accepted by CSCW 2019!
“I Knew It Was Too Good To Be True”: The Challenges Economically Disadvantaged Users Face in Assessing Trustworthiness, Avoiding Scams, and Developing Self-Efficacy Online
In the U.S., consumers increasingly turn to the internet and mobile apps to complete essential personal transactions, ranging from financial payments to job applications. This shift to digital transactions can create challenges for those without reliable home internet connections or with limited digital literacy by requiring them to submit sensitive information on public computers or on unfamiliar websites. Using interviews with 52 families from high-poverty communities in the mid-Atlantic region of the U.S., we explore the compounding privacy and security challenges that economically disadvantaged individuals face when navigating online services. We describe the real, perceived, and unknown risks they face as they navigate online transactions with limited technical skills, as well as the strategies and heuristics they employ to minimize these risks. The findings highlight a complex relationship between participants’ negative experiences and their general mistrust of sharing data through online channels. We also describe a range of strategies participants use to try and protect their personal information. Based on these findings, we offer design recommendations to inform the creation of educational resources that we will develop in the next phase of this project.
Our paper has been accepted by iConference 2018, Sheffield, UK!
Our paper with the title Librarians as Information Intermediaries: Navigating Tensions Between Being Helpful and Being Liable has been accepted by iConference 2018 at Sheffield, UK. As conference proceeding, it will be published in Lecture Notes in Computer Science.
Safe Data | Safe Families Project Awarded the 2016 IMLS Research Grant
The University of Maryland, partnering with the Maryland State Department of Education’s Division of Library Development & Services (DLDS), the American Library Association’s Center for the Future of Libraries (CFL), and CASA de Maryland, will identify privacy and security challenges librarians and families of low socioeconomic status face using internet and communication technologies. After conducting an evaluation of the challenges facing librarians, families of low socioeconomic status, and information intermediaries within families such as the children and young adults who serve as information brokers, the partners will develop a suite of educational and professional development resources for librarians and families to enhance privacy-related digital skills and to minimize risks to the security of individuals’ personal information. Up to 50 local families and 40 librarians will participate in the study while data, research reports, and resources will be made accessible to the broader field through a project website, webinars, conference presentations, stakeholder talks, and social media.
Return to Research Updates