COAP:Privacy - part 2: Difference between revisions

The educational technology and digital learning wiki
Jump to navigation Jump to search
mNo edit summary
Line 201: Line 201:
=== Predicting user personality ===
=== Predicting user personality ===


-information used to predict personality traits, based on language used, avatars.


-feedback via surveys provided by companies.
Drawback of information being saved and taken from individual's?
Everyone on social media leaves a foot print with likes. Through correlation certain traits can be guessed. Such as sexual preferences and intelligence. People should be careful with likes, everything you do will be used against you.
=== Behavior of younger persons===
=== Behavior of younger persons===
Teenagers attitudes to social privacy. Assumption that they do not care about privacy. Have different strategies such as: references or deleting and editing content. Teenagers are not concerned about third party companies looking at their information, more preoccupied by personal acquaintances reading their social media profiles. What is socially acceptable?




Line 208: Line 216:




=== Technical issues (e-mail trakcing, big data, protection)===
=== Technical issues (e-mail tracking, big data, protection)===





Revision as of 16:46, 23 March 2017

Introduction

This is part two of a teaching module on privacy.

Lesson two will feature student presentations and discussion.

Instructions

  1. Each student must read one paper from the reading list below (or part of it). Work will be distributed at the end of lesson 1
  2. Please come back with the following items found in your reading:
    1. One or two important idea(s) or fact(s) found in the article (issues)
    2. One guideline for either institutions that collect data or individuals that provide data
    3. One question you would like to discuss

Copies of these papers are available through an Intranet. The instructor will give you a login + password. Otherwise, you may try to obtain them through Webster's library online service.

Lesson 2 preparation - reading list

Pick one text among the following (will be negotiated at the of lesson 1)

Defining (Internet) Privacy

  • Privacy, Stanford Encyclopedia of Philosophy, First published Tue May 14, 2002; substantive revision Fri Aug 9, 2013
    • Student:
    • Summary: This article discusses the multiple facets of privacy. Good, but somewhat difficult reading.
    • Read all


  • Daniel J. Solove (2006). Taxonomy Of Privacy, University of Pennsylvania Law Review.

Predictive modeling

  • Honghao Wei, Fuzheng Zhang, Nicholas Jing Yuan, Chuan Cao, Hao Fu, Xing Xie, Yong Rui, and Wei-Ying Ma. 2017. Beyond the Words: Predicting User Personality from Heterogeneous Information. In Proceedings of the Tenth ACM International Conference on Web Search and Data Mining (WSDM '17). ACM, New York, NY, USA, 305-314. DOI: https://doi.org/10.1145/3018661.3018717
    • Read the summary: Beyond the words: predicting user personality from heterogeneous information February 16, 2017
    • Student: TO
    • Quote: Previous studies have demonstrated that language usage in social media is effective in personality prediction. However, except for single language features, a less researched direction is how to leverage the heterogeneous information on social media to have a better understanding of user personality. In this paper, we propose a Heterogeneous Information Ensemble framework, called HIE, to predict users' personality traits by integrating heterogeneous information including self-language usage, avatar, emoticon, and responsive patterns.


  • Michal Kosinski, David Stillwell, and Thore Graepel, Private traits and attributes are predictable from digital records of human behavior, PNAS 2013 110 (15) 5802-5805; published ahead of print March 11, 2013, doi:10.1073/pnas.1218772110
    • Student: TG
    • Quote: We show that easily accessible digital records of behavior, Facebook Likes, can be used to automatically and accurately predict a range of highly sensitive personal attributes including: sexual orientation, ethnicity, religious and political views, personality traits, intelligence, happiness, use of addictive substances, parental separation, age, and gender.
    • See also: MyPersonality Database
    • Read the whole article
    • Kosinski-PNAS-2013.pdf (access restricted)

Young people's behavior

  • Boyd, Danah and Marwick, Alice E., Social Privacy in Networked Publics: Teens’ Attitudes, Practices, and Strategies (September 22, 2011). A Decade in Internet Time: Symposium on the Dynamics of the Internet and Society, September 2011. Available at SSRN: http://ssrn.com/abstract=1925128
    • Student: AT
    • This paper represents an ethnographic study on what is teen's privacy and how it is managed
    • Read at least "Privacy in Public" (last section) plus another section on a topic that is of interest.
    • boyd-marwick-2011.pdf (access restricted)
  • Hoofnagle, Chris Jay and King, Jennifer and Li, Su and Turow, Joseph, How Different are Young Adults from Older Adults When it Comes to Information Privacy Attitudes and Policies? (April 14, 2010). Available at SSRN: http://ssrn.com/abstract=1589864 or http://dx.doi.org/10.2139/ssrn.1589864
    • Student: AA
    • Quote: We conclude then that that young-adult Americans have an aspiration for increased privacy even while they participate in an online reality that is optimized to increase their revelation of personal data.
    • Skim the whole paper
    • hoofnagle-et-al-2010.pdf (access restricted)


  • Marwick, Alice E. and Murgia-Diaz, Diego and Palfrey, John G., Youth, Privacy and Reputation (Literature Review). Berkman Center Research Publication No. 2010-5; Harvard Public Law Working Paper No. 10-29. Available at SSRN: http://ssrn.com/abstract=1588163 (80 pages)
    • Student: DLDP
    • Quote: The scope of this literature review is to map out what is currently understood about the intersections of youth, reputation, and privacy online, focusing on youth attitudes and practices. We summarize both key empirical studies from quantitative and qualitative perspectives and the legal issues involved in regulating privacy and reputation. This project includes studies of children, teenagers, and younger college students.
    • Read pages 60-65
    • marwick-et-al-2010.pdf (access restricted)

Privacy on the Internet - practical and technical issues


  • Seda Gürses. 2014. Can you engineer privacy?. Commun. ACM 57, 8 (August 2014), 20-23. DOI=10.1145/2633029 http://doi.acm.org/10.1145/2633029
    • Student:
    • Quote: We cannot engineer society, but neither are our societies independent of the systems we engineer. Hence, as practitioners and researchers we have the responsibility to engineer systems that address privacy concerns.
    • Read the whole article (pages 20-23)
    • gurses-2014.pdf (access restricted)


  • Paul Weiser and Simon Scheider. 2014. A civilized cyberspace for geoprivacy. In Proceedings of the 1st ACM SIGSPATIAL International Workshop on Privacy in Geographic Information Collection and Analysis (GeoPrivacy '14), Carsten Kessler, Grant D. McKenzie, and Lars Kulik (Eds.). ACM, New York, NY, USA, , Article 5 , 8 pages. DOI=10.1145/2675682.2676396 http://doi.acm.org/10.1145/2675682.2676396


  • Fabian, B., Bender, B., & Weimann, L. (2015). E-Mail Tracking in Online Marketing-Methods, Detection, and Usage, Wirtschaftsinformatik (pp. 1100-1114).
    • Student: AF
    • Quote: EMail tracking uses personalized links and pictures for gathering in-formation on user behavior, for example, where, when, on what kind of device, and how often an e-mail has been read. This information can be very useful for marketing purposes. On the other hand, privacy and security requirements of customers could be violated by tracking.
    • Read All
    • fabian-2015.pdf (access restricted)


  • Solon Barocas and Helen Nissenbaum. 2014. Big data's end run around procedural privacy protections. Commun. ACM 57, 11 (October 2014), 31-33. DOI=10.1145/2668897 http://doi.acm.org/10.1145/2668897
    • Student: VF
    • Quote: When consent is given (or not withheld) or the data is anonymized, virtually any information practice becomes permissible.
    • Read all
    • barocas-et-al-2014.pdf (access restricted)


  • Tanmay Sinha, Vrns Srikanth, Mangal Sain, and Hoon Jae Lee. 2013. Trends and research directions for privacy preserving approaches on the cloud. In Proceedings of the 6th ACM India Computing Convention (Compute '13). ACM, New York, NY, USA, , Article 21 , 12 pages. DOI=10.1145/2522548.2523138 http://doi.acm.org/10.1145/2522548.2523138



  • Robert Faris and David R. O’Brien, Data and Privacy, in Gasser et al. in Gasser, Urs and Zittrain, Jonathan and Faris, Robert and Heacock Jones, Rebekah, Internet Monitor 2014: Reflections on the Digital World: Platforms, Policy, Privacy, and Public Discourse (December 15, 2014). Berkman Center Research Publication No. 2014-17. Available at SSRN: http://ssrn.com/abstract=2538813.
    • Student:
    • Quote: The mismatch between traditional mechanisms for preserving privacy and the realities of digital networks are more apparent each day. The Internet, “the world’s biggest copy machine,”1 has eliminated the principal mechanism for preserving privacy; it used to be expensive to record and maintain information on the everyday comings and goings of citizens.
    • Read the introduction (p. 63-65) plus 2-3 following ultra-short articles
    • gasser-et-al-2014.pdf

Mobile apps and other data from your mobile


Privacy in Internet-supported research

  • John Leslie King. 2015. Humans in computing: growing responsibilities for researchers. Commun. ACM 58, 3 (February 2015), 31-33. DOI=10.1145/2723675 http://doi.acm.org/10.1145/2723675
    • Student:
    • Quote: Open issues regarding human welfare will not be settled using an authoritarian approach. Computing researchers in universities and companies cannot do whatever they like. Doctoral students and postdoctoral fellows should be aware of science and engineering ethics. Ethical concerns must lead professional practice and regulation, not the other way around.
    • Read all
    • king-2015.pdf (access restricted)

Political action, law and opinions

  • Liberty in the age of technology. ACLU, 2014, (3 pages)
    • Student: ILS
    • Quote: Increasing government surveillance worldwide raises tough questions for democracy and civil liberty. Left unchecked, the deployment of intrusive new technologies poses a profound threat to individual privacy. What we need, says Barry Steinhardt, is stronger regulation to ensure that such technology is used fairly – by governments and businesses alike.
    • Read the whole article



  • Bolton, Robert Lee, The Right to Be Forgotten: Forced Amnesia in a Technological Age (October 15, 2014). 31 J. Marshall J. Info. Tech. & Privacy L. 133 (2015); John Marshall Journal of Computer & Information Law, Forthcoming. Available at SSRN: http://ssrn.com/abstract=2513652
    • Quote: In much of Europe, among the citizenry’s rights is a legal concept referred to as le droit à l’oubli. This “right to be forgotten” is a nebulous term whose exact meaning varies by country, but can generally be defined as the right of an individual to control data pertaining to them and have it destroyed if they so desire
    • Student: PAB
    • Read sections "Introduction", "The Law abroad" and Conclusion
    • bolton-2015.pdf (access restricted)

Use of medical e-health data

Apple vs FBI

  • FBI–Apple encryption dispute (Wikipedia). https://en.wikipedia.org/wiki/FBI%E2%80%93Apple_encryption_dispute
    • Student: GSH
    • Quote: The FBI–Apple encryption dispute concerns whether and to what extent courts in the United States can compel manufacturers to assist in unlocking cell phones whose data are cryptographically protected.[1] There is much debate over public access to strong encryption.[2]
    • Read the whole article

Day two organization

  • Each student will present three items (an important issue, a guideline, and a question)

List of student presentations:

  1. Student: TO - Beyond the words: predicting user personality from heterogeneous information
  2. Student: TG - Private traits and attributes are predictable from digital records of human behavior
  3. Student: AT - Social Privacy in Networked Publics: Teens’ Attitudes, Practices, and Strategies
  4. Student: AA - How Different are Young Adults from Older Adults When it Comes to Information Privacy Attitudes and Policies?
  5. Student: DLDP - Youth, Privacy and Reputation
  6. Student: FP - Who Should Take Care of Identity, Privacy and Reputation
  7. Student: AF - E-Mail Tracking in Online Marketing-Methods, Detection, and Usage
  8. Student: VF - Big data's end run around procedural privacy protections
  9. Student: AT - Trends and research directions for privacy preserving approaches on the cloud
  10. Student: MH - Results of the 2014 Global Privacy Enforcement Network Sweep
  11. Student: ILS - Liberty in the age of technology
  12. Student: PAB - “right to be forgotten”
  13. Student: GSH - APPle vs. IBM

Summary of issues

Predicting user personality

-information used to predict personality traits, based on language used, avatars.

-feedback via surveys provided by companies.

Drawback of information being saved and taken from individual's?

Everyone on social media leaves a foot print with likes. Through correlation certain traits can be guessed. Such as sexual preferences and intelligence. People should be careful with likes, everything you do will be used against you.

Behavior of younger persons

Teenagers attitudes to social privacy. Assumption that they do not care about privacy. Have different strategies such as: references or deleting and editing content. Teenagers are not concerned about third party companies looking at their information, more preoccupied by personal acquaintances reading their social media profiles. What is socially acceptable?


Who is concerned

Technical issues (e-mail tracking, big data, protection)

Privacy enforcement, liberty, rights

Privacy vs. crime

Ideas for guidelines

See also:

Additional resources

Classes
Classes (recent past)