COAP:Privacy - part 2

From EduTech Wiki
Jump to: navigation, search

1 Introduction

This is part two of a teaching module on privacy.

Lesson two will feature student presentations and discussion.

Instructions

  1. Each student must read one paper from the reading list below (or part of it). Work will be distributed at the end of lesson 1
  2. Please come back with the following items found in your reading:
    1. One or two important idea(s) or fact(s) found in the article (issues)
    2. One guideline for either institutions that collect data or individuals that provide data
    3. One question you would like to discuss

Copies of these papers are available through an Intranet. The instructor will give you a login + password. Otherwise, you may try to obtain them through Webster's library online service.

2 Lesson 2 preparation - reading list

Pick one text among the following (will be negotiated at the of lesson 1)

2.1 Defining (Internet) Privacy

  • Privacy, Stanford Encyclopedia of Philosophy, First published Tue May 14, 2002; substantive revision Fri Aug 9, 2013
    • Student:
    • Summary: This article discusses the multiple facets of privacy. Good, but somewhat difficult reading.
    • Read all


  • Daniel J. Solove (2006). Taxonomy Of Privacy, University of Pennsylvania Law Review.

2.2 Predictive modeling

  • Honghao Wei, Fuzheng Zhang, Nicholas Jing Yuan, Chuan Cao, Hao Fu, Xing Xie, Yong Rui, and Wei-Ying Ma. 2017. Beyond the Words: Predicting User Personality from Heterogeneous Information. In Proceedings of the Tenth ACM International Conference on Web Search and Data Mining (WSDM '17). ACM, New York, NY, USA, 305-314. DOI: https://doi.org/10.1145/3018661.3018717
    • Read the summary: Beyond the words: predicting user personality from heterogeneous information February 16, 2017
    • Student: TO
    • Quote: Previous studies have demonstrated that language usage in social media is effective in personality prediction. However, except for single language features, a less researched direction is how to leverage the heterogeneous information on social media to have a better understanding of user personality. In this paper, we propose a Heterogeneous Information Ensemble framework, called HIE, to predict users' personality traits by integrating heterogeneous information including self-language usage, avatar, emoticon, and responsive patterns.


  • Michal Kosinski, David Stillwell, and Thore Graepel, Private traits and attributes are predictable from digital records of human behavior, PNAS 2013 110 (15) 5802-5805; published ahead of print March 11, 2013, doi:10.1073/pnas.1218772110
    • Student: TG
    • Quote: We show that easily accessible digital records of behavior, Facebook Likes, can be used to automatically and accurately predict a range of highly sensitive personal attributes including: sexual orientation, ethnicity, religious and political views, personality traits, intelligence, happiness, use of addictive substances, parental separation, age, and gender.
    • See also: MyPersonality Database
    • Read the whole article
    • Kosinski-PNAS-2013.pdf (access restricted)

2.3 Young people's behavior

  • Boyd, Danah and Marwick, Alice E., Social Privacy in Networked Publics: Teens’ Attitudes, Practices, and Strategies (September 22, 2011). A Decade in Internet Time: Symposium on the Dynamics of the Internet and Society, September 2011. Available at SSRN: http://ssrn.com/abstract=1925128
    • Student: AT
    • This paper represents an ethnographic study on what is teen's privacy and how it is managed
    • Read at least "Privacy in Public" (last section) plus another section on a topic that is of interest.
    • boyd-marwick-2011.pdf (access restricted)
  • Hoofnagle, Chris Jay and King, Jennifer and Li, Su and Turow, Joseph, How Different are Young Adults from Older Adults When it Comes to Information Privacy Attitudes and Policies? (April 14, 2010). Available at SSRN: http://ssrn.com/abstract=1589864 or http://dx.doi.org/10.2139/ssrn.1589864
    • Student: AA
    • Quote: We conclude then that that young-adult Americans have an aspiration for increased privacy even while they participate in an online reality that is optimized to increase their revelation of personal data.
    • Skim the whole paper
    • hoofnagle-et-al-2010.pdf (access restricted)


  • Marwick, Alice E. and Murgia-Diaz, Diego and Palfrey, John G., Youth, Privacy and Reputation (Literature Review). Berkman Center Research Publication No. 2010-5; Harvard Public Law Working Paper No. 10-29. Available at SSRN: http://ssrn.com/abstract=1588163 (80 pages)
    • Student: DLDP
    • Quote: The scope of this literature review is to map out what is currently understood about the intersections of youth, reputation, and privacy online, focusing on youth attitudes and practices. We summarize both key empirical studies from quantitative and qualitative perspectives and the legal issues involved in regulating privacy and reputation. This project includes studies of children, teenagers, and younger college students.
    • Read pages 60-65
    • marwick-et-al-2010.pdf (access restricted)

2.4 Privacy on the Internet - practical and technical issues


  • Seda Gürses. 2014. Can you engineer privacy?. Commun. ACM 57, 8 (August 2014), 20-23. DOI=10.1145/2633029 http://doi.acm.org/10.1145/2633029
    • Student:
    • Quote: We cannot engineer society, but neither are our societies independent of the systems we engineer. Hence, as practitioners and researchers we have the responsibility to engineer systems that address privacy concerns.
    • Read the whole article (pages 20-23)
    • gurses-2014.pdf (access restricted)


  • Paul Weiser and Simon Scheider. 2014. A civilized cyberspace for geoprivacy. In Proceedings of the 1st ACM SIGSPATIAL International Workshop on Privacy in Geographic Information Collection and Analysis (GeoPrivacy '14), Carsten Kessler, Grant D. McKenzie, and Lars Kulik (Eds.). ACM, New York, NY, USA, , Article 5 , 8 pages. DOI=10.1145/2675682.2676396 http://doi.acm.org/10.1145/2675682.2676396


  • Fabian, B., Bender, B., & Weimann, L. (2015). E-Mail Tracking in Online Marketing-Methods, Detection, and Usage, Wirtschaftsinformatik (pp. 1100-1114).
    • Student: AF
    • Quote: EMail tracking uses personalized links and pictures for gathering in-formation on user behavior, for example, where, when, on what kind of device, and how often an e-mail has been read. This information can be very useful for marketing purposes. On the other hand, privacy and security requirements of customers could be violated by tracking.
    • Read All
    • fabian-2015.pdf (access restricted)


  • Solon Barocas and Helen Nissenbaum. 2014. Big data's end run around procedural privacy protections. Commun. ACM 57, 11 (October 2014), 31-33. DOI=10.1145/2668897 http://doi.acm.org/10.1145/2668897
    • Student: VF
    • Quote: When consent is given (or not withheld) or the data is anonymized, virtually any information practice becomes permissible.
    • Read all
    • barocas-et-al-2014.pdf (access restricted)


  • Tanmay Sinha, Vrns Srikanth, Mangal Sain, and Hoon Jae Lee. 2013. Trends and research directions for privacy preserving approaches on the cloud. In Proceedings of the 6th ACM India Computing Convention (Compute '13). ACM, New York, NY, USA, , Article 21 , 12 pages. DOI=10.1145/2522548.2523138 http://doi.acm.org/10.1145/2522548.2523138



  • Robert Faris and David R. O’Brien, Data and Privacy, in Gasser et al. in Gasser, Urs and Zittrain, Jonathan and Faris, Robert and Heacock Jones, Rebekah, Internet Monitor 2014: Reflections on the Digital World: Platforms, Policy, Privacy, and Public Discourse (December 15, 2014). Berkman Center Research Publication No. 2014-17. Available at SSRN: http://ssrn.com/abstract=2538813.
    • Student:
    • Quote: The mismatch between traditional mechanisms for preserving privacy and the realities of digital networks are more apparent each day. The Internet, “the world’s biggest copy machine,”1 has eliminated the principal mechanism for preserving privacy; it used to be expensive to record and maintain information on the everyday comings and goings of citizens.
    • Read the introduction (p. 63-65) plus 2-3 following ultra-short articles
    • gasser-et-al-2014.pdf

2.5 Mobile apps and other data from your mobile


2.6 Privacy in Internet-supported research

  • John Leslie King. 2015. Humans in computing: growing responsibilities for researchers. Commun. ACM 58, 3 (February 2015), 31-33. DOI=10.1145/2723675 http://doi.acm.org/10.1145/2723675
    • Student:
    • Quote: Open issues regarding human welfare will not be settled using an authoritarian approach. Computing researchers in universities and companies cannot do whatever they like. Doctoral students and postdoctoral fellows should be aware of science and engineering ethics. Ethical concerns must lead professional practice and regulation, not the other way around.
    • Read all
    • king-2015.pdf (access restricted)

2.7 Political action, law and opinions

  • Liberty in the age of technology. ACLU, 2014, (3 pages)
    • Student: ILS
    • Quote: Increasing government surveillance worldwide raises tough questions for democracy and civil liberty. Left unchecked, the deployment of intrusive new technologies poses a profound threat to individual privacy. What we need, says Barry Steinhardt, is stronger regulation to ensure that such technology is used fairly – by governments and businesses alike.
    • Read the whole article



  • Bolton, Robert Lee, The Right to Be Forgotten: Forced Amnesia in a Technological Age (October 15, 2014). 31 J. Marshall J. Info. Tech. & Privacy L. 133 (2015); John Marshall Journal of Computer & Information Law, Forthcoming. Available at SSRN: http://ssrn.com/abstract=2513652
    • Quote: In much of Europe, among the citizenry’s rights is a legal concept referred to as le droit à l’oubli. This “right to be forgotten” is a nebulous term whose exact meaning varies by country, but can generally be defined as the right of an individual to control data pertaining to them and have it destroyed if they so desire
    • Student: PAB
    • Read sections "Introduction", "The Law abroad" and Conclusion
    • bolton-2015.pdf (access restricted)

2.8 Use of medical e-health data

2.9 Apple vs FBI

  • FBI–Apple encryption dispute (Wikipedia). https://en.wikipedia.org/wiki/FBI%E2%80%93Apple_encryption_dispute
    • Student: GSH
    • Quote: The FBI–Apple encryption dispute concerns whether and to what extent courts in the United States can compel manufacturers to assist in unlocking cell phones whose data are cryptographically protected.[1] There is much debate over public access to strong encryption.[2]
    • Read the whole article

3 Day two organization

  • Each student will present three items (an important issue, a guideline, and a question)

List of student presentations:

  1. Student: TO - Beyond the words: predicting user personality from heterogeneous information
  2. Student: TG - Private traits and attributes are predictable from digital records of human behavior
  3. Student: AT - Social Privacy in Networked Publics: Teens’ Attitudes, Practices, and Strategies
  4. Student: AA - How Different are Young Adults from Older Adults When it Comes to Information Privacy Attitudes and Policies?
  5. Student: DLDP - Youth, Privacy and Reputation
  6. Student: FP - Who Should Take Care of Identity, Privacy and Reputation
  7. Student: AF - E-Mail Tracking in Online Marketing-Methods, Detection, and Usage
  8. Student: VF - Big data's end run around procedural privacy protections
  9. Student: AT - Trends and research directions for privacy preserving approaches on the cloud
  10. Student: MH - Results of the 2014 Global Privacy Enforcement Network Sweep
  11. Student: ILS - Liberty in the age of technology
  12. Student: PAB - “right to be forgotten”
  13. Student: GSH - APPle vs. IBM

4 Summary of issues

4.1 Predicting user personality

Information used to predict personality traits, based on language used, avatars.

Feedback via surveys provided by companies.

Drawback of information being saved and taken from individual's?

Everyone on social media leaves a foot print with likes. Through correlation certain traits can be guessed. Such as sexual preferences and intelligence. People should be careful with likes, everything you do will be used against you.

4.2 Behavior of younger persons

Teenagers attitudes to social privacy. Assumption that they do not care about privacy. Have different strategies such as: references or deleting and editing content. Teenagers are not concerned about third party companies looking at their information, more preoccupied by personal acquaintances reading their social media profiles. What is socially acceptable?

Young adults (18-24) do not care about what they post on the internet, unlike the older people.

younger adults are more aware about their online privacy.

Teenagers do not care about their online privacy, as they post pictures of their private lives.

How to deal with teenagers or young adults who do not care about their privacy?

the type of government policies that use to mentor the private information. According to statistics, if parents inform children how to use the internet in a right wayit will cause less problems than those less informed. Guidelines might be to use filters that might restrict youngsters access to inappropriate sources.

4.3 Who is concerned

reputation and privacy:

adolescents activities online are investigated in order to create profiles of them. character is created during the adolescence period as they are discovering their personality.

requirement- users must be 13 years old, however they are not informed. profiles of users under 13 can be deleted under request of parent. if user is older than 13, the request will be rejected

Establish a law that will forbid recording of children activities so they won't be concerned about being investigated. Record generally rather than have a detailed digital profile. spread information about permanent recording.

what does identity and privacy mean to youngsters and should be collected on them?

4.4 Technical issues (e-mail tracking, big data, protection)

How companies track an individual by email, either by image that tacks IP address, or by link.

Guideline would be to not click on the links to avoid being tracked and targeted.

Is it a good/bad thing that companies are trying to sell you things you are interested in?

Marketing should be limited especially in terms of email targeting, in order to not invade someone's privacy.

Companies track customers; then generalize into groups. Draw influences on individual without consent. Lack of anonymity. Lack personal details, but know characteristics. Individuals should be aware of this, should there be legal actions/ protections on how companies can target?

4.5 Privacy enforcement, liberty, rights

Now a days we know about cloud services. Privacy is important. companies analyze your data. There should be no revision of your information. Faith of the consumer in the different companies. Therefore they should not leak information.

problem in the future in regards to privacy as technology becomes more advanced. Everyone is entitled to their privacy and it is important to keep it this way.

75% or more apps request an access for location, pictures, camera, etc. People do not often pay attention to this.

50% of the apps are rated by 3.

Survey for individuals providing data.

apps should find a way to provide the best service to users without accessing the private data.

Are we devaluing our private data?

Some apps are asking for data they do not need in order to sell what they obtain from users.

-No barrier between technology and our personal privacy.

Certain states implement programs on computers to restrict an individual's privacy.

Concern of privacy vs. state security.

How can we reach a balance, is it possible?

"The right to be forgotten. Several countries tried to pass laws regarding this subject. Conflict of interests is a bi issue. Every country regulates as they please. Sweden banned looking up their potential employees. Or is it up to people to be cautious online." - proudly transcribed by Alexander Tugashev

4.6 Privacy vs. crime

Main ideas ::

Clash between Apple and the FBI. On Feb 2016, FBI asked Apple to create a new software to access different devices and Apple refused and the case was dropped when a Third Party refused to support creating the software.

Apple believed that if they created a new software, it would be a danger to customers privacy. Apple didn't want to open a sort of Pandora's Box and start a trend of standardized device access by the state.

Guideline ::

Apple should resist but at the same time be able to cooperate with the FBI when needed (e.g. opening only a terrorist's IPhone exclusively and nobody else's) without acting as a threat to customers' privacy.

Question ::

Which should take higher priority in regards to Apple products: the privacy of customers or the securing of National Security?

5 Ideas for guidelines

See also:

6 Additional resources

Classes
Classes (recent past)