Technology and Delegation, Fall 2019
This is the syllabus for the Fall 2019 INFO 239 Technology and Delegation Lab 2019, taught by Deirdre K. Mulligan (dmulligan@berkeley[dot]edu) and Daniel Griffin (daniel.griffin@berkeley[dot]edu).
- Class meeting
- Tuesday and Thursday 3:30-5:00 PM; 202 South Hall
- Office hours
- Deirdre: Wednesdays 2:15pm-3:15pm, 303b South Hall or by appointment
- Daniel (to be confirmed): Thursdays 11am-noon, Rm 2 South Hall (PhD Office at the south end of the lower floor. Knock if the door is locked), or by appointment.
- mailing list
- #techdel_class on Slack
- (email Daniel if you have problems joining either - non-I School students will need to email Daniel to get access to the Slack channel)
- 1 Course description
- 2 Schedule
- 2.1 Week 0: Introduction
- 2.2 Week 1: Values
- 2.3 Assignment 1
- 2.4 Week 2: Why Does Technology Matter?
- 2.5 Assignment 2
- 2.6 Week 3: Protecting Values
- 2.7 Week 4: Legitimacy and Governance con’t
- 2.8 Assignment 3
- 2.9 Week 5: Studying Values
- 2.10 Week 6: Values in Design
- 2.11 Week 7: Designing for Law & Regulation
- 2.12 Week 8: User Centered Design & Value Sensitive Design
- 2.13 Week 9: Critically Oriented Design Approaches
- 2.14 Week 10: Engaging Stakeholders
- 2.15 Week 11: Impact Assessments & Frameworks
- 2.16 Assignment 4
- 2.17 Week 12: Analyzing Systems
- 2.18 Week 13: Organizational Change
- 2.19 Week 14: Final Project Presentations
- 2.20 Week 15 (RRR): No Class, Work on Final Projects
- 3 Course policies
Technology is often put forth as inevitable progress toward modernization and as a value-neutral means for implementing the policies of law, agency rules, or corporate planning. The introduction of technology increasingly delegates responsibility for some function — sensing, sensemaking, deciding, acting — to technical actors. These handoffs of function between people, processes and technologies are rarely as inconsequential as they are initially made out to be. Often these shifts reduce traditional forms of transparency—as black boxes embed rules and make decisions less visible—and challenge traditional methods for accountability.
Meanwhile, policymakers are asking those who design technology to build-in values such as privacy and fairness—handing off some responsibility for their expression and protection. It is rarely a simple case of a function being handed wholesale from people to technology, or one technology to another; but rather a complicated interleaving in which a value is shaped and constrained by multiple modes of regulation.
We will draw on a wide range of literature, including: design, science and technology studies, computer science, law, and ethics, as well as primary sources in policy, standards and source code. We will explore the interaction between technical design and values including: privacy, accessibility, fairness, and freedom of expression. We will investigate approaches to identifying the value implications of technical designs and use methods and tools for intentionally building in values at the outset. And we will be not only critical, but also constructive: this lab will give us a hands-on opportunity to try out technologies for ourselves and experiment with building alternatives that address rights and values.
First, we will look at the theory and examples of social values and the effects of technological delegation on those values; next, we will review and test the practices, tools and methodologies to design for values we care about; and finally, we will consider some ongoing case studies to discover what affirmative agenda or common principles we can elicit for embedding values in design.
Typically, Tuesday meetings will be discussions based on the readings and Thursday meetings more lab exercises, but this will vary.
Week 0: Introduction
Thursday, August 29
- We will introduce the content and methods of the course, and introduce ourselves to one another.
- Winner, Langdon. "Do Artifacts Have Politics?" Daedalus, Vol. 109, No. 1, Modern Technology: Problem or Opportunity? (Winter, 1980), pp. 121-136 The MIT Press on behalf of American Academy of Arts & Sciences
- Julia Angwin, Jeff Larson, Surya Mattu and Lauren Kirchner, "" Machine Bias There’s software used across the country to predict future criminals. And it’s biased against blacks. ProPublica May 23, 2016
- Technical Flaws of Pretrial Risk Assessments Raise Grave Concerns ""
Week 1: Values
Tuesday, September 3; Thursday, September 5
- What are values? How do they relate to rights, preferences, valuation? How do different disciplines view and situate values-related work?
- Shilton, Katie. "Values and Ethics in Human-Computer Interaction." Foundations and Trends® Human–Computer Interaction 12.2 (2018): 107-171. Read Sections 1-3 (pgs 108-137).
- Koops, Bert-Jaap. "Criteria for Normative Technology: An Essay on the Acceptability of 'Code as Law' in Light of Democratic and Constitutional Values." REGULATING TECHNOLOGIES, Brownsword & Yeung, eds (2007): 157-174. Read § 4-7 ONLY
- Winner, Langdon. Chapter 9: "Brandy, cigars and human values." The Whale and the Reactor (1986): 155-63.
- United Nations. Guiding principles on business and human rights: implementing the United Nations" protect, respect and remedy" framework. UN. pp. 13-22
- Graeber, David. (2001). Three ways of talking about value. In Toward An Anthropological Theory of Value (pp. 1-22). Palgrave Macmillan, New York.
- Before class: work with a partner to find a value that interests you, and see how that value is discussed in different ways, or by different fields. (sign up on the class sign up sheet)
- Class: Provide a short oral presentation on your findings. (3 minutes maximum, followed by 4-5 minutes of feedback and discussion by the class)
Due Wednesday September 11 at 11:59pm
- With your lab partner, provide a write up on your value from this week's lab (which will be posted to the TechDel repository), the multiple ways it gets discussed. 2-4 pages (single spaced), and your preferred way of positioning it (value, right, preference etc.). Make sure references are cited (use any citation style you’d like, but be consistent). Submit as a pair via email to Deirdre and Daniel.
- If you need help thinking about what values to look at, consider the examples listed on the TechDel repository, Koops paper, or the following paper (see Table 2 in particular): Cheng, An-Shou, and Kenneth R. Fleischmann. 2010. "Developing a meta-inventory of human values." Proceedings of the 73rd ASIS&T Annual Meeting on Navigating Streams in an Information Ecosystem-Volume 47.
Week 2: Why Does Technology Matter?
Tuesday, September 10; Thursday, September 12
- We'll continue discussing artifacts and politics, as well as what things regulate.
- Latour, Bruno, "The Moral Dilemmas of a Safety-belt." 1989
- Akrich, M. (1992). The description of technical objects. In Shaping technology/building society, studies in socio technical change, edited by WE Bijker and J. Law, 205-224. Cambridge, MA: MIT Press.
- Kleinberg, Jon, Sendhil Mullainathan, and Manish Raghavan. "" Inherent trade-offs in the fair determination of risk scores . arXiv preprint arXiv:1609.05807 (2016). (Read the Abstract, Section 1, and Section 5 -- if you're really digging the math, feel free to read the other sections as well, but they are by no means necessary)
- Christin, A. (2017). ""Algorithms in practice: Comparing web journalism and criminal justice. Big Data & Society, 4(2), 2053951717718855.
- Before Thursday's class: work with a partner to interrogate a consumer device or app or system or service using the readings as guidance and inspiration. What is the script? How has the rearrangement of the activity by the technical system shifted responsibility for morality? What, if any, values have to be computationally addressed in this new configuration?(sign up on the class sign up sheet)
- Class: Provide a short oral presentation with appropriate visual aids (slides or something else) (3-4 minutes max, with 5-6 minutes of discussion) on your findings
- Document your methods: how you explored the artifact, why you chose them
- Address the questions
- What did you find
Due Friday September 20 by 10:00 pm
- With your lab partner, provide a write up of your analysis from lab, with improvements based on feedback during class (which will be posted to the TechDel repository). 2-4 pages (single spaced). Make sure references are cited (use any citation style you’d like, but be consistent). Submit as a pair (slides or whatever from presentation and write up) via email to Deirdre and Daniel.
Week 3: Protecting Values
Tuesday, September 17; Thursday, September 19
- 9/17 We'll explore the implications of using different modalities—technical designs, legal rules, norms, markets—to regulate.
- Lessig, Larry, Code, Chapter 7: "What Things Regulate"
- Surden, Harry, Structural Rights in Privacy. SMU Law Review, Vol. 60, pp. 1605-1629, 2007.
- Cohen, Julie E., "Pervasively Distributed Copyright Enforcement". Georgetown Law Journal, Vol. 95, 2006. Read SECTIONS I and IV ONLY.
- [Skim] Shah, Rajiv C., and Jay P. Kesan. "Manipulating the governance characteristics of code." info 5.4 (2003): 3-9.
- Stevenson, Megan. "Assessing risk assessment in action." Minn. L. Rev. 103 (2018): 303 pp. 333-341 ONLY.
- 9/19 Legitimacy and Governance
- Brownsword, Roger. "Lost in translation: Legality, regulatory margins, and technological management." Berkeley Technology Law Journal 26.3 (2011): 1321-1365.
- Freeman, Jody. "Private parties, public functions and the new administrative law." Administrative Law Review (2000): Read § I and Conclusion (skip § II)
- Doty, Nick, and Deirdre K. Mulligan. 2013. “Internet Multistakeholder Processes and Techno-Policy Standards: Initial Reflections on Privacy at the World Wide Web Consortium”. Journal on Telecommunications and High Technology Law 11. Read Section III, but skip III.C.3 (read pp. 154-174); Skim Section II (pp. 140-153) just to get an idea of what the W3C, P3P, and TPWG are
- Deirdre K. Mulligan & Kenneth A. Bamberger, Procurement as Policy: Administrative Process for Machine Learning (excerpt)
Week 4: Legitimacy and Governance con’t
Tuesday, September 24; Thursday, September 26 (Deirdre away Tuesday)
- We'll do some brainstorming activities to think about potential Final Project ideas
And file a Public Record Act Request for details on an algorithmic tool in government use 
- 9/26 Discussion of search and content moderation issues.
- Letter from David Kaye, Special Rapporteur on the Promotion & Prot. of the Right to Freedom of Op. & Expression, United Nations Human Rights Council, to Hon. Sheri Pym (Mar. 2, 2016)
- Deirdre Mulligan, Daniel Griffin (2018) If Google goes to China, will it tell the truth about Tiananmen Square? The Guardian
- Human Rights Council, (2018) Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression - Read Sections 1 & 5 (Introduction and Recommendations)
- Geiger, R. S. (2011). The lives of bots. From Critical Point of View: A Wikipedia Reader.
- RFC 7725: An HTTP Status Code to Report Legal Obstacles. Edited by Tim Bray. February 2016. (2 pages)
Due Friday October 25 10 pm
- Individually, write a short analysis looking at governance issues in relation to a specific case (could be something related to your final project ideas, or could be another case you’re interested in). Your analysis should deeply engage with (not just summarize) at least one of the readings from the course (we suggest using the readings from week 4 or 7). You’re also welcome to engage with additional readings from other weeks.
- 2-4 pages (single spaced). Make sure references are cited (use any citation style you’d like, but be consistent). Submit individually via email to Deirdre and Daniel.
Week 5: Studying Values
Tuesday, October 1; Thursday, October 3
- Shilton, Katie, Jes A. Koepfler, and Kenneth R. Fleischmann. "How to see values in social computing: methods for studying values dimensions How to see values in social computing: methods for studying values dimensions." Proceedings of the 17th ACM conference on Computer supported cooperative work & social computing. ACM, 2014.
- Nissenbaum, Helen. "Values in technical design." Encyclopedia of science, technology, and ethics (2005): 66-70.
- Knobel, Cory, and Geoffrey C. Bowker. "Values in design." Communications of the ACM 54.7 (2011): 26-28. 
- We'll do some activities with the Moral Machine and Moderation Machine websites. Bring your laptop/tablet to class.
Week 6: Values in Design
Tuesday, October 8; Thursday, October 10
- Working with values. What values might we want to support through design? What do we mean by “by design”? Given their contested and contextual nature, how can we model values?
- Dombrowski, Lynn, Ellie Harmon, and Sarah Fox. "Social Justice-Oriented Interaction Design: Outlining Key Design Strategies and Commitments." Proceedings of the 2016 ACM Conference on Designing Interactive Systems. ACM, 2016. 656-671 
- Hirsch, Tad, et al. "Designing contestability: Interaction design, machine learning, and mental health." Proceedings of the 2017 Conference on Designing Interactive Systems. ACM, 2017. 95-99 
- Deirdre K. Mulligan, Colin Koopman and Nick Doty. (2016) “Privacy is an Essentially Contested Concept: A Multidimensional Analytic for Mapping Privacy.” Philosophical Transactions of the Royal Society A: Mathematical, Physical & Engineering Sciences, 1-17
- Excerpts from Bringing Design to the Privacy Table: Broadening "Design" in "Privacy by Design" Through the Lens of HCI Read Sections 1, 4, 5, 7 (~6.5 pgs)
- Deirdre Mulligan, Joshua Kroll, Nitin Kohli, Richmond Wong, "This Thing Called Fairness" Forthcoming at CSCW 2019
- Working session thinking through values and design choices for a particular technical system. Please come with something you and 3-4 classmates want to interrogate using Tuesday’s readings.
Week 7: Designing for Law & Regulation
Tuesday, October 15; Thursday, October 17 (Deirdre away Thursday)
- What are different ways to think about the relationship between law and technological design?
- Nissenbaum, H. (2011). From preemption to circumvention: if technology regulates, why do we need regulation (and vice versa). Berkeley Tech. LJ, 26, 1367-1386.
- T.D. Breaux, M. Vail and A.I. Anton, "Towards Regulatory Compliance: Extracting Rights and Obligations to Align Requirements with Regulations," 14th IEEE International Requirements Engineering Conference (RE'06), Minneapolis / St. Paul, Minnesota. 1-10. September 2006.
- Friedman, B., Smith, I., Kahn, P. H., Consolvo, S., & Selawski, J. (2006). Development of a privacy addendum for open source licenses: Value Sensitive Design in industry. In International Conference on Ubiquitous Computing (pp. 194-211). Springer, Berlin, Heidelberg. Read Only Sections 1-3, 6 (pgs 194-196; 204-207)
- Mulligan, Deirdre K. and Bamberger, Kenneth A., Saving Governance-by-Design 106 California Law Review 697 (2018) pp. 722-742 only
Lab (First half):
- By 2pm Thursday, add 1 slide to the shared Google Slide deck with your idea for a final project: TBD
- Everybody will have 60 seconds to pitch an idea for their final project (we'll use a timer!)
- If you aren't going to be in class physically, you can still put in a slide and we'll share it with the class
- We'll give some time for people to ask questions
Lab (Second half)
- We'll look at ways in which Twitter relates to law vis a vis content moderation (thinking about their content moderation policies, technologies used to moderate, and content moderation laws such as DMCA, RTBF, CDA, etc.).
- https://help.twitter.com/en/rules-and-policies/twitter-rules (General rules)
- https://help.twitter.com/en/rules-and-policies/media-settings (User posting)
- https://help.twitter.com/en/rules-and-policies/tweet-withheld-by-country (Government withholding)
- https://transparency.twitter.com/en/gov-tos-reports.html (Government TOS violations Transparency Report)
- https://transparency.twitter.com/en/removal-requests.html (Government removal request transparency report)
- https://lumendatabase.org/ (Lumen Database)
- How does platform frame content moderation? (norms, compliance, legality, other?)
- How has platform implemented this?
- How is technology used?
- What tasks has platform assigned to its users? What affordances, tools have they provided to users?
- How do you think the particular law you chose shapes the policy/tech used?
- What does the law require of the company wrt defining, identifying, taking action wrt the content?
Week 8: User Centered Design & Value Sensitive Design
Tuesday, October 22; Thursday, October 24
- Look through IDEO’s Human Centered Design Kit Methods
- Read the methods page of the Public Policy Lab and skim their report on Designing Busing Services (68 pages, but most are images and diagrams).
- Friedman, B., Hendry, D. G., & Borning, A. (2017). A survey of value sensitive design methods. Foundations and Trends® in Human–Computer Interaction, 11(2), 63-125. Read Sections 1-3, 5 (pgs 64-101, 109-112) (Alternate Link)
- Friedman, B., Smith, I., Kahn, P. H., Consolvo, S., & Selawski, J. (2006). Development of a privacy addendum for open source licenses: Value Sensitive Design in industry. In International Conference on Ubiquitous Computing (pp. 194-211). Springer, Berlin, Heidelberg. These are other sections of the piece from last week - Read Section 4 (pg 197), Skim section 5 (pgs 198-204) for an example of VSD.
- Norman, Don. (1988). The Psychopathology of Everyday Things, Chapter 1 in The Design of Everyday Things (1-33)
- Maguire, Martin. (2001). Methods to support human-centred design. International journal of human-computer studies, 55(4), Read Sections 1-3 (pg 587-593 and skim the rest).
- Friedman, Batya, et al. "Value sensitive design and information systems." in Early engagement and new technologies: Opening up the laboratory. Springer Netherlands, 2013. 55-95. Read 4.1 - 4.3 (pg 55-61), Read 1 case study in section 4.4 & skim the others, read 4.6 (pg 74-79), and read "4.8.1:Practical Value Sensitive Design Challenges" (pg 85-88) after the citations.
- Let’s try out some of the Value Sensitive Design tools (envisioning cards, etc)
Week 9: Critically Oriented Design Approaches
Tuesday, October 29; Thursday, October 31 (Deirdre away Tuesday)
- Sims, Christo. "The Politics of Design, Design as Politics. The Routledge Companion to Digital Ethnography (2017), 1-9.
- Sengers, Phoebe, Kirsten Boehner, Shay David, and Joseph “Jofish” Kaye. 2005. “Reflective Design.” In Proceedings of the 4th Decennial Conference on Critical Computing: Between Sense and Sensibility, 49–58. CC ’05. New York, NY, USA: ACM. 
- Dunne, Anthony and Raby, Fiona. 2013. Speculative Everything Read Chapter 1: Beyond Radical Design? (pg 1-9) and the A/B chart in the Preface (pg vii) 
- Auger, James. "Speculative design: crafting the speculation." Digital Creativity 24.1 (2013): 11-35. Read Sections 3-4, pgs 12-32 (about half of this is images!) 
- Hansson, K., Forlano, L., Choi, J. H. J., Disalvo, C., Pargman, T. C., Bardzell, S.,Lindtner, S. Joshi, S. (2018). Provocation, conflict, and appropriation: The role of the designer in making publics. Design Issues, 34(4), 3–7.
- Dunne, Anthony, and Fiona Raby. 2001. Design Noir: The Secret Life of Electronic Objects. Springer Science & Business Media. Chapter 4.
- Gaver, Martin (2000) "Alternatives: Exploring Information Appliances through Conceptual Design Proposals." In Proc. CHI 2000
- Gaver (2011) "Making Spaces: How Design Workbooks Work." In Proc. CHI 2011
- Malpass, Matt. "Critical Design Practice: Theoretical Perspectives and Methods of Engagement." The Design Journal 19.3 (2016): 473-489.
- Using critically oriented design approaches
- Thing from the future cards
- Let’s interrogate and redesign search
Week 10: Engaging Stakeholders
Tuesday, November 5; Thursday, November 7
- Lassana Magassa, Meg Young, Batya Friedman. (2017) Diverse Voices: A How-To Guide for Facilitating Inclusiveness in Tech Policy. Focus on Sections 1-5 (pgs 1-30), skim the rest.
- Geiger, R. Stuart. “Bot-Based Collective Blocklists in Twitter: The Counterpublic Moderation of Harassment in a Networked Public Space.” Information, Communication & Society 19, no. 6 (June 2, 2016): 787–803. 
- Lilly C. Irani and M. Six Silberman. 2013. Turkopticon: interrupting worker invisibility in amazon mechanical turk. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). 
- Before class, meet with your team and identify stakeholders, models and tools for engagement for your final project
- Prepare a brief presentation for class to get feedback
Week 11: Impact Assessments & Frameworks
Tuesday, November 12; Thursday, November 14 (Deirdre away Thursday)
- Cooper, A., Tschofenig, H., Aboba, B., Peterson, J., Morris, J., Hansen, M., Smith Janet, R. RFC6973 Privacy Considerations for Internet Protocols. 2013.
- Department of Homeland Security. Privacy Impact Assessment template. Introduction and the Template (just skim, no need to print these out).
- Reisman, Dillon, Jason Schultz, Kate Crawford, and Meredith Whittaker  Algorithmic Impact Assessments: A Practical Framework For Public Agency Accountability AI Now Institute(2018).
- Andrew D Selbst, Danah Boyd, Sorelle A Friedler, Suresh Venkatasubramanian, and Janet Vertesi. "" Fairness and abstraction in sociotechnical systems. In Proceedings of the Conference on Fairness, Accountability, and Transparency. ACM, 59–68, 2019.
Choose 1 of these to look at (will be updated with more):
- Natalie Cadranel, Anqi Li, An Xiao Mina, Caroline Sinders. 2018. Digital Security and Privacy Protection UX Checklist
- Ethical OS Toolkit
- EFF Know your customer standards applied to ICE 
- Markkula Center for Applied Ethics, An Ethical Toolkit for Engineering/Design Practice 
- Ethics and Algorithms Tool kit 
- IEEE Ethically Aligned Design  (feel free to look at general principles and a section or two if you don't want to look at the entire doc, its long)
- CDT's Digital Decisions Tool  and backgrounder 
- Model Cards for Model Reporting 
Tuesday we'll be discussing the variety of toolkits and frameworks that folks have looked at. Lab: Thursday we'll try implementing them to compare and contrast their utility.
- Doty, Nick. “Reviewing for Privacy in Internet and Web Standard-Setting.” International Workshop on Privacy Engineering; May 21, 2015.
- Privacy Patterns Look at the homepage, about page, patterns generally, and explore two patterns
- National Institute of Standards and Technology. Ed.: Sean Brooks, Michael Garcia, Naomi Lefkovitz, Suzanne Lightman and Ellen Nadeau. NISTIR 8062: Privacy Risk Management for Federal Information Systems. January 2017.
- ENISA (2013) Recommendations for a methodology of the assessment of severity of personal data breaches, European Union Agency for Network and Information Security
Due Tuesday, November 26 at 10pm
- During weeks 5-12, we’ve discussed different methods and tools for thinking about values in design. Individually, apply one of the methods or tools to a problem or artifact. Write up an analysis, including:
- Your problem/artifact that you are apply the method/tool to.
- Document your method/tool used and why you chose it
- What did you find?
- What was useful about this method/tool? What was less useful or difficult?
- What outstanding issues have not been addressed by this method/tool
- This does not have to be focused on your final project topic. However, you might find it helpful to use this as an opportunity to apply some of the methods/or tools to your final project topic
- 2-4 pages (single spaced). Make sure references are cited (use any citation style you’d like, but be consistent). Submit individually via email to Deirdre and Daniel.
Note: You may optionally complete the assignment in pairs, but if you do so, we expect the analysis to be a little more deeper and submit a single paper in the 4-8 page range)
Week 12: Analyzing Systems
Tuesday, November 19; Thursday, November 21
- We've talked about a lot of tools and approaches to use in the design of systems, but how might we evaluate the tradeoffs of the choices we have? And how might we critique and analyze existing systems?
- Bowker, Geoffrey C., Baker, Karen, Millerand, Florence, and David Ribes. 2010. “Toward Information Infrastructure Studies: Ways of Knowing in a Networked Environment” In International Handbook of Internet Research. (Read Intro & Parts 1-2, Skip Part 3)
- Neyland, Daniel, and Norma Möllers. "Algorithmic IF… THEN rules and the conditions and consequences of power." Information, Communication & Society (2016). 
- Deirdre K. Mulligan & Helen Nissenbaum, The Concept of Handoff as a Model for Ethical Analysis and Design, Oxford Handbook of Ethics of Artificial Intelligence (Markus D. Dubber, Frank Pasquale & Sunit Das, eds., forthcoming Oxford University Press 2020).
- Musiani, F., & Ermoshina, K. (2017). What is a Good Secure Messaging Tool? The EFF Secure Messaging Scorecard and the Shaping of Digital (Usable) Security. Westminster Papers in Communication and Culture, 12(3).
- Let's apply the handoffs model to a system
Week 13: Organizational Change
Tuesday, November 26; [no class 11/28, Thanksgiving]
- How might we consider the organizational, social, and cultural contexts in which we try to apply the methods and tools we've discussed? What challenges might occur?
- Agre, Philip. "Toward a critical technical practice: Lessons learned in trying to reform AI." Social Science, Technical Systems and Cooperative Work: Beyond the Great Divide. Erlbaum (1997).
- Think about how this piece relates to the toolkits we discussed 2 weeks ago
- Abu-Salma, R., Sasse, M. A., Bonneau, J., Danilova, A., Naiakshina, A., & Smith, M. (2017, May). Obstacles to the adoption of secure communication tools. In Security and Privacy (SP), 2017 IEEE Symposium on (pp. 137-153). IEEE.
- O’Mahony, Siobhán, and Beth A. Bechky. 2008. “Boundary Organizations: Enabling Collaboration among Unexpected Allies.” Administrative Science Quarterly 53 (3): 422–59.
- Raina, Kalinda. (2017) Creating a Culture of Privacy. LinkedIn. Watch the online course (1hr) or read the transcripts.
- No Lab, Happy Thanksgiving!
Week 14: Final Project Presentations
Tuesday, December 3; Thursday, December 5
- Each group will give a short presentation on their final project for class feedback.
Week 15 (RRR): No Class, Work on Final Projects
Thursday, December 6
- Final projects are due December 20
This is a 3-unit class. Meetings each week will include seminar-style discussion, lab-style exercises, and collaborative design and technical work. Readings should be read carefully. Throughout the semester students will complete short writing assignments, lead class discussions, and critique artifacts during the semester. During the second half of the semester students will work collaboratively in teams of 2-4 on a larger design-oriented final project.
Your grade is based on class participation 15% and assignments 85%. This class is designed to hone your critical inquiry skills. You are expected to fully participate—present, actively listen, engage with your classmates and the materials, bring your own insights to the discussion, share your experience and knowledge. Please come prepared to argue, explain, revise, borrow, refine, and of course junk your ideas. Thinking out loud is encouraged. This is how one learns. The success of this class depends upon student’s diligent preparation and active participation—listening, speaking, designing, building—in class. Readings will be assigned throughout the semester. Everyone is expected to read and reflect on the readings.
Assignments account for the remainder of your grade. They will be graded primarily on substance, however a minimal portion of each grade will reflect organizational clarity, grammar, and presentation style as appropriate. The assignments are staggered throughout the semester.
1. Written Assignments 50% (4 assignments, 12.5% each) The written assignments are an invitation to apply both the theoretical and practical learning from the course to new problems. They are designed to develop your skills as readers--critiquing, building upon, relating various pieces we read. Reflection pieces should synthesize readings and ideas from class discussion, and use the resulting insights to analyze an issue or object of interest to you, critique readings, or anything else you would like. Specific requirements are described in each assignment, but for all assignments, you must 1) seriously engage with the readings (could be 1 could be 2 could be more); and 2) write about something that interests you.
Do not, under any circumstances, provide a summary of the articles. You've read them. We've read them. We know you've read them. The form is in between short essay and journal entry. This is a playful style. We want to get a sense of what you are taking away from the class and what sorts of thoughts, ideas, questions it is raising for you. 2-4 normal single spaced pages typed. More details for each assignment can be found here:
2. Critiques 10% The second half of the semester we will begin each class with a student led critique of an information or physical artifact. The students assigned (teams of 2) will select an interface, object, algorithm, design, instructable, kickstarter, toy, etc. and offer a brief—4 minute maximum—critique that introduces the item and reflects on its values implications drawing on class readings and assignments up to that point. The class will then collectively critique the artifact.
3. Final Project 25% DUE December 20 (hard deadline) During the second half of the semester students will work collaboratively in teams of 2-4 on a larger final project. While we are flexible on the type of project, projects might include: a design-oriented project where something is analyzed or prototyped, writing out a research protocol for further exploration, or writing a part of a paper aimed toward publication.
Late assignments will be penalized: each day an assignment is late will result in a half a grade deduction. Recognizing that emergencies arise, exceptions will be made on a case-by-case basis.
Scheduling: Use [TBD] to choose which week you'll do a design critique, and to list who you're working with for Assignments 1 and 2.
The high academic standard at the University of California, Berkeley, is reflected in each degree that is awarded. As a result, every student is expected to maintain this high standard by ensuring that all academic work reflects unique ideas or properly attributes the ideas to the original sources. Academic integrity for this course depends on clear citation of ideas, text and code. At this level, we harbor no romantic illusions that work is born whole from your mind alone; instead, we want you to engage with the assigned readings and ideas you connect from other classes or the world around you, and to make those connections clear. Programmers often copy-paste code or re-use libraries in order to build upon the shoulders of giants without reinventing the wheel. This is encouraged, but we expect students to be careful in their assignments to note what they wrote themselves and to attribute code snippets and libraries to their original authors. These are some basic expectations of students with regards to academic integrity: Any work submitted should be your own individual thoughts, and should not have been submitted for credit in another course unless you have prior written permission to re-use it in this course from this instructor. All assignments must use "proper attribution," meaning that you have identified the original source and extent or words or ideas that you reproduce or use in your assignment. This includes drafts and homework assignments! If you are unclear about expectations, ask. Do not collaborate or work with other students on assignments or projects unless you have been directed to do so. For more information visit: https://sa.berkeley.edu/conduct/integrity
Class projects for the purpose of learning are typically exempt from Berkeley’s Committee for the Protection of Human Subjects (our Institutional Review Board) review. However, if you are conducting a research project that might result in publishable work (and many of you may!), keep in mind that UCB policy and publication venues will typically require you to have gone through the IRB process even if it is to receive an exemption. In addition, being exempt from IRB review does not mean that you are somehow exempt from ethical norms or that you wouldn’t benefit from advice or review. Feel free to come to the instructors for advice on a project which involves intervention with, data collection, or use of data about, human subjects.
Diversity and Inclusion
We value diversity in this course. Diversity can stem from disciplinary training, political or religious beliefs, identity (including race, gender, nationality, class, sexuality, religion, ability, caregiving roles, etc.), professional and personal experiences. We welcome it all. We aim to create an inclusive classroom. The syllabus includes a wide range of scholars and practitioners representing diverse perspectives. Due to the current composition of fields it is not optimal. If there are articles, papers, or other materials relevant to this course that you think would enhance the goals of diversity please share them with us, and feel free to share them (without asking me first) on the course slack channel.
If you go by a name that differs from what appears in your official UC-Berkeley records, or if you have preferred pronouns, please let me know. I struggle with names, please do not hesitate to remind me publicly or privately if I mispronounce your name.
I welcome accommodation letters from the Disabled Students’ Program. Know that I read them and adjust my course design and/or course policies accordingly.
I acknowledge that having a personal computer available to you makes keeping up with course readings and completing class assignments much easier. If you are struggling with the expense of technology or with unreliable equipment or if you are having any trouble with accessing materials (books, websites) for this course I want to know. The UC-Berkeley library has a laptop loan program.
Life doesn’t stop because you are a student. If experiences outside of school are interfering with your studies, please don’t hesitate to come and talk with me. If you prefer to speak with someone outside of the course or the program, our campus-side diversity office is available (email: firstname.lastname@example.org, phone: 510-642-7294).
I (like many people) am still in the process of learning about diverse perspectives and identities. If something was said in class (by anyone, including me) that you feel was unfair, ill-informed, or personally hurtful, please talk to me about it. To report any incident of intolerance, hate, harassment or exclusion on campus or by members of the campus community, you can start here.
Finally, as a participant in course discussions, you should also strive to honor the diversity of your classmates.
UC Berkeley Statement on Diversity These principles of community for the University of California, Berkeley are rooted in a mission of teaching, research and public service and will be enforced in our classroom this term. We place honesty and integrity in our teaching, learning, research and administration at the highest level. We recognize the intrinsic relationship between diversity and excellence in all our endeavors. We affirm the dignity of all individuals and strive to uphold a just community in which discrimination and hate are not tolerated. We are committed to ensuring freedom of expression and dialogue that elicits the full spectrum of views held by our varied communities. We respect the differences as well as the commonalities that bring us together and call for civility and respect in our personal interactions. We believe that active participation and leadership in addressing the most pressing issues facing our local and global communities are central to our educational mission. We embrace open and equitable access to opportunities for learning and development as our obligation and goal. For more information, visit UC Berkeley's Division of Equity, Inclusion & Diversity page: https://diversity.berkeley.edu/about
Learning Accommodations & Access
If you need accommodations for any physical, psychological, or learning disability, please speak to us after class or during office hours. If appropriate, please obtain an accommodation letter from the Disabled Students’ Program.
Additional Campus Resources
These additional campus units may, at times, prove helpful during the course of the semester: