Center on Privacy and Technology

Last updated
Center on Privacy and Technology
Formation2014
Founder Alvaro Bedoya
TypePolicy think tank
Location
Director
Emily Tucker
Parent organization
Georgetown University Law Center
Website www.law.georgetown.edu/privacy-technology-center/

The GeorgetownCenter on Privacy and Technology is a think tank at Georgetown University in Washington, DC dedicated to the study of privacy and technology. Established in 2014, it is housed within the Georgetown University Law Center. [1] The goal of the Center is to conduct research and empower legal and legislative advocacy around issues of privacy and surveillance, with a focus on how such issues affect groups of different social class and race. [2] In May 2022, the Center's founding director Alvaro Bedoya was confirmed as a commissioner of the United States Federal Trade Commission. [3]

Contents

Activities

Surveillance

From 2016 to 2019, the Center hosted an annual conference titled "The Color of Surveillance" which explored how government and technological surveillance affected different marginalized populations, including Black Americans, immigrants to the United States, religious minorities, and poor and working people. [4]

Facial recognition

The Center has collaborated with many advocacy organizations, including the ACLU, the Algorithmic Justice League, and the Electronic Frontier Foundation, as part of campaigns raising awareness about the use of facial recognition by the government. In 2016, the Center published a report called The Perpetual Line-Up: Unregulated Police Face Recognition in America which documents the widespread unregulated use of facial recognition by law enforcement across the United States. [5] [6] In 2018, a Freedom of Information Act lawsuit brought by the Center against the New York Police Department revealed that facial recognition scans were being run on mugshots of every arrestee. [7] A subsequent report in 2019, "Garbage In, Garbage Out: Face Recognition on Flawed Data" documented multiple cases of police departments attempting to identify suspects using hand-drawn sketches, highly edited photos, and photos of celebrity lookalikes. [8] [9]

Related Research Articles

<span class="mw-page-title-main">Closed-circuit television</span> Use of video cameras to transmit a signal to a specific place on a limited set of monitors

Closed-circuit television (CCTV), also known as video surveillance, is the use of video cameras to transmit a signal to a specific place, on a limited set of monitors. It differs from broadcast television in that the signal is not openly transmitted, though it may employ point-to-point (P2P), point-to-multipoint (P2MP), or mesh wired or wireless links. Even though almost all video cameras fit this definition, the term is most often applied to those used for surveillance in areas that require additional security or ongoing monitoring.

<span class="mw-page-title-main">Surveillance</span> Monitoring something for the purposes of influencing, protecting, or suppressing it

Surveillance is the monitoring of behavior, many activities, or information for the purpose of information gathering, influencing, managing or directing. This can include observation from a distance by means of electronic equipment, such as closed-circuit television (CCTV), or interception of electronically transmitted information like Internet traffic. It can also include simple technical methods, such as human intelligence gathering and postal interception.

<span class="mw-page-title-main">Facial recognition system</span> Technology capable of matching a face from an image against a database of faces

A facial recognition system is a technology potentially capable of matching a human face from a digital image or a video frame against a database of faces. Such a system is typically employed to authenticate users through ID verification services, and works by pinpointing and measuring facial features from a given image.

Center for Democracy & Technology (CDT) is a Washington, D.C.–based 501(c)(3) nonprofit organisation that advocates for digital rights and freedom of expression. CDT seeks to promote legislation that enables individuals to use the internet for purposes of well-intent, while at the same time reducing its potential for harm. It advocates for transparency, accountability, and limiting the collection of personal information.

Big Brother Watch is a non-party British civil liberties and privacy campaigning organisation. It was launched in 2009 by founding director Alex Deane to campaign against state surveillance and threats to civil liberties. It was founded by Matthew Elliott. Since January 2018, Silkie Carlo is the Director.

<span class="mw-page-title-main">Domain Awareness System</span>

The Domain Awareness System is the largest digital surveillance system in the world as part of the Lower Manhattan Security Initiative in partnership between the New York Police Department and Microsoft to monitor New York City. It allows the NYPD to track surveillance targets and gain detailed information about them, and is overseen by the counterterrorism bureau.

<span class="mw-page-title-main">Ring (company)</span> Home security products manufacturer

Ring LLC is a manufacturer of home security and smart home devices owned by Amazon. It manufactures a titular line of smart doorbells, home security cameras, and alarm systems. It also operates Neighbors, a social network that allows users to discuss local safety and security issues, and share footage captured with Ring products. Via Neighbors, Ring may also provide footage and data to law enforcement agencies to assist in investigations.

<span class="mw-page-title-main">Mass surveillance in India</span> Overview of mass surveillance in India

Mass surveillance is the pervasive surveillance of an entire or a substantial fraction of a population. Mass surveillance in India includes Surveillance, Telephone tapping, Open-source intelligence, Lawful interception, and surveillance under Indian Telegraph Act, 1885.

<span class="mw-page-title-main">Mass surveillance in China</span> Network of monitoring systems used by the Chinese government

Mass surveillance in the People's Republic of China (PRC) is the network of monitoring systems used by the Chinese central government to monitor Chinese citizens. It is primarily conducted through the government, although corporate surveillance in connection with the Chinese government has been reported to occur. China monitors its citizens through Internet surveillance, camera surveillance, and through other digital technologies. It has become increasingly widespread and grown in sophistication under General Secretary of the Chinese Communist Party (CCP) Xi Jinping's administration.

Unmanned aerial vehicles (UAVs) have been used for domestic police work in various countries around the world since the mid-2000s. Their appeal comes from their small size, lack of crew, and lower cost compared to police helicopters. UAVs may be used for search and rescue operations, aerial patrols, and other roles that are usually served by crewed police aircraft. UAVs can be powerful surveillance tools by carrying camera systems capable of license plate scanning and thermal imaging, as well as radio equipment and other sensors. While a vast majority of law enforcement UAVs are unarmed, documents obtained by digital rights group Electronic Frontier Foundation indicated the U.S. Customs and Border Protection would consider arming their UAVs with "non-lethal weapons designed to immobilize" targets.

DeepFace is a deep learning facial recognition system created by a research group at Facebook. It identifies human faces in digital images. The program employs a nine-layer neural network with over 120 million connection weights and was trained on four million images uploaded by Facebook users. The Facebook Research team has stated that the DeepFace method reaches an accuracy of 97.35% ± 0.25% on Labeled Faces in the Wild (LFW) data set where human beings have 97.53%. This means that DeepFace is sometimes more successful than human beings. As a result of growing societal concerns Meta announced that it plans to shut down Facebook facial recognition system, deleting the face scan data of more than one billion users. This change will represent one of the largest shifts in facial recognition usage in the technology’s history. Facebook planned to delete by December 2021 more than one billion facial recognition templates, which are digital scans of facial features. However, it did not plan to eliminate DeepFace which is the software that powers the facial recognition system. The company has also not ruled out incorporating facial recognition technology into future products, according to Meta spokesperson.

<span class="mw-page-title-main">Your papers, please</span> Expression associated with police state functionaries

"Your papers, please" is an expression or trope associated with police state functionaries demanding identification from citizens during random stops or at checkpoints. It is a cultural metaphor for life in a police state.

<span class="mw-page-title-main">Police surveillance in New York City</span>

The New York City Police Department (NYPD) actively monitors public activity in New York City, New York, United States. Historically, surveillance has been used by the NYPD for a range of purposes, including against crime, counter-terrorism, and also for nefarious or controversial subjects such as monitoring political demonstrations, activities, and protests, and even entire ethnic and religious groups.

<span class="mw-page-title-main">Internet Freedom Foundation</span> Indian digital liberties organisation

Internet Freedom Foundation (IFF) is an Indian non-governmental organisation that conducts advocacy on digital rights and liberties, based in New Delhi. IFF files petitions and undertakes advocacy campaigns to defend online freedom, privacy, net neutrality, and innovation.

Amazon Rekognition is a cloud-based software as a service (SaaS) computer vision platform that was launched in 2016. It has been sold to, and used by a number of United States government agencies, including U.S. Immigration and Customs Enforcement (ICE) and Orlando, Florida police, as well as private entities.

Clearview AI is an American facial recognition company, providing software to companies, law enforcement, universities, and individuals. The company's algorithm matches faces to a database of more than 20 billion images indexed from the Internet, including social media applications. Founded by Hoan Ton-That and Richard Schwartz, the company maintained a low profile until late 2019, when its usage by law enforcement was reported on. Multiple reports identified Clearview's association with far-right personas dating back to 2016, when the company claimed to sever ties with two employees.

DataWorks Plus LLC is a privately held biometrics systems integrator based in Greenville, South Carolina. The company started in 2000 and originally focused on mugshot management, adding facial recognition beginning in 2005. Brad Bylenga is the CEO, and Todd Pastorini is the EVP and GM.

<span class="mw-page-title-main">Hyper-surveillance</span> Form of surveillance

Hyper surveillance is the intricate surveillance of an entire or a substantial fraction of a population in order to monitor that group of citizens that specifically utilizes technology and security breaches to access information. As the reliance on the internet economy grows, smarter technology with higher surveillance concerns and snooping means workers to have increased surveillance at their workplace. Hyper surveillance is highly targeted and intricate observation and monitoring among an individual, group of people, or faction.

<span class="mw-page-title-main">Algorithmic Justice League</span> Digital advocacy non-profit organization

The Algorithmic Justice League (AJL) is a digital advocacy non-profit organization based in Cambridge, Massachusetts. Founded in 2016 by computer scientist Joy Buolamwini, the AJL uses research, artwork, and policy advocacy to increase societal awareness regarding the use of artificial intelligence (AI) in society and the harms and biases that AI can pose to society. The AJL has engaged in a variety of open online seminars, media appearances, and tech advocacy initiatives to communicate information about bias in AI systems and promote industry and government action to mitigate against the creation and deployment of biased AI systems. In 2021, Fast Company named AJL as one of the 10 most innovative AI companies in the world.

<span class="mw-page-title-main">Rashida Richardson</span> American attorney and scholar

Rashida Richardson is a visiting scholar at Rutgers Law School and the Rutgers Institute for Information Policy and the Law and an attorney advisor to the Federal Trade Commission. She is also an assistant professor of law and political science at the Northeastern University School of Law and the Northeastern University Department of Political Science in the College of Social Sciences and Humanities.

References

  1. Ho, Catherine (11 January 2015). "Georgetown Law, MIT team up to tackle topic of privacy in the age of big data". Washington Post. Retrieved 12 September 2021.
  2. "Center on Privacy and Technology". www.law.georgetown.edu. Retrieved 12 March 2021.
  3. "U.S. Senate: U.S. Senate Roll Call Votes 117th Congress - 2nd Session". www.senate.gov. 11 May 2022. Retrieved 29 November 2022.
  4. "The Color of Surveillance: Government Monitoring of the African American Community". www.law.georgetown.edu. Retrieved 26 March 2021.
  5. "The Perpetual Line-Up". Center on Privacy & Technology at Georgetown Law. Archived from the original on 2016-10-18. Retrieved 2021-09-18.
  6. Williams, Patricia J. (7 November 2016). "Americans Are Finding New Ways to Join the Surveillance State". The Nation. Retrieved 13 March 2021.
  7. Brown, Stephen Rex (1 March 2018). "NYPD ripped for abusing facial-recognition tool". NY Daily News. Retrieved 12 September 2021.
  8. "Garbage In. Garbage Out. Face Recognition on Flawed Data". Center on Privacy & Technology at Georgetown Law. Archived from the original on 2019-05-16. Retrieved 2021-09-18.
  9. Ng, Alfred. "Police are using flawed data in facial recognition searches, study finds". CNET. Retrieved 13 March 2021.