Cross-device tracking

Last updated

Cross-device tracking is technology that enables the tracking of users across multiple devices such as smartphones, television sets, smart TVs, and personal computers. [1] [2]

Contents

More specifically, cross-device tracking is a technique in which technology companies and advertisers deploy trackers, often in the form of unique identifiers, cookies, or even ultrasonic signals, to generate a profile of users across multiple devices, not simply one. [3] For example, one such form of this tracking uses audio beacons, or inaudible sounds, emitted by one device and recognized through the microphone of the other device. [3]

This form of tracking is used primarily by technology companies and advertisers who use this information to piece together a cohesive profile of the user. [3] These profiles inform and predict the type of advertisements the user receives. [3]

Background

There are many ways in which online tracking has manifested itself. Historically, when companies wanted to track users' online behavior, they simply had users sign in to their website. [4] This is a form of deterministic cross-device tracking, in which the user's devices are associated with their account credentials, such as their email or username. [5] Consequently, while the user is logged in, the company can keep a running history of what sites the user has been to and which ads the user interacted with between computers and mobile devices. [5]

Eventually, cookies were deployed by advertisers, providing each user with a unique identifier in his or her browser so that the user's preferences can be monitored. [6] This unique identifier informs the placement of relevant, targeted ads the user may receive. [6] Cookies were also used by companies to improve the user experience, enabling users to pick up where they left off on websites. [7] However, as users began using multiple devices––up to around five––advertisers became confused as to how to track, manage, and consolidate this data across multiple devices as the cookie-based model suggested that each device––whether a phone, computer, or tablet––was a different person. [6]

Other technologies such as supercookies, which stay on computers long after the user deletes his or her cookies, and web beacons, which are unique images from a URL, are also used by trackers and advertisers to gain increased insight into users' behavior. [6] However, advertisers were still limited in that only one device was able to be tracked and associated with a user. [6]

Thus, cross-device tracking initially emerged as a means of generating a profile of users across multiple devices, not simply one.

One such tactic for cross-device tracking is called browser fingerprinting, and occurs when browsers, which are modifiable to the users' tastes, produce a unique signal that companies or advertisers can use to single out the user. [6] Browser fingerprinting has been a cause for concern because of its effectiveness and also since it does not allow for users to opt-out of the tracking. [6]

Another tactic used by Google is called AdID and works on smartphones in tandem with cookies on a user's computer to track behavior across devices. [4]

Now, cross-device tracking has evolved into a new, radical form of surveillance technology which enables users to be tracked across multiple devices, including smartphones, TVs, and personal computers through the use of audio beacons, or inaudible sound, emitted by one device and recognized through the microphone of the other device, usually a smartphone. [3] In addition, cross-device tracking may presage the future of the Internet of things (IoT), in which all types of devices––such as offices, cars, and homes––are seamlessly interconnected via the internet. [4]

Ultrasonic tracking

Humans interpret sound by picking up on different frequencies. [3] Given the variety of sound waves that exist, humans can only hear frequencies that are within a certain range––generally from 20 Hz to 20 kHz. By the age of 30, most humans cannot hear sounds above 18 kHz. [3]

Ultrasound, which is shorter wavelengths greater than or equal to 20 kHz, enables the rapid transmission of data necessary for cross-device tracking to occur. [3]

Another integral component of cross-device tracking is the usage of audio beacons. Audio beacons are beacons that are embedded into ultrasound, so they cannot be heard by humans. [3] These audio beacons are used to surreptitiously track a user's location and monitor online behavior by connecting with the microphone on another device without the user's awareness. [3]

In October 2015, the Center for Democracy and Technology submitted comments to the Federal Trade Commission (FTC) regarding cross-device tracking technology, specifically mentioning SilverPush. [8] [9]

Audio "beacons" can be embedded into television advertisements. In a similar manner to radio beacons, these can be picked up by mobile apps. [10] This allows the behavior of users to be tracked, including which ads were seen by the user and how long they watched an ad before changing the channel. [11]

In March 2016, the FTC issued warning letters to 12 app developers using cross-device tracking in their apps. [12] The FTC warned these developers that they may be violating the FTC Act if they state or imply that their apps are not tracking television viewing habits when they in fact are.

Applications

Studies have shown that 234 Android applications are eavesdropping on these ultrasonic channels without the user's awareness. [3]

Applications such as SilverPush, Shopkick, and Lisnr are part of an "ultrasonic side-channel" in which the app, often unbeknownst to the user, intercepts ultrasonic signals emitted from the user's environment, such as from a TV, to track which advertisements the user has heard and how long the person listened to them. [3]

Another study suggested that Apple, Google, and Bluetooth Special Interest groups need to do more to prevent cross-device tracking. [13]

Privacy and surveillance concerns

Ultrasonic tracking

Cross-device tracking has privacy implications and allows for more detailed tracking of users than traditional tracking methods. Data can be collected from multiple devices used by a single user and correlated to form a more accurate picture of the person being tracked. [11] Moreover, malicious actors may use variants of the technology to de-anonymize anonymity network users. [14]

Ultrasonic tracking technologies can pose massive threats to a user's privacy. There are four primary privacy concerns associated with this new form of tracking:

Panoptic surveillance and the commodification of users' digital identity

From cookies to ultrasonic trackers, some argue that invasive forms of surveillance underscore how users are trapped in a digital panopticon, similar to the concept envisioned by Jeremy Bentham: a prison in which the prisoners were able to be seen at all times by guards but were unable to detect when, or even if, they were being watched at all, creating a sense of paranoia that drove prisoners to carefully police their own behavior. [15] Similarly, scholars have drawn parallels between Bentham's panopticon and today's pervasive use of internet tracking in that individuals lack awareness to the vast disparities of power that exist between themselves and the corporation to which they willingly give their data. [15] In essence, companies are able to gain access to consumers' activity when they use a company's services. [15] The usage of these services often is beneficial, which is why users agree to exchange personal information. [15] However, since users participate in this unequal environment, in which corporations hold most of the power and in which the user is obliged to accept the bad faith offers made by the corporations, users are operating in an environment that ultimately controls, shapes and molds them to think and behave in a certain way, depriving them of privacy. [15]

In direct response to the panoptic and invasive forms of tracking manifesting themselves within the digital realm, some have turned to sousveillance: a form of inverse surveillance in which users can record those who are surveilling them, thereby empowering themselves. [16] This form of counter surveillance, often used through small wearable recording devices, enables the subversion of corporate and government panoptic surveillance by holding those in power accountable and giving people a voice––a permanent video record––to push back against government abuses of power or malicious behavior that may go unchecked. [16]

The television, along with the remote control, is also argued to be conditioning humans into habitually repeating that which they enjoy without experiencing genuine surprise or even discomfort, a critique of the television similar to that of those made against information silos on social media sites today. [17] In essence, this technological development led to egocasting: a world in which people exert extreme amounts of control over what they watch and hear. [17] As a result, users deliberately avoid content they disagree with in any form––ideas, sounds, or images. [17] In turn, this siloing can drive political polarization and stoke tribalism. [17] Plus, companies like TiVO analyze how TV show watchers use their remote and DVR capability to skip over programming, such as advertisements––a privacy concern users may lack awareness of as well. [17]

Some scholars have even contended that in an age of increased surveillance, users now participate online through the active generation and curation of online images––a form of control. [3] In so doing, users can be seen as rejecting the shame associated with their private lives. [3] Other scholars note that surveillance is fundamentally dependent upon location in both physical and virtual environments. [18] This form of surveillance can be seen in travel websites which enable the user to share their vacation to a virtual audience. [18] The person's willingness to share their personal information online is validated by the audience, since the audience holds the user accountable and the user vicariously experiences pleasure through the audience. [18] Further, users' mobile data is increasingly being shared to third parties online, potentially underscoring the regulatory challenges inherent in protecting users' online privacy. [19]

In addition, scholars argue that users have the right to know the value of their personal data. [20] Increasingly, users' digital identity is becoming commodified through the selling and monetizing of their personal data for profit by large companies. [20] Unfortunately, many people appear to be unaware of the fact that their data holds monetary value that can potentially be used towards other products and services. [20] Thus, scholars are arguing for users' to have increased awareness and transparency into this process so that users can become empowered and informed consumers of data. [20]

Surveillance capitalism

The increased usage of cross-device tracking by advertisers is indicative of the rise of a new era of data extraction and analysis as a form of profit, or surveillance capitalism, a term coined by Shoshana Zuboff. [21] This form of capitalism seeks to commodify private human experience to create behavioral futures markets, in which behavior is predicted and behavioral data is harvested from the user. [21] Zuboff suggests that this new era of surveillance capitalism eclipses Bentham's panopticon, becoming far more encroaching and invasive as, unlike a prison, there is no escape, and the thoughts, feelings, and actions of users are immediately extracted to be commodified and resold. [21] Thus, since cross-device tracking seeks to create a profile of a user across multiple devices, big tech companies, such as Google, could use this behavioral data to make predictions about the user's future behavior without the user's awareness. [21]

Scholars are beginning to discuss the possibility of quantifying the monetary value of users' personal data. Notably, the algorithms used to extract and mine user data are increasingly seen as business assets and thus protected via trade secrets. [20] Indeed, the usage of free online services, such as public Wi-Fi, often comes at the unknown cost to the user of being tracked and profiled by the company providing the service. [20] In essence, a transaction is occurring: users' personal data is being exchanged for access to a free service. [20] Increasingly, scholars are advocating for users' right to understand the fundamental value of their personal data more intimately so as to be more savvy, informed consumers who have the ability to protect the privacy of their online information and not be manipulated into unwittingly giving away personal information. [20]

Health and wellness applications

In addition, health and wellness applications also have a dearth of privacy protections as well: a study found that many health apps lacked encryption and that regulators should enforce stronger Data privacy protections. [22] The study stated that of the 79 apps they tested, none of the applications locally encrypted the users' personal information and 89% of the applications pushed the data online. [22] The lack of adequate privacy and security measures surrounding users' personal medical data on mobile applications underscores the lessening degree to which users can trust mobile app developers to safeguard their personal information online. [22] While mobile application developers continue to confront privacy and security concerns, users are increasingly looking to ways to visualize their data through wearable devices and applications that track their workout and exercise routines. [23] Indeed, researchers discovered that these self-tracking devices play a role as a tool, a toy, and a tutor in users' lives. [24] In the tool role, the self-tracking device functions as a mechanism to help the user in some capacity, often to achieve personal health goals. [24] The toy role underscores how some self-tracking users see it as a fun game, particularly with regard to rewards and viewing the visualized data. [24] Lastly, the tutor role reflects how users gain insights from and motivation about their activity from the apps themselves. [24] Other scholars have characterized self-tracking as performing for the system, or controlling what is (or isn't) recorded, performing for the self, tracking themselves to gain insight into their behavior, and performing for other people, or the importance of how other people viewed the person being tracked, as well as the control the person being tracked had over their data and thus how they are perceived. [25]

Cookies, flash cookies, and web beacons

Additionally, privacy concerns surround cookies, flash cookies, and web beacons on websites today. [25] Ultimately, five main concerns surround the usage of cookies, flash cookies, and web beacons, according to a study: [7]

Data capitalism

Other scholars have defined a similarly extractive and destructive phenomenon called data capitalism. [26] Data capitalism is an economic system enabling the redistribution of power towards those who have access to the information––namely, big corporations. [26] There are three fundamental theories of how large companies engage users in virtual communities, reflecting the power of data capitalism on users today:

Solutions

Scholars are convinced the current notice-and-consent model for privacy policies is fundamentally flawed because it assumes users intuitively understand all of the facts in a privacy policy, which is often not the case. [27] Instead, scholars emphasize the imperative role of creating a culture in which privacy becomes a social norm. [27] In effect, users of online technologies should identify the social activities they use on the internet and start questioning websites' governing norms as a natural outgrowth of their web browsing. [27] In effect, these norms need to prevent websites from collecting and sharing users' personal information. [27] In addition, starting with a user's personal values and seeing how these values correlate with online norms may be another way to assess whether or not privacy norms are being violated in odd cases. [27] Ultimately, scholars believe these privacy norms are vital to protecting both individuals and social institutions. [27]

While the United States lacks extensive privacy rights, the Fourth Amendment provides some privacy protections. [7] The Fourth Amendment states that "the right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated", suggesting that while individuals are protected from all levels of the government, they are not legally protected from private companies or individuals with malicious intent. [7]

There are large implications for this technology within the legal field. Legally, The Federal Trade Commission has a responsibility to prevent deceptive practices by technology companies, such as those that could lead to consumer injury. [28] The FTC has made efforts to prevent invasive web tracking, tracking in physical space, malware, insecure and poorly designed services, and the use of deception to engage in surveillance. [28] For instance, in the realm of invasive web tracking, the FTC has brought lawsuits against companies who engage in history sniffing, a technique that enables companies to ascertain which links a user clicked on based on the color of the link. [28] Concerning tracking in physical space, the FTC has also cracked down on Nomi, a company that scans the MAC addresses of customers' phones in stores. [28] MAC addresses function as a unique identifier, enabling the connection to wireless networks. [28] In the case of malware, the FTC has placed pressure on companies such as CyberSpy, a self-proclaimed email attachment company that claimed to secretly record users' key presses. [28] The FTC has also cracked down on companies like Compete, a browser toolbar, because it decrypted users' personal information on the internet, putting users at risk. [28] Lastly, in cases during which deception is used to engage in surveillance, the FTC has investigated private investigators, who surveil individuals on another person's behalf. [28] In addition, audio beacon technology, used by an application called Silverpush, could violate the FTC's policies because users were not made aware as to when the ultrasonic signals were being recorded. [28]

Another scholar believes that the convergence between lived experience and online technology is creating a term called Mixed reality, in which people and things are replaced with virtual experiences. [29] Mixed Reality technologies can pose legal challenges in that laws which govern the online world will also extend to the real world. [29] In addition, data tagging––often through GPS, location-based services, or even near-field communication (NFC)––is the new technology at the heart of mixed reality, since people's data is determined in part by their location. [29] Near-field communication enables devices to transmit data to each other with a certain range. [29] Virtual reality can become a privacy issue because it attempts to immerse users into the virtual environment by recording a user's every sensation. [29] In turn, mixed reality's amalgamation with daily tasks suggest that it will be implicated in numerous legal issues ranging from copyright law to intellectual property law. [29] Customers are also being denied a voice in contracts, since only corporations set the rules by which individuals' private information is mined and extracted. [29] The solution to these issues, according to scholars, are opt-in controls to police users' privacy that enable balance to be restored to the law, particularly as it stands regarding contracts. [29]

Ethically, Zuboff points to the extraction, commodification, and analysis of private human experiences as well as increased surveillance––which is sometimes hidden––in everyday life as violating users' rights to privacy. [21] The usage of surreptitious methods, in which the user is unaware of the extent to which he or she is being tracked, brings tracking mechanisms––such as cookies, flash cookies, and web beacons––into the ethical realm as well since users are not being informed of this tracking perhaps as often as they should. [7]

See also

Related Research Articles

<span class="mw-page-title-main">Privacy</span> Seclusion from unwanted attention

Privacy is the ability of an individual or group to seclude themselves or information about themselves, and thereby express themselves selectively.

Information privacy is the relationship between the collection and dissemination of data, technology, the public expectation of privacy, contextual information norms, and the legal and political issues surrounding them. It is also known as data privacy or data protection.

Personalized marketing, also known as one-to-one marketing or individual marketing, is a marketing strategy by which companies leverage data analysis and digital technology to deliver individualized messages and product offerings to current or prospective customers. Advancements in data collection methods, analytics, digital electronics, and digital economics, have enabled marketers to deploy more effective real-time and prolonged customer experience personalization tactics.

Internet privacy involves the right or mandate of personal privacy concerning the storage, re-purposing, provision to third parties, and display of information pertaining to oneself via the Internet. Internet privacy is a subset of data privacy. Privacy concerns have been articulated from the beginnings of large-scale computer sharing and especially relate to mass surveillance.

Online advertising, also known as online marketing, Internet advertising, digital advertising or web advertising, is a form of marketing and advertising which uses the Internet to promote products and services to audiences and platform users. Online advertising includes email marketing, search engine marketing (SEM), social media marketing, many types of display advertising, and mobile advertising. Advertisements are increasingly being delivered via automated software systems operating across multiple websites, media services and platforms, known as programmatic advertising.

<span class="mw-page-title-main">Google Analytics</span> Web analytics service from Google

Google Analytics is a web analytics service offered by Google that tracks and reports website traffic and also the mobile app traffic & events, currently as a platform inside the Google Marketing Platform brand. Google launched the service in November 2005 after acquiring Urchin.

<span class="mw-page-title-main">Targeted advertising</span> Form of advertising

Targeted advertising is a form of advertising, including online advertising, that is directed towards an audience with certain traits, based on the product or person the advertiser is promoting. These traits can either be demographic with a focus on race, economic status, sex, age, generation, level of education, income level, and employment, or psychographic focused on the consumer values, personality, attitude, opinion, lifestyle and interest. This focus can also entail behavioral variables, such as browser history, purchase history, and other recent online activities. The process of algorithm targeting eliminates waste.

Web tracking is the practice by which operators of websites and third parties collect, store and share information about visitors’ activities on the World Wide Web. Analysis of a user's behaviour may be used to provide content that enables the operator to infer their preferences and may be of interest to various parties, such as advertisers. Web tracking can be part of visitor management.

<span class="mw-page-title-main">Network Advertising Initiative</span>

The Network Advertising Initiative is an industry trade group founded in 2000 that develops self-regulatory standards for online advertising. Advertising networks created the organization in response to concerns from the Federal Trade Commission and consumer groups that online advertising — particularly targeted or behavioral advertising — harmed user privacy. The NAI seeks to provide self-regulatory guidelines for participating networks and opt-out technologies for consumers in order to maintain the value of online advertising while protecting consumer privacy. Membership in the NAI has fluctuated greatly over time, and both the organization and its self-regulatory system have been criticized for being ineffective in promoting privacy.[Missing Citation]

<span class="mw-page-title-main">FTC regulation of behavioral advertising</span> US Regulations on Advertising Targeted by Online Activity

The United States Federal Trade Commission (FTC) has been involved in oversight of the behavioral targeting techniques used by online advertisers since the mid-1990s. These techniques, initially called "online profiling", are now referred to as "behavioral targeting"; they are used to target online behavioral advertising (OBA) to consumers based on preferences inferred from their online behavior. During the period from the mid-1990s to the present, the FTC held a series of workshops, published a number of reports, and gave numerous recommendations regarding both industry self-regulation and Federal regulation of OBA. In late 2010, the FTC proposed a legislative framework for U.S. consumer data privacy including a proposal for a "Do Not Track" mechanism. In 2011, a number of bills were introduced into the United States Congress that would regulate OBA.

Do Not Track (DNT) is a formerly official HTTP header field, designed to allow internet users to opt-out of tracking by websites—which includes the collection of data regarding a user's activity across multiple distinct contexts, and the retention, use, or sharing of data derived from that activity outside the context in which it occurred.

Since the arrival of early social networking sites in the early 2000s, online social networking platforms have expanded exponentially, with the biggest names in social media in the mid-2010s being Facebook, Instagram, Twitter and Snapchat. The massive influx of personal information that has become available online and stored in the cloud has put user privacy at the forefront of discussion regarding the database's ability to safely store such personal information. The extent to which users and social media platform administrators can access user profiles has become a new topic of ethical consideration, and the legality, awareness, and boundaries of subsequent privacy violations are critical concerns in advance of the technological age.

Ghostery is a free and open-source privacy and security-related browser extension and mobile browser application. Since February 2017, it has been owned by the German company Cliqz International GmbH. The code was originally developed by David Cancel and associates.

Do Not Track legislation protects Internet users' right to choose whether or not they want to be tracked by third-party websites. It has been called the online version of "Do Not Call". This type of legislation is supported by privacy advocates and opposed by advertisers and services that use tracking information to personalize web content. Do Not Track (DNT) is a formerly official HTTP header field, designed to allow internet users to opt-out of tracking by websites—which includes the collection of data regarding a user's activity across multiple distinct contexts, and the retention, use, or sharing of that data outside its context. Efforts to standardize Do Not Track by the World Wide Web Consortium did not reach their goal and ended in September 2018 due to insufficient deployment and support.

<span class="mw-page-title-main">Chris Hoofnagle</span>

Chris Jay Hoofnagle is an American professor at the University of California, Berkeley who teaches information privacy law, computer crime law, regulation of online privacy, internet law, and seminars on new technology. Hoofnagle has contributed to the privacy literature by writing privacy law legal reviews and conducting research on the privacy preferences of Americans. Notably, his research demonstrates that most Americans prefer not to be targeted online for advertising and despite claims to the contrary, young people care about privacy and take actions to protect it. Hoofnagle has written scholarly articles regarding identity theft, consumer privacy, U.S. and European privacy laws, and privacy policy suggestions.

Corporate surveillance describes the practice of businesses monitoring and extracting information from their users, clients, or staff. This information may consist of online browsing history, email correspondence, phone calls, location data, and other private details. Acts of corporate surveillance frequently look to boost results, detect potential security problems, or adjust advertising strategies. These practices have been criticized for violating ethical standards and invading personal privacy. Critics and privacy activists have called for businesses to incorporate rules and transparency surrounding their monitoring methods to ensure they are not misusing their position of authority or breaching regulatory standards.

<span class="mw-page-title-main">Dataveillance</span> Monitoring and collecting online data and metadata

Dataveillance is the practice of monitoring and collecting online data as well as metadata. The word is a portmanteau of data and surveillance. Dataveillance is concerned with the continuous monitoring of users' communications and actions across various platforms. For instance, dataveillance refers to the monitoring of data resulting from credit card transactions, GPS coordinates, emails, social networks, etc. Using digital media often leaves traces of data and creates a digital footprint of our activity. Unlike sousveillance, this type of surveillance is not often known and happens discreetly. Dataveillance may involve the surveillance of groups of individuals. There exist three types of dataveillance: personal dataveillance, mass dataveillance, and facilitative mechanisms.

On 28 March 2017, the United States House of Representatives passed a resolution of disapproval to overturn the Broadband Consumer Privacy Proposal privacy law by the Federal Communications Commission (FCC) and was expected to be approved by United States' President Donald Trump. It was passed with 215 Republican votes against 205 votes of disapproval.

Spy pixels or tracker pixels are hyperlinks to remote image files in HTML email messages that have the effect of spying on the person reading the email if the image is downloaded. They are commonly embedded in the HTML of an email as small, imperceptible, transparent graphic files. Spy pixels are commonly used in marketing, and there are several countermeasures in place that aim to block email tracking pixels. However, there are few regulations in place that effectively guard against email tracking approaches.

<span class="mw-page-title-main">Privacy Sandbox</span> Google initiative to create web standards for advertising without the use of third-party cookies

The Privacy Sandbox is an initiative led by Google to create web standards for websites to access user information without compromising privacy. Its core purpose is to facilitate online advertising by sharing a subset of user private information without the use of third-party cookies. The initiative includes a number of proposals, many of these proposals have bird-themed names which are changed once the corresponding feature reaches general availability. The technology include Topics API, Protected Audience, Attribution Reporting, Private Aggregation, Shared Storage and Fenced Frames as well as other proposed technologies. The project was announced in August 2019.

References

  1. Jebara, Tony; Bellovin, Steven M.; Kim, Hyungtae; Li, Jie S.; Zimmeck, Sebastian (2017). "A Privacy Analysis of Cross-device Tracking" (PDF). Usenix. S2CID   23378463.
  2. Yuan, H.; Maple, C.; Chen, C.; Watson, T. (1 July 2018). "Cross-device tracking through identification of user typing behaviours". Electronics Letters. 54 (15): 957–959. Bibcode:2018ElL....54..957Y. doi: 10.1049/el.2018.0893 . ISSN   0013-5194. S2CID   55463759.
  3. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 Arp, Daniel (2017). "Privacy Threats through Ultrasonic Side Channels on Mobile Devices". 2017 IEEE European Symposium on Security and Privacy (EuroS&P). pp. 1–13. doi:10.1109/EuroSP.2017.33. ISBN   978-1-5090-5762-7. S2CID   698921 via IEEE Xplore.
  4. 1 2 3 Jebara, Tony; Bellovin, Steven M.; Kim, Hyungtae; Li, Jie S.; Zimmeck, Sebastian (2017). A Privacy Analysis of Cross-device Tracking. USENIX Association. pp. 1391–1408. ISBN   978-1-931971-40-9.
  5. 1 2 Brookman, Justin (2017). "Cross-Device Tracking: Measurement and Disclosures" (PDF). Proceedings on Privacy Enhancing Technologies. 2017 (2): 133–148. doi: 10.1515/popets-2017-0020 .
  6. 1 2 3 4 5 6 7 "Comments for November 2015Workshop on Cross-Device Tracking" (PDF).
  7. 1 2 3 4 5 6 7 8 9 10 Sipior, Janice C.; Ward, Burke T.; Mendoza, Ruben A. (30 March 2011). "Online Privacy Concerns Associated with Cookies, Flash Cookies, and Web Beacons". Journal of Internet Commerce. 10 (1): 1–16. doi:10.1080/15332861.2011.558454. ISSN   1533-2861. S2CID   154250015.
  8. "Re: Comments for November 2015 Workshop on Cross – Device Tracking" (PDF). Center for Democracy and Technology . Retrieved 1 April 2016.
  9. "How TV ads silently ping commands to phones: Sneaky SilverPush code reverse-engineered". The Register . Retrieved 1 April 2016.
  10. "FTC letter to app developers" (PDF). Retrieved 1 April 2016.
  11. 1 2 "Beware of ads that use inaudible sound to link your phone, TV, tablet, and PC". Ars Technica . 13 November 2015. Retrieved 31 March 2016.
  12. "FTC Issues Warning Letters to App Developers Using 'Silverpush' Code". 17 March 2016. Retrieved 1 April 2016.
  13. Korolova, Aleksandra; Sharma, Vinod (2018). "Cross-App Tracking via Nearby Bluetooth Low Energy Devices". Proceedings of the Eighth ACM Conference on Data and Application Security and Privacy. CODASPY '18. New York, NY, USA: ACM. pp. 43–52. doi:10.1145/3176258.3176313. ISBN   978-1-4503-5632-9. S2CID   3933311.
  14. Vasilios Mavroudis; et al. "On the Privacy and Security of the Ultrasound Ecosystem" (PDF). ubeacsec.org. Proceedings on Privacy Enhancing Technologies. Retrieved 30 November 2017.
  15. 1 2 3 4 5 Campbell, John Edward; Carlson, Matt (2002). "Panopticon.com: Online Surveillance and the Commodification of Privacy". Journal of Broadcasting & Electronic Media. 46 (4): 586–606. doi:10.1207/s15506878jobem4604_6. ISSN   0883-8151. S2CID   144277483.
  16. 1 2 Wellman, Barry; Nolan, Jason; Mann, Steve (2003). "Sousveillance: Inventing and Using Wearable Computing Devices for Data Collection in Surveillance Environments". Surveillance & Society. 1 (3): 331–355. doi: 10.24908/ss.v1i3.3344 . ISSN   1477-7487.
  17. 1 2 3 4 5 Rosen, Christine (2004). "The Age of Egocasting". The New Atlantis (7): 51–72. ISSN   1543-1215. JSTOR   43152146.
  18. 1 2 3 Molz, Jennie Germann (2006). "'Watch us wander': mobile surveillance and the surveillance of mobility". Environment and Planning A. 38 (2): 377–393. doi:10.1068/a37275. ISSN   0308-518X. S2CID   145772112.
  19. Razaghpanah, Abbas; Nithyanand, Rishab; Vallina-Rodriguez, Narseo; Sundaresan, Srikanth; Allman, Mark; Kreibich, Christian; Gill, Phillipa. "Apps, Trackers, Privacy and Regulators: A Global Study of the Mobile Tracking Ecosystem". icsi.berkeley.edu. Retrieved 11 April 2019.
  20. 1 2 3 4 5 6 7 8 Malgieri, Gianclaudio; Bart Custers (April 2018). "Pricing privacy – the right to know the value of your personal data". Computer Law & Security Review. 34 (2): 289–303. doi:10.1016/j.clsr.2017.08.006. hdl: 1887/72422 . S2CID   64962762.
  21. 1 2 3 4 5 Zuboff, Shoshana (2015). "Big other: Surveillance Capitalism and the Prospects of an Information Civilization". Journal of Information Technology. 30 (1): 75–89. doi: 10.1057/jit.2015.5 . ISSN   0268-3962. S2CID   15329793.
  22. 1 2 3 Huckvale, Kit; Prieto, José Tomás; Tilney, Myra; Benghozi, Pierre-Jean; Car, Josip (25 September 2015). "Unaddressed privacy risks in accredited health and wellness apps: a cross-sectional systematic assessment". BMC Medicine. 13 (1): 214. doi: 10.1186/s12916-015-0444-y . PMC   4582624 . PMID   26404673.
  23. Malgieri, Gianclaudio; Custers, Bart (April 2018). "Pricing privacy – the right to know the value of your personal data". Computer Law & Security Review. 34 (2): 289–303. doi:10.1016/j.clsr.2017.08.006. hdl: 1887/72422 . S2CID   64962762.
  24. 1 2 3 4 Lyall, Ben; Robards, Brady (1 March 2018). "Tool, toy and tutor: Subjective experiences of digital self-tracking". Journal of Sociology. 54 (1): 108–124. doi: 10.1177/1440783317722854 . ISSN   1440-7833.
  25. 1 2 Gross, Shad; Bardzell, Jeffrey; Bardzell, Shaowen; Stallings, Michael (2 November 2017). "Persuasive Anxiety: Designing and Deploying Material and Formal Explorations of Personal Tracking Devices". Human–Computer Interaction. 32 (5–6): 297–334. doi:10.1080/07370024.2017.1287570. ISSN   0737-0024. S2CID   2557583.
  26. 1 2 3 4 5 6 7 West, Sarah Myers (2019). "Data Capitalism: Redefining the Logics of Surveillance and Privacy". Business & Society. 58 (1): 20–41. doi:10.1177/0007650317718185. ISSN   0007-6503. S2CID   157945904.
  27. 1 2 3 4 5 6 "A Contextual Approach to Privacy Online". American Academy of Arts & Sciences. October 2011. Retrieved 18 April 2019.
  28. 1 2 3 4 5 6 7 8 9 Hoofnagle, Chris Jay (1 September 2017). "FTC Regulation of Cybersecurity and Surveillance". SSRN   3010205.
  29. 1 2 3 4 5 6 7 8 Fairfield, Joshua A.T. (2012). "Mixed Reality: How the Laws of Virtual Worlds Govern Everyday Life". Berkeley Technology Law Journal. 27 (1): 55–116. ISSN   1086-3818. JSTOR   24119476.