Privacy settings are "the part of a social networking website, internet browser, piece of software, etc. that allows you to control who sees information about you". [1] With the growing prevalence of social networking services, [2] [3] opportunities for privacy exposures also grow. [4] Privacy settings allow a person to control what information is shared on these platforms.
Many social networking services (SNS) such as Facebook, have default privacy settings that leave users more prone to sharing personal information. [5] Privacy settings are contributed to by users, companies, and external forces. Contributing factors that influence user activity in privacy settings include the privacy paradox [6] and the third person effect. [7] The third person effect explains why privacy settings can remain unchanged throughout time. [7] Companies can enforce a Principle of Reciprocity (PoR) where users have to decide what information they are willing to share in exchange for others’ information. [8]
With the growing focus on internet privacy, there are technologies and programs designed to enhance and encourage more privacy setting activity. Applications such as the Personal Data Manager (PDM) are used to improve the efficiency of privacy setting management. [9] Privacy by design can enhance privacy settings through incorporating privacy notifications or prompting users to occasionally manage their privacy settings. [10]
SNS are designed to connect people together online. Users share information and build relationships online. [3] Privacy leaks can still occur even with privacy settings intact. [3] Users’ connections on SNS can reveal personal information such as having friends from the same university can lead to an inference that a person attends that university. [3] Furthermore, even if a person has strict privacy settings enabled, their privacy can still be leaked through their connections who may not have as many privacy settings in place. [3] This calls for enhanced privacy settings that can tolerate different privacy settings while allowing online connections. [3]
The ability to control who views their content influences users’ decision to share or not share images on SNS such as WeChat or Qzone. [11] Different communities call for different levels of privacy. [11] For example, an individual is more likely to share a photo of themselves and a close friend with their close friend circle and family than strangers. [11] This reveals a need for fine privacy settings that allow users more flexibility in their SNS sharing ability. [11] Hu Xiaoxu et al. suggests privacy settings should encourage social networking on SNS while simultaneously protecting user privacy. [11] [12]
Privacy settings for SNS have default settings that set up users to automatically share personal information the user has inputted. [13] For example, Twitter users are automatically prone to a public profile when an account is first made. [14] Furthermore, SNS privacy policies have shown to be too complex for consumers to fully understand, leading to personal information being shared regardless of user awareness. [13] Even after a user deletes their Facebook profile, Facebook can still use and sell user information according to their privacy policy. [13] Facebook's default settings allow friends to view a person's profile and anyone to search for one's profile. [5] Default settings can be chosen due to their convenience; users do not have to exert as much effort to choose default settings compared to personalizing privacy settings. [15]
Privacy settings are situated in the framework of the communication privacy management theory. [14] This theory states that privacy management involves setting boundaries and agreements with individuals, highlighting that once information is shared, it is now their information as well. [16] In a study about teenagers and their privacy, it was revealed that privacy concerns was the biggest contributor to management both personal and interpersonal privacy (see Privacy concerns with social networking services). [17] Privacy settings can be inaccessible or effortful to implement. [10] Teenagers that feel fatalistic toward their personal privacy are more likely to depend on interpersonal privacy techniques. [14] De Wolf's study revealed that teenagers used more personal privacy techniques than interpersonal privacy management, which emphasizes the need for accessible, clear privacy settings. [14]
Voluntary servitude is an idea that states people knowingly give their support to authoritative figures by subjecting themselves to servitude. [13] In the sense of social media, voluntary servitude is how users expose their information to companies and perpetuate data collection and monetization. [13] Romele et al. offers a possible explanation to voluntary servitude through the System Justification Theory. [13] This theory states that people learn to internalize societal hierarchies and perpetuate their existence. [13] Users are complying to the power hierarchy by allowing their information to be extracted from these companies, through methods such as sharing personal information with close family and friends which can be harvested by companies and third parties. [13]
The theory of planned behavior aligns beliefs, attitudes, norms and behavior together. [2] In a study that surveyed undergraduate students on their Facebook use, a majority responded that when their close friends think privacy protection is important or engage in privacy protection activity, they are more likely to have intentions to also engage in privacy protection behavior. [18]
By accepting privacy policies and therefore agreeing to default settings set in place by companies, users are prone to oversharing information. [19] However, users usually do not change their privacy settings unless they personally experience a privacy invasion or unintentionally share information. [19] In a study exploring the connection between Facebook attitudes and behaviors, having a friend experience a privacy invasion (e.g. someone hacking into their profile) did not lead to one implementing privacy changes. [7] A possible explanation for this is the third-person effect, which is the belief that one has a less chance of a privacy invasion than someone else, thinking they are safer than others. [7] This protective mindset is flawed because anyone is susceptible to privacy invasions, and managing privacy settings can help decrease that risk. [7]
The privacy paradox is an idea that states individuals' privacy attitudes and beliefs do not match their privacy behavior. [6] This concept offers an explanation to why individuals may be complacent in privacy setting management. [6] The privacy paradox intertwines with the third-person effect because individuals believe privacy is important but do not believe a privacy-related incident will happen to them over others. [6] [7] Recognizing personal privacy as important is a low-cost effort, but actually taking measures to protect one's privacy may be too high-cost for individuals, explaining the privacy paradox. [6]
An extension of the privacy paradox is the discrepancy between the nothing to hide claim and the use of privacy settings. [4] In a study of Canadian teenage use of SNS, a majority of participants that claimed they had nothing to hide still utilized privacy settings such as blocking other users. [4] In addition, different selves were also portrayed across different SNS such as Facebook and Snapchat, which is a form of privacy in itself. [4]
Another reason users do not alter their privacy settings is a lack of knowing these settings exist. [20] Facebook users that know privacy settings exist are more likely to change them compared to users who do not know privacy settings exist. [7] Furthermore, with Facebook, users explain their lack of privacy setting alteration because the choice to choose who is a Facebook friend is already a form of privacy. [7] However, Debatin et al. emphasizes that the criteria individuals use to decide who is a Facebook friend is typically relaxed, posing privacy risks for users. [7] [21] In a different study, it was shown that the number of friends who had a private profile increased the likelihood that a user would adopt a private profile. [5] Furthermore, the amount of time a person spent on Facebook and women were more likely to have a private profile. [5] A private Facebook profile was defined as changing the default settings so non-friends cannot search for their profile. [6] If the data is valuable, privacy is prevalent on the app, and implementing privacy settings is easy, users say they are more likely to engage in privacy behavior. [22]
In May 2020, Facebook began implementing the option to give users to delete or archive past posts from a certain time or from certain people. [18] This "Manage Activity" option allows more security and privacy control for users. [18] This tool is only accessible through the mobile app and has yet to be adapted to the web version of Facebook. [18]
However, users are not the sole party involved in privacy. Companies are also responsible for implementing default privacy settings and creating options. Romele et al. acknowledges that social media companies often play a significant role in perpetuating the voluntary servitude of users. [13] Social media privacy policies can be complex and default settings are set up to collect beneficial, profitable data to companies. [13] In a study that examined Facebook's privacy policy from 2005 to 2015, Shore et al. found that the policy became increasingly unclear and unaccountable. [23] A specific portion regarding the use of user personal data and third party involvement was found to be increasingly confusing. [23] In addition to companies portraying their privacy protection as clear and beneficial for users, they also do not do anything to heighten awareness of constant data collection. [13] An example of this is shown through Facebook ad preferences. [13] Facebook users can have personalized ads on their feeds which Facebook portrays as a specialized, beneficial option for users but does not explicitly state that these ads largely benefit the companies and third parties involved. [13]
Default settings put users on a specific trajectory regarding their privacy. The principle of reciprocity (PoR) is played out in terms of privacy and sociability on these networks. [8] PoR is the concept that users who give their privacy receive application utility in return. [8] In a WhatsApp study testing user privacy choices (see Reception and criticism of WhatsApp security and privacy features), users were interviewed regarding the Last Seen Online (LSO) and Read Receipt (RR) option. [8] Participants revealed a higher preference for keeping the RR on than the LSO option. [8] Individuals were more likely to share if they have read a message in exchange for their recipients' read status, an example of the principle of reciprocity. [8] LSO was not as practical as RR because being online was not a direct translation to reading their messages. [8] In the study, younger participants had LSO turned off, indicating that older participants were less likely to have restrictive privacy settings. [8] This could be because older users have closer circle of contacts and/or they do not place as much emphasis on what they share. [8] Designing privacy settings to be reciprocal force users to make a decision on what private information they are willing to exchange for others' private information. [8]
SNS companies who want to increase revenue from advertisements can achieve this by increasing the number of users and retention rate. [24] Lin et al. revealed that both informational and territorial privacy are user concerns with SNS. [24] Territory coordination, the access to an individual's virtual territory which can be a Facebook profile or a Twitter page, influences user privacy management more than informational disclosure. [24] More fine-grain privacy settings are recommended by Lin et al. to better suit a wide collection of territorial and informational privacy preferences. [24] By targeting user preferences and needs, companies can increase the number of users on their platforms and increase their revenue. [24] This serves as a monetary motivation for companies to adjust their privacy settings to better support users. [24]
Cultural differences such as being in a collectivistic versus an individualistic society can influence the general privacy settings chosen by users. [14] In a study that analyzed varying Twitter privacy behaviors across the globe, there appeared a cultural difference between countries. [14] In this study, culture was measured through individualism and uncertainty avoidance. According to Hofstede's cultural dimensions, individualism is a heavy focus on the person while collectivism is more based on group effort. Uncertainty avoidance is how uncomfortable an individual is with the unknown. Collectivistic societies tend to be less individualistic and less avoidant of uncertainty while individualistic societies tend to be more individualistic and more avoidant of uncertainty. The results of this study show collectivistic cultural values support more permissiveness, as measured by sharing geographic location on a person's profile. [14] Privacy settings in collectivist communities supported more information sharing than in an individualistic community. [14] The cultural differences between in group versus out group could also play a role in this relationship. Collectivistic societies, such as Japan or China, distinguish more between in group versus out group than their individualistic counterparts. Users in collectivistic countries typically keep a smaller inner circle, which fosters a more intimate, trusting environment for sharing personal information. [14] On the other hand, individualistic societies such as the U.S. or Europe, typically keep a bigger social circle and in group and out group do not differ much. Internet penetration, the measure of how many people use to how many do not use the Internet in a country, predicted user privacy. [14] In areas that had low internet penetration, users likely had private accounts but also did not conceal their location. [14] This study reveals how cultural values and access to Internet can affect a person's privacy settings. [14]
Societal norms can also contribute to privacy setting behavior. [2] Close friends’ own privacy behavior can influence a person's intentions to also participate in similar behavior. [2] However, intentions may not always lead to action which was seen in Saeri et al.’s study. [2] Two weeks after indicating intentions to engage in privacy protection behavior on a survey, most participants did nothing. [2] An explanation lies in the habitual nature of using Facebook, making it difficult for some to change their behavior. [2] Facebook also encourages users to share information which shapes norms that influence behavior. [2] Norms serve a role in influencing privacy behavior. [2]
With the growing prevalence of social media, the risk for privacy leaks becomes more and more possible. Privacy settings can help protect users if designed and used in specific ways. [25] Privacy settings could be re-designed with the intention of protecting the user as much as possible. [25] This is called privacy by design. Privacy by design aims to limit the risks of information sharing while maintaining or possibly increasing the benefits. [25] Privacy policies can be complex and unclear which serves as an obstacle to users understanding their privacy. [13] [26] Potential changes in privacy settings include simpler privacy policies that are concise, clear, and understood by the user. Incorporating reminders for users to check, and possibly update their privacy settings occasionally may increase privacy awareness. [10] With default settings designed for users to keep an open profile, Watson et al. offered a different default setting design. Facebook default privacy settings could be altered to be more conservative (e.g. not being searchable by anyone on Facebook) to possibly prevent unintentional information sharing. [19] However, utility needs to be balanced with default privacy settings. [19] If default privacy settings were too strict and closed off, the functionality of social media apps could decrease. [19] A balance between default privacy settings that protect the user from unwanted privacy leaks but also allow users to socialize and interact online should be considered.
When first choosing privacy settings, it may be useful to choose from pre-made profiles that have varying levels of privacy in them. [27] Sanchez et al.’s study revealed that profile examples accurately reflect user privacy preferences and beliefs; this limits the discrepancy between privacy beliefs and privacy behavior. [27] Along with the profile examples, users could also manually change and adjust their privacy settings accordingly after. [27] This keeps the privacy setting process flexible and more convenient than manually choosing each privacy setting option. [27] Furthermore, a simulation tool that informs users about posts and comments’ visibility can encourage users to use privacy settings more. [28] Sayin et al. created a Facebook simulation tool that showed users how a post's audience setting was going to affect comment owners and their privacy. [28] For example, if a user commented on a post that was initially only viewable by friends but later changed to public, the user's privacy is risked because Facebook does not notify users of this audience change. [28] 95% of participants that used the simulation tool believed that this tool would help increase Facebook privacy awareness. [28]
Anytime a new connection is made on a social media app, users could be prompted to set privacy settings for that specific individual. [10] This may be tedious and too effortful for some users to use effectively. However, this can be balanced with the assistance of software tools such as a personal data manager. This software can be used to take into account user's privacy wants, and apply appropriate privacy settings that match these preferences to an individual's accounts. [9] However, more research needs to be conducted to make sure this software can accurately apply privacy preferences to privacy settings. Personal data managers have the potential to help users become more involved in their privacy and lessen the effort for setting privacy controls. Another software, AID-S (Adaptive Inference Discovery Service) personalizes each user's privacy preferences since what is considered private information varies from each individual. [29] Torre et al. found that AID-S can be used to find a user's preferred privacy settings and help users make more informed decisions regarding privacy including third party inclusion. [29] Furthermore, a framework was created for smart home information processing that includes a two layer security that improves user privacy. [30] This framework incorporates user privacy preferences, similar to the personal data manager, [9] and uses Data Encryption Standard (DES) and Top Three Most Significant Bit (TTMSB) to safely transmit data from smart home devices. [30] TTMSB takes into account user privacy desires and conceals sensitive information during data transmission. [30]
Data science is a field that has been used to extract inferred information from databases of collected SNS user information. [25] However, the ability to infer information from data science can be used to better inform users on how their information can be used and increase informed privacy decisions. [25] Control over one's privacy and transparency on how their information will be used can help facilitate a relationship between data science and privacy. [25]
Trust-based negotiations are based on a contingency of acceptance or rejection from the user. In respect to privacy, trust-based negotiations have been offered as a diversion from the binary of accepting or rejecting privacy policies in full, and allow acceptance and rejection of specific parts of the privacy policies. [9] This allows users to have more control over their privacy and allow interaction between the two parties. The general data protection regulation (GDPR) ensures that third parties (TPs) are using concise, clear language in their privacy policies while the user also gives a clear response of acceptance or rejection to them. [9] There must be a consensual agreement between the user and the TPs. [9] If either side is unsatisfied with the privacy terms, they can negotiate. [9] PDM plays a major role in this by mediating between the user and the TP. [9] PDM applies the user's privacy preferences to the TP's privacy statements and either accepts or rejects it based on the user's preferences. [9] If there is a rejection, both parties can enter negotiation. [9] The advancement of an interactive privacy model attempts to make privacy settings a one-time action that will be applied to all Internet of things (IoT). [9]
The privacy paradox is contributed by lack of privacy knowledge, fatigue, and feeling distant from a privacy invasion. [31] Education and increasing intrinsic motivation can help alleviate the effect of these contributors, and in turn, the privacy paradox. [31] However, in a study that tested the effectiveness of an interactive privacy smartwatch game, game players were more likely to engage in privacy behavior such as enabling a lock screen. [31] The game was personalized where users could create their own avatar and had to complete time-sensitive tasks. [31] The time restraint encouraged users to continue engaging with the game more, and the personalization connected with them. [31] These tasks were split up into levels according to difficulty. [31] For example, tasks included checking app permissions, enabling screen lock, disabling GPS, and turning off SMS permissions. [31] Alongside the tasks, individuals also had to answer questions regarding privacy settings such as, "How can you stop your contacts from being used by apps?" [31] A person's digital character had health that was determined by performance regarding the task and answering questions correctly. [31] This study emphasizes the potential effectiveness of interactive privacy games that can increase privacy literacy and encourage more privacy setting usage. [31]
Privacy is the ability of an individual or group to seclude themselves or information about themselves, and thereby express themselves selectively.
Internet privacy involves the right or mandate of personal privacy concerning the storage, re-purposing, provision to third parties, and display of information pertaining to oneself via the Internet. Internet privacy is a subset of data privacy. Privacy concerns have been articulated from the beginnings of large-scale computer sharing and especially relate to mass surveillance.
The Platform for Privacy Preferences Project (P3P) is an obsolete protocol allowing websites to declare their intended use of information they collect about web browser users. Designed to give users more control of their personal information when browsing, P3P was developed by the World Wide Web Consortium (W3C) and officially recommended on April 16, 2002. Development ceased shortly thereafter and there have been very few implementations of P3P. Internet Explorer and Microsoft Edge were the only major browsers to support P3P. Microsoft has ended support from Windows 10 onwards. Internet Explorer and Edge on Windows 10 no longer support P3P as of 2016. W3C officially obsoleted P3P on 2018-08-30. The president of TRUSTe has stated that P3P has not been implemented widely due to the difficulty and lack of value.
A social networking service (SNS), or social networking site, is a type of online social media platform which people use to build social networks or social relationships with other people who share similar personal or career content, interests, activities, backgrounds or real-life connections.
Facebook has been the subject of criticism and legal action since it was founded in 2004. Criticisms include the outsize influence Facebook has on the lives and health of its users and employees, as well as Facebook's influence on the way media, specifically news, is reported and distributed. Notable issues include Internet privacy, such as use of a widespread "like" button on third-party websites tracking users, possible indefinite records of user information, automatic facial recognition software, and its role in the workplace, including employer-employee account disclosure. The use of Facebook can have negative psychological and physiological effects that include feelings of sexual jealousy, stress, lack of attention, and social media addiction that in some cases is comparable to drug addiction.
Digital footprint or digital shadow refers to one's unique set of traceable digital activities, actions, contributions, and communications manifested on the Internet or digital devices. Digital footprints can be classified as either passive or active. The former is composed of a user's web-browsing activity and information stored as cookies. The latter is often released deliberately by a user to share information on websites or social media. While the term usually applies to a person, a digital footprint can also refer to a business, organization or corporation.
Targeted advertising is a form of advertising, including online advertising, that is directed towards an audience with certain traits, based on the product or person the advertiser is promoting.
Men and women use social network services (SNSs) differently and with different frequencies. In general, several researchers have found that women tend to use SNSs more than men and for different and more social purposes.
The social data revolution is the shift in human communication patterns towards increased personal information sharing and its related implications, made possible by the rise of social networks in the early 2000s. This phenomenon has resulted in the accumulation of unprecedented amounts of public data.
Communication privacy management (CPM), originally known as communication boundary management, is a systematic research theory developed by Sandra Petronio in 1991. CPM theory aims to develop an evidence-based understanding of the way people make decisions about revealing and concealing private information. It suggests that individuals maintain and coordinate privacy boundaries with various communication partners depending on the perceived benefits and costs of information disclosure. Petronio believes disclosing private information will strengthen one's connections with others, and that we can better understand the rules for disclosure in relationships through negotiating privacy boundaries.
Digital privacy is often used in contexts that promote advocacy on behalf of individual and consumer privacy rights in e-services and is typically used in opposition to the business practices of many e-marketers, businesses, and companies to collect and use such information and data. Digital privacy, a crucial aspect of modern online interactions and services, can be defined under three sub-related categories: information privacy, communication privacy, and individual privacy.
Since the arrival of early social networking sites in the early 2000s, online social networking platforms have expanded exponentially, with the biggest names in social media in the mid-2010s being Facebook, Instagram, Twitter and Snapchat. The massive influx of personal information that has become available online and stored in the cloud has put user privacy at the forefront of discussion regarding the database's ability to safely store such personal information. The extent to which users and social media platform administrators can access user profiles has become a new topic of ethical consideration, and the legality, awareness, and boundaries of subsequent privacy violations are critical concerns in advance of the technological age.
Individualistic cultures are characterized by individualism, which is the prioritization or emphasis of the individual over the entire group. In individualistic cultures, people are motivated by their own preference and viewpoints. Individualistic cultures focus on abstract thinking, privacy, self-dependence, uniqueness, and personal goals. The term individualistic culture was first used in the 1980s by Dutch social psychologist Geert Hofstede to describe countries and cultures that are not collectivist; Hofstede created the term individualistic culture when he created a measurement for the five dimensions of cultural values.
A user profile is a collection of settings and information associated with a user. It contains critical information that is used to identify an individual, such as their name, age, portrait photograph and individual characteristics such as knowledge or expertise. User profiles are most commonly present on social media websites such as Facebook, Instagram, and LinkedIn; and serve as voluntary digital identity of an individual, highlighting their key features and traits. In personal computing and operating systems, user profiles serve to categorise files, settings, and documents by individual user environments, known as ‘accounts’, allowing the operating system to be more friendly and catered to the user. Physical user profiles serve as identity documents such as passports, driving licenses and legal documents that are used to identify an individual under the legal system.
Cross-device tracking is technology that enables the tracking of users across multiple devices such as smartphones, television sets, smart TVs, and personal computers.
Social profiling is the process of constructing a social media user's profile using his or her social data. In general, profiling refers to the data science process of generating a person's profile with computerized algorithms and technology. There are various platforms for sharing this information with the proliferation of growing popular social networks, including but not limited to LinkedIn, Google+, Facebook and Twitter.
The gathering of personally identifiable information (PII) refers to the collection of public and private personal data that can be used to identify individuals for various purposes, both legal and illegal. PII gathering is often seen as a privacy threat by data owners, while entities such as technology companies, governments, and organizations utilize this data to analyze consumer behavior, political preferences, and personal interests.
The 2018 Google data breach was a major data privacy scandal in which the Google+ API exposed the private data of over five hundred thousand users.
The advent of social networking services has led to many issues spanning from misinformation and disinformation to privacy concerns related to public and private personal data.
Meta Platforms Inc., or Meta for short, has faced a number of privacy concerns. These stem partly from the company's revenue model that involves selling information collected about its users for many things including advertisement targeting. Meta Platforms Inc. has also been a part of many data breaches that have occurred within the company. These issues and others are further described including user data concerns, vulnerabilities in the company's platform, investigations by pressure groups and government agencies, and even issues with students. In addition, employers and other organizations/individuals have been known to use Meta Platforms Inc. for their own purposes. As a result, individuals’ identities and private information have sometimes been compromised without their permission. In response to these growing privacy concerns, some pressure groups and government agencies have increasingly asserted the users’ right to privacy and to be able to control their personal data.