Reputation system

Last updated

Reputation systems are programs or algorithms that allow users to rate each other in online communities in order to build trust through reputation. Some common uses of these systems can be found on E-commerce websites such as eBay, Amazon.com, and Etsy as well as online advice communities such as Stack Exchange. [1] These reputation systems represent a significant trend in "decision support for Internet mediated service provisions". [2] With the popularity of online communities for shopping, advice, and exchange of other important information, reputation systems are becoming vitally important to the online experience. The idea of reputation systems is that even if the consumer can't physically try a product or service, or see the person providing information, that they can be confident in the outcome of the exchange through trust built by recommender systems. [2]

Contents

Collaborative filtering, used most commonly in recommender systems, are related to reputation systems in that they both collect ratings from members of a community. [2] The core difference between reputation systems and collaborative filtering is the ways in which they use user feedback. In collaborative filtering, the goal is to find similarities between users in order to recommend products to customers. The role of reputation systems, in contrast, is to gather a collective opinion in order to build trust between users of an online community.

Types

Online

Howard Rheingold states that online reputation systems are "computer-based technologies that make it possible to manipulate in new and powerful ways an old and essential human trait". [3] Rheingold says that these systems arose as a result of the need for Internet users to gain trust in the individuals they transact with online. The trait he notes in human groups is that social functions such as gossip "keeps us up to date on who to trust, who other people trust, who is important, and who decides who is important". Internet sites such as eBay and Amazon, he argues, seek to make use of this social trait and are "built around the contributions of millions of customers, enhanced by reputation systems that police the quality of the content and transactions exchanged through the site".

Reputation banks

The emerging sharing economy increases the importance of trust in peer-to-peer marketplaces and services. [4] Users can build up reputation and trust in individual systems but usually don't have the ability to carry those reputations to other systems. Rachel Botsman and Roo Rogers argue in their book What's Mine is Yours (2010), [5] that "it is only a matter of time before there is some form of network that aggregates reputation capital across multiple forms of Collaborative Consumption". These systems, often referred to as reputation banks, try to give users a platform to manage their reputation capital across multiple systems.

Maintaining effective reputation systems

The main function of reputation systems is to build a sense of trust among users of online communities. As with brick and mortar stores, trust and reputation can be built through customer feedback. Paul Resnick from the Association for Computing Machinery describes three properties that are necessary for reputation systems to operate effectively. [2]

  1. Entities must have a long lifetime and create accurate expectations of future interactions.
  2. They must capture and distribute feedback about prior interactions.
  3. They must use feedback to guide trust.

These three properties are critically important in building reliable reputations, and all revolve around one important element: user feedback. User feedback in reputation systems, whether it be in the form of comments, ratings, or recommendations, is a valuable piece of information. Without user feedback, reputation systems cannot sustain an environment of trust.

Eliciting user feedback can have three related problems.

  1. The first of these problems is the willingness of users to provide feedback when the option to do so is not required. If an online community has a large stream of interactions happening, but no feedback is gathered, the environment of trust and reputation cannot be formed.
  2. The second of these problems is gaining negative feedback from users. Many factors contribute to users not wanting to give negative feedback, the most prominent being a fear of retaliation. When feedback is not anonymous, many users fear retaliation if negative feedback is given.
  3. The final problem related to user feedback is eliciting honest feedback from users. Although there is no concrete method for ensuring the truthfulness of feedback, if a community of honest feedback is established, new users will be more likely to give honest feedback as well.

Other pitfalls to effective reputation systems described by A. Josang et al. include change of identities and discrimination. Again these ideas tie back to the idea of regulating user actions in order to gain accurate and consistent user feedback. When analyzing different types of reputation systems it is important to look at these specific features in order to determine the effectiveness of each system.

Standardization attempt

The IETF proposed a protocol to exchange reputation data. [6] It was originally aimed at email applications, but it was subsequently developed as a general architecture for a reputation-based service, followed by an email-specific part. [7] However, the workhorse of email reputation remains with DNSxL's, which do not follow that protocol. [8] Those specification don't say how to collect feedback —in fact, the granularity of email sending entities makes it impractical to collect feedback directly from recipients— but are only concerned with reputation query/response methods.

Notable examples of practical applications

Reputation as a resource

High reputation capital often confers benefits upon the holder. For example, a wide range of studies have found a positive correlation between seller rating and selling price on eBay, [10] indicating that high reputation can help users obtain more money for their items. High product reviews on online marketplaces can also help drive higher sales volumes.

Abstract reputation can be used as a kind of resource, to be traded away for short-term gains or built up by investing effort. For example, a company with a good reputation may sell lower-quality products for higher profit until their reputation falls, or they may sell higher-quality products to increase their reputation. [11] Some reputation systems go further, making it explicitly possible to spend reputation within the system to derive a benefit. For example, on the Stack Overflow community, reputation points can be spent on question "bounties" to incentivize other users to answer the question. [12]

Even without an explicit spending mechanism in place, reputation systems often make it easier for users to spend their reputation without harming it excessively. For example, a ridesharing company driver with a high ride acceptance score (a metric often used for driver reputation) may opt to be more selective about his or her clientele, decreasing the driver's acceptance score but improving his or her driving experience. With the explicit feedback provided by the service, drivers can carefully manage their selectivity to avoid being penalized too heavily.

Attacks and defense

Reputation systems are in general vulnerable to attacks, and many types of attacks are possible. [13] As the reputation system tries to generate an accurate assessment based on various factors including but not limited to unpredictable user size and potential adversarial environments, the attacks and defense mechanisms play an important role in the reputation systems. [14]

Attack classification of reputation system is based on identifying which system components and design choices are the targets of attacks. While the defense mechanisms are concluded based on existing reputation systems.

Attacker model

The capability of the attacker is determined by several characteristics, e.g., the location of the attacker related to the system (insider attacker vs. outsider attacker). An insider is an entity who has legitimate access to the system and can participate according to the system specifications, while an outsider is any unauthorized entity in the system who may or may not be identifiable.

As the outsider attack is much more similar to other attacks in a computer system environment, the insider attack gets more focus in the reputation system. Usually, there are some common assumptions: the attackers are motivated either by selfish or malicious intent and the attackers can either work alone or in coalitions.

Attack classification

Attacks against reputation systems are classified based on the goals and methods of the attacker.

Defense strategies

Here are some strategies to prevent the above attacks. [17]

See also

Related Research Articles

<span class="mw-page-title-main">HTTP</span> Application protocol for distributed, collaborative, hypermedia information systems

HTTP is an application layer protocol in the Internet protocol suite model for distributed, collaborative, hypermedia information systems. HTTP is the foundation of data communication for the World Wide Web, where hypertext documents include hyperlinks to other resources that the user can easily access, for example by a mouse click or by tapping the screen in a web browser.

Multiprotocol Label Switching (MPLS) is a routing technique in telecommunications networks that directs data from one node to the next based on labels rather than network addresses. Whereas network addresses identify endpoints, the labels identify established paths between endpoints. MPLS can encapsulate packets of various network protocols, hence the multiprotocol component of the name. MPLS supports a range of access technologies, including T1/E1, ATM, Frame Relay, and DSL.

In computing, Internet Protocol Security (IPsec) is a secure network protocol suite that authenticates and encrypts packets of data to provide secure encrypted communication between two computers over an Internet Protocol network. It is used in virtual private networks (VPNs).

<span class="mw-page-title-main">Email client</span> Computer program used to access and manage a users email

An email client, email reader or, more formally, message user agent (MUA) or mail user agent is a computer program used to access and manage a user's email.

The Internet Architecture Board (IAB) is a committee of the Internet Engineering Task Force (IETF) and an advisory body of the Internet Society (ISOC). Its responsibilities include architectural oversight of IETF activities, Internet Standards Process oversight and appeal, and the appointment of the Request for Comments (RFC) Editor. The IAB is also responsible for the management of the IETF protocol parameter registries.

In computing, Internet Key Exchange is the protocol used to set up a security association (SA) in the IPsec protocol suite. IKE builds upon the Oakley protocol and ISAKMP. IKE uses X.509 certificates for authentication ‒ either pre-shared or distributed using DNS ‒ and a Diffie–Hellman key exchange to set up a shared session secret from which cryptographic keys are derived. In addition, a security policy for every peer which will connect must be manually maintained.

In cryptography, X.509 is an International Telecommunication Union (ITU) standard defining the format of public key certificates. X.509 certificates are used in many Internet protocols, including TLS/SSL, which is the basis for HTTPS, the secure protocol for browsing the web. They are also used in offline applications, like electronic signatures.

In cryptography, a certificate authority or certification authority (CA) is an entity that stores, signs, and issues digital certificates. A digital certificate certifies the ownership of a public key by the named subject of the certificate. This allows others to rely upon signatures or on assertions made about the private key that corresponds to the certified public key. A CA acts as a trusted third party—trusted both by the subject (owner) of the certificate and by the party relying upon the certificate. The format of these certificates is specified by the X.509 or EMV standard.

<span class="mw-page-title-main">Phishing</span> Form on social engineering

Phishing is a form on social engineering and a scam where attackers deceive people into revealing sensitive information or installing malware such as viruses, worms, adware, or ransomware. Phishing attacks have become increasingly sophisticated and often transparently mirror the site being targeted, allowing the attacker to observe everything while the victim navigates the site, and transverses any additional security boundaries with the victim. As of 2020, it is the most common type of cybercrime, with the FBI's Internet Crime Complaint Center reporting more incidents of phishing than any other type of computer crime.

<span class="mw-page-title-main">Trust metric</span> Term in psychology and sociology

In psychology and sociology, a trust metric is a measurement or metric of the degree to which one social actor trusts another social actor. Trust metrics may be abstracted in a manner that can be implemented on computers, making them of interest for the study and engineering of virtual communities, such as Friendster and LiveJournal.

Internet security is a branch of computer security. It encompasses the Internet, browser security, web site security, and network security as it applies to other applications or operating systems as a whole. Its objective is to establish rules and measures to use against attacks over the Internet. The Internet is an inherently insecure channel for information exchange, with high risk of intrusion or fraud, such as phishing, online viruses, trojans, ransomware and worms.

Reputation management, originally a public relations term, refers to the influencing, controlling, enhancing, or concealing of an individual's or group's reputation. The growth of the internet and social media led to growth of reputation management companies, with search results as a core part of a client's reputation. Online reputation management, sometimes abbreviated as ORM, focuses on the management of product and service search engine results.

The domain name arpa is a top-level domain (TLD) in the Domain Name System (DNS) of the Internet. It is used predominantly for the management of technical network infrastructure. Prominent among such functions are the subdomains in-addr.arpa and ip6.arpa, which provide namespaces for reverse DNS lookup of IPv4 and IPv6 addresses, respectively.

A Sybil attack is a type of attack on a computer network service in which an attacker subverts the service's reputation system by creating a large number of pseudonymous identities and uses them to gain a disproportionately large influence. It is named after the subject of the book Sybil, a case study of a woman diagnosed with dissociative identity disorder. The name was suggested in or before 2002 by Brian Zill at Microsoft Research. The term pseudospoofing had previously been coined by L. Detweiler on the Cypherpunks mailing list and used in the literature on peer-to-peer systems for the same class of attacks prior to 2002, but this term did not gain as much influence as "Sybil attack".

<span class="mw-page-title-main">Feedback loop (email)</span> Process of forwarding user complaints to senders

A feedback loop (FBL), sometimes called a complaint feedback loop, is an inter-organizational form of feedback by which a mailbox provider (MP) forwards the complaints originating from their users to the sender's organizations. MPs can receive users' complaints by placing report spam buttons on their webmail pages, or in their email client, or via help desks. The message sender's organization, often an email service provider, has to come to an agreement with each MP from which they want to collect users' complaints.

Online participation is used to describe the interaction between users and online communities on the web. Online communities often involve members to provide content to the website or contribute in some way. Examples of such include wikis, blogs, online multiplayer games, and other types of social platforms. Online participation is currently a heavily researched field. It provides insight into fields such as web design, online marketing, crowdsourcing, and many areas of psychology. Some subcategories that fall under online participation are: commitment to online communities, coordination and interaction, and member recruitment.

HTTP Strict Transport Security (HSTS) is a policy mechanism that helps to protect websites against man-in-the-middle attacks such as protocol downgrade attacks and cookie hijacking. It allows web servers to declare that web browsers should automatically interact with it using only HTTPS connections, which provide Transport Layer Security (TLS/SSL), unlike the insecure HTTP used alone. HSTS is an IETF standards track protocol and is specified in RFC 6797.

DNS-based Authentication of Named Entities (DANE) is an Internet security protocol to allow X.509 digital certificates, commonly used for Transport Layer Security (TLS), to be bound to domain names using Domain Name System Security Extensions (DNSSEC).

Q&A software is online software that attempts to answer questions asked by users. Q&A software is frequently integrated by large and specialist corporations and tends to be implemented as a community that allows users in similar fields to discuss questions and provide answers to common and specialist questions.

Proof of personhood (PoP) is a means of resisting malicious attacks on peer to peer networks, particularly, attacks that utilize multiple fake identities, otherwise known as a Sybil attack. Decentralized online platforms are particularly vulnerable to such attacks by their very nature, as notionally democratic and responsive to large voting blocks. In PoP, each unique human participant obtains one equal unit of voting power, and any associated rewards.

References

  1. "What is reputation? How do I earn (and lose) it? - Help Center". Stack Overflow. Retrieved 2022-11-15.
  2. 1 2 3 4 Josang, Audun (2000). "A survey of trust and reputation systems for online service provision". Decision Support Systems. 45 (2): 618–644. CiteSeerX   10.1.1.687.1838 . doi:10.1016/j.dss.2005.05.019. S2CID   209552.
  3. Books in Print Supplement. R. R. Bowker Company. 2002. ISBN   978-0-8352-4564-7.
  4. Tanz, Jason (May 23, 2014). "How Airbnb and Lyft Finally Got Americans to Trust Each Other". Wired.
  5. Botsman, Rachel (2010). What's Mine is Yours . New York: Harper Business. ISBN   978-0061963544.
  6. Nathaniel Borenstein; Murray S. Kucherawy (November 2013). An Architecture for Reputation Reporting. IETF. doi: 10.17487/RFC7070 . RFC 7070 . Retrieved 20 April 2017.
  7. Nathaniel Borenstein; Murray S. Kucherawy (November 2013). A Reputation Response Set for Email Identifiers. IETF. doi: 10.17487/RFC7073 . RFC 7073 . Retrieved 20 April 2017.
  8. John Levine (February 2010). DNS Blacklists and Whitelists. IETF. doi: 10.17487/RFC5782 . RFC 5782 . Retrieved 20 April 2017.
  9. Dencheva, S.; Prause, C. R.; Prinz, W. (September 2011). Dynamic self-moderation in a corporate wiki to improve participation and contribution quality (PDF). Proceedings of the 12th European Conference on Computer Supported Cooperative Work (ECSCW 2011). Aarhus, Denmark. Archived from the original (PDF) on 2014-11-29.
  10. Ye, Qiang (2013). "In-Depth Analysis of the Seller Reputation and Price Premium Relationship: A Comparison Between eBay US And Taobao China" (PDF). Journal of Electronic Commerce Research. 14 (1). Archived from the original (PDF) on 2017-08-08. Retrieved 2015-04-30.
  11. Winfree, Jason, A. (2003). "Collective Reputation and Quality" (PDF). American Agricultural Economics Association Meetings.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  12. "What is a bounty? How can I start one? - Help Center". stackoverflow.com.
  13. Jøsang, A.; Golbeck, J. (September 2009). Challenges for Robust of Trust and Reputation Systems (PDF). Proceedings of the 5th International Workshop on Security and Trust Management (STM 2009). Saint Malo, France.
  14. Hoffman, K.; Zage, D.; Nita-Rotaru, C. (2009). "A survey of attack and defense techniques for reputation systems" (PDF). ACM Computing Surveys. 42: 1–31. CiteSeerX   10.1.1.172.8253 . doi:10.1145/1592451.1592452. S2CID   2294541. Archived from the original (PDF) on 2017-04-07. Retrieved 2016-12-05.
  15. Lazzari, Marco (March 2010). An experiment on the weakness of reputation algorithms used in professional social networks: the case of Naymz. Proceedings of the IADIS International Conference e-Society 2010. Porto, Portugal. Archived from the original on 2016-03-07. Retrieved 2014-08-28.
  16. Srivatsa, M.; Xiong, L.; Liu, L. (2005). TrustGuard: countering vulnerabilities in reputation management for decentralized overlay networks (PDF). Proceedings of the IADIS International Conference e-Society 2010the 14th international conference on World Wide Web. Porto, Portugal. doi:10.1145/1060745.1060808. S2CID   1612033. Archived from the original (PDF) on 2017-10-18.
  17. Hoffman, Kevin; Zage, David; Nita-Rotaru, Cristina (2009-12-14). "A survey of attack and defense techniques for reputation systems". ACM Computing Surveys. 42 (1): 1:1–1:31. CiteSeerX   10.1.1.172.8253 . doi:10.1145/1592451.1592452. ISSN   0360-0300. S2CID   2294541.