Social norm design for information exchange systems with limited observations

Jie Xu, Mihaela Van Der Schaar

Research output: Contribution to journalArticle

11 Citations (Scopus)

Abstract

Information exchange systems, such as BitTorrent, Yahoo Answers, Yelp, Amazon Mechanical Turk, differ in many ways, but all share a common vulnerability to selfish behavior and free-riding. In this paper, we build incentives schemes based on social norms. Social norms prescribe a social strategy for the agents in the system to follow and deploy reputation schemes to reward or penalize agents depending on whether they follow or deviate from the prescribed strategy when selecting actions. Because agents in these systems often have only limited capability to observe the global system information, e.g. the reputation distribution of the agents participating in the system, their beliefs about the reputation distribution are heterogeneous and biased. Such belief heterogeneity causes a positive fraction of agents to not follow the social strategy. In such practical scenarios, the standard equilibrium analysis deployed in the economics literature is no longer directly applicable and hence, the system design needs to consider these differences. To investigate how the system designs need to change, we focus on a simple social norm with binary reputation labels but allow adjusting the punishment severity through randomization. First, we model the belief heterogeneity using a suitable Bayesian belief function. Next, we formalize the agents' optimal decision problems and derive in which scenarios they follow the prescribed social strategy. Then we study how the system state is determined by the agents' strategic behavior. We are particularly interested in the robust equilibrium where the system state becomes invariant when all agents strategically optimize their decisions. By rigorously studying two specific cases where agents' belief distribution is constant or is linearly influenced by the true reputation distribution, we prove that the optimal reputation update rule is to choose the mildest possible punishment. This result is further confirmed for more sophisticated belief influences in simulations. In conclusion, our proposed design framework enables the development of optimal social norms for various deployment scenarios with limited observations.

Original languageEnglish (US)
Article number6354271
Pages (from-to)2126-2135
Number of pages10
JournalIEEE Journal on Selected Areas in Communications
Volume30
Issue number11
DOIs
StatePublished - 2012
Externally publishedYes

Fingerprint

Systems analysis
Labels
Information systems
Economics

Keywords

  • game theory
  • limited observations
  • reputation

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Computer Networks and Communications

Cite this

Social norm design for information exchange systems with limited observations. / Xu, Jie; Van Der Schaar, Mihaela.

In: IEEE Journal on Selected Areas in Communications, Vol. 30, No. 11, 6354271, 2012, p. 2126-2135.

Research output: Contribution to journalArticle

@article{e29c32149eb946fca09f65e897f939b9,
title = "Social norm design for information exchange systems with limited observations",
abstract = "Information exchange systems, such as BitTorrent, Yahoo Answers, Yelp, Amazon Mechanical Turk, differ in many ways, but all share a common vulnerability to selfish behavior and free-riding. In this paper, we build incentives schemes based on social norms. Social norms prescribe a social strategy for the agents in the system to follow and deploy reputation schemes to reward or penalize agents depending on whether they follow or deviate from the prescribed strategy when selecting actions. Because agents in these systems often have only limited capability to observe the global system information, e.g. the reputation distribution of the agents participating in the system, their beliefs about the reputation distribution are heterogeneous and biased. Such belief heterogeneity causes a positive fraction of agents to not follow the social strategy. In such practical scenarios, the standard equilibrium analysis deployed in the economics literature is no longer directly applicable and hence, the system design needs to consider these differences. To investigate how the system designs need to change, we focus on a simple social norm with binary reputation labels but allow adjusting the punishment severity through randomization. First, we model the belief heterogeneity using a suitable Bayesian belief function. Next, we formalize the agents' optimal decision problems and derive in which scenarios they follow the prescribed social strategy. Then we study how the system state is determined by the agents' strategic behavior. We are particularly interested in the robust equilibrium where the system state becomes invariant when all agents strategically optimize their decisions. By rigorously studying two specific cases where agents' belief distribution is constant or is linearly influenced by the true reputation distribution, we prove that the optimal reputation update rule is to choose the mildest possible punishment. This result is further confirmed for more sophisticated belief influences in simulations. In conclusion, our proposed design framework enables the development of optimal social norms for various deployment scenarios with limited observations.",
keywords = "game theory, limited observations, reputation",
author = "Jie Xu and {Van Der Schaar}, Mihaela",
year = "2012",
doi = "10.1109/JSAC.2012.121205",
language = "English (US)",
volume = "30",
pages = "2126--2135",
journal = "IEEE Journal on Selected Areas in Communications",
issn = "0733-8716",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "11",

}

TY - JOUR

T1 - Social norm design for information exchange systems with limited observations

AU - Xu, Jie

AU - Van Der Schaar, Mihaela

PY - 2012

Y1 - 2012

N2 - Information exchange systems, such as BitTorrent, Yahoo Answers, Yelp, Amazon Mechanical Turk, differ in many ways, but all share a common vulnerability to selfish behavior and free-riding. In this paper, we build incentives schemes based on social norms. Social norms prescribe a social strategy for the agents in the system to follow and deploy reputation schemes to reward or penalize agents depending on whether they follow or deviate from the prescribed strategy when selecting actions. Because agents in these systems often have only limited capability to observe the global system information, e.g. the reputation distribution of the agents participating in the system, their beliefs about the reputation distribution are heterogeneous and biased. Such belief heterogeneity causes a positive fraction of agents to not follow the social strategy. In such practical scenarios, the standard equilibrium analysis deployed in the economics literature is no longer directly applicable and hence, the system design needs to consider these differences. To investigate how the system designs need to change, we focus on a simple social norm with binary reputation labels but allow adjusting the punishment severity through randomization. First, we model the belief heterogeneity using a suitable Bayesian belief function. Next, we formalize the agents' optimal decision problems and derive in which scenarios they follow the prescribed social strategy. Then we study how the system state is determined by the agents' strategic behavior. We are particularly interested in the robust equilibrium where the system state becomes invariant when all agents strategically optimize their decisions. By rigorously studying two specific cases where agents' belief distribution is constant or is linearly influenced by the true reputation distribution, we prove that the optimal reputation update rule is to choose the mildest possible punishment. This result is further confirmed for more sophisticated belief influences in simulations. In conclusion, our proposed design framework enables the development of optimal social norms for various deployment scenarios with limited observations.

AB - Information exchange systems, such as BitTorrent, Yahoo Answers, Yelp, Amazon Mechanical Turk, differ in many ways, but all share a common vulnerability to selfish behavior and free-riding. In this paper, we build incentives schemes based on social norms. Social norms prescribe a social strategy for the agents in the system to follow and deploy reputation schemes to reward or penalize agents depending on whether they follow or deviate from the prescribed strategy when selecting actions. Because agents in these systems often have only limited capability to observe the global system information, e.g. the reputation distribution of the agents participating in the system, their beliefs about the reputation distribution are heterogeneous and biased. Such belief heterogeneity causes a positive fraction of agents to not follow the social strategy. In such practical scenarios, the standard equilibrium analysis deployed in the economics literature is no longer directly applicable and hence, the system design needs to consider these differences. To investigate how the system designs need to change, we focus on a simple social norm with binary reputation labels but allow adjusting the punishment severity through randomization. First, we model the belief heterogeneity using a suitable Bayesian belief function. Next, we formalize the agents' optimal decision problems and derive in which scenarios they follow the prescribed social strategy. Then we study how the system state is determined by the agents' strategic behavior. We are particularly interested in the robust equilibrium where the system state becomes invariant when all agents strategically optimize their decisions. By rigorously studying two specific cases where agents' belief distribution is constant or is linearly influenced by the true reputation distribution, we prove that the optimal reputation update rule is to choose the mildest possible punishment. This result is further confirmed for more sophisticated belief influences in simulations. In conclusion, our proposed design framework enables the development of optimal social norms for various deployment scenarios with limited observations.

KW - game theory

KW - limited observations

KW - reputation

UR - http://www.scopus.com/inward/record.url?scp=84870264859&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84870264859&partnerID=8YFLogxK

U2 - 10.1109/JSAC.2012.121205

DO - 10.1109/JSAC.2012.121205

M3 - Article

AN - SCOPUS:84870264859

VL - 30

SP - 2126

EP - 2135

JO - IEEE Journal on Selected Areas in Communications

JF - IEEE Journal on Selected Areas in Communications

SN - 0733-8716

IS - 11

M1 - 6354271

ER -