Skip to main content
Advertisement
  • Loading metrics

Reputation Effects in Public and Private Interactions

  • Hisashi Ohtsuki ,

    ohtsuki_hisashi@soken.ac.jp

    Affiliation Department of Evolutionary Studies of Biosystems, School of Advanced Sciences, SOKENDAI (The Graduate University for Advanced Studies), Hayama, Kanagawa, Japan

  • Yoh Iwasa,

    Affiliation Department of Biology, Faculty of Sciences, Kyushu University, Fukuoka, Japan

  • Martin A. Nowak

    Affiliations Program for Evolutionary Dynamics, Harvard University, Cambridge, Massachusetts, United States of America, Department of Mathematics, Harvard University, Cambridge, Massachusetts, United States of America, Department of Organismic and Evolutionary Biology, Harvard University, Cambridge, Massachusetts, United States of America

Abstract

We study the evolution of cooperation in a model of indirect reciprocity where people interact in public and private situations. Public interactions have a high chance to be observed by others and always affect reputation. Private interactions have a lower chance to be observed and only occasionally affect reputation. We explore all second order social norms and study conditions for evolutionary stability of action rules. We observe the competition between “honest” and “hypocritical” strategies. The former cooperate both in public and in private. The later cooperate in public, where many others are watching, but try to get away with defection in private situations. The hypocritical idea is that in private situations it does not pay-off to cooperate, because there is a good chance that nobody will notice it. We find simple and intuitive conditions for the evolution of honest strategies.

Author Summary

We study the evolution of cooperation based on reputation. This mechanism is called indirect reciprocity. In a world of binary reputations, people help a good individual but do not help a bad one. They also monitor their own reputation to receive reciprocation from others. We propose a novel model of indirect reciprocity where two types of interactions exist. In a public interaction your behavior is always observed by others. In a private interaction, your behavior is less likely to be observed. We study the competition between honest and hypocritical strategies. The former always help good individuals, whereas the latter do so only in private interactions. We describe conditions for the evolution of honest strategies.

Introduction

Most human interactions occur in situations where repetition is possible and reputation is at stake. Repeated interactions in a group of players facilitate evolution of cooperation via indirect reciprocity [1, 2]: here players use conditional strategies that depend on what has happened between others. Cooperation is costly but can establish a good reputation. Others might preferentially cooperate with those who have a good reputation. Many studies explore theoretical [349] and empirical [5070] aspects of indirect reciprocity. Experiments reveal that people help those who help others [50, 5257, 59, 60, 6267, 70]. Reputation is a strong driving force of prosocial behavior [54, 61, 7180]. Internet commerce is based to a large extent on reputation systems: buyers are sensitive to sellers’ reputation [76, 8187].

The standard framework of indirect reciprocity assumes that each interaction consists of a donor and a recipient. The donor can choose between cooperation and defection. If the donor cooperates, her cost is c and the benefit for the recipient is b. If the donor defects, there is no cost and no benefit. A crucial parameter is the benefit-to-cost ratio, b/c.

Many studies of indirect reciprocity so far assumed that social interactions are public, which means that everyone is informed about the outcome of these interactions. In reality, however, social information is often incomplete. A previous study [6] showed that if donors remember the reputation of their recipient only with probability q, cooperation evolves if b/c > 1/q, suggesting that incomplete information hinders indirect reciprocity.

But in this paper we study another source of incomplete information; the absence of observers. Engelmann & Fischbacher [63] found that donors helped recipients substantially more when their reputation score was seen by others than when it was not publicly announced. Accumulating evidence suggests that humans are very sensitive to cues of being observed, and change their social behavior accordingly [72, 73, 75, 77, 79]. These facts motivate us to study strategies in indirect reciprocity when two types of interactions differing in observability are mixed. In public interactions the donor’s action always affects his reputation, but in private ones it affects his reputation only with probability q and otherwise his reputation is unchanged. In our framework people can use different strategies depending on whether they are in private or in public situations. Moreover, observers can evaluate private and public interactions with different assessment rules. For example, they could be indifferent to displays of public cooperation, but very much reward private cooperation, if they hear about it, or vice versa. Crucial questions such as those have not yet been studied in the context of indirect reciprocity.

Models

We have constructed a theoretical model of indirect reciprocity to study the effect of the mixture of public and private interactions. We consider an infinitely large population of players who are engaged in indirect reciprocity interactions. In each time step, they are randomly matched to form a pair that consists of a donor and a recipient. The donor can choose between cooperation and defection. If the donor cooperates, her cost is c and the benefit for the recipient is b. If the donor defects, there is no cost and no benefit.

In addition, each interaction within a pair is public with probability, p, and private with probability 1 − p. For simplicity we assume that a public interaction is always observed, while a private interaction is observed only with probability q. Therefore, on average, an interaction is observed with probability, .

We consider a world with binary reputation scores: each player has either a good or a bad reputation, depending on his previous actions. We assume that everyone knows and agrees on others’ reputation scores. A strategy in indirect reciprocity games is called an action rule. We have 16 different action rules. Each action rule specifies whether to cooperate or to defect given that the situation is either public or private and given that the reputation of the recipient is either good or bad (Fig 1). For example, the action rule CDCD (honest) means: cooperate with good recipients and defect with bad ones, no matter whether the situation is public or private. In contrast, CDDD (hypocrite) means: in public situations the actor will cooperate with good recipients and defect with bad ones, but in private situations the actor will always defect. There is also unconditional cooperation, CCCC, and unconditional defection, DDDD.

thumbnail
Fig 1. An action rule specifies how a donor behaves in ‘public’ and ‘private’ interactions with recipients who have either a ‘good’ or ‘bad’ reputation.

The donor’s choice is binary: cooperate or defect. There are 16 different action rules. A social norm describes how (potential) observers evaluate an interaction. The donor’s reputation is updated dependent on the type of interaction (public or private), the recipient’s reputation (good or bad) and the donor’s action (cooperate or defect). There are 256 such second-order social norms.

https://doi.org/10.1371/journal.pcbi.1004527.g001

If the other people in the population are informed about an interaction, then they update the reputation of the actor by using their social norm. We consider second order social norms [14, 15], which depend on the action of the donor, the reputation of the recipient and whether the interaction was public or private. Thus, there are 256 possible social norms (Fig 1). We assume that all people in the population use the same social norm. After a single game interaction, players leave the pair and go back to the population pool to wait for another recruitment. After a sufficiently large number of interactions, natural selection acts on action rules according to their payoffs. A goal of our analysis is to identify which of the 16 strategies (action rules) are evolutionarily stable for each of the 256 norms.

Because of the assumption of an infinitely large population, no two players can ever meet more than once, so the chance of any forms of directly reciprocity is excluded from the model. We further assume that the time scale of natural selection is much slower than that of social interactions and reputation updates. Throughout the analysis, therefore, we can assume that the fraction of time that one has a good reputation is at an equilibrium level. The model presented here can further be extended by incorporating the effect of errors and finite time horizon. Details of the present model as well as those extensions are described in S1 Text.

Our analysis uses the method of dynamic programming [18, 27]. Details are provided in S1 Text, but to grasp the idea, here we show one example. Consider the social norm that regards cooperation as good and defection as bad irrespective of the other factors. Such a norm is called “Scoring” [11, 15]. We ask, for example, if the action rule CDCD (honest) is an evolutionarily stable strategy (ESS) under this norm. For that purpose, we temporarily assume that everyone adopts CDCD.

Dynamic programming allows us to calculate the value of a good reputation, v, which reflects the advantage of possessing a good reputation over a bad one. In our example it is calculated as (see S1 Text for the derivation), suggesting that keeping a good reputation is advantageous, but it is qualitatively a trivial consequence of everyone using CDCD.

With the value v, we check if each digit of the action rule CDCD follows the optimal behavior, because otherwise it is not an ESS. In our example, cooperation in public interactions costs c to the donor but makes his reputation good, by which he obtains the future benefit, v. Therefore, the relative size of c and v matters. If v < c then cooperation in public interactions does not pay, so the first digit of CDCD is not optimal. If v > c then defection in public interactions does not pay, so the second digit of CDCD is not optimal. In either way, CDCD does not follow the optimal behavior, so it is not ESS. An analysis of this kind is systematically repeated for each of the 256 social norms and for each of the 16 action to find all ESS.

Results

We find that notorious DDDD is always ESS. The question is which other strategies are ESS and for what conditions. We note that DDDD is ESS for any social norm, while all cooperative strategies require specific social norms to be ESS. In Fig 2, we show social norms that allow CDDD (hypocrite) and CDCD (honest), respectively, to be ESS.

thumbnail
Fig 2. There are 9 social norms that make CDCD evolutionarily stable, 3 social norms that stabilize CCCD, and 3 social norms that stabilize CDCC.

There are 48 social norms that make CDDD evolutionarily stable. Note that CDDD only leads to partial cooperation. The other three action rules achieve perfect cooperation except CDCD when used with the social norm where both wild cards are BB. The hypocrite strategy CDDD is robust for a wider range of social norms, but the honest strategy CDCD achieves higher levels of cooperation.

https://doi.org/10.1371/journal.pcbi.1004527.g002

We obtain the following results. If b/c is less than both and then the only ESS is DDDD. If then CDDD is ESS. CDDD achieves only partial cooperation. If then CDCD is ESS; this is the crucial condition for the evolution of an honest strategy. Social norms that support the evolutionary stability of CDCD turn out to be the combinations of “Simple-Standing”, “Kandori” (also known as “Stern-Judging”), and “Shunning” social norms [15, 19, 21] (see Fig 2). There is also the possibility that more lenient strategies can evolve if stronger conditions are fulfilled. If then CCCD is ESS. If then CDCC is ESS. All three strategies, CDCD, CCCD and CDCC, achieve full cooperation for appropriate social norms. S1 Text describes all the ESS we found in our analysis. Note that our model allows multiple ESS. In Fig 3, we show ESS that achieve the highest level of cooperation for the given parameters, because either group competition [17, 19, 22, 30] or intergroup contingent movement by players [27] is likely to favor such ESS. Fig 3 also distinguishes between the two cases, b/c > 2 and 1 < b/c < 2. Quite interestingly, in the latter case we find a pocket inside our parameter space (shown in gray in Fig 3) where no combination of p and q allows the evolutionary stability of action rules other than DDDD. This result is an unexpected consequence of the interplay between public and private interactions; the reputation that one acquires in a public interaction affects one’s private interactions, and vice versa.

thumbnail
Fig 3. Analytical classification of indirect reciprocity with public and private interactions.

Parameter regions for the most cooperative, evolutionarily stable action rules are shown. p denotes the frequency of public interactions; q denotes the probability that a private interaction is observed. Perfect cooperation (using CDCD) is realized above the solid line labeled as ‘3’, whereas only partial cooperation (using CDDD) is achieved below. There are also parameter regions where the lenient strategies, CCCD and CDCC, are ESS. For b/c < 2 there exists a parameter region where no ESS other than DDDD is found (shown in gray). The curves are given by 1: q = p/[(r − 1)(1 − p)], 2: q = p/[(r + 1)p − 1], 3: q = p/(p + r − 1), and 4: q = (r − 1)p/(1 − p), where rb/c.

https://doi.org/10.1371/journal.pcbi.1004527.g003

Our results have simple intuitive justifications. Consider the crucial condition, , that needs to hold for the honest strategy, CDCD, to be the most cooperative ESS. The most dangerous invader is CDDD. Let us now examine the situation where you have a good reputation and meet another person with a good reputation in a private situation. If you cooperate then you lose c, but maintain a good reputation. If you defect then you save c, but obtain a bad reputation with probability q. Once it occurs you will not receive cooperation (losing b) until you recover a good reputation. To regain a good reputation your help must be observed by a third party, which occurs with probability per interaction. Therefore you must wait rounds. Thus, cooperation costs c, whereas defection costs . Cooperation is less costly if yielding the desired condition. Similar intuitions exist for other conditions and are described in S1 Text.

When comparing with previous models of incomplete information, one may wonder why the evolutionary condition of the honest strategy, CDCD, is now more relaxed. Note that requires a lower benefit-to-cost ratio than the previously known condition, b/c > 1/q. The explanation is as follows. In previous models, observers are always present and your reputation as a donor is always updated, but people sometimes fail to remember your reputation. In contrast, our current model assumes that observers are sometimes not present. The absence of observers could be good news for defectors, but not necessarily; once you obtain a bad reputation, it carries over to your future interactions until your next interaction is observed. Therefore, defectors experience more hardship in the current model.

In Fig 4 we compare our analytical results with numerical simulations of evolutionary dynamics. We examine a particular social norm where defection against a ‘good’ recipient leads to bad reputation in public and private situations (when they are observed), while other actions lead to good reputation. The pairwise invasion plots confirm our evolutionary stability analysis for the various parameter regions. We also study convergence to equilibrium points starting from various initial conditions. The system behaves in accordance with our analytical results. Additional numerical tests are performed in S1 Text.

thumbnail
Fig 4. Numerical tests of evolutionary dynamics for a particular social norm which stabilizes both CDCD and CDDD (in addition to DDDD).

Top left: the social norm that is studied. Top right: our theoretical prediction of cooperative ESS for various p and q for b/c = 5. CDCD is ESS in the blue region. CDDD is ESS in the red region. Both are ESS in the purple region. Middle and Bottom rows: results of deterministic computer simulations. The chance of erroneous reputation assignment is set to e = 0.03. Middle row: pairwise comparison between a wildtype (initial frequency = 0.99) and a mutant (= 0.01). Invasion is deemed successful if the mutant’s frequency exceeds 0.01 after a long run. We observe no neutrality. Bottom row: competition among 16 action rules, where one of them is initially abundant (= 0.99) and the others are rare (= 0.01/15 each), or all 16 are initially equally abundant (= 1/16; ‘all equal’ treatment). Our theoretical prediction is in agreement with this numerical analysis. Parameter values (p, q) are I:(0.25, 0.25), II:(0.8, 0.1), III:(0.1, 0.8) and IV:(0.8, 0.8).

https://doi.org/10.1371/journal.pcbi.1004527.g004

Discussion

We have studied indirect reciprocity in public and private situations. Our system has three parameters: the benefit-to-cost ratio, b/c, the frequency of public interactions, p, and the probability that private interactions are revealed, q. Of fundamental interest is the competition between the honest strategy, CDCD, which cooperates with deserving recipients both in public and private and the hypocritical strategy, CDDD, which only cooperates in public but never in private. We find that for reasonably small q values CDCD can prevail over CDDD. The critical q for CDCD to be the most cooperative ESS is a declining function of b/c and an increasing function of p. Specifically, we can derive the following prediction: helping in a private interaction is suppressed (i) if the observability q is low AND if (ii) private interactions are rare (large p); see Fig 3. Empirical studies are needed that examine a mixture of public and private interactions in a laboratory setting or field study to test this prediction.

One is tempted to associate the two key strategies with different motives: CDCD behaves ‘properly’ irrespective of the probability of being observed, while CDDD cooperates (with deserving recipients) only when there is certainty that others will notice the good deed; in contrast it tries to get away with defection when no one is watching. Thus CDCD seems to have ‘higher morals’, while CDDD represents a more utilitarian approach. There is, however, also a cynical interpretation of CDCD: as we have noted this strategy is only stable if the probability to be observed in private interactions is sufficiently high; therefore a CDCD player cooperates in private situations, because on average it does pay-off to do so. Our theoretical results suggest that strategic reputation building [15, 57, 63] is a strong driving force for the evolution of action rules in indirect reciprocity.

Supporting Information

S1 Text. Full analysis of the model.

The supporting information is structured as follows. Section 1 describes our full model with maximal generality. Section 2 describes the analytical framework to analyze this full model. Section 3 shows the results of our ESS search. Section 4 explains a reduced model, which directly corresponds to the model described in the main text. Section 5 describes numerical simulations.

https://doi.org/10.1371/journal.pcbi.1004527.s001

(PDF)

Author Contributions

Conceived and designed the experiments: HO YI MAN. Performed the experiments: HO YI MAN. Analyzed the data: HO YI MAN. Contributed reagents/materials/analysis tools: HO YI MAN. Wrote the paper: HO YI MAN.

References

  1. 1. Sugden R. The Economics of Rights, Cooperation and Welfare. Oxford: Basil Blackwell; 1986.
  2. 2. Alexander RD. The Biology of Moral Systems. New York: Aldine de Gruyter; 1987.
  3. 3. Kandori M. Social norms and community enforcement. Review of Economic Studies. 1992; 59(1):63–80.
  4. 4. Okuno-Fujiwara M, Postlewaite A. Social norms and random matching games. Games Econ Behav. 1995; 9(1):79–109.
  5. 5. Nowak MA, Sigmund K. Evolution of indirect reciprocity by image scoring. Nature. 1998; 393(6685):573–577. pmid:9634232
  6. 6. Nowak MA, Sigmund K. The dynamics of indirect reciprocity. J Theor Biol. 1998; 194(4):561–574. pmid:9790830
  7. 7. Leimar O, Hammerstein P. Evolution of cooperation through indirect reciprocation. Proc Biol Sci. 2001; 268(1468):745–753. pmid:11321064
  8. 8. Fishman MA. Indirect reciprocity among imperfect individuals. J Theor Biol. 2003; 225(3):285–292. pmid:14604582
  9. 9. Mohtashemi M, Mui L. Evolution of indirect reciprocity by social information: the role of trust and reputation in evolution of altruism. J Theor Biol. 2003; 223(4):523–531. pmid:12875829
  10. 10. Panchanathan K, Boyd R. A tale of two defectors: the importance of standing for evolution of indirect reciprocity. J Theor Biol. 2003; 224(1):115–126. pmid:12900209
  11. 11. Brandt H, Sigmund K. The logic of reprobation: assessment and action rules for indirect reciprocation. J Theor Biol. 2004; 231(4):475–486. pmid:15488525
  12. 12. Ohtsuki H, Iwasa Y. How should we define goodness?—reputation dynamics in indirect reciprocity. J Theor Biol. 2004; 231(1):107–120. pmid:15363933
  13. 13. Panchanathan K, Boyd R. Indirect reciprocity can stabilize cooperation without the second-order free-rider problem. Nature. 2004; 432(7016):499–502. pmid:15565153
  14. 14. Brandt H, Sigmund K. Indirect reciprocity, image scoring, and moral hazard. Proc Natl Acad Sci USA. 2005; 102(7):2666–2670. pmid:15695589
  15. 15. Nowak MA, Sigmund K. Evolution of indirect reciprocity. Nature. 2005; 437(7063):1291–1298. pmid:16251955
  16. 16. Brandt H, Sigmund K. The good, the bad and the discriminator—errors in direct and indirect reciprocity. J Theor Biol. 2006; 239(2):183–194. pmid:16257417
  17. 17. Chalub FA, Santos FC, Pacheco JM. The evolution of norms. J Theor Biol. 2006; 241(2):233–240. pmid:16388824
  18. 18. Ohtsuki H, Iwasa Y. The leading eight: social norms that can maintain cooperation by indirect reciprocity. J Theor Biol. 2006; 239(4):435–444. pmid:16174521
  19. 19. Pacheco JM, Santos FC, Chalub FACC. Stern-judging: a simple, successful norm which promotes cooperation under indirect reciprocity. PLoS Comput Biol. 2006; 2(12):e178. pmid:17196034
  20. 20. Takahashi N, Mashima R. The importance of subjectivity in perceptual errors on the emergence of indirect reciprocity. J Theor Biol. 2006; 243(3):418–436. pmid:16904697
  21. 21. Ohtsuki H, Iwasa Y. Global analyses of evolutionary dynamics and exhaustive search for social norms that maintain cooperation by reputation. J Theor Biol. 2007; 244(3):518–531. pmid:17030041
  22. 22. Santos FC, Chalub FACC, Pacheco JM. A multi-level selection model for the emergence of social norms. In: Almeida e Costa F, et al. editors. Advances in Artificial Life, Lecture Notes in Computer Science Volume 4648. Berlin: Springer; 2007. pp. 525–534.
  23. 23. Suzuki S, Akiyama E. Evolution of indirect reciprocity in groups of various sizes and comparison with direct reciprocity. J Theor Biol. 2007; 245(3):539–552. pmid:17182063
  24. 24. Suzuki S, Akiyama E. Three-person game facilitates indirect reciprocity under image scoring. J Theor Biol. 2007; 249(1):93–100. pmid:17714735
  25. 25. Fu F, Hauert C, Nowak MA, Wang L. Reputation-based partner choice promotes cooperation in social networks. Phys Rev E. 2008; 78(2 Pt 2):026117.
  26. 26. Roberts G. Evolution of direct and indirect reciprocity. Proc Biol Sci. 2008; 275(1631): 173–179. pmid:17971326
  27. 27. Ohtsuki H, Iwasa Y, Nowak MA. Indirect reciprocity provides only a narrow margin of efficiency for costly punishment. Nature. 2009; 457(7225):79–82. pmid:19122640
  28. 28. Rankin DJ, Taborsky M. Assortment and the evolution of generalized reciprocity. Evolution. 2009; 63(7):1913–1922. pmid:19222566
  29. 29. Iwagami A, Masuda N. Upstream reciprocity in heterogeneous networks. J Theor Biol. 2010; 265(3):297–305. pmid:20478314
  30. 30. Scheuring I. Coevolution of honest signaling and cooperative norms by cultural group selection. Biosystems. 2010; 101(2):79–87. pmid:20444429
  31. 31. Sigmund K. The calculus of selfishness. Princeton NJ: Princeton University Press; 2010.
  32. 32. Uchida S. Effect of private information on indirect reciprocity. Phys Rev E. 2010; 82(3 Pt 2):036111.
  33. 33. Uchida S, Sigmund K. The competition of assessment rules for indirect reciprocity. J Theor Biol. 2010; 263(1):13–19. pmid:19962390
  34. 34. Berger U. Learning to cooperate via indirect reciprocity. Games Econ Behav. 2011; 72(1):30–37.
  35. 35. Masuda N. Clustering in large networks does not promote upstream reciprocity. PLoS One. 2011; 6(10):e25190. pmid:21998641
  36. 36. Nakamura M, Masuda N. Indirect reciprocity under incomplete observation. PLoS Comput Biol. 2011; 7(7):e1002113. pmid:21829335
  37. 37. Panchanathan K. Two wrongs don’t make a right: the initial viability of different assessment rules in the evolution of indirect reciprocity. J Theor Biol. 2011; 277(1):48–54. pmid:21329700
  38. 38. Peña J, Pestelacci E, Berchtold A, Tomassini M. Participation costs can suppress the evolution of upstream reciprocity. J Theor Biol. 2011; 273(1):197–206. pmid:21216253
  39. 39. Masuda N. Ingroup favoritism and intergroup cooperation under indirect reciprocity based on group reputation. J Theor Biol. 2012; 311:8–18. pmid:22796271
  40. 40. Nakamura M, Masuda N. Groupwise information sharing promotes ingroup favoritism in indirect reciprocity. BMC Evol Biol. 2012; 12(1):213. pmid:23126611
  41. 41. Sigmund K. Moral assessment in indirect reciprocity. J Theor Biol. 2012; 299:25–30. pmid:21473870
  42. 42. Martinez-Vaquero LA, Cuesta JA. Evolutionary stability and resistance to cheating in an indirect reciprocity model based on reputation. Phys Rev E. 2013; 87(5):052810.
  43. 43. Oishi K, Shimada T, Ito N. Group formation through indirect reciprocity. Phys Rev E. 2013; 87:030801(R).
  44. 44. Suzuki S, Kimura H. Indirect reciprocity is sensitive to costs of information transfer. Sci Rep. 2013; 3:1435. pmid:23486389
  45. 45. Tanabe S, Suzuki H, Masuda N. Indirect reciprocity with trinary reputations. J Theor Biol. 2013; 317:338–347. pmid:23123557
  46. 46. Berger U, Grüne A. Evolutionary stability of indirect reciprocity by image scoring; 2014. Department of Economics Working Paper No. 168. Available: https://epub.wu-wien.ac.at/4087/1/wp168.pdf.
  47. 47. Matsuo T, Jusup M, Iwasa Y. The conflict of social norms may cause the collapse of cooperation: indirect reciprocity with opposing attitudes towards in-group favoritism. J Theor Biol. 2014; 346:34–46. pmid:24380777
  48. 48. Nakamura M, Ohtsuki H. Indirect reciprocity in three types of social dilemmas. J Theor Biol. 2014; 355:117–127. pmid:24721479
  49. 49. Ghang W, Nowak MA. Indirect reciprocity with optional interactions. J Theor Biol. 2015; 365:1–11. pmid:25305556
  50. 50. Wedekind C, Milinski M. Cooperation through image scoring in humans. Science. 2000; 288(5467):850–852. pmid:10797005
  51. 51. Dufwenberg M, Gneezy U, Güth W, van Damme E. Direct vs indirect reciprocation: an experiment. Homo Oeconomicus. 2001; 18:19–30.
  52. 52. Milinski M, Semmann D, Bakker TC, Krambeck HJ. Cooperation through indirect reciprocity: image scoring or standing strategy? Proc Biol Sci. 2001; 268(1484):2495–2501. pmid:11747570
  53. 53. Milinski M, Semmann D, Krambeck HJ. Reputation helps solve the ‘tragedy of the commons’. Nature. 2002; 415(6870):424–426. pmid:11807552
  54. 54. Milinski M, Semmann D, Krambeck HJ. Donors to charity gain in both indirect reciprocity and political reputation. Proc Biol Sci. 2002; 269(1494):881–883. pmid:12028769
  55. 55. Wedekind C, Braithwaite VA. The long-term benefits of human generosity in indirect reciprocity. Curr Biol. 2002; 12(12):1012–1015. pmid:12123575
  56. 56. Bolton GE, Katok E, Ockenfels A. Cooperation among strangers with limited information about reputation. J Public Econ. 2005; 89(8):1457–1468.
  57. 57. Semmann D, Krambeck HJ, Milinski M. Reputation is valuable within and outside one’s own social group. Behav Ecol Sociobiol. 2005; 57(6):611–616.
  58. 58. Bshary R, Grutter AS. Image scoring and cooperation in a cleaner fish mutualism. Nature. 2006; 441(7096):975–978. pmid:16791194
  59. 59. Rockenbach B, Milinski M. The efficient interaction of indirect reciprocity and costly punishment. Nature. 2006; 444(7120):718–723. pmid:17151660
  60. 60. Seinen I, Schram A. Social status and group norms: indirect reciprocity in a repeated helping experiment. Eur Econ Rev. 2006; 50(3):581–602.
  61. 61. Milinski M, Rockenbach B. Spying on others evolves. Science. 2007; 317(5837):464–465. pmid:17656712
  62. 62. Sommerfeld RD, Krambeck HJ, Semmann D, Milinski M. Gossip as an alternative for direct observation in games of indirect reciprocity. Proc Natl Acad Sci USA. 2007; 104(44):17435–17440. pmid:17947384
  63. 63. Engelmann D, Fischbacher U. Indirect reciprocity and strategic reputation-building in an experimental helping game. Games Econ Behav. 2009; 67(2):399–407.
  64. 64. Ule A, Schram A, Riedl A, Cason TN. Indirect punishment and generosity toward strangers. Science. 2009; 326(5960):1701–1704. pmid:20019287
  65. 65. Pfeiffer T, Tran L, Krumme C, Rand DG. The value of reputation. J R Soc Interface. 2012; 9(76):2791–2797. pmid:22718993
  66. 66. Kato-Shimizu M, Onishi K, Kanazawa T, Hinobayashi T. Preschool children’s behavioral tendency toward social indirect reciprocity. PLoS One. 2013; 8(8):e70915. pmid:23951040
  67. 67. Molleman L, van den Broek E, Egas M. Personal experience and reputation interact in human decisions to help reciprocally. Proc Biol Sci. 2013; 280(1757):20123044. pmid:23446528
  68. 68. Rand DG, Nowak MA. Human Cooperation. Trends Cogn Sci. 2013; 17(8):413–425. pmid:23856025
  69. 69. Yoeli E, Hoffman M, Rand DG, Nowak MA. Powering up with indirect reciprocity in a large-scale field experiment. Proc Natl Acad Sci USA. 2013; 110(Suppl 2):10424–10429. pmid:23754399
  70. 70. Watanabe T, Takezawa M, Nakawake Y, Kunimatsu A, Yamasue H, Nakamura M, et al. Two distinct neural mechanisms underlying indirect reciprocity. Proc Natl Acad Sci USA. 2014; 111(11):3990–3995. pmid:24591599
  71. 71. Fehr E. Don’t lose your reputation. Nature. 2004; 432(7016):449–450.
  72. 72. Haley KJ, Fessler DMT. Nobody’s watching? Subtle cues affect generosity in an anonymous economic game. Evol Hum Behav. 2005; 26(3):245–256.
  73. 73. Bateson M, Nettle D, Roberts G. Cues of being watched enhance cooperation in a real-world setting. Biol Lett. 2006; 2(3):412–414. pmid:17148417
  74. 74. Milinski M, Semmann D, Krambeck HJ, Marotzke J. Stabilizing the earth’s climate is not a losing game: supporting evidence from public goods experiments. Proc Natl Acad Sci USA. 2006; 103(11):3994–3998. pmid:16537474
  75. 75. Mifune N, Hashimoto H, Yamagishi T. Altruism toward in-group members as a reputation mechanism. Evol Hum Behav. 2010; 31(2):109–117.
  76. 76. Tennie C, Frith U, Frith CD. Reputation management in the age of the world-wide web. Trends Cogn Sci. 2010; 14(11):482–488. pmid:20685154
  77. 77. Ernest-Jones M, Nettle D, Bateson M. Effects of eye images on everyday cooperative behavior: a field experiment. Evol Hum Behav. 2011; 32(3):172–178.
  78. 78. Jacquet J, Hauert C, Traulsen A, Milinski M. Shame and honour drive cooperation. Biol Lett. 2011; 27(6):899–901.
  79. 79. Oda R, Niwa Y, Honma A, Hiraishi K. An eye-like painting enhances the expectation of a good reputation. Evol Hum Behav. 2011; 32(3):166–171.
  80. 80. dos Santos M, Rankin DJ, Wedekind C. Human cooperation based on punishment reputation. Evolution. 2013; 67(8):2446–2450. pmid:23888865
  81. 81. Melnik M, Alm J. Does a seller’s eCommerce reputation matter? Evidence from eBay auctions. J Indust Econom. 2002; 50(3):337–349.
  82. 82. Resnick P, Zeckhauser R. Trust among strangers in internet transactions: empirical analysis of eBay’s reputation system. In: Baye MR. editors. The economics of the internet and e-commerce. Amsterdam: Elsevier Science; 2002. pp. 127–157.
  83. 83. Bolton GE, Katok E, Ockenfels A. Trust among internet traders: a behavioral economics approach. Analyse und Kritik. 2004; 26(1):185–202.
  84. 84. Bolton GE, Katok E, Ockenfels A. How effective are electronic reputation mechanisms? An experimental investigation. Management Sci. 2004; 50(11):1587–1602.
  85. 85. Houser D, Wooders J. Reputation in auctions: theory, and evidence from eBay. J. Econom. Management Strategy. 2006; 15(2):353–369.
  86. 86. Resnick P, Zeckhauser R, Swanson J, Lockwood K. The value of reputation on eBay: a controlled experiment. Exp Econ. 2006; 9(2):79–101.
  87. 87. Bolton G, Greiner B, Ockenfels A. Engineering trust: reciprocity in the production of reputation information. Management Sci. 2013; 59(2):265–285.