Open Access Articles- Top Results for Tit for tat

Tit for tat

For other uses, see Tit for Tat (disambiguation).

Tit for tat is an English saying dating to 1556, from "tip for tap", meaning "blow for blow," i.e., retaliation in kind—or more broadly, an equivalent to an action given in return. It has related meanings and use as a concept in biology, social psychology, business, as well as in the mathematical area of game theory. The concept in its various forms has found use in the real world in attempting to explain a form of reciprocated altruism in animal communities, and as a strategy for managing activities in technology areas.

In speech and linguistics

Tit for tat is an English expression that is used to refer to "retaliation in kind", or more broadly, for any "equivalent [to an action] given in return."[1] It is thought to have evolved from the earlier expression, "tip for tap," where the connotation of "tip" is "blow", as in to strike physically (e.g., as in "blow for blow"); its reported first appearance was in 1556.[1]

In biology

The title phrase has been used to describe the concept behind how groups of animals have come to live in largely or entirely cooperative societies, rather than the individualistic "red in tooth and claw" way that might be expected from individuals engaged in a Hobbesian state of nature. This, and particularly its application to human society and politics, is the subject of Robert Axelrod's book The Evolution of Cooperation.

In social psychology

Moreover, the tit-for-tat strategy has been of beneficial use to social psychologists and sociologists[who?] in studying effective techniques to reduce conflict. Research has indicated[who?] that when individuals who have been in competition for a period of time no longer trust one another, the most effective competition reverser is the use of the tit-for-tat strategy. Individuals commonly engage in behavioral assimilation, a process in which they tend to match their own behaviors to those displayed by cooperating or competing group members. Therefore, if the tit-for-tat strategy begins with cooperation, then cooperation ensues. On the other hand, if the other party competes, then the tit-for-tat strategy will lead the alternate party to compete as well. Ultimately, each action by the other member is countered with a matching response, competition with competition and cooperation with cooperation.

In the case of conflict resolution, the tit-for-tat strategy is effective for several reasons: the technique is recognized as clear, nice, provocable, and forgiving.[according to whom?][citation needed] Firstly, It is a clear and recognizable strategy. Those using it quickly recognize its contingencies and adjust their behavior accordingly. Moreover, it is considered to be nice as it begins with cooperation and only defects in following competitive move. The strategy is also provocable because it provides immediate retaliation for those who compete. Finally, it is forgiving as it immediately produces cooperation should the competitor make a cooperative move.

In business

Individuals who employ the tit-for-tat strategy are generally considered to be tough but fair—a disposition that is often respected in the business/organization world. Those who always cooperate with a competitor are often viewed as weak, while those who consistently compete are perceived as unfair. In any case, the implications of the tit-for-tat strategy have been of relevance to conflict research, resolution and many aspects of applied social science.[2][full citation needed]

In game theory


The tit for tat game theory is an expression in the mathematical area of game theory, relevant to a problem called the iterated prisoner's dilemma.[citation needed] It was first introduced as a strategy by Anatol Rapoport in Robert Axelrod's two tournaments, held around 1980.[3] Notably, on both occasions it was both the simplest strategy and the most successful in direct competition.[4]

An agent using this strategy will first cooperate, then subsequently replicate an opponent's previous action. If the opponent previously was cooperative, the agent is cooperative; if not, the agent is not.[citation needed] The success of the tit-for-tat strategy, which is largely cooperative despite that its name emphasizes an adversarial nature, took many by surprise.[according to whom?][citation needed] Arrayed against strategies produced by various teams it won in two competitions; after the first competition, new strategies formulated specifically to combat tit-for-tat failed due to their negative interactions with each other.[citation needed] A successful strategy other than tit-for-tat would have had to be formulated with both tit-for-tat and itself in mind.[citation needed]


While Axelrod has empirically shown[according to whom?] that the strategy is optimal in some cases of direct competition, two agents playing tit for tat remain vulnerable. A one-time, single-bit error in either player's interpretation of events can lead to an unending "death spiral."[this quote needs a citation] In this symmetric situation, each side perceives itself as preferring to cooperate, if only the other side would, but each is forced by the strategy into repeatedly punishing an opponent who continues to attack despite being punished in every game cycle. Both sides come to think of themselves as innocent and acting in self-defense, and their opponent as either evil or too stupid to learn to cooperate.[according to whom?] This situation frequently arises in real world conflicts, ranging from schoolyard fights to civil and regional wars. Tit for two tats could be used to mitigate this problem; see the description below.[5][page needed]

"Tit for tat with forgiveness" is sometimes superior.[according to whom?][citation needed] When the opponent defects, the player will occasionally cooperate on the next move anyway. This allows for recovery from getting trapped in a cycle of defections. The exact probability that a player will respond with cooperation depends on the line-up of opponents.

The reason for these issues is that tit for tat is not a subgame perfect equilibrium, except under knife-edge conditions on the discount rate.[6][page needed] If one agent defects and the opponent cooperates, then both agents will end up alternating cooperate and defect, yielding a lower payoff than if both agents were to continually cooperate. While this subgame is not directly reachable by two agents playing tit for tat strategies, a strategy must be a Nash equilibrium in all subgames to be subgame perfect. Further, this subgame may be reached if any noise is allowed in the agents' signaling. A subgame perfect variant of tit for tat known as "contrite tit for tat" may be created by employing a basic reputation mechanism.[7]

Furthermore, the tit-for-tat strategy is not proved optimal in situations short of total competition; for example, when the parties are friends it may be best for the friendship when a player cooperates at every step despite occasional deviations by the other player. Most situations in the real world are less competitive than the total competition in which the tit-for-tat strategy won its competition.[citation needed]

Tit for two tats

Tit for two tats is similar to tit for tat in that it is nice, retaliating, forgiving and non-envious, the only difference between the two being how forgiving the strategy is. In a tit for tat strategy, once an opponent defects, the tit for tat player immediately responds by defecting on the next move. This has the unfortunate consequence of causing two retaliatory strategies to continuously defect against one another resulting in a poor outcome for both players. A tit for two tats player will let the first defection go unchallenged as a means to avoid the "death spiral" of the previous example. If the opponent defects twice in a row, the tit for two tats player will respond by defecting.

This strategy was put forward by Robert Axelrod during his second round of computer simulations at RAND. After analyzing the results of the first experiment, he determined that had a participant entered the tit for two tats strategy it would have emerged with a higher cumulative score than any other program. As a result, he himself entered it with high expectations in the second tournament. Unfortunately, owing to the more aggressive nature of the programs entered in the second round, which were able to take advantage of its highly forgiving nature, tit for two tats did significantly worse (in the game-theory sense) than tit for tat.[8][page needed]

Real world use

Explaining reciprocal altruism in animal communities

Studies in the prosocial behaviour of animals have led many ethologists and evolutionary psychologists to apply tit-for-tat strategies to explain why altruism evolves in many animal communities. Evolutionary game theory, derived from the mathematical theories formalised by von Neumann and Morgenstern (1953), was first devised by Maynard Smith (1972) and explored further in bird behaviour by Robert Hinde. Their application of game theory to the evolution of animal strategies launched an entirely new way of analysing animal behaviour.

Reciprocal altruism works in animal communities where the cost to the benefactor in any transaction of food, mating rights, nesting or territory is less than the gains to the beneficiary. The theory also holds that the act of altruism should be reciprocated if the balance of needs reverse. Mechanisms to identify and punish "cheaters" who fail to reciprocate, in effect a form of tit for tat, are important to regulate reciprocal altruism. For example, tit-for-tat is suggested to be the mechanism of cooperative predator inspection behavior in guppies.[by whom?][citation needed]

Explaining war

The tit-for-tat inability of either side to back away from conflict, for fear of being perceived as weak or as cooperating with the enemy, has been the source of many conflicts throughout history. However, the tit for tat strategy has also been detected by analysts in the spontaneous non-violent behaviour, called "live and let live" that arose during trench warfare in the First World War. Troops dug in only a few hundred feet from each other would evolve an unspoken understanding. If a sniper killed a soldier on one side, the other could expect an equal retaliation. Conversely, if no one was killed for a time, the other side would acknowledge this implied "truce" and act accordingly. This created a "separate peace" between the trenches.[9][full citation needed]

Peer-to-peer file sharing

BitTorrent peers use tit-for-tat strategy to optimize their download speed.[10][third-party source needed] More specifically, most BitTorrent peers use a variant of Tit for two Tats which is called regular unchoking in BitTorrent terminology. BitTorrent peers have a limited number of upload slots to allocate to other peers. Consequently, when a peer's upload bandwidth is saturated, it will use a tit-for-tat strategy. Cooperation is achieved when upload bandwidth is exchanged for download bandwidth. Therefore, when a peer is not uploading in return to our own peer uploading, the BitTorrent program will choke the connection with the uncooperative peer and allocate this upload slot to a hopefully more cooperating peer. Regular unchoking corresponds very strongly to always cooperating on the first move in prisoner’s dilemma. Periodically, a peer will allocate an upload slot to a randomly chosen uncooperative peer (unchoke). This is called optimistic unchoking. This behavior allows searching for more cooperating peers and gives a second chance to previously non-cooperating peers. The optimal threshold values of this strategy are still the subject of research.

Cockney rhyming slang

In Cockney rhyming slang a tit for tat or titfer is a hat.

See also


  1. ^ a b Merriam-Webster, 2015, "Dictionary Entry:tit for tat," at (online), see [], accessed 14 April 2015.
  2. ^ Forsyth, D.R. (2010) Group Dynamics[full citation needed]
  3. ^ The Axelrod Tournaments
  4. ^ Shaun Hargreaves Heap, Yanis Varoufakis (2004). Game theory: a critical text. Routledge. p. 191. ISBN 0-415-25094-3. 
  5. ^ Dawkins, Richard (1989). The Selfish Gene. Oxford University Press. ISBN 978-0-19-929115-1. [page needed]
  6. ^ Gintis, Herbert (2000). Game Theory Evolving. Princeton University Press. ISBN 0-691-00943-0. [page needed]
  7. ^ Boyd, Robert (1989). "Mistakes Allow Evolutionary Stability in the Repeated Prisoner's Dilemma Game". Journal of Theoretical Biology 136 (1): 47–56. PMID 2779259. doi:10.1016/S0022-5193(89)80188-2. 
  8. ^ Axelrod, Robert (1984). The Evolution of Cooperation. Basic Books. ISBN 0-465-02121-2. [page needed]
  9. ^ Nice Guys Finish First. Richard Dawkins. BBC. 1986.[full citation needed]
  10. ^ Cohen, Bram (2003-05-22). "Incentives Build Robustness in BitTorrent" (PDF). Retrieved 2011-02-05. [third-party source needed]

External links