Honesty is a mental illness
Last reviewed: 23.04.2024
All iLive content is medically reviewed or fact checked to ensure as much factual accuracy as possible.
We have strict sourcing guidelines and only link to reputable media sites, academic research institutions and, whenever possible, medically peer reviewed studies. Note that the numbers in parentheses ([1], [2], etc.) are clickable links to these studies.
If you feel that any of our content is inaccurate, out-of-date, or otherwise questionable, please select it and press Ctrl + Enter.
In early June, the US published the book of the professor of behavioral economics at the Duke University, Dan Ariely. "(True) the truth about dishonesty: how we spend time in succession, especially for ourselves." The main thesis is this: large units are deceived, and almost everything is in the details, and the second type of dishonesty is much more harmful, says The Wall Street Journal, to whom the author himself presented excerpts from the book.
In the beginning, Dr. Arieely remembers the story of his student, about how he changed the castle. The called locksmith turned out to be a philosopher and said, they say, locks on the doors are needed only for honest people to remain honest. There is one percent of people who will always behave honestly and will never steal. Another one percent will always behave dishonestly and constantly try to open your castle and take away the TV; from hardened thieves locks you are unlikely to save - these, if they really need, will find a way to climb into your house. The purpose of locks, said the locksmith, is to protect you from 98% of mostly honest people who might be tempted to tug your door without a lock on it.
So what is the nature of dishonesty? Arieli and his colleagues conducted an experiment in which participants were asked to solve as many tasks as possible in 5 minutes. For money. Researchers experimented with the size of the remuneration and came to the conclusion that this factor does not exert an expected influence on the outcome of the experiment. Moreover, when assigning the highest price for one solved task, the amount of fraud was reduced. Perhaps, in such conditions, the participants were more difficult to deceive, retaining a sense of their own honesty, Arieli suggests.
The change in the likelihood of catching red-handed also does not affect the final results. In order to convince oneself of this, the scientists introduced the "blind" leader into the experiment, they allowed the experimental ones to take a fee from the general basket in accordance with their results.
In the second part of the experiment, the pay for intelligence was not issued with money, but with tokens (they could later be exchanged for money). It turned out that the more mediated the benefit that can be obtained from fraud, the more chances that a person will succumb to the temptation to cheat.
To the lies of a person is also pushed by the belief that he is not the only one lying. At a certain stage, the script included a dummy "student David" who, a minute after the beginning of the experiment, declared that he had solved all the problems, and, winking happily, was retiring with a bundle of money. After such impudence, the "effectiveness" of the participants in the experiment, in comparison with the control group, jumped three-fold. Like, if he can, why can not I?
Among other factors that increase the tendency to deceit, Ariely calls mental exhaustion, when it is easier for a person to cheat on trifles than honestly bring the hard work to the end. And also the understanding that a lie will benefit not the deceiver, but some "team". And a lie for salvation, when a person gets used to "embellish reality" for the sake of some good (in his opinion) goals.
[1],