Information Hazards in Career and Life

This site was named after the idea that we need to face unpleasant facts. In some ways, it is named after a quote from George Orwell’s Why I Write.

“I knew that I had a facility with words and a power of facing unpleasant facts, and I felt that this created a sort of private world in which I could get my own back for my failure in everyday life.”

Now, I lack Orwell’s facility with words, but I have always thought that my ability to face and discuss unpleasant facts as an asset. After encountering Nick Bostrom’s work on information hazards, I am less certain that this is the case. According to Bostrom, an information hazard is:

A risk that arises from the dissemination or the potential dissemination of (true) information that may cause harm or enable some agent to cause harm

One basic example of this dynamic is that once people know that nuclear bombs or certain bioweapons are possible, some agents who might cause harm might figure out how to make these for themselves. For the world at large, the advancement of science and spread of scientific knowledge is not always good like we assume it is.

Bostrom’s paper does a good job categorizing different types of hazards, but I want to focus on how individuals might use the concept of information hazards. An example of a potential information hazard is if a person would be much happier never knowing their partner cheated on them. It is only an information hazard if they could have lived their life happily having never known that their partner was unfaithful. If they would have found out eventually when other people laughed at them for it, or if they got an STD, genetically tested their children, or if their partner then leaves them for their mistress then the revelation is just getting bad news earlier and giving the person more time to adjust for it, which is not an information hazard.

On an individual level, one way to frame this is the classic Matrix dilemma, where Morpheus asks Neo if he wants to take the red pill and see reality for what it is or take the blue pill and go back to believing everything is normal. Many people immediately think the red pill is the obvious option, but after taking the red pill they can no longer relate to their previous friends. And worse, having taken the red pill Neo made enemies of immensely powerful people.

Information hazards are not necessarily about red vs blue, as taking the red pill might give you superpowers. But sometimes it is more like a black pill that might give you a slightly more accurate understanding of how the world works but saps your motivation to effectively navigate it.

Social media is a way information hazards get spread in the world. One example that applies to everyone who uses social media: We know intellectually that people have horrible views. But in today’s world, more and more people are saying offensive things in ways that can spread online. When a partisan commentator is highlighting how some random person on the other team is saying ridiculous thing about the world, a practice often referred to as nut-picking, their partisan fans become less inclined to interact in good faith with anyone on the other side. Everyone is worse off because they were exposed to true but unrepresentative information about how some people see the world.

It might be helpful to think through a few more scenarios.

Knowledge that makes you unpopular

One type of knowledge that is often more than useless is knowledge that makes you unpopular. It might be fun to be the one to tell your classmates that Santa Claus is not real, but they will not like you for it. Similarly, knowing something is false when your colleagues ardently believe in it can be dangerous for anyone with a predisposition for correcting others.

An accurate vision of a future point in time without considering the paths that will lead there.

There have been many economic bubbles, and not participating early on has always been the wrong choice from a financial perspective.

Imagine a situation where people are getting excited about a technology, but the pessimist who is interested in the area knows that it will not be viable for another decade while others in the field insist that success is a couple years around the corner. If they decide not to join or invest in a group of otherwise very smart people working on this problem, they will miss out on working with people at the cutting edge of what they are doing. And their business venture might still be a success if they can convince someone else to buy their company, whether it is public market investors or a CEO who wants their company to have exposure to the cool new technology. They could buy it early on even if numbers do not add up. And the pessimist could have been right about the timeline and the fundamental economics but still miss out massively by not joining that initial team.

Knowing that a problem is really hard

Similarly, industry insiders often miss out on new ideas that remake their industry because they are too aware of the little details that new solutions do not address. Knowing other details, they fail to appreciate what can be built on top of new technologies.

Understanding that what they are doing is not helping the world (or their customers).

It is debatable as to whether this is a true information hazard for the world at large. But from an organization’s or individual’s perspective, if they or their employees switch from believing they are helping the world towards believing they are a negative force then morale will take a hit and there will be less productivity. Like bad news about infidelity, this is not an information hazard if there are effective paths that resolve the issues. But if an employee is naïve enough to put information about the problems into an email, it could later be used as evidence against the company.

Google’s quest to keep all mention of antitrust related phrases out of any written communications is an example where explicit knowledge of market dominance is effectively an information hazard.

When team members understand that they are not aligned with the firm.

Many organizations have incentives set up that allow for abuse by clever employees. When people understand these mismatching incentives, the organization can suffer. If one of the employees notices a misalignment and does not think before they point it out publicly, they might not be helping the company or themselves. Now other employees might be tempted to take advantage of the loophole and some people in management might resent them for bringing up the issue in the first place when things weren't broken before they mentioned the problem.

Complete knowledge of costs, incomplete knowledge of communication norms

Let’s say that a project manager knows certain types of projects run into cost overruns, and they incorporate those projections into their presentation. First, the project will look expensive compared others. Second, the observers might not give them credit for incorporating the overruns ahead of time and might add an additional premium on top of their estimates, making the project look even worse in comparison. The project manager suffers if they acknowledge the true costs ahead of time. From a societal perspective, it might be beneficial if a few less boondoggle high speed rail type projects get started under extremely rosy projections, but the people working on the project will hurt their careers if accurate information about the project's actual costs become widely known ahead of time.

Knowledge that a person acted badly towards someone else

Knowing that another person treated someone badly is another type of information that is sometimes protective but can also backfire. People are already subject to fundamental attribution error where they ignore the situational factors when judging other’s behavior. And if the person who did something questionable is in their life regardless as part of their social circle or exists in their business world, then it would be better not to hear about a one-off instance of bad behavior unless it is likely that the bad behavior is persistent or extreme enough to be worth noting.

Information about who is to blame during a crisis

Figuring out who to blame can backfire if everyone is going to remain on the team and will be working to fix things together for the time being. If there is a big leak in the boat, finding clues about who caused the leak is a distraction to the main goal of bailing out the water and plugging the hole. If two parents leave their child at the mall, understanding who was supposed to bring them home does not help. Only after the crisis is solved is information about who is at fault not likely to do more harm than good. Like other types of potential information hazards, this only holds true to the extent that the person at fault is not an active saboteur. (Or in the parent analogy, it’s possible that proof that one parent was at fault allows them to stop fighting and focus on solving the problem)

General Solutions for personal information hazards: Optimism, Social Grace and Esotericism

One confusing fact that the existence of personal information hazards help solve is how pessimists are generally more accurate than optimists, but optimists succeed more often. About the only career in which pessimists do better is law, where understanding downside scenarios is particularly valued. Developing a bias towards optimism helps avoid focusing on information hazards that are more likely to bring you or other people down.

One aspect of social grace is developing a habit of not spewing out low-grade information hazards to people. If that is too difficult for some people, and I can relate, then being a little bit more esoteric in your speech and writings might be an alternative solution.

Knowledge in any amount can sometimes be a dangerous thing. But thinking through some of these dynamics can help us understand when and why we might want to hold off on making the world or at least our social circles face certain unpleasant facts.

3 responses
Great post Jeff! I particularly relate to the final three points on 1) Amplifying fundamental attribution error impacts 2) Focusing on keeping the ship afloat vs. blaming and 3) General solutions. For #2 I kept thinking about the framework Bill Campbell ("Trillion Dollar Coach") used on psychological safety and building an Envelope of Trust as a key pillar in supporting a high-performing team. I liked the insight on #3 personal information hazards and how a bias towards optimism can be beneficial at least in the short-run for organizations and social circles. Hope you're doing well and staying safe in Saigon! - Best, Bilesh +84-965-013-410
2 visitors upvoted this post.