Page Nav

HIDE

Breaking News:

latest

Ads Place

Making Informed Decisions in Imperfect Situations with Bayes’ Theorem – Part 2

https://ift.tt/30O0DG9 In my blog “Making Informed Decisions in Imperfect Situations”, I discussed the importance of properly and objectiv...

https://ift.tt/30O0DG9

In my blog “Making Informed Decisions in Imperfect Situations”, I discussed the importance of properly and objectively framing the decision that we seek to make and how that impacts the data that we gather (and ignore) in an effort to make an informed decision. That is:

Are you trying to gather data to determine the right decisions or are you gathering data to support the decision that you have already made? 

In that blog, I introduced two tools that can help us make informed decisions using the best available data, even when that data might be incomplete, conflicting, and/or distorted by others.  The first tool is the Informed Decision-making Framework (special thanks to Erik Burd) and the iterative process outlined in Figure 1.

Figure 1: Tool #1: Informed Decision-making Framework

The second tool was the Decision Matrix (special thanks to Craig “Doc” Savage for sharing the Decision Matrix).  We can use the Decision Matrix to help us make an informed decision by following the steps outlined in Figure 2.

Figure 2: Tool #2: The MECE Decision Matrix

Objectivity is everything.  If you come into this process with your mind already made up about the decision that you want to prove, then you will only find data that support your position and find reason to ignore the data that runs counter to what you already believe. Many folks succumb to this confirmation bias – the tendency to interpret new evidence as confirmation of one's existing beliefs – and only seek data that supports the decision that they have already made.

We can use to help combat decision-making Confirmation using Bias Bayes’ Theorem and Bayesian Inference.

Using Bayes’ Theorem to Combatting Confirmation Bias

Understanding how to use the Bayes’ Theorem (and Bayesian Inference) to help combat confirmation bias starts with a basic understanding of Conditional Probabilities.

Conditional Probability is a measure of the probability of an event occurring, given that another event has already occurred (Figure 3).

Figure 3: Conditional Probabilities

Conditional Probabilities introduces the concept of using prior knowledge to make an informed decision in an environment where the data accuracy and completeness is changing. That sets up Bayes’ Theorem, which determines probability of the occurrence of an event, based on prior knowledge of conditions related to the occurrence of the event (Figure 4).

Figure 4:  What is Bayes’ Theorem

For example, if a disease is related to age, then using Bayes' theorem, a person's age can be used to more accurately assess the probability that they have the disease, compared to the assessment of the probability of disease made without knowledge of the person's age.

In another example, we can determine the likelihood that it is going to rain given that the day started cloudy.  Figure 5 shows how to frame the Bayes’ Theorem framing and the associated calculations (plus a cool website that helps with the Bayes’ Theorem calculation).

Figure 5: Bayes’ Theorem in Action!

Finally, Bayesian Inference is a method of statistical inference in which the Bayes' theorem is used to update the probability for a hypothesis as more data becomes available.  Bayesian Inference can be used to gradually update the probability of the occurrence of an event as more data is gathered or updated, such as what is happening today with increasing data about COVID.

Using Bayes’ Theorem to Make Informed Decisions Summary

We live in a world of probabilities and can use probabilities to improve the likelihood of making better decisions especially in decisions about preventative actions.  But using probabilities to make decisions does not guarantee that you will always make the right decision.

Resulting, which was developed in the world of Poker, is a tendency to equate the quality of a decision with the quality of its outcome. In other words: If we got the result we wanted, we assume it must be because we did something right. If we didn't get the result we wanted, we assume it must be because we did something wrong. Just because you have a bad outcome, does not mean the decision was wrong based on what was known at the time of the decision.

And that’s where Bayesian Inference can help.  As we get more data, we can revise our probabilities of making the right decision.  Again, that doesn’t guarantee that we’ll get the desired outcome, but we can improve our changes of the desired outcome (surviving a car crash, surviving a bike accident, surviving COVID) by employing Bayesian concepts to factor updated data into our decision-making models.

The ability to update the likelihood of an event occurring is critical in situations where new data is being generated and new insights uncovered. The willingness to ingest new facts, toss out outdated facts, and ignore the droning in our ears people’s distorted version of truth is the key to survival, not only as professionals, but as a species.

 



from Featured Blog Posts - Data Science Central https://ift.tt/3x90g4Y
via RiYo Analytics

No comments

Latest Articles