subscribe : Posts | Comments

Biases and Medical Market Investing: Part One

0 comments

Risk management is older than homo sapiens. Our early ancestors did not have to ponder how best to invest in international emerging markets or medical device market analysis, but how to hunt game and not become game themselves.

Mankind in its hubris and for motivational reasons defines itself by its reliance on reason. The human brain for all its wonders is “a flawed belief machine”. Errors in judgment are the product of the interplay of the necessary shortcomings in information-processing strategies and motivational factors. With medical technology investments, allocated by due diligence from market makers if given proper analysis can bring significant enhancement to success for early stage companies raising new capital. Data was traditionally treated historically, but just as the internet has moved technologies at light speed, ergo analysis for details needs 21st century processing, not 20th century processing. Without exception, computers process at a phenomenal rate, increasing to unthinkable levels. It takes evolution millions of years to produce changes in the brain, and the brain could not be shut down for renovation. In short we rely on heuristics- quick and dirty ways of dealing with information. We use historical industry processes and systems. But if we accept our human shortcomings we are prone to error. Overwhelmed with stimuli, we are forced to utilize snap judgments, cognitive shortcuts that often work and often do not, but this is our tunnel vision for traditional and the learned forms of investment decisions. In addition unlike a computer, we have motivational biases. We function like a government run press. All new information is filtered, interpreted and censored to confirm our prejudices, pre-conceived notions and dogmas and to make us look good to ourselves. Combined with processing limits this explains much about human behavior. All of this is equally true in risk management..

The quick summary of cognitive heuristics and biases below will make their relevance to medical technology assessment clear.

1. Heuristic availability – We make attributions based on what comes to mind easily. For example more people are afraid to fly than drive, and think flying more dangerous than driving. Driving is actually more dangerous but one commercial plane crash receives worldwide heavy press coverage, 1000 fatal car crashes each rate at most one sentence in a local newspaper.

2. Illusionary correlation – we see chance occurrences happening together as related and then notice whenever they occur together again. An evolved cognitive fallacy is to attribute meaning and significance to coincidences. In fact, even in 2013, some otherwise intelligent and educated people do not believe in “coincidences”. Quantitative analysis, is the only defend against this, ironically based on accepting the theory of randomness.

3. Confirmation bias -We notice and recall events that support our beliefs more than events which contradict them.

4. Magical thinking- Even in modern “western” society many people believe wishing makes things happen. It is even the idea behind a recent bestseller, “The Secret”. Just because one is evaluating cutting edge life science technologies does not make one immune to this.

5. Planning fallacy- We underestimate how long it will take, and how hard it will be to do a task. Again here Economic Quantitative Analysis can assure achievement on a time basis and within budget! Whether one is a researcher involved in creating new healthcare solutions or a sixth grader doing homework, the planning fallacy rules!

6. Omission bias- We notice an action more than a failure to act. We judge an action that produces harmful consequences more harshly than harmful consequences produced by failing to act.

7. Illusion of control- We believe we are in control of events that actually occur by chance. Dice players have been found to believe they can control dice outcomes by talking to dice! This is why lotteries allow people to pick their own numbers, instead of having assigned numbers.

By Alexander Nussbaum, PhD
Adjunct Full Professor
St. John’ s University

Edited by Ken Peters PhD for Analytic MedTek Consultants, LLC

Leave a Reply

Your email address will not be published. Required fields are marked *


*