Wednesday, June 28, 2017

A thought on research methodology, anomalies and outliers


I am presenting on continual innovation at the ASQ Innovation Conference this Fall. I have been gathering some interesting reading materials on innovation. Clayton Christensen has written excellent books on innovation and is a true Guru in the field. Flipping through the first chapter of his book, "The Innovator's Solution", I found this gem, pictured above.

If you are like me, time and again, you re-read the fundamentals, and still go, "ah, how pithy!" Some of you also probably wonder why such fundamentals need to be repeated over and over. It never hurts to go back to the basics, even if we have read it many many times. Because certain concepts are fundamental, we tend not to think deeply about them. So, it doesn't hurt to sit back, relax and think about these concepts and what they mean.

Because, by rethinking key concepts, we can strengthen our understanding and methodology. We can also think about our adherence to this, and maybe adjust the sails a little bit, from time to time. So, join me...

Research Methodology

Just recently I interacted with someone who had collected some device data and was really worried about an outlier skewing results. They were so concerned even as they were collecting the data (!), they created an additional data set (quite expensive, just so you know), and when helping with the analysis, I did agree to throw out the outlier. The analysis went fine and the data fit the assumptions to an acceptable extent and they moved on.

However, that and reading Christensen made me think about this deeper, last night.

Why do people fear, and consequentially loathe outliers? I remember, when first being taught about linear regression and the box and whisker plot, I was told, it is common to find outliers and then explain them, and eliminate them from the analysis, or represent them in a certain wahy. So, there was a certain effort to go ahead and get us to explain what we should do with values outside the expected range, but we were never really told that we might be biasing ourselves. While the way different people are taught about data handling is different, I think we have all been exposed to this idea at some point.

Later, just like me, you must have come across the caveat. Don't just give an oversimplified example and run away from the data. It might have been an instructor, a lab manager or some speaker, urging you to look much, much closer at the outliers. This is true. A theory, many times is best shaped by phenomena it cannot explain, because then, it grows out of the crude form to a more refined form which makes the theory stronger. An over simplified example is Newton's Third Law. Yes, every action has an equal and opposite reaction BUT, the reaction is divided among signal and noise - your frictional losses, etc. Without that qualifier, the theory would be hard to disseminate and use.

Evolutionary Principles?

So, yes, we have all seen both sides of the coin, but why do people want to pretend like the outliers never happened? Why do others want to pass quick judgment and run away from the issue? It appears, humans evolved to try and form communities and categories and try to fit everyone tightly into these strata. This is why we have people who try very hard to force everyone to belong, and once, even burned people at the stake for being different!

I recently spoke to a recruiter who was shocked that I have a résumé with color and graphics in it. He wanted me to change it to the boring version (!) and sensing my reluctance to do so, he appears to have become uncommunicative! Well, in my defense, in other industries, they want graphics, numbers, data and so on, and I put in the energy to innovate my résumé.

Yes, it looks different. But, if you can't even handle a document that looks different, how could you handle disruptive products, and at that point, why claim you want to hire people to innovate? It appears you want to hire people to do more of the same - throw a stent, a balloon or RF signals at the problem and then wonder loudly as to why Google and Apple are about to take over the industry!!

Unfortunately though, it appears that this is human nature. Shun anything different and it appears that principle has crept into people's research activities. And that is not good!

Root Cause Analysis

It is important to know the whys and whats of unexpected results. At times, yes, the explanation is simple. Errors, erroneous measurement, bad sample, etc., but there are instances where the explanation is not simple. You need to dig deeper. Even when you know a measurement error was made or the sample was "bad", you need to know why. That is how we got to Penicillin, because Fleming was dogged. And that is just one example.

Plus, in chasing down the problem, you could find new and exciting opportunities. This is what Christensen and Raynor are getting to. Innovation has many origins. Looking closely at a problem, is one of those. So, what can you and I do about all this?

What do we do?

In our own research, we need to go after the outliers and anomalies. Just as there is a push for the inclusion of negative results in publications, publications that clearly lay out outliers, and try to provide a deeper explanation for those data points, must be encouraged.

When we mentor new and young researchers, whether in an instructional setting or in a professional setting, we should discourage the masking or elimination of data that doesn't fit a model. Removing bias from research is not easy, but a fundamental necessity. And it can lead to your next business idea. It can help you develop a better device or drug. It can save lives!

And it has a funny effect. It helps you root out bias in other areas of your life - how you interview candidates, judge people, promote and mentor professionals and much more!

Did you enjoy this post? Please subscribe for more updates, using the sidebar. Have ideas or blog posts you'd like to see here? Contact me at yamanoor at gmail dot com.

Reference:

1. The Innovator's Solution. Clayton Christensen and Michael Raynor


No comments: