Analysts often presume that the users of the findings and results from their analytics-based studies will immediately understand and appreciate their impressive work. They may be wrong. The analysis may be too complex or is not answering the question the user is seeking. How can analysts focus more on people than the technology they have mastered?
Have you ever noticed how we often get things backward? At a dinner restaurant some order their entrée before their appetizer. During a job interview some employers are biased with their first impressions of the candidate, such as by their clothing attire, before learning the skills and competencies of the individual.
Please allow me provide some background before I make my case as to why some less experienced "data scientists" get things backward. Then I will explain why.
What we learn from the military
I recently had a stimulating meeting in Washington DC with Larry Carr, a retired US Air Force jet pilot with a systems engineering background. Larry specializes in the design of systems that interface humans with technology for NTVI in Falls Church, Virginia.
An observation that Larry described to me was provocative. He said that design engineers often put a technology's capabilities first, and subsequently consider how a human will use it. Carr's point was that the "human-machine" interface is sub-optimized when you do not first think about people before the machine. He referenced examples of jets that were designed beyond the capability of a pilot to fly the jet. Hence, my title of equip the man, not man the equipment.
Analytics and the human factor -- people
I confess. I am analytical. When I was a kid I was into numbers. In elementary and high school I loved math. My university degree was in operations research. Those of you who are aware of me know of my pride and joy to be honored to be in the Cooperstown, New York Baseball Hall of Fame for programming the oldest computer baseball game. But now at age 64 and recently retired from SAS, the large analytics and business intelligence software vendor, I have some reflections and hindsight about analysts and how some analysts relate to people.
All analysts are nor geeks
Larry Carr's observation to me was salient. If an analyst's test hypothesis is too obscure or too complex for those who are to learn from the test's outcome, then there will be insufficient buy-in to and acceptance of its findings.
For example, a business analyst in a bank may have developed an elegant equation to measure "customer lifetime value (CLV)" of the future potential profit level for individual customers using a discounted cash flow (DCF) equation with dozens of variables. Examples of variables are the forecasted customer demand volume, product unit costs, and the survival retention time length of the customer with the bank. The analyst's purpose may be to provide the marketing team responsible for higher profit lift from customers with a rank-ordered list of the most to least CLV customers.
What is the analyst's implied point? It is for marketing to focus on and target the highest CLV customers. Wrong! It should be to determine which incremental offer, deal, or discount will generate the relatively highest incremental profit from any customer. That may come from customers who are ranked low in their baseline CLV because the offer or deal may result in a greater profit lift the ranked high.
The bank's goal is to maximize the enterprise's profit and not to sub-optimize by focusing on the illusion that the already most profitable customers are the path to maximizing future profits.
Will analysts get intoxicated with technology?
If you have not learned a little about in-memory microchip technology, I suggest that you do. It is a game changer. An analyst's scenario test of large amounts of "Big Data" (e.g., a million products sold to 500,000 customers through five distribution channels) used to require hours of computer time to run the test. With in-memory technology, now it takes minutes and seconds. What does this mean? It means that analysts no longer have to carefully "frame the problem" to minimize the number of their time-consuming tests -- their what-if scenarios. They can now fail quickly and often and not feel embarrassed. Trial and error can now become an accepted and superior way to achieve findings to gain insights.
But will some analysts continue to get things backwards? Will they pursue attempting to solve the wrong problem? For example for CFOs, will the analyst fiddle with economic value added (EVA) equations and DuPont decompositions models of the elements of income statements and balance sheet general ledger accounts? Or will analysts go to the primary source of shareholder wealth creation -- view each customer as an investment in a portfolio to maximize the profit lift from each customer?
Analysts need to equip the man before manning the equipment. Consider who must be convinced to use the results from applying analytics. Get their buy-in. Then take them deeper to solve even more complex problems or pursue unimaginable opportunities.