Predictum Blog

Oct 30, 2016

How to empower scientists and engineers to make discoveries in data with JMP

Wayne J. Levin, President, Predictum Inc. www.predictum.com

Business and production systems have become much more capable at collecting data. Equipment collects a variety of sensor and parametric data and today all kinds of information on buying habits and consumer preferences is available. This level of detail cannot be analyzed and comprehended with static, conventional reporting. Instead, business analysts, engineers and scientists can unlock insights with leverage provided by interactive, visual analytical software, such as JMP.

Characteristics of New Analytics

New analytical software has surfaced a new world of analytics which is characterized by these important traits:

  • Data are “self-provisioned.” Users are able to get the data they need without assistance and without delay.
  • The analytics are visual and interactive. As a result …
  • Users can now conduct advanced analytics without a PhD in statistics.
  • Analysts conduct their work “in-the-moment.” Insights often surface questions that analysts explore “in-the-moment” creating an active dynamic that further spawns discovery.
  • Analytical thinking is completely coupled to the business thinking.
  • More than descriptive, analytics are inferential.

Example of Analytics in Insurance

Before split - 12.5% conversion rate across the demographics

JMP partition analysis initially shows overall conversion rate is about 12.5%

Consider this insurance example. Here demographic information from many thousands of current and potential clients was collected and maintained in a database. The insurance company was able to download the data into a spreadsheet and summarize the data but did they get the best exploitable insights? Answering even the simplest questions took days to acquire, splice and arrange the data. Today, with integrated, interactive and visual analytics insights are revealed in seconds.

The big question when it comes to prospective clients is how many of them were converted to new business and what are the factors that drive the conversion? By knowing this, focus can be brought to business practices that lead to higher rates of success.

We started by loading the data. With only a few clicks, tens of thousands prospective client encounters, including demographic information such as income, education, age, martial status, etc., were loaded. oou can see from the image above that overall about 12.5% (the blue area) of these prospects were converted into paying customers.

One Click and Ah-ha!

Now to the question at hand, what factors determine success in winning new business? One more click (on the Split button in the lower-left) and an “ah-ha” moment ensued.

After 1 split, Ah-ha! there's a demographic that is an easy win.

After 1 click in JMP’s partition platform, there’s a demographic that’s an easy win. This insight was hidden in the data prior to this analysis.

The chart above shows that a particular factor (which, due to confidentiality I can’t disclose so we’ll call it … ), “factor Xn,” leads to an incredibly high conversion rate (about 90% as seen in the blue bar on the right) for a good number of prospects and that the remaining prospects had little chance of succeeding. The analysts were stunned at seeing this. This insight had eluded them because the overall conversion rate was masking a major distinction, identified by factor Xn, among the prospects.

Keep in mind that these analysts spend day-in and day-out pouring over data but this important insight, and others that were to follow, remained locked within.

Analytical Insights spawn questions

This insight spawned a bunch of questions. First, it appears changing sales representative instructions are in order. Second, why is it that the conversion rate for other customers is so incredibly low? This leads to questions about pricing, packaging and the like in combination with demographics that are to be investigated with designed experiments.

Looking back at the 6 traits above we can see that:

  • IT established systems that allowed users to get the data themselves: “self-provisioned data”.
  • Indeed the analytics were highly visual. Yes, all the statistical information is provided but it is made accessible through graphics and interactivity.
  • No PhD in statistics was necessary. The analysis above involves recursive partitioning with cross-validation. A mouthful to be sure but that complexity (and statistical jargon) does not get in the way of a business analyst, or engineer from gaining the most possible number and quality of exploitable insights. They can focus on their subject matter unfettered. In fact, my experience is that the tool almost becomes invisible as the focus is on the subject matter.
  • Unlike the old days, when I started in this game, there is no need to submit a request that instructs programmers in IT to amend a report that will arrive several days later. The lapsed time between question-and-answer is gone and so is the dependency.
  • The old division of labor between analytics and business is gone. They must be welded together to be effective and efficient at finding exploitable business, engineering and scientific insights.
  • Notice that the analysis is not simply descriptive, as it was in the old days. It is inferential because it leads analysts to predict future outcomes and ask further questions.

Not only were the analysts impressed with the insight – they were also excited about how readily it was derived.

IT & the New Analytics

What does it take to bring the new world of analytics into your organization and support a culture of analytics?

This is where IT comes in – obviously, they have a major role to play. IT no longer needs to worry about conducting analytics. It’s best left to the analysts. Instead, IT are now enablers of analytics. They can do this by:

  • Maintaining the hardware and software infrastructure that supports operational and analytical needs.
  • Making data available in an analytically-friendly way so that data may be self-provisioned. We do lots of work in this area to ensure that analytical data demands do not affect operations. For example, in pharmaceutical, semiconductor, solar and other industries, unimpeded real-time data must be collected for traceability. Analytical demand on IT infrastructure cannot affect operational systems.
  • Support the likes of our company, Predictum, in developing integrated analytical applications that further facilitate analysis, store and transfer knowledge and insights and gain other efficiencies and cost savings in areas of operations, research and compliance.
  • Secure all systems.

Securing systems is a rapidly growing and increasingly demanding responsibility for IT. So much so that we find they are usually very happy to be relieved of the burden of conducting analytics or involving themselves with analytics that analysts can better support themselves. Their enabling role is much more consistent with their other activities and responsibilities. For example, IT supports Order/Shipping/Billing systems but they do not order, ship or bill themselves so why should they conduct business, science or engineering analytics?

Expanding Analytical Opportunities with the Internet of Things

With the Internet of Things, new more capable equipment and the internet’s expanding reach, we can expect an exponential increase in the amount and quality of data well into the future. It’s best to prepare for the opportunities presented by building a culture of analytics now. That involves designing the right data architecture, providing JMP and enabling business analysts, scientists and engineers to advance their subject matter expertise with analytics.


Share this: