MITSloan Management Review
Big Idea: Data & Analytics Interview
January 27, 2014
Philip Kim (GE Measurement & Control), interviewed by
Renee Boucher Ferguson
GE’s marketing division uses data to continuously improve
performance — and democratize analytics.
Philip Kim, (former) marketing operations leader for
Measurement & Control
General Electric (GE) is a massive conglomerate that
encompasses a number of separate businesses: Power & Water, Oil & Gas,
Energy Management, Aviation, Transportation, Healthcare, Home & Business
Solutions.
Oil & Gas is the company’s fastest growing business,
with revenues of $15 billion. It competes in high-growth markets and creates
products, like the recently launched first subsea compressor, that utilize GE’s
broad technical capabilities. Measurement & Control, a division of Oil
& Gas, covers a swath of industries and applications, according to its
website, including sensing, asset condition monitoring, controls and
instrumentation.
But Oil & Gas, along with the rest of GE, is also betting
heavy on analytics. The company announced this summer the first-of-its-kind
cloud platform for collecting, storing and analyzing large-scale machine data,
to handle the massive data from the upcoming Industrial Internet.
GE is also applying that analytic rigor to innovate
internally – and drive commercial change. Philip Kim, (former) marketing
operations leader for Measurement & Control,1 talks
withMIT Sloan Management Review contributing editor Renee Boucher
Ferguson about the process of innovation through analytics, driving commercial
change, and what others can do to get there.
How are you are using analytics within GE Oil & Gas?
General Electric is a very large conglomerate. So when you
use the word “analytics,” [providing] context is probably the paramount thing
you can and must do, in order to make sure that people understand where you’re
coming from. When we talk about analytics within the context of our business —
Measurement & Control, which is a part of Oil & Gas — we segment
analytics into two large categories.
One is what I would call “big machine data” or “big data” —
applications, for example, trying to identify from a series of data points if
there’s a technical issue with a customer asset. For example, detecting when
rotating machinery might fail by combining a lot of sensor data with software
and analytics.
The second category that comes up with analytics is, how do
you take in what available data you have and drive it toward commercial
objectives? Whether it be to grow sales, decrease costs, increase productivity
— all intended to help provide solutions that are not only optimal, but that
continuously improve our performance.
What my team works on mostly is the second category. So,
what we’ll do, often, is figure out what the business problem is that we can
help resolve. We’re kind of a jack-of-all-trades; we don’t turn away anyone
from any part of the business, even though we might not necessarily have a lot
of skills in that particular vertical area. But we will bring the analytical
tool sets and help drive an outcome that we think would improve some strategic
outcome.
What are you working on now?
In terms of the actual specifics, we’ve been working very
hard on a lot of visualization techniques, as well as incorporating the data
methodology that GE’s famous for — Lean Six Sigma — and using that to
build and prototype actionable use, which can then help drive change in
behavior. So we spend a lot of our time not only getting what data we can,
doing a lot of mash-ups, exploring what orthogonal datasets might be useful to
help answer the question, but ultimately driving some sort of business strategy
improvement with that, and measuring that.
How is it that data and analytics have enabled you to
innovate and drive business strategy?
I think analytics in the context of innovation is really
bounded by what’s different. One of the things that we always strive for is, if
we do an analysis, and that analysis basically just confirms what you already
knew, we’ve failed the test for innovation. So one of the things that we always
try to look for in a particular project is, why is an analytics approach
superior? And typically, it’s because the scale or complexity of the problem
escapes just simple human intuition, or the data provides something that is
fairly counterintuitive.
So we work toward projects that humans just can’t process
very well. To give you an example, we’ll use very sophisticated modeling and
statistical multivariate analysis to identify what are the key leading
indicators for particular growth segments around the world — by verticals, by
product — just so we can understand the question: is it really the market or is
it us? That kind of information can then be used to determine our best strategy
for resource allocation.
If we hadn’t provided that kind of normalization function,
it would be much more of an intangible, to know whether or not you made the
right decision. It also allows us to preplan a lot of our workforce, as well as
things that are very hard to substantially, quantitatively prove out. What
we’re trying to do here is make the case that data helps and bolsters strategic
thinking and innovation. But we’re very cautious about that, because that is a
very hard thing to do.
Can you give me an example of an analytics project that
helps bolster strategic thinking? And why is it a hard thing to do?
We did a pretty large study of what kind of segments and
customer sites could we map around the world today. And could we identify,
using that data, where our best opportunities are, in the hopes that our sales
team will spend more of their time identifying and triaging the best
opportunity they have in the time frame that they have (rather than
prospecting). Our projects around that have been pretty successful.
One thing we do in our group, versus what we see in other
parts of General Electric or even outside of General Electric, is [incorporate]
a very strong, results-based analytics. We’re very conscious about driving
change and about measuring the benefits. Some of the projects that we’ve run
through have really helped drive significant incremental changes in sales,
because we’re answering a very important question in a very simple way.
I think that’s probably something that’s lacking today in a
lot of analytics projects. You have these very complicated models, but they
don’t translate very well to the commercial operation. Therefore they get lost,
or they remain in the academic modeling world, the analytics world, and [are]
not driving commercial change. We’ve been very good at scoping out what makes a
difference. The one feedback that we get consistently is not to increase the
sophistication of our models, but to make [them] simpler and easier to use. I
think that’s a lesson we’re still learning.
We do measure the ROI, so obviously we have to expend
effort, time and money on that. And we do want to make sure that the right
projects get the right prioritization. So we let people bid up our services,
depending on what kind of benefit that they might be able to provide for us.
That’s a slightly different approach than I think you’ll find in a lot of other
businesses, even within General Electric. We’ll challenge business users with
the benefit up front. But the General Electric culture can accept that. That’s
something that we are all very comfortable with, as a principle — projects that
have more ROI, more benefit, they deserve to get to the front of the line. And
there’s not a lot of issue with that.
It sounds like different departments are competing for
your attention. How is it that this happens? And how does that process play
out?
Let me give you the quick background, because it’s a very
strange process that we’ve amalgamated. The first thing that we realized, very
early on, is that we were absolutely terrible at predicting when we would be
done with any one project. We were just bad at it. When you’re trying to do
analytics, it’s really hard to know when you’re going to be done. It’s really
hard to understand what the outcome might be in a certain time frame,
especially the more complicated or the more complex, or where the datasets are
not readily available.
So what we decided to do was basically borrow and steal the
concept of agile methodology, which is, you write very simple stories and you
try to scope out the work in a three-week or four-week chunk of work. We
incorporated that, and as part of that you get something called sprint
planning. And what that means is that you decide what you’re going to work on,
you decide what your availability is. You decide what the risk and the other
challenges might be in that project.
But you work on it very collaboratively, and then you have
daily calls to make sure that everything’s working really well, or you’re
making progress. And what it does for us is it simplifies the traditional
waterfall requirements that might be driving a lot of IT deployments, and
instead, focuses on the business problem, the stories that the end users and
stakeholders are very interested in doing.
It’s taking a project and scoping it down in such a way that
we can do a very fast prototyping and delivery. Without that, we could get into
a situation where we’d go after the biggest benefit for the project, and it’s a
huge project, and it’s very feasible, but it just takes time. Then you realize
6 or 12 months down the road that we’re not going to be successful and just
basically scrap the project. What this allows us to do is be very, very
tactical, with very fast prototyping. The adage of “fail fast, fail early” —
that’s something that we live by. But we incorporate that within the context of
analytics, which I think is kind of rare.
What I push my team to always do is to think about, “how do
you know you’re going to be successful, or how do you know you’re not going to
be successful,” very early in the process. I push for that very quickly, very
early. And that’s not typical for an analytics group, because they think we can
solve everything. We try very hard to bound and constrain it, and integrate
that with a deployment or delivery model. I think that’s probably our strongest
strength so far to date.
How does GE’s culture enable analytics innovation?
One thing that we have going for us is, innovation and
analytics are really very contingent on the culture of the organization as a
whole. And General Electric is, if nothing else, known for data-based decision
[making] within our culture, within our operating rhythm. We’re also known
obviously for some innovation and a lot of the R&D that gets done. It’s
pretty well known that we’re known for grooming leaders who need to look at
problems in new and challenging ways, because we’re so big that we have to grow
using these kind of large decisions. We can’t go after minimal changes and
expect to sustain the growth that we’re accustomed to.
What can other organizations do to better utilize
analytics for innovation?
What I find is, many organizations are underestimating the
importance of culture in adopting analytics as a strategy. It comes back down
to, you need to have some leaders who are fairly bold, that are willing to take
data and analytics as a kind of untrusted or unvalidated initiative, where you
don’t have a lot to go on. You just take it as potentially what’s valuable.
What’s happening is that the business environment is
changing much more rapidly than people were anticipating. And so they’re trying
to figure out, ‘what’s that signal? Why are we not able to predict as well as
we could have or would have in the past?’ That’s an interesting thing that I
don’t have an answer for, but it’s my hypothesis as to why people are increasingly
looking to data and analytics as a potential solution.
But that’s also leading to the mythology of big data,
because organizations are seeing it as, “oh, it’ll solve everything.” The
closest corollary I would have is the power of Six Sigma and Lean Six Sigma.
Everyone knows that it’s potentially very valuable. It could drive huge
benefits to the bottom line. But if you were to survey the folks who believe
that it’s important and the folks who actually kind of pulled it off, I think
you’d see a fairly large drop-off or a die-off rate. That’s something to factor
in with analytics as well. The folks in the businesses who do make the
investment have a much better chance of surviving, versus the folks who say
that’s important but they’re not investing at the same rate or the same
proportion.
I also believe that where analytics lives is a factor. I’m
not in IT, and if analytics is in IT or in BI [Business Intelligence], I
personally believe that it’s got a more challenging road ahead than a group
that is kind of charged with looking around corners, which is what we do in
marketing. We’ve got a bold leader at Measurement & Control who believes in
this kind of stuff, and I’m a headquarter function. I’ve got personal
experience, at least anecdotally, that says that that’s a big driver for us.
You mentioned early on in the conversation about putting
analytics outcomes into commercial use. Can you talk about that process within
your group?
I call it a democratization of analytics. Even now, it’s a
very rarified skill. It’s a very uncommon activity that most people don’t know
how to do, or do very well.
A lot of analysts are terrible at explaining analytics needs
or insights in business terms, and I think “terrible” is being polite to them.
So, you have this missing link, where you have a need, but they can’t
articulate it. You have a bunch of folks who can do it, but they don’t know how
to explain it. Or the data quality is so poor that they don’t feel right
telling you, because there’s not enough R2 or P-value confidence to say, “I can
do this.” Whereas a lot of folks would be willing to accept “good enough,”
right?
So, these kinds of conflicts arise, I think, because of the
fact that we’ve not had nearly as much maturity or time in this field as we had
in, say, just pure statisticians or people who are sales guys. You can have
people with 20 or 30 years’ experience in each of those categories, but not
nearly as large a population in the middle, between those two.
What our team has really worked to do is bridge that gap,
and we’ve tried to use these very iterative development cycles to come up with
a picture — a technique and a process method — that works for both. That
measures the changes through the same analytics and visualization that we’ve
constructed, to make sure that we are making a difference. We spend a lot of
our time in that middle.
REFERENCES
1. Kim moved on from his position at GE after
this interview was conducted.
ABOUT THE AUTHOR
Renee Boucher Ferguson is a contributing editor for the Data
& Analytics Big Ideas Initiative at MIT Sloan Management Review.
No comments:
Post a Comment