Tag Archives: Gary Angel

Practical Steps to Building an Analytics Culture

Building an analytics culture in the enterprise is incredibly important. It’s far more important than any single capability, technology or technique. But building culture isn’t easy. You can’t buy it. You can’t proclaim it. You can’t implement it.

There is, of course, a vast literature on building culture in the enterprise. But if the clumsy, heavy-handed, thoroughly useless attempts to “build culture” that I’ve witnessed over the course of my working life are any evidence, that body of literature is nearly useless.

Here’s one thing I know for sure: you don’t build culture by talk. I don’t care whether it’s getting teenagers to practice safe-sex or getting managers to use analytics, preaching virtue doesn’t work, has never worked and will never work. Telling people to be data-driven, proclaiming your commitment to analytics, touting your analytics capabilities: none of this builds analytics culture.

If there’s one thing that every young employee has learned in this era, it’s that fancy talk is cheap and meaningless. People are incredibly sophisticated about language these days. We can sit in front of the TV and recognize in a second whether we’re seeing a commercial or a program. Most of us can tell the difference between a TV show and movie almost at a glance. We can tune out advertising on a Website as effortlessly as we put on our pants. A bunch of glib words aren’t going to fool anyone. You want to know what the reaction is to your carefully crafted, strategic consultancy driven mission statement or that five year “vision” you spent millions on and just rolled out with a cool video at your Sales Conference? Complete indifference.

That’s if you’re lucky…if you didn’t do it really well, you got the eye-roll.

But it isn’t just that people are incredibly sensitive – probably too sensitive – to BS. It’s that even true, sincere, beautifully reasoned words will not build culture. Reading moral philosophy does not create moral students. Not because the words aren’t right or true, but because behaviors are, for the most part, not driven by those types of reasons.

That’s the whole thing about culture.

Culture is lived, not read or spoken. To create it, you have to ingrain it in people’s thinking. If you want a data-driven organization, you have to create good analytic habits. You have to make the organization (and you too) work right.

How do you do that?

You do it by creating certain kinds of process and behaviors that embed analytic thinking. Do enough of that, and you’ll have an analytic culture. I guarantee it. The whole thrust of this recent series of posts is that by changing the way you integrate analytics, voice-of-customer, journey-mapping and experimentation into the enterprise, you can drive better digital decision making. That’s building culture. It’s my big answer to the question of how you build analytics culture.

But I have some small answers as well. Here, in no particular order, are practical ways you can create importantly good analytics habits in the enterprise.

Analytic Reporting

What it is: Changing your enterprise reporting strategy by moving from reports to tools. Analytic models and forecasting allow you to build tools that integrate historical reporting with forecasting and what-if capabilities. Static reporting is replaced by a set of interactive tools that allow users to see how different business strategies actually play-out.

Why it build analytics culture: With analytics reporting, you democratize knowledge not data. It makes all the difference in the world. The analytic models capture your best insight into how a key business works and what levers drive performance. Building this into tools not only operationalizes the knowledge, it creates positive feedback loops to analytics. When the forecast isn’t right, everyone know it and the business is incented to improve its understanding and predictive capabilities. This makes for better culture in analytics consumers and analytics producers.

 

Cadence of Communications

What it is: Setting up regular briefings between analytics and your senior team and decision-makers. This can include review of dashboards but should primarily focus on answers to previous business questions and discussion of new problems.

Why it builds analytics culture: This is actually one of the most important things you can do. It exposes decision-makers to analytics. It makes it easy for decision-makers to ask for new research and exposes them to the relevant techniques. Perhaps even more important, it lets decision-makers drive the analytics agenda, exposes analysts to real business problems, and forces analysts to develop better communication skills.

 

C-Suite Advisor

What it is: Create an Analytics Minister-without-portfolio whose sole job is to advise senior decision-makers on how to use, understand and evaluate the analytics, the data and the decisions they get.

Why it builds analytics culture: Most senior executives are fairly ignorant of the pitfalls in data interpretation and the ins-and-outs of KPIs and experimentation. You can’t send them back to get a modern MBA, but you can give them a trusted advisor with no axe to grind. This not only raises their analytics intelligence, it forces everyone feeding them information to up their game as well. This tactic is also critical because of the next strategy…

 

Walking the Walk

What it is: Senior Leaders can talk tell they are blue in the face about data-driven decision-making. Nobody will care. But let a Senior Leader even once use data or demand data around a decision they are making and the whole organization will take notice.

Why it builds analytics culture: Senior leaders CAN and DO have a profound impact on culture but they do so by their behavior not their words. When the leaders at the top use and demand data for decisions, so will everyone else.

 

Tagging Standards

What it is: A clearly defined set of data collection specifications that ensure that every piece of content on every platform is appropriately tagged to collect a rich set of customer, content, and behavioral data.

Why it builds analytics culture: This ends the debate over whether tags and measurement are optional. They aren’t. This also, interestingly, makes measurement easier. Sometimes, people just need to be told what to do. This is like choosing which side of the road to drive on – it’s far more important that you have a standard that which side of the road you pick. Standards are necessary when an organization needs direction and coordination. Tagging is a perfect example.

 

CMS and Campaign Meta-Data

What it is: The definition of and governance around the creation of campaign and content meta-data. Every piece of content and every campaign element should have detailed, rich meta-data around the audience, tone, approach, contents, and every other element that can be tuned and analyzed.

Why it builds analytics culture: Not only is meta-data the key to digital analytics – providing the meaning that makes content consumption understandable, but rich meta-data definition guides useful thought. These are the categories people will think about when they analyze content and campaign performance. That’s as it should be and by providing these pre-built, populated categorizations, you’ll greatly facilitate good analytics thinking.

 

Rapid VoC

What it is: The technical and organizational capability to rapidly create, deploy and analyze surveys and other voice-of-customer research instruments.

Why it builds analytics culture: This is the best capability I know for training senior decision-makers to use research. It’s so cheap, so easy, so flexible and so understandable that decision-makers will quickly get spoiled. They’ll use it over and over and over. Well – that’s the point. Nothing builds analytics muscle like use and getting this type of capability deeply embedded in the way your senior team thinks and works will truly change the decision-making culture of the enterprise.

 

SPEED and Formal Continuous Improvement Cycles

What it is: The use of a formal methodology for digital improvement. SPEED provides a way to identify the best opportunities for digital improvement, the ways to tackle those opportunities, and the ability to measure the impact of any changes. It’s the equivalent of Six Sigma for digital.

Why it builds analytics culture: Formal methods make it vastly easier for everyone in the organization to understand how to get better. Methods also help define a set of processes that organizations can build their organization around. This makes it easier to grow and scale. For large enterprises, in particular, it’s no surprise that formal methodologies like Six Sigma have been so successful. They make key cultural precepts manifest and attach processes to them so that the organizational inertia is guided in positive directions.

 

Does this seem like an absurdly long list? In truth I’m only about half-way through. But this post is getting LONG. So I’m going to save the rest of my list for next week. Till then, here’s some final thoughts on creating an analytics culture.

The secret to building culture is this: everything you do builds culture. Some things build the wrong kind of culture. Some things the right kind. But you are never not building culture. So if you want to build the right culture to be good at digital and decision-making, there’s no magic elixir, no secret sauce. There is only the discipline of doing things right. Over and over.

That being said, not every action is equal. Some foods are empty of nutrition but empty, too, of harm. Others positively destroy your teeth or your waistline. Still others provide the right kind of fuel. The things I’ve described above are not just a random list of things done right, they are the small to medium things that, done right, have the biggest impacts I’ve seen on building a great digital and analytics culture. They are also targeted to places and decisions which, done poorly, will deeply damage your culture.

I’ll detail some more super-foods for analytics culture in my next post!

 

[Get your copy of Measuring the Digital World – the definitive guide to the discipline of digital analytics – to learn more].

Building and Measuring Analytics Culture

Culture – how to measure it and how to build it – has been much on my mind lately.

At least when it comes to the measurement part – something we don’t normally have to do – the reason is…different.

My Counseling Family team is going to be doing another fun project – participating in the 538 Oscar Modelling challenge – and our approach is to try and model each nominated movies’ fit to the current Hollywood zeitgeist. The theory behind the approach is simple. It seems fairly reasonable to suggest that while qualitative differences between nominated movies and non-nominated movies might be fairly large, when it comes to selecting between a small-set of relatively high-quality choices the decision is fairly arbitrary. In such situations, political and personal concerns will play a huge role, but so, presumably, will simple preference. Our thought is that preference is likely more a function of worldview than artistry – in much the same manner that people watching a political debate almost always believe that the person who most nearly echoes their opinion-set won. But how do you measure the cultural fit of a movie to a community? It’s no easy task. Our challenges include deciding how to capture a cultural zeitgeist in general, how to focus that capture on Hollywood, how to capture the spirit and themes of each movie, and how to score a match. And, of course, there is the challenge that the Hollywood zeitgeist might be more hot air than great wind and altogether too thin to be captured!

Should be interesting.

Equally, though, I have been thinking a lot about how to build culture – specifically when it comes to analytics. A constant theme running through my recent posts on enterprise transformation has been the challenge of doing digital well in the large enterprise. As I’ve repeatedly pointed out, that challenge is less a matter of technology or people, as it is of organization and culture. Enterprises have and understand the core capabilities to do digital well. Listen to my recent video series and you’ll hear this refrain:

  • The typical enterprise does analytics, they just don’t use it.
  • They have testing, they just don’t learn.
  • They talk voice-of-customer, they just don’t listen.
  • They do Agile, but they aren’t.

To fix those problems requires changes in the way the organization is structured and he way those capabilities are done. Even more, it requires changes in the way people need to think. That’s culture.

So I’ve been pondering how to build a culture of analytics driven decision-making and, of course, how to measure whether you’re successful. Now while my particular problem – building the proper sort of enterprise for digital transformation – may not be the standard one, the problems and challenges of building culture and measuring culture are hardly unique. And since this isn’t my specialty at all, I’ve been trying to read up on common approaches.

By and large, it’s pretty disappointing.

From a building culture perspective, so much of the literature seems to focus on top-down approaches: ways that a senior leader can communicate and encourage cultural change. That’s clearly important and I’m not going to dispute both the need and the challenges around top-down change. But this type of approach often seems to degenerate into leadership self-help advice or cheerleading, neither of which seem likely to be useful. Nor am I much impressed by the idea of carefully crafting a mission statement and promulgating it through the organization. I’ve sat in more than one excruciating mission statement meeting and all I ever got out of it was a sore butt. I’ve said before that if you have to create an innovation capability in your enterprise, you’re already defeated. And if you’re looking to a carefully crafted corporate mission statement to provide a shared vision, you’ve already lost your way.

I wasn’t much more impressed with attempts to measure culture.

It’s hard, obviously.

Most approaches seem to rely on survey instruments and involve categorization of the organization into pre-defined types (e.g. hierarchical) or score a number of separate variables (e.g. perceived alignment of vision). This seems like a close corollary to personality measurement tests. Lots of people love these tests, but they don’t strike me as particularly rigorous.

With regards to categorization, in particular, I’m skeptical that it means much and very skeptical that it might be a useful trigger to action. I can see value – and perhaps even triggers to action – in learning that there are differing perceptions of organization mission or differing perceptions around how aligned the organization is. It’s easy to fool yourself with a view from the top and this type of cultural survey instrument might help correct attitudes of corporate complacency. It’s much less clear to me, however, that such measurement would be a useful part of a continuous program designed to improve analytics (or any other) culture.

I’d very much like to see measures of culture that are behavioral and amenable to continuous measurement, but at least so far I haven’t come across anything very interesting.

It may be that culture is one of those things that is so challenging to measure that the subsequent benefits in clarity and decision-making – at least outside the world of academia – aren’t worth the investment. Perhaps the best way to measure culture is by digital success. If it works, it’s working. You could easily take that point of view from this extended series on digital transformation and I don’t think it’s implausible.

Maybe we just haven’t found the right methods.

Or maybe I just haven’t read the right articles. Indeed, if you have thoughts on either of these issues (how to build or measure culture) or can point me to interesting research, I’d love to hear about  it.

Right now, I have more ideas about how to build analytics culture than I do how to measure your success building it. Some of those ideas are implicit in the recommendations I’ve been making about the integration of analytics, the use of voice-of-customer and the role of experimentation, but they don’t end there.

In my next post, I’ll explain some concrete actions the enterprise can take to build analytics culture and why I think they are both more practical and more impactful than most of what passes for culture building.

At the same time, I’m going to be thinking more about measuring culture and I hope – eventually – to have something interesting to say about that too.

 

Measuring the Digital World is OFFICIALLY RELEASED. Order here – and just let me know if you’d like an autographed copy!

Measuring the Digital World

After several months in pre-order purgatory, my book, Measuring the Digital World is now available. If you’re even an occasional reader of this blog, I hope you’ll find the time to read it.

I know that’s no small ask. Reading a professional book is a big investment of time. So is reading Measuring the Digital World worth it?

Well, if you’re invested in digital optimization and analytics, I think it is – and here’s why. We work in a field that is still very immature. It’s grown up, as it were, underneath our feet. And while that kind of organic growth is always the most exciting, it’s also the most unruly. I’m betting that most of us who have spent a few years or more in digital analytics have never really had a chance to reflect on what we do and how we do it. Worse, most of those who are trying to learn the field, have to do so almost entirely by mentored trial-and-error. That’s hard. Having a framework for how and why things work makes the inevitable trial-and-error learning far more productive.

My goal in Measuring the Digital World wasn’t so much to create a how-to book as to define a discipline. I believe digital analytics is a unique field. A field defined by a few key problems that we must solve if we are to do it well. In the book, I wanted to lay out those problems and show how they can be tackled – irrespective of the tools you use or the type of digital property you care about.

At the very heart of digital analytics is a problem of description. Measurement is basic to understanding. We are born with and soon learn to speak and think in terms of measurement categories that apply to the physical world. Dimensionality, weight, speed, direction and color are some of the core measurement categories that we use over and over and over again in understanding the world we live in. These things don’t exist in the digital world.

What replaces them?

Our digital analytics tools provide the eyes and ears into the digital world. But I think we should be very skeptical of the measurement categories they suggest. Having lived through the period when those tools where designed and took their present shape, I’ve seen how flawed were the measurement conceptions that drove their form and function.

It’s not original, but it’s still true to say that our digital analytics tools mostly live at the wrong level and have the wrong set of measurement categories – that they are far too focused on web assets and far too little on web visitors.

But if this is a mere truism, it nevertheless lays the ground work for a real discipline. Because it suggests that the great challenge of digital is how to understand who people are and what they are doing using only their viewing behavior. We have to infer identity and intention from action. Probably 9 out of every 10 pages in Measuring the Digital World are concerned with how to do this.

The things that make it hard are precisely the things that define our discipline. First, to make the connection between action and both identity and intention, we have to find ways to generate meaning based on content consumption. This means understanding at a deep level what content is about – it also means making the implicit assumption that people self-select the things that interest them.

For the most part, that’s true.

But it’s also where things get tricky. Because digital properties don’t contain limitless possibilities and they impose a structure that tries to guide the user to specific actions. This creates a push-pull in every digital world. On the one hand, we’re using what people consume to understand their intention and, at the very same time, we’re constantly forcing their hand and trying to get them to do specific actions! Every digital property – no matter its purpose or design – embodies this push-pull. The result? A complex interplay between self-selection, intention and web design that makes understanding behavior in digital a constant struggle.

That’s the point – and the challenge – of digital analytics. We need to have techniques for moving from behavior to identity and intention. And we need to have techniques that control for the structure of digital properties and the presence or absence of content. These same challenges are played out on Websites, on mobile apps and, now, on omni-channel customer journeys.

This is all ground I’ve walked before, but Measuring the Digital World embodies an orderly and fairly comprehensive approach to describing these challenges and laying out the framework of our discipline. How it works. Why it’s hard. What challenges we still face. It’s all there.

So if you’re an experienced analyst and just want to reflect your intuitions and knowledge against a formal description of digital analytics and how it can be done, this book is for you. I’m pretty sure you’ll find at least a few new ideas and some new clarity around ideas you probably already have.

If you’re relatively new to the field and would like something that is intellectually a little more meaty than the “bag of tips-and-tricks” books that you’ve already read, then this book is for you. You’ll get a deep set of methods and techniques that can be applied to almost any digital property to drive better understanding and optimization. You’ll get a sense, maybe for the first time, of exactly what our discipline is – why it’s hard and why certain kinds of mistakes are ubiquitous and must be carefully guarded against.

And if you’re teaching a bunch of MBA or Business Students about digital analytics and want something that actually describes a discipline, this book is REALLY for you (well…for your students). Your students will get a true appreciation for a cutting edge analytics discipline, they’ll also get a sense of where the most interesting new problems in digital analytics are and what approaches might bear fruit. They’ll get a book that illuminates how the structure of a field – in this case digital – demands specific approaches, creates unique problems, and rewards certain types of analysis. That’s knowledge that cuts deeper than just understanding digital analytics – it goes right to the heart of what analytics is about and how it can work in any business discipline. Finally, I hope that the opportunity to tackle deep and interesting problems illuminated by the book’s framework, excites new analysts and inspires the next generation of digital analysts to go far beyond what we’ve been able to do.

 

Yes, even though I’m an inveterate reader, I know it’s no trivial thing to say “read this book”. After all, despite my copious consumption, I delve much less often into business or technical books. So many seem like fine ten-page articles stretched – I’m tempted to say distorted – into book form. You get their gist in the first five pages and the rest is just filler. That doesn’t make for a great investment of time.

And now that I’ve actually written a book, I can see why that happens. Who really has 250 pages worth of stuff to say? I’m not sure I do…actually I’m pretty sure there’s some filler tucked in there in a spot or two. But I think the ratio is pretty good.

With Measuring the Digital World I tried to do something very ambitious – define a discipline. To create the authoritative view of what digital analytics is, how it works, and why it’s different than any other field of analytics. Not to answer every question, lay out every technique or solve every problem. There are huge swaths of our field not even mentioned in the book. That doesn’t bother me. What we do is far too rich to describe in a single book or even a substantial collection. Digital is, as the title of the book suggests, a whole new world. My goal was not to explore every aspect of measuring that world, but only to show how that measurement, at its heart, must proceed. I’m surely not the right person to judge to what extent I succeeded. I hope you’ll do that.

Here’s the link to Measuring the Digital World on Amazon.

[By the way, if you’d like signed copy of Measuring the Digital World, just let me know. You can buy a copy online and I’ll send you a book-plate. I know it’s a little silly, but I confess to extreme fondness for the few signed books I possess!]

Analytics with a Strategic Edge

The Role of Voice of Customer in Enterprise Analytics

The vast majority of analytics effort is expended on problems that are tactical in nature. That’s not necessarily wrong. Tactics gets a bad rap, sometimes, but the truth is that the vast majority of decisions we make in almost any context are tactical. The problem isn’t that too much analytics is weighted toward tactical issues, it’s really that strategic decisions don’t use analytics at all. The biggest, most important decisions in the digital enterprise nearly always lack a foundation in data or analysis.

I’ve always disliked the idea behind “HIPPOs” – with its Dilbertian assumption that executives are idiots. That isn’t (mostly) my experience at all. But analytics does suffer from what might be described as “virtue” syndrome – the idea that something (say taxes or abstinence) is good for everyone else but not necessarily for me. Just as creative folks tend to think that what they do can’t be driven by analytics, so too is there a perception that strategic decisions must inevitably be more imaginative and intuitive and less number-driven than many decisions further down in the enterprise.

This isn’t completely wrong though it probably short-sells those mid-level decisions. Building good creative takes…creativity. It can’t be churned out by machine. Ditto for strategic decisions. There is NEVER enough information to fully determine a complex strategic decision at the enterprise level.

This doesn’t mean that data isn’t useful or should not be a driver for strategic decisions (and for creative content too). Instinct only works when it’s deeply informed about reality. Nobody has instincts in the abstract. To make a good strategic decision, a decision-maker MUST have certain kinds of data to hand and without that data, there’s nothing on which intuition, knowledge and experience can operate.

What data does a digital decision-maker need for driving strategy?

Key audiences. Customer Journey. Drivers of decision. Competitive choices.

You need to know who your audiences are and what makes them distinct. You need (as described in the last post) to understand the different journeys those audiences take and what journeys they like to take. You need to understand why they make the choices they make – what drives them to choose one product or service or another. Things like demand elasticity, brand awareness, and drivers of choice at each journey stage are critical. And, of course, you need to understand when and why those choices might favor the competition.

None of this stuff will make a strategic decision for you. It won’t tell you how much to invest in digital. Whether or not to build a mobile app. Whether personalization will provide high returns.

But without fully understanding audience, journey, drivers of decision and competitive choices, how can ANY digital decision-maker possibly arrive at an informed strategy? They can’t. And, in fact, they don’t. Because for the vast majority of enteprises, none of this information is part-and-parcel of the information environment.

I’ve seen plenty of executive dashboards that are supposed to help people run their business. They don’t have any of this stuff. I’ve seen the “four personas” puffery that’s supposed to help decision-makers understand their audience. I’ve seen how limited is the exposure executives have to journey mapping and how little it is deployed on a day-to-day basis. Worst of all, I’ve seen how absolutely pathetic is the use of voice of customer (online and offline) to help decision-makers understand why customers make the choices they do.

Voice of customer as it exists today is almost exclusively concerned with measuring customer satisfaction. There’s nothing wrong with measuring NPS or satisfaction. But these measures tell you nothing that will help define a strategy. They are at best (and they are often deeply flawed here too) measures of scoreboard – whether or not you are succeeding in a strategy.

I’m sure that people will object that knowing whether or not a strategy is succeeding is important. It is. It’s even a core part of ongoing strategy development. However, when divorced from particular customer journeys, NPS is essentially meaningless and uninterpretable. And while it truly is critical to measure whether or not a strategy is succeeding, it’s even more important to have data to help shape that strategy in the first place.

Executives just don’t get that context from their analytics teams. At best, they get little pieces of it in dribs and drabs. It is never – as it ought to be – the constant ongoing lifeblood of decision-making.

I subtitled this post “The Role of Voice of Customer in Enterprise Analytics” because of all the different types of information that can help make strategic decisions better, VoC is by far the most important. A good VoC program collects information from every channel: online and offline surveys, call-center, site feedback, social media, etc. It provides a continuing, detailed and sliceable view of audience, journey distribution and (partly) success. It’s by far the best way to help decision-makers understand why customers are making the choices they are, whether those choices are evolving, and how those choices are playing out across the competitive set. In short, it answers the majority of the questions that ought to be on the minds of decision-makers crafting a digital strategy.

This is a very different sort of executive dashboard than we typically see. It’s a true customer insights dashboard. It’s also fundamentally different than almost ANY VoC dashboard we see at any level. The vast majority of VoC reporting doesn’t provide slice-and-dice by audience and use-case – a capability which is absolutely essential to useful VoC reporting. VoC reporting is almost never based on and tied into a journey model so that the customer insights data is immediately reflective of journey stage and actionable arena. And VoC reporting almost never includes a continuous focus on exploring customer decision-making and tying that into the performance of actual initiatives.

It isn’t just a matter of a dashboard. One of the most unique and powerful aspects of digital voice-of-customer is the flexibility it provides to rapidly, efficiently and at very little cost tackle new problems. VoC should be a core part of executive decision-making with a constant cadence of research, analysis, discussion and reporting driven by specific business questions. This open and continuing dialog where VoC is a tool for decision-making is critical to integrating analytics into decisioning. If senior folks aren’t asking for new VoC research on a constant basis, you aren’t doing it right. The single best indicator of a robust VoC program in digital is the speed with which it changes.

Sadly, what decision-makers mostly get right now (if they get anything at all) is a high-level, non-segmented view of audience demographics, an occasional glimpse into high-level decision-factors that is totally divorced from both segment and journey stage, and an overweening focus on a scoreboard metric like NPS.

It’s no wonder, given such thin gruel, that decision-makers aren’t using data for strategic decisions better. If our executives mostly aren’t Dilbertian, they aren’t miracle workers either. They can’t make wine out of information water. If we want analytics to support strategy – and I assume we all do – then building a completely different sort of VoC program is the single best place to start. It isn’t everything. There are other types of data (behavioral, benchmark, econometric, etc.) that can be hugely helpful in shaping digital strategies. But a good VoC program is a huge step forward – a step forward that, if well executed – has the power to immediately transform how the digital enterprise thinks and works.

 

This is probably my last post of the year – so see you in 2016! In the meantime, my book Measuring the Digital World is now available. Could be a great way to spend your holiday down time (ideally while your resting up from time on the slopes)! Have a great holiday…