Tag Archives: enteprise culture

Building Analytics Culture – One Decision at a Time

In my last post, I argued that much of what passes for “building culture” in corporate America is worthless. It’s all about talk. And whether that talk is about diversity, ethics or analytics, it’s equally arid. Because you don’t build culture by talking. You build culture though actions. By doing things right (or wrong if that’s the kind of culture you want). Not only are words not effective in building culture, they can be positively toxic. When words and actions don’t align, the dishonesty casts other – possibly more meaningful words – into disrepute. Think about which is worse – a culture where bribery is simply the accepted and normal way of getting things done (and is cheerfully acknowledged) and one where bribery is ubiquitous but is cloaked behind constant protestations of disinterest and honesty? If you’re not sure about your answer, take it down to a personal level and ask yourself the same question. Do we not like an honest villain better than a hypocrite? If hypocrisy is the compliment vice pays to virtue, it is a particularly nasty form of flattery.

What this means is that you can’t build an analytics culture by telling people to be data driven. You can’t build an analytics culture by touting the virtues of analysis. You can’t even build an analytics culture by hiring analysts. You build an analytics culture by making good (data-driven) decisions.

That’s the only way.

But how do you get an organization to make data-driven decisions? That’s the art of building culture. And in that last post, I laid out seven (a baker’s half-dozen?) tactics for building good decision-making habits: analytic reporting, analytics briefing sessions, hiring a C-Suite analytics advisor, creating measurement standards, building a rich meta-data system for campaigns and content, creating a rapid VoC capability and embracing a continuous improvement methodology like SPEED.

These aren’t just random parts of making analytic decisions. They are tactics that seem to me particularly effective in driving good habits in the organization and building the right kind of culture. But seven tactics doesn’t nearly exhaust my list. Here’s another set of techniques that are equally important in helping drive good decision-making in the organization (my original list wasn’t in any particular order so it’s not like the previous list had all the important stuff):

Yearly Agency Performance Measurement and Reviews

What it is: Having an independent annual analysis of your agency’s performance. This should include review of goals and metrics, consideration of the appropriateness of KPIs and analysis of variation in campaign performance along three dimensions (inside the campaign by element, over time, and across campaigns). This must not be done by the agency itself (duh!) or by the owners of the relationship.

Why it builds culture: Most agencies work by building strong personal relationships. There are times and ways that this can work in your favor, but from a cultural perspective it both limits and discourages analytic thinking. I see many enterprises where the agency is so strongly entrenched you literally cannot criticize them. Not only does the resulting marketing nearly always suck, but this drains the life out of an analytics culture. This is one of many ways in which building an analytic culture can conflict with other goals, but here I definitely believe analytics should win. You don’t need a too cozy relationship with your agency. You do need objective measurement of their performance.

 

Analytics Annotation / Collaboration Tool like Insight Rocket

What it is: A tool that provides a method for rich data annotation and the creation and distribution of analytic stories across the analytics team and into the organization. In Analytic Reporting, I argued for a focus on democratizing knowledge not data. Tools like Insight Rocket are a part of that strategy, since they provide a way to create and rapidly disseminate a layer of meaning on top of powerful data exploration tools like Tableau.

Why it builds culture: There aren’t that many places where technology makes much difference to culture, but there are a few. As some of my other suggestions make clear, you get better analytics culture the more you drive analytics across and into the organization (analytic reporting, C-Suite Advisor, SPEED, etc.). Tools like Insight Rocket have three virtues: they help disseminate analytics thinking not just data, they boost analytics collaboration making for better analytic teams, and they provide a repository of analytics which increases long-term leverage in the enterprise. Oh, here’s a fourth advantage, they force analysts to tell stories – meaning they have to engage with the business. That makes this piece of technology a really nice complement to my suggestion about a regular cadence of analytics briefings and a rare instance of technology deepening culture.

 

In-sourcing

What it is: Building analytics expertise internally instead of hiring it out and, most especially, instead of off-shoring it.

Why it builds culture: I’d be the last person to tell you that consulting shouldn’t have a role in the large enterprise. I’ve been a consultant for most of my working life. But we routinely advise our clients to change the way they think about consulting – to use it not as a replacement for an internal capability but as a bootstrap and supplement to that capability. If analytics is core to digital (and it is) and if digital is core to your business (which it probably is), then you need analytics to be part of your internal capability. Having strong, capable, influential on-shore employees who are analysts is absolutely necessary to analytics culture. I’ll add that while off-shoring, too, has a role, it’s a far more effective culture killer than normal consulting. Off-shoring creates a sharp divide between the analyst and the business that is fatal to good performance and good culture on EITHER side.

 

Learning-based Testing Plan

What it is: Testing plans that include significant focus on developing best design practices and resolving political issues instead of on micro-optimizations of the funnel.

Why it works: Testing is a way to make decisions. But as long as its primary use is to decide whether to show image A or image B or a button in this color or that color, it will never be used properly. To illustrate learning-based testing, I’ve used the example of video integration – testing different methods of on-page video integration, different lengths, different content types and different placements against each key segment and use-case to determine UI parameters for ALL future videos. When you test this way, you resolve hundreds of future questions and save endless future debate about what to do with this or that video. That’s learning based testing. It’s also about picking key places in the organization where political battles determine design – things like home page real-estate and the amount of advertising load on a page – and resolving them with testing; that’s learning based testing, too. Learning based testing builds culture in two ways. First, in and of itself, it drives analytic decision-making. Almost as important, it demonstrates the proper role of experimentation and should help set the table for decision-makers tests to ask for more interesting tests.

 

Control Groups

What it is: Use of control groups to measure effectiveness whenever new programs (operational or marketing) are implemented. Control groups use small population subsets chosen randomly from a target population who are given either no experience or a neutral (existing) experience instead. Nearly all tests feature a baseline control group as part of the test, but the use of control groups transcends A/B testing tools. Use of control groups common in traditional direct response marketing and can be used in a wide variety of on and offline contexts (most especially as I recently saw Elea Feit of Drexel hammer home at the DAA Symposium – as a much more effective approach to attribution).

Why it works: One of the real barriers to building culture is a classic problem in education. When you first teach students something, they almost invariably use it poorly. That can sour others on the value of the knowledge itself. When people in an organization first start using analytics, they are, quite inevitably, going to fall into the correlation trap. Correlation is not causation. But in many cases, it sure looks like it is and this leads to many, many bad decisions. How to prevent the most common error in analytics? Control groups. Control groups build culture because they get decision-makers thinking the right way about measurement and because they protect the organization from mistakes that will otherwise sour the culture on analytics.

 

Unified Success Framework

What it is: A standardized, pre-determined framework for content and campaign success measurement that includes definition of campaign types, description of key metrics for those types, and methods of comparing like campaigns on an apples-to-apples basis.

Why it works: You may not be able to make the horse drink, but leading it to water is a good start. A unified success framework puts rigor around success measurement – a critical part of building good analytics culture. On the producer side, it forces the analytics team to make real decisions about what matters and, one hopes, pushes them to prove that proxy measures (such as engagement) are real. On the consumer side, it prevents that most insidious destroyer of analytics culture, the post hoc success analysis. If you can pick your success after the game is over, you’ll always win.

 

The Enterprise VoC Dashboard

What it is: An enterprise-wide state-of-the-customer dashboard that provides a snapshot and trended look at how customer attitudes are evolving. It should include built in segmentation so that attitudinal views are ALWAYS shown sliced by key customer types with additional segmentation possible.

Why it works: There are so many good things going on here that it’s hard to enumerate them all. First, this type of dashboard is one of the best ways to distill customer-first thinking in the organization. You can’t think customer-first, until you know what the customer thinks. Second, this type of dashboard enforces a segmented view of the world. Segmentation is fundamental to critical thinking about digital problems and this sets the table for better questions and better answers in the organization. Third, opinion data is easier to absorb and use than behavioral data, making this type of dashboard particularly valuable for encouraging decision-makers to use analytics.

 

Two-Tiered Segmentation

What it is: A method that creates two-levels of segmentation in the digital channel. The first level is the traditional “who” someone is – whether in terms of persona or business relationship or key demographics. The second level captures “what” they are trying to accomplish. Each customer touch-point can be described in this type of segmentation as the intersection of who a visitor is and what their visit was for.

Why it works: Much like the VoC Dashboard, Two-Tiered Segmentation makes for dramatically better clarity around digital channel decision-making and evaluation of success. Questions like ‘Is our Website successful?’ get morphed into the much more tractable and analyzable question ‘Is our Website successful for this audience trying to do this task?’. That’s a much better question and big part of building analytics culture is getting people to ask better questions. This also happens to be the main topic of my book “Measuring the Digital World” and in it you can get a full description of both the power and the methods behind Two-Tiered Segmentation.

 

I have more, but I’m going to roll the rest into my next post on building an agile organization since they are all deeply related to the integration of capabilities in the organization. Still, that’s fifteen different tactics for building culture. None of which include mission statements, organizational alignment or C-Level support (okay, Walking the Walk is kind of that but not exactly and I didn’t include it in the fifteen) and none of which will take place in corporate retreats or all-hands conferences. That’s a good thing and makes me believe they might actually work.

Ask yourself this: is it possible to imagine an organization that does even half these things and doesn’t have a great analytics culture? I don’t think it is. Because culture just is the sum of the way your organization works and these are powerful drivers of good analytic thinking. You can imagine an organization that does these things and isn’t friendly, collaborative, responsible, flat, diverse, caring or even innovative. There are all kinds of culture, and good decision-making isn’t the only aspect of culture to care about*. But if you do these things, you will have an organization that makes consistently good decisions.

*Incidentally, if you want to build culture in any of these other ways, you have to think about similar approaches. Astronomers have a clever technique for seeing very faint objects called averted vision. The idea is that you look just to the side of the object if you want to get the most light-gathering power from your eyes. It’s the same with culture. You can’t tackle it head-on by talking about it. You have to build it just a little from the side!

Building and Measuring Analytics Culture

Culture – how to measure it and how to build it – has been much on my mind lately.

At least when it comes to the measurement part – something we don’t normally have to do – the reason is…different.

My Counseling Family team is going to be doing another fun project – participating in the 538 Oscar Modelling challenge – and our approach is to try and model each nominated movies’ fit to the current Hollywood zeitgeist. The theory behind the approach is simple. It seems fairly reasonable to suggest that while qualitative differences between nominated movies and non-nominated movies might be fairly large, when it comes to selecting between a small-set of relatively high-quality choices the decision is fairly arbitrary. In such situations, political and personal concerns will play a huge role, but so, presumably, will simple preference. Our thought is that preference is likely more a function of worldview than artistry – in much the same manner that people watching a political debate almost always believe that the person who most nearly echoes their opinion-set won. But how do you measure the cultural fit of a movie to a community? It’s no easy task. Our challenges include deciding how to capture a cultural zeitgeist in general, how to focus that capture on Hollywood, how to capture the spirit and themes of each movie, and how to score a match. And, of course, there is the challenge that the Hollywood zeitgeist might be more hot air than great wind and altogether too thin to be captured!

Should be interesting.

Equally, though, I have been thinking a lot about how to build culture – specifically when it comes to analytics. A constant theme running through my recent posts on enterprise transformation has been the challenge of doing digital well in the large enterprise. As I’ve repeatedly pointed out, that challenge is less a matter of technology or people, as it is of organization and culture. Enterprises have and understand the core capabilities to do digital well. Listen to my recent video series and you’ll hear this refrain:

  • The typical enterprise does analytics, they just don’t use it.
  • They have testing, they just don’t learn.
  • They talk voice-of-customer, they just don’t listen.
  • They do Agile, but they aren’t.

To fix those problems requires changes in the way the organization is structured and he way those capabilities are done. Even more, it requires changes in the way people need to think. That’s culture.

So I’ve been pondering how to build a culture of analytics driven decision-making and, of course, how to measure whether you’re successful. Now while my particular problem – building the proper sort of enterprise for digital transformation – may not be the standard one, the problems and challenges of building culture and measuring culture are hardly unique. And since this isn’t my specialty at all, I’ve been trying to read up on common approaches.

By and large, it’s pretty disappointing.

From a building culture perspective, so much of the literature seems to focus on top-down approaches: ways that a senior leader can communicate and encourage cultural change. That’s clearly important and I’m not going to dispute both the need and the challenges around top-down change. But this type of approach often seems to degenerate into leadership self-help advice or cheerleading, neither of which seem likely to be useful. Nor am I much impressed by the idea of carefully crafting a mission statement and promulgating it through the organization. I’ve sat in more than one excruciating mission statement meeting and all I ever got out of it was a sore butt. I’ve said before that if you have to create an innovation capability in your enterprise, you’re already defeated. And if you’re looking to a carefully crafted corporate mission statement to provide a shared vision, you’ve already lost your way.

I wasn’t much more impressed with attempts to measure culture.

It’s hard, obviously.

Most approaches seem to rely on survey instruments and involve categorization of the organization into pre-defined types (e.g. hierarchical) or score a number of separate variables (e.g. perceived alignment of vision). This seems like a close corollary to personality measurement tests. Lots of people love these tests, but they don’t strike me as particularly rigorous.

With regards to categorization, in particular, I’m skeptical that it means much and very skeptical that it might be a useful trigger to action. I can see value – and perhaps even triggers to action – in learning that there are differing perceptions of organization mission or differing perceptions around how aligned the organization is. It’s easy to fool yourself with a view from the top and this type of cultural survey instrument might help correct attitudes of corporate complacency. It’s much less clear to me, however, that such measurement would be a useful part of a continuous program designed to improve analytics (or any other) culture.

I’d very much like to see measures of culture that are behavioral and amenable to continuous measurement, but at least so far I haven’t come across anything very interesting.

It may be that culture is one of those things that is so challenging to measure that the subsequent benefits in clarity and decision-making – at least outside the world of academia – aren’t worth the investment. Perhaps the best way to measure culture is by digital success. If it works, it’s working. You could easily take that point of view from this extended series on digital transformation and I don’t think it’s implausible.

Maybe we just haven’t found the right methods.

Or maybe I just haven’t read the right articles. Indeed, if you have thoughts on either of these issues (how to build or measure culture) or can point me to interesting research, I’d love to hear about  it.

Right now, I have more ideas about how to build analytics culture than I do how to measure your success building it. Some of those ideas are implicit in the recommendations I’ve been making about the integration of analytics, the use of voice-of-customer and the role of experimentation, but they don’t end there.

In my next post, I’ll explain some concrete actions the enterprise can take to build analytics culture and why I think they are both more practical and more impactful than most of what passes for culture building.

At the same time, I’m going to be thinking more about measuring culture and I hope – eventually – to have something interesting to say about that too.

 

Measuring the Digital World is OFFICIALLY RELEASED. Order here – and just let me know if you’d like an autographed copy!