Tag Archives: analytics culture

The Agile Organization

I’ve been meandering through an extended series on digital transformation: why it’s hard, where things go wrong, and what you need to be able to do to be successful. In this post, I intend to summarize some of that thinking and describe how the large enterprise should organize itself to be good at digital.

Throughout this series, I’ve emphasized the importance of being able to make good decisions in the digital realm. That is, of course, the function of analytics and its my own special concerns when it comes to digital. But there are people who will point out  that decision-making is not the be all and end all of digital excellence. They might suggest that being able to execute is important too.

If you’re a football fan, it’s easy to see the dramatic difference between Peyton Manning – possibly the finest on-field decision-maker in the history of the game – with a good arm and without. It’s one thing to know where to throw the ball on any given play, quite another to be able to get it there accurately. If that wasn’t the case, it’s probably true that many of my readers would be making millions in the NFL!

On the other hand, this divide between decision-making and execution tends to break down if you extend your view to the entire organization. If the GM is doing the job properly, then the decision about which quarterbacks to draft or sign will appropriately balance their physical and decision-making skills. That’s part of what’s involved in good GM decisioning. Meanwhile, the coach has an identical responsibility on a day-to-day basis. A foot injury may limit Peyton to the point where his backup becomes a better option. Then it may heal and the pendulum swings back. The organization makes a series of decisions and if it can make all of those decisions well, then it’s hard to see how execution doesn’t follow along.

If, as an organization, I can make good decisions about the strategy for digital, the technology to run it on, the agencies to build it, the people to optimize it, the way to organize it, and the tactics to drive it, then everything is likely to be pretty good.

Unfortunately, it’s simply not the case that the analytics, organization and capabilities necessary to make good decisions across all these areas are remotely similar. To return to my football analogy, it’s clear that very few organizations are setup to make good decisions in every aspect of their operations. Some organizations excel at particular functions (like game-planning) but are very poor at drafting. Indeed, sometimes success in one-area breeds disaster in another. When a coach like Chip Kelly becomes very successful in his role, there is a tendency for the organization to expand that role so that the coach has increasing control over personnel. This almost always works badly in practice. Even knowing it will work badly doesn’t prevent the problem. Since the coach is so important, it may be that an organization will cede much control over personnel to a successful coach even when everyone (except the coach) believes it’s a bad idea.

If you don’t think similar situations arise constantly in corporate America, you aren’t paying attention.

In my posts in this series, I’ve mapped out the capabilities necessary to give decision-makers the information and capabilities they need to make good decisions about digital experiences. I haven’t touched on (and don’t really intend to touch on) broader themes like deciding who the right people to hire are or what kind of measurement, analysis or knowledge is necessary to make those sorts of meta-decisions.

There are two respects, however, in which I have tried to address at least some of these meta-concerns about execution. First, I’ve described why it is and how it comes to pass that most enterprises don’t use analytics to support strategic decision-making. This seems like a clear miss and a place where thoughtful implementation of good measurement, particularly voice-of-customer measurement of the type I’ve described, should yield high returns.

Second, I took a stab at describing how organizations can think about and work toward building an analytics culture. In these two posts, I argue that most attempts at culture-building approach the problem backwards. The most common culture-building activities in the enterprise are all about “talk”. We talk about diversity. We talk about ethics. We talk about being data-driven in our decision-making. I don’t think this talk adds up to much. I suggest that culture is formed far more through habit than talk; that if an organization wants to build an analytics culture, it needs to find ways to “do” analytics. The word may proceed the deed, but it is only through the force of the deed (good habits) that the word becomes character/culture. This may seem somewhat obvious – no, it is obvious – but people somehow manage to miss the obvious far too often. Those posts don’t just formulate the obvious, they also suggest a set of activities that are particularly efficacious in creating good enterprise habits of decision-making. If you care about enterprise culture and you haven’t already done so, give them a read.

For some folks, however, all these analytics actions miss the key questions. They don’t want to know what the organization should do. They want to know how the organization should work. Who owns digital? Who owns analytics? What lives in a central organization? What lives in a business unit? Is digital a capability or a department?

In the context of the small company, most of these questions aren’t terribly important. In the large enterprise, they mean a lot. But acknowledging that they mean a lot isn’t to suggest that I can answer them – or at least most of them.

I’m skeptical that there is an answer for most of these questions. At least in the abstract, I doubt there is one right organization for digital or one right degree of centralization. I’ve had many conversations with wise folks who recognize that their organizations seem to be in constant motion – swinging like an enormous pendulum between extremes of centralization followed by extremes of decentralization.

Even this peripatetic motion – which can look so irrational from the inside – may make sense. If we assume that centralization and decentralization have distinct advantages, then not only might it be true that changing circumstances might drive a change in the optimal configuration, but it might even be true that swinging the organization from one pole to the other might help capture the benefits of each.

That seems unlikely, but you never know. There is sometimes more logic in the seemingly irrational movements of the crowd than we might first imagine.

Most questions about digital organization are deeply historical. They depend on what type of company you are, in what of market, with what culture and what strategic imperatives. All of which is, of course, Management 101. Obvious stuff that hardly needs to be stated.

However, there are some aspects of digital about which I am willing to be more directive. First, that some balance between centralization and decentralization is essential in analytics. The imperative for centralization is driven by these factors: the need for comparative metrics of success around digital, the need for consistent data collection, the imperatives of the latest generation of highly-complex IT systems, and the need/desire to address customers across the full spectrum of their engagement with the enterprise. Of these, the first and the last are primary. If you don’t need those two, then you may not care about consistent data collection or centralized data systems (this last is debatable).

On the other hand, there are powerful reasons for decentralization of which the biggest is simply that analytics is best done as close to the decision-making as possible. Before the advent of Hadoop, I would have suggested that the vast majority of analytics resources in the digital space be decentralized. Hadoop makes that much harder. The skills are much rarer, the demands for control and governance much higher, and the need for cross-domain expertise much greater in this new world.

That will change. As the open-source analytics stack matures and the market over-rewards skilled practitioners – drawing in more folks, it will become much easier to decentralize again. This isn’t the first time we’ve been down the IT path that goes from centralization to gradual diffusion as technologies become cheaper, easier, and better supported.

At an even more fundamental level than the question of centralization lives the location and nature of digital. Is digital treated as a thing? Is it part of Marketing? Or Operations? Or does each thing have a digital component?

I know I should have more of an opinion about this, but I’m afraid that the right answers seem to me, once again, to be local and historical. In a digital pure-play, to even speak of digital as a thing seems absurd. It’s the core of the company. In a gas company, on the other hand, digital might best be viewed as a customer service channel. In a manufacturer, digital might be a sub-function of brand marketing or, depending on the nature of the digital investment and its importance to the company, a unit unto-itself.

Obviously, one of the huge disadvantages to thinking of digital as a unit unto-itself is how it can then interact correctly with the non-digital functions that share the same purpose. If you have digital customer servicing and non-digital customer servicing, does it really make sense to have one in a digital department and the other as a customer-service department?

There is a case, however, for incubating digital capabilities within a small compact, standalone entity that can protect and nourish the digital investment with a distinct culture and resourcing model. I get that. Ultimately, though, it seems to me that unless digital OWNS an entire function, separating that function across digital and non-digital lines is arbitrary and likely to be ineffective in an omni-channel world.

But here’s the flip side. If you have a single digital property and it shares marketing and customer support functions, how do you allocate real-estate and who gets to determine key things like site structure? I’ve seen organizations where everything but the homepage is owned by somebody and the home page is like Oliver Twist. “Home page for sale, does anybody want one?”

That’s not optimal.

So the more overlap there needs to be between the functions and your digital properties, the more incentive you have to build a purely digital organization.

No matter what structure you pick, there are some trade-offs you’re going to have to live with. That’s part of why there is no magic answer to the right organization.

But far more important than the precise balance you strike around centralization or even where you put digital is the way you organize the core capabilities that belong to digital. Here, the vast majority of enterprises organize along the same general lines. Digital comprises some rough set of capabilities including:

  • IT
  • Creative
  • Marketing
  • Customer
  • UX
  • Analytics
  • Testing
  • VoC

In almost every company I work with, each of these capabilities is instantiated as a separate team. In most organizations, the IT folks are in a completely different reporting structure all the way up. There is no unification till you hit the C-Suite. Often, Marketing and Creative are unified. In some organizations, all of the research functions are unified (VoC, analytics) – sometimes under Customer, sometimes not. UX and Testing can wind up almost anywhere. They typically live under the Marketing department, but they can also live under a Research or Customer function.

None of this, to me, makes any sense.

To do digital well requires a deep integration of these capabilities. What’s more, it requires that these teams work together on a consistent basis. That’s not the way it’s mostly done.

Almost every enterprise I see not only siloes these capabilities, but puts in place budgetary processes that fund each digital asset as a one-time investment and which requires pass-offs between teams.

That’s probably not entirely clear so let me give some concrete examples.

You want to launch a new website. You hire an agency to design the Website. Then your internal IT team builds it. Now the agency goes away. The folks who designed the website no longer have anything to do with it. What’s more, the folks who built it get rotated onto the next project. Sometimes, that’s all that happens. The website just sits there – unimproved. Sometimes the measurement team will now pick it up. Keep in mind that the measurement team almost never had anything to do with the design of the site in the first place. They are just there to report on it. Still, they measure it and if they find some problem, who do they give it to?

Well, maybe they pass it on to the UX team or the testing team. Those teams, neither of which have ever worked with the website or had anything to do with its design are now responsible for implementing changes on it. And, of course, they will be working with developers who had nothing to do with building it.

Meanwhile, on an entirely separate track, the customer team may be designing a broader experience that involves that website. They enlist the VoC team to survey the site’s users and find out what they don’t like about it. Neither team (of course) had anything to do with designing or building the site.

If they come to some conclusion about what they want the site to do, they work with another(!) team of developers to implement their changes. That these changes may be at cross-purposes to the UX team’s changes or the original design intent is neither here nor there.

Does any of this make sense?

If you take continuous improvement to heart (and you should because it is the key to digital excellence), you need to realize that almost everything about the way your digital organization functions is wrong. You budget wrong and you organize wrong.

[Check out my relatively short (20 min) video on digital transformation and analytics organization – it’s the perfect medium for distributing this message through your enterprise!]

Here’s my simple rule about building digital assets. If it’s worth doing, it’s worth improving. Nothing you build will ever be right the first time. Accept that. Embrace it. That means you budget digital teams to build AND improve something. Those teams don’t go away. They don’t rotate. And they include ALL of the capabilities you need to successfully deliver digital experiences. Your developers don’t rotate off, your designers don’t go away, your VoC folks aren’t living in a parallel universe.

When you do things this way, you embody a commitment to continuous improvement deeply into your core organizational processes. It almost forces you to do it right. All those folks in IT and creative will demand analytics and tests to run or they won’t have anything to do.

That’s a good thing.

This type of vertical integration of digital capabilities is far, far more important than the balance around centralization or even the home for digital. Yet it gets far less attention in most enterprise strategic discussions.

The existence or lack of this vertical integration is the single most important factor in driving analytics into digital. Do it right, and you’ll do it well. Do what everyone else does and…well…it won’t be so good.

Building Analytics Culture – One Decision at a Time

In my last post, I argued that much of what passes for “building culture” in corporate America is worthless. It’s all about talk. And whether that talk is about diversity, ethics or analytics, it’s equally arid. Because you don’t build culture by talking. You build culture though actions. By doing things right (or wrong if that’s the kind of culture you want). Not only are words not effective in building culture, they can be positively toxic. When words and actions don’t align, the dishonesty casts other – possibly more meaningful words – into disrepute. Think about which is worse – a culture where bribery is simply the accepted and normal way of getting things done (and is cheerfully acknowledged) and one where bribery is ubiquitous but is cloaked behind constant protestations of disinterest and honesty? If you’re not sure about your answer, take it down to a personal level and ask yourself the same question. Do we not like an honest villain better than a hypocrite? If hypocrisy is the compliment vice pays to virtue, it is a particularly nasty form of flattery.

What this means is that you can’t build an analytics culture by telling people to be data driven. You can’t build an analytics culture by touting the virtues of analysis. You can’t even build an analytics culture by hiring analysts. You build an analytics culture by making good (data-driven) decisions.

That’s the only way.

But how do you get an organization to make data-driven decisions? That’s the art of building culture. And in that last post, I laid out seven (a baker’s half-dozen?) tactics for building good decision-making habits: analytic reporting, analytics briefing sessions, hiring a C-Suite analytics advisor, creating measurement standards, building a rich meta-data system for campaigns and content, creating a rapid VoC capability and embracing a continuous improvement methodology like SPEED.

These aren’t just random parts of making analytic decisions. They are tactics that seem to me particularly effective in driving good habits in the organization and building the right kind of culture. But seven tactics doesn’t nearly exhaust my list. Here’s another set of techniques that are equally important in helping drive good decision-making in the organization (my original list wasn’t in any particular order so it’s not like the previous list had all the important stuff):

Yearly Agency Performance Measurement and Reviews

What it is: Having an independent annual analysis of your agency’s performance. This should include review of goals and metrics, consideration of the appropriateness of KPIs and analysis of variation in campaign performance along three dimensions (inside the campaign by element, over time, and across campaigns). This must not be done by the agency itself (duh!) or by the owners of the relationship.

Why it builds culture: Most agencies work by building strong personal relationships. There are times and ways that this can work in your favor, but from a cultural perspective it both limits and discourages analytic thinking. I see many enterprises where the agency is so strongly entrenched you literally cannot criticize them. Not only does the resulting marketing nearly always suck, but this drains the life out of an analytics culture. This is one of many ways in which building an analytic culture can conflict with other goals, but here I definitely believe analytics should win. You don’t need a too cozy relationship with your agency. You do need objective measurement of their performance.

 

Analytics Annotation / Collaboration Tool like Insight Rocket

What it is: A tool that provides a method for rich data annotation and the creation and distribution of analytic stories across the analytics team and into the organization. In Analytic Reporting, I argued for a focus on democratizing knowledge not data. Tools like Insight Rocket are a part of that strategy, since they provide a way to create and rapidly disseminate a layer of meaning on top of powerful data exploration tools like Tableau.

Why it builds culture: There aren’t that many places where technology makes much difference to culture, but there are a few. As some of my other suggestions make clear, you get better analytics culture the more you drive analytics across and into the organization (analytic reporting, C-Suite Advisor, SPEED, etc.). Tools like Insight Rocket have three virtues: they help disseminate analytics thinking not just data, they boost analytics collaboration making for better analytic teams, and they provide a repository of analytics which increases long-term leverage in the enterprise. Oh, here’s a fourth advantage, they force analysts to tell stories – meaning they have to engage with the business. That makes this piece of technology a really nice complement to my suggestion about a regular cadence of analytics briefings and a rare instance of technology deepening culture.

 

In-sourcing

What it is: Building analytics expertise internally instead of hiring it out and, most especially, instead of off-shoring it.

Why it builds culture: I’d be the last person to tell you that consulting shouldn’t have a role in the large enterprise. I’ve been a consultant for most of my working life. But we routinely advise our clients to change the way they think about consulting – to use it not as a replacement for an internal capability but as a bootstrap and supplement to that capability. If analytics is core to digital (and it is) and if digital is core to your business (which it probably is), then you need analytics to be part of your internal capability. Having strong, capable, influential on-shore employees who are analysts is absolutely necessary to analytics culture. I’ll add that while off-shoring, too, has a role, it’s a far more effective culture killer than normal consulting. Off-shoring creates a sharp divide between the analyst and the business that is fatal to good performance and good culture on EITHER side.

 

Learning-based Testing Plan

What it is: Testing plans that include significant focus on developing best design practices and resolving political issues instead of on micro-optimizations of the funnel.

Why it works: Testing is a way to make decisions. But as long as its primary use is to decide whether to show image A or image B or a button in this color or that color, it will never be used properly. To illustrate learning-based testing, I’ve used the example of video integration – testing different methods of on-page video integration, different lengths, different content types and different placements against each key segment and use-case to determine UI parameters for ALL future videos. When you test this way, you resolve hundreds of future questions and save endless future debate about what to do with this or that video. That’s learning based testing. It’s also about picking key places in the organization where political battles determine design – things like home page real-estate and the amount of advertising load on a page – and resolving them with testing; that’s learning based testing, too. Learning based testing builds culture in two ways. First, in and of itself, it drives analytic decision-making. Almost as important, it demonstrates the proper role of experimentation and should help set the table for decision-makers tests to ask for more interesting tests.

 

Control Groups

What it is: Use of control groups to measure effectiveness whenever new programs (operational or marketing) are implemented. Control groups use small population subsets chosen randomly from a target population who are given either no experience or a neutral (existing) experience instead. Nearly all tests feature a baseline control group as part of the test, but the use of control groups transcends A/B testing tools. Use of control groups common in traditional direct response marketing and can be used in a wide variety of on and offline contexts (most especially as I recently saw Elea Feit of Drexel hammer home at the DAA Symposium – as a much more effective approach to attribution).

Why it works: One of the real barriers to building culture is a classic problem in education. When you first teach students something, they almost invariably use it poorly. That can sour others on the value of the knowledge itself. When people in an organization first start using analytics, they are, quite inevitably, going to fall into the correlation trap. Correlation is not causation. But in many cases, it sure looks like it is and this leads to many, many bad decisions. How to prevent the most common error in analytics? Control groups. Control groups build culture because they get decision-makers thinking the right way about measurement and because they protect the organization from mistakes that will otherwise sour the culture on analytics.

 

Unified Success Framework

What it is: A standardized, pre-determined framework for content and campaign success measurement that includes definition of campaign types, description of key metrics for those types, and methods of comparing like campaigns on an apples-to-apples basis.

Why it works: You may not be able to make the horse drink, but leading it to water is a good start. A unified success framework puts rigor around success measurement – a critical part of building good analytics culture. On the producer side, it forces the analytics team to make real decisions about what matters and, one hopes, pushes them to prove that proxy measures (such as engagement) are real. On the consumer side, it prevents that most insidious destroyer of analytics culture, the post hoc success analysis. If you can pick your success after the game is over, you’ll always win.

 

The Enterprise VoC Dashboard

What it is: An enterprise-wide state-of-the-customer dashboard that provides a snapshot and trended look at how customer attitudes are evolving. It should include built in segmentation so that attitudinal views are ALWAYS shown sliced by key customer types with additional segmentation possible.

Why it works: There are so many good things going on here that it’s hard to enumerate them all. First, this type of dashboard is one of the best ways to distill customer-first thinking in the organization. You can’t think customer-first, until you know what the customer thinks. Second, this type of dashboard enforces a segmented view of the world. Segmentation is fundamental to critical thinking about digital problems and this sets the table for better questions and better answers in the organization. Third, opinion data is easier to absorb and use than behavioral data, making this type of dashboard particularly valuable for encouraging decision-makers to use analytics.

 

Two-Tiered Segmentation

What it is: A method that creates two-levels of segmentation in the digital channel. The first level is the traditional “who” someone is – whether in terms of persona or business relationship or key demographics. The second level captures “what” they are trying to accomplish. Each customer touch-point can be described in this type of segmentation as the intersection of who a visitor is and what their visit was for.

Why it works: Much like the VoC Dashboard, Two-Tiered Segmentation makes for dramatically better clarity around digital channel decision-making and evaluation of success. Questions like ‘Is our Website successful?’ get morphed into the much more tractable and analyzable question ‘Is our Website successful for this audience trying to do this task?’. That’s a much better question and big part of building analytics culture is getting people to ask better questions. This also happens to be the main topic of my book “Measuring the Digital World” and in it you can get a full description of both the power and the methods behind Two-Tiered Segmentation.

 

I have more, but I’m going to roll the rest into my next post on building an agile organization since they are all deeply related to the integration of capabilities in the organization. Still, that’s fifteen different tactics for building culture. None of which include mission statements, organizational alignment or C-Level support (okay, Walking the Walk is kind of that but not exactly and I didn’t include it in the fifteen) and none of which will take place in corporate retreats or all-hands conferences. That’s a good thing and makes me believe they might actually work.

Ask yourself this: is it possible to imagine an organization that does even half these things and doesn’t have a great analytics culture? I don’t think it is. Because culture just is the sum of the way your organization works and these are powerful drivers of good analytic thinking. You can imagine an organization that does these things and isn’t friendly, collaborative, responsible, flat, diverse, caring or even innovative. There are all kinds of culture, and good decision-making isn’t the only aspect of culture to care about*. But if you do these things, you will have an organization that makes consistently good decisions.

*Incidentally, if you want to build culture in any of these other ways, you have to think about similar approaches. Astronomers have a clever technique for seeing very faint objects called averted vision. The idea is that you look just to the side of the object if you want to get the most light-gathering power from your eyes. It’s the same with culture. You can’t tackle it head-on by talking about it. You have to build it just a little from the side!

Practical Steps to Building an Analytics Culture

Building an analytics culture in the enterprise is incredibly important. It’s far more important than any single capability, technology or technique. But building culture isn’t easy. You can’t buy it. You can’t proclaim it. You can’t implement it.

There is, of course, a vast literature on building culture in the enterprise. But if the clumsy, heavy-handed, thoroughly useless attempts to “build culture” that I’ve witnessed over the course of my working life are any evidence, that body of literature is nearly useless.

Here’s one thing I know for sure: you don’t build culture by talk. I don’t care whether it’s getting teenagers to practice safe-sex or getting managers to use analytics, preaching virtue doesn’t work, has never worked and will never work. Telling people to be data-driven, proclaiming your commitment to analytics, touting your analytics capabilities: none of this builds analytics culture.

If there’s one thing that every young employee has learned in this era, it’s that fancy talk is cheap and meaningless. People are incredibly sophisticated about language these days. We can sit in front of the TV and recognize in a second whether we’re seeing a commercial or a program. Most of us can tell the difference between a TV show and movie almost at a glance. We can tune out advertising on a Website as effortlessly as we put on our pants. A bunch of glib words aren’t going to fool anyone. You want to know what the reaction is to your carefully crafted, strategic consultancy driven mission statement or that five year “vision” you spent millions on and just rolled out with a cool video at your Sales Conference? Complete indifference.

That’s if you’re lucky…if you didn’t do it really well, you got the eye-roll.

But it isn’t just that people are incredibly sensitive – probably too sensitive – to BS. It’s that even true, sincere, beautifully reasoned words will not build culture. Reading moral philosophy does not create moral students. Not because the words aren’t right or true, but because behaviors are, for the most part, not driven by those types of reasons.

That’s the whole thing about culture.

Culture is lived, not read or spoken. To create it, you have to ingrain it in people’s thinking. If you want a data-driven organization, you have to create good analytic habits. You have to make the organization (and you too) work right.

How do you do that?

You do it by creating certain kinds of process and behaviors that embed analytic thinking. Do enough of that, and you’ll have an analytic culture. I guarantee it. The whole thrust of this recent series of posts is that by changing the way you integrate analytics, voice-of-customer, journey-mapping and experimentation into the enterprise, you can drive better digital decision making. That’s building culture. It’s my big answer to the question of how you build analytics culture.

But I have some small answers as well. Here, in no particular order, are practical ways you can create importantly good analytics habits in the enterprise.

Analytic Reporting

What it is: Changing your enterprise reporting strategy by moving from reports to tools. Analytic models and forecasting allow you to build tools that integrate historical reporting with forecasting and what-if capabilities. Static reporting is replaced by a set of interactive tools that allow users to see how different business strategies actually play-out.

Why it build analytics culture: With analytics reporting, you democratize knowledge not data. It makes all the difference in the world. The analytic models capture your best insight into how a key business works and what levers drive performance. Building this into tools not only operationalizes the knowledge, it creates positive feedback loops to analytics. When the forecast isn’t right, everyone know it and the business is incented to improve its understanding and predictive capabilities. This makes for better culture in analytics consumers and analytics producers.

 

Cadence of Communications

What it is: Setting up regular briefings between analytics and your senior team and decision-makers. This can include review of dashboards but should primarily focus on answers to previous business questions and discussion of new problems.

Why it builds analytics culture: This is actually one of the most important things you can do. It exposes decision-makers to analytics. It makes it easy for decision-makers to ask for new research and exposes them to the relevant techniques. Perhaps even more important, it lets decision-makers drive the analytics agenda, exposes analysts to real business problems, and forces analysts to develop better communication skills.

 

C-Suite Advisor

What it is: Create an Analytics Minister-without-portfolio whose sole job is to advise senior decision-makers on how to use, understand and evaluate the analytics, the data and the decisions they get.

Why it builds analytics culture: Most senior executives are fairly ignorant of the pitfalls in data interpretation and the ins-and-outs of KPIs and experimentation. You can’t send them back to get a modern MBA, but you can give them a trusted advisor with no axe to grind. This not only raises their analytics intelligence, it forces everyone feeding them information to up their game as well. This tactic is also critical because of the next strategy…

 

Walking the Walk

What it is: Senior Leaders can talk tell they are blue in the face about data-driven decision-making. Nobody will care. But let a Senior Leader even once use data or demand data around a decision they are making and the whole organization will take notice.

Why it builds analytics culture: Senior leaders CAN and DO have a profound impact on culture but they do so by their behavior not their words. When the leaders at the top use and demand data for decisions, so will everyone else.

 

Tagging Standards

What it is: A clearly defined set of data collection specifications that ensure that every piece of content on every platform is appropriately tagged to collect a rich set of customer, content, and behavioral data.

Why it builds analytics culture: This ends the debate over whether tags and measurement are optional. They aren’t. This also, interestingly, makes measurement easier. Sometimes, people just need to be told what to do. This is like choosing which side of the road to drive on – it’s far more important that you have a standard that which side of the road you pick. Standards are necessary when an organization needs direction and coordination. Tagging is a perfect example.

 

CMS and Campaign Meta-Data

What it is: The definition of and governance around the creation of campaign and content meta-data. Every piece of content and every campaign element should have detailed, rich meta-data around the audience, tone, approach, contents, and every other element that can be tuned and analyzed.

Why it builds analytics culture: Not only is meta-data the key to digital analytics – providing the meaning that makes content consumption understandable, but rich meta-data definition guides useful thought. These are the categories people will think about when they analyze content and campaign performance. That’s as it should be and by providing these pre-built, populated categorizations, you’ll greatly facilitate good analytics thinking.

 

Rapid VoC

What it is: The technical and organizational capability to rapidly create, deploy and analyze surveys and other voice-of-customer research instruments.

Why it builds analytics culture: This is the best capability I know for training senior decision-makers to use research. It’s so cheap, so easy, so flexible and so understandable that decision-makers will quickly get spoiled. They’ll use it over and over and over. Well – that’s the point. Nothing builds analytics muscle like use and getting this type of capability deeply embedded in the way your senior team thinks and works will truly change the decision-making culture of the enterprise.

 

SPEED and Formal Continuous Improvement Cycles

What it is: The use of a formal methodology for digital improvement. SPEED provides a way to identify the best opportunities for digital improvement, the ways to tackle those opportunities, and the ability to measure the impact of any changes. It’s the equivalent of Six Sigma for digital.

Why it builds analytics culture: Formal methods make it vastly easier for everyone in the organization to understand how to get better. Methods also help define a set of processes that organizations can build their organization around. This makes it easier to grow and scale. For large enterprises, in particular, it’s no surprise that formal methodologies like Six Sigma have been so successful. They make key cultural precepts manifest and attach processes to them so that the organizational inertia is guided in positive directions.

 

Does this seem like an absurdly long list? In truth I’m only about half-way through. But this post is getting LONG. So I’m going to save the rest of my list for next week. Till then, here’s some final thoughts on creating an analytics culture.

The secret to building culture is this: everything you do builds culture. Some things build the wrong kind of culture. Some things the right kind. But you are never not building culture. So if you want to build the right culture to be good at digital and decision-making, there’s no magic elixir, no secret sauce. There is only the discipline of doing things right. Over and over.

That being said, not every action is equal. Some foods are empty of nutrition but empty, too, of harm. Others positively destroy your teeth or your waistline. Still others provide the right kind of fuel. The things I’ve described above are not just a random list of things done right, they are the small to medium things that, done right, have the biggest impacts I’ve seen on building a great digital and analytics culture. They are also targeted to places and decisions which, done poorly, will deeply damage your culture.

I’ll detail some more super-foods for analytics culture in my next post!

 

[Get your copy of Measuring the Digital World – the definitive guide to the discipline of digital analytics – to learn more].

Building and Measuring Analytics Culture

Culture – how to measure it and how to build it – has been much on my mind lately.

At least when it comes to the measurement part – something we don’t normally have to do – the reason is…different.

My Counseling Family team is going to be doing another fun project – participating in the 538 Oscar Modelling challenge – and our approach is to try and model each nominated movies’ fit to the current Hollywood zeitgeist. The theory behind the approach is simple. It seems fairly reasonable to suggest that while qualitative differences between nominated movies and non-nominated movies might be fairly large, when it comes to selecting between a small-set of relatively high-quality choices the decision is fairly arbitrary. In such situations, political and personal concerns will play a huge role, but so, presumably, will simple preference. Our thought is that preference is likely more a function of worldview than artistry – in much the same manner that people watching a political debate almost always believe that the person who most nearly echoes their opinion-set won. But how do you measure the cultural fit of a movie to a community? It’s no easy task. Our challenges include deciding how to capture a cultural zeitgeist in general, how to focus that capture on Hollywood, how to capture the spirit and themes of each movie, and how to score a match. And, of course, there is the challenge that the Hollywood zeitgeist might be more hot air than great wind and altogether too thin to be captured!

Should be interesting.

Equally, though, I have been thinking a lot about how to build culture – specifically when it comes to analytics. A constant theme running through my recent posts on enterprise transformation has been the challenge of doing digital well in the large enterprise. As I’ve repeatedly pointed out, that challenge is less a matter of technology or people, as it is of organization and culture. Enterprises have and understand the core capabilities to do digital well. Listen to my recent video series and you’ll hear this refrain:

  • The typical enterprise does analytics, they just don’t use it.
  • They have testing, they just don’t learn.
  • They talk voice-of-customer, they just don’t listen.
  • They do Agile, but they aren’t.

To fix those problems requires changes in the way the organization is structured and he way those capabilities are done. Even more, it requires changes in the way people need to think. That’s culture.

So I’ve been pondering how to build a culture of analytics driven decision-making and, of course, how to measure whether you’re successful. Now while my particular problem – building the proper sort of enterprise for digital transformation – may not be the standard one, the problems and challenges of building culture and measuring culture are hardly unique. And since this isn’t my specialty at all, I’ve been trying to read up on common approaches.

By and large, it’s pretty disappointing.

From a building culture perspective, so much of the literature seems to focus on top-down approaches: ways that a senior leader can communicate and encourage cultural change. That’s clearly important and I’m not going to dispute both the need and the challenges around top-down change. But this type of approach often seems to degenerate into leadership self-help advice or cheerleading, neither of which seem likely to be useful. Nor am I much impressed by the idea of carefully crafting a mission statement and promulgating it through the organization. I’ve sat in more than one excruciating mission statement meeting and all I ever got out of it was a sore butt. I’ve said before that if you have to create an innovation capability in your enterprise, you’re already defeated. And if you’re looking to a carefully crafted corporate mission statement to provide a shared vision, you’ve already lost your way.

I wasn’t much more impressed with attempts to measure culture.

It’s hard, obviously.

Most approaches seem to rely on survey instruments and involve categorization of the organization into pre-defined types (e.g. hierarchical) or score a number of separate variables (e.g. perceived alignment of vision). This seems like a close corollary to personality measurement tests. Lots of people love these tests, but they don’t strike me as particularly rigorous.

With regards to categorization, in particular, I’m skeptical that it means much and very skeptical that it might be a useful trigger to action. I can see value – and perhaps even triggers to action – in learning that there are differing perceptions of organization mission or differing perceptions around how aligned the organization is. It’s easy to fool yourself with a view from the top and this type of cultural survey instrument might help correct attitudes of corporate complacency. It’s much less clear to me, however, that such measurement would be a useful part of a continuous program designed to improve analytics (or any other) culture.

I’d very much like to see measures of culture that are behavioral and amenable to continuous measurement, but at least so far I haven’t come across anything very interesting.

It may be that culture is one of those things that is so challenging to measure that the subsequent benefits in clarity and decision-making – at least outside the world of academia – aren’t worth the investment. Perhaps the best way to measure culture is by digital success. If it works, it’s working. You could easily take that point of view from this extended series on digital transformation and I don’t think it’s implausible.

Maybe we just haven’t found the right methods.

Or maybe I just haven’t read the right articles. Indeed, if you have thoughts on either of these issues (how to build or measure culture) or can point me to interesting research, I’d love to hear about  it.

Right now, I have more ideas about how to build analytics culture than I do how to measure your success building it. Some of those ideas are implicit in the recommendations I’ve been making about the integration of analytics, the use of voice-of-customer and the role of experimentation, but they don’t end there.

In my next post, I’ll explain some concrete actions the enterprise can take to build analytics culture and why I think they are both more practical and more impactful than most of what passes for culture building.

At the same time, I’m going to be thinking more about measuring culture and I hope – eventually – to have something interesting to say about that too.

 

Measuring the Digital World is OFFICIALLY RELEASED. Order here – and just let me know if you’d like an autographed copy!