Tag Archives: digital analytics

Segmentation is the Key to Marketing Analytics

The equation in retail today is simple. Evolve or die. But if analytics is one of the core tools to drive successful  evolution, we have a problem. From an analytics perspective, we’re used to a certain view of the store. We know how many shoppers we get (door counting) and we know what we sold. We know how many Associates we had. We (may) know what they sold. This isn’t dog food. If you had to pick a very small set of metrics to work with to optimize the store, most of these would belong. But we’re missing a lot, too. We’re missing almost any analytic detail around the customer journey in the store. That’s a particularly acute lack (as I noted in my last post) in a world where we’re increasingly focused on delivering (and measuring) better store experiences. In a transaction-focused world, transactions are the key measures. In an experience world? Not so much. So journey measurement is a critical component of today’s store optimization. And there’s the problem. Because the in-store measurement systems we have available are tragically limited. DM1, our new platform, is designed to fix that problem.

People like to talk about analytics as if it just falls out of data. As if analysts can take any data set and any tool and somehow make a tasty concoction. It isn’t true. Analytics is hard work. A really great analyst can work wonders, but some data sets are too poor to use. Some tools lock away the data or munge it beyond recognition.  And remember, the most expensive part of analytics is the human component. Why arm those folks with tools that make their job slow and hard? Believe me, when it comes to getting value out of analytics, it’s hard enough with good tools and good data. You can kid yourself that it’s okay to get by with less. But at some point you’re just flushing your investment and your time away. In two previous posts, I called out a set of problems with the current generation of store customer measurement systems. Sure, every system has problems – no analytics tool is perfect. But some problems are much worse than others. And some problems cripple or severely limit our ability to use journey data to drive real improvement.

When it comes to store measurement tools, here are the killers: lack of segmentation, lack of store context, inappropriate analytics tools, inability to integrate Associate data and interactions, inability to integrate into the broader analytics ecosystem and an unwillingness to provide cleaned, event-level data that might let analysts get around these other issues.

Those are the problems we set out to solve when we built DM1.

Let’s start with Segmentation. Segmentation can sound like a fancy add-on. A nice to have. Important maybe, but not critical.

That isn’t right. Marketing analytics just is segmentation. There is no such thing as an average customer. And when it comes to customer journey’s, trying to average them makes them meaningless. One customer walks in the door, turns around and leaves. Another lingers for twenty minutes shopping intensively in two departments. Averaging the two? It means nothing.

Almost every analysis you’ll do, every question you’ll try to answer about store layout, store merchandising, promotion performance, or experience will require you to segment. To be able to look at the just the customers who DID THIS. Just the customers who experienced THAT.

Think about it. When you build a new experience, and want to know how it changed behavior you need to segment. When you change a greeting script or adjust a presentation and want to know if it improved store performance you need segmentation. When you change Associate interaction strategies and want to see how it’s impacting customer behavior you need segmentation. When you add a store event and want to see how it impacted key sections, you need segmentation. When you want to know what other stuff shoppers interested in a category cared about, you need segmentation. When you want to know how successful journeys differed from unsuccessful ones, you need segmentation. When you want to know what happens with people who do store pickup or returns, you need segmentation.

In other words, if you want to use customer journey tracking tools for tracking customer journeys, you need segmentation.

If your tool doesn’t provide segmentation and it doesn’t give the analyst access to the data outside it’s interface, you’re stuck. It doesn’t matter how brilliant you are. How clever. Or how skilled. You can’t manufacture segmentation.

Why don’t most tools deliver segmentation?

If it’s so important, why isn’t it there? Supporting segmentation is actually kind of hard. Most reporting systems work by aggregating the data. They add it up by various dimensions so that it can be collapsed into easily accessible chunks delivered up into reports. But when you add segmentation into the mix, you have to chunk every metric by every possible combination of segments. It’s messy and it often expands the data so much that reports take forever to run. That’s not good either.

We engineered DM1 differently. In DM1, all the data is stored in memory. What does that mean? You know how on your PC, when you save something to disk or first load it from the hard drive it takes a decent chunk of time? But once it’s loaded everything goes along just fine? That’s because memory is much faster than disk. So once your PowerPoint or spreadsheet is loaded into memory, things run much faster. With DM1, your entire data set is stored in-memory. Every record. Every journey. And because it’s in-memory, we can pass all your data for every query, really fast. But we didn’t stop there. When you run a query on DM1, that query is split up into lots of chunks (called threads) each of which process its own little range of data – usually a day or two. Then they combine all the answers together and deliver them back to you.

That means that not only does DM1 deliver reports almost instantaneously, it means we can run even pretty complex queries without pre-aggregating anything and without having to worry about the performance. Things like…segmentation.

Segmentation and DM1

In DM1, you can segment on quite a few different things. You can segment on where in the store the shopper spent time. You can segment on how much time they spent. You can segment on their total time in the store. You can segment on when they shopped (both by day of week and time of day). You can segment on whether they purchased or not. And even whether they interacted with an Associate.

If, for example, you want to understand potential cross-sells, you can apply a segment that selects only visitors who spent a significant amount of time shopping in a section or department. Actually, this undersells the capability because it’s in no way limited to any specific type of store area. You can segment on any store area down to the level of accuracy achieved by the collection architecture.

What’s more, DM1 keeps track of historical meta-data for every area of the store. Meaning that even if you changed, moved or re-sized an area of the store, DM1 still tracks and segments on it appropriately.

So if you want to see what else shoppers who looked at, for example, Jackets also considered, you can simply apply the segmentation. It will work correctly no matter how many times the area was re-defined. It will work even in store roll-ups with fundamentally different store types. And with the segment applied, you can view any DM1 visualization, chart or table. So you can look at where else Jacket Shoppers passed through, where they lingered, where they engaged more deeply, what else they were likely to buy, where they exited from, where they went first, where they spent the most time, etc. etc. You can even answer questions such as whether shoppers in Jackets were more or less likely to interact with Sales Associates in that section or another.

Want to see if Jacket shoppers are different on weekdays and weekends? If transactors are different from browsers? If having an Associate interaction significantly increases browse time? Well, DM1 let’s you stack segments. So you can choose any other filter type and apply it as well. I think the Day and Time part segmentation’s are particularly cool (and unusual). They let you seamlessly focus on morning shoppers or late afternoon, weekend shoppers or even just shoppers who come in over lunchtime. Sure, with door-counting you know your overall store volume. But with day and time-part segmentation you know volume, interest, consideration, and attribution for every measured area of the store and every type of customer for every hour and day of week.

DM1’s segmentation capability makes it easy to see whether merchandise is grouped appropriately. How different types of visitor journeys play out. Where promotional opportunities exist. And how and where the flow of traffic contradicts the overall store layout or associate plan. For identified shoppers, it also means you can create extraordinarily rich behavioral profiles that capture in near real-time what a shopper cares about right now.

It comes down to this. Without segmentation, analytics solutions are just baby toys. Segmentation is what makes them real marketing tools.

The Roadmap

DM1 certainly delivers far more segmentation than any other product in this space. But it’s still quite a bit short of what I’d like to deliver. I mean it when I say that segmentation is the heart and soul of marketing analytics. A segmentation capability can never be too robust.

Not only do we plan to add even more basic segmentation options to DM1, we’ve also roadmapped a full segmentation builder (of the sort that the more recent generation of digital analytics tools include). Our current segmentation interface is simple. Implied “ors” within a category and implied “ands” across segmentation types. That’s by far the most common type of segmentation analysts use. But it’s not the only kind that’s valuable. Being able to apply more advanced logic and groupings, customized thresholds, and time based concepts (visited before / after) are all valuable for certain types of analysis.

I’ve also roadmapped basic machine learning to create data-driven segmentations and a UI that provides a more persona-based approach to understanding visitor types and tracking them as cohorts.

The beauty of our underlying data structures is that none of this is architecturally a challenge. Creating a good UI for building segmentations is hard. But if you can count on high performance processing event level detail in your queries (and by high-performance I mean sub-second – check out my demos if you don’t believe me), you can support really robust segmentation without having to worry about the data engine or the basic performance of queries. That’s a luxury I plan to take full advantage of in delivering a product that segments. And segments. And segments again.

In-Store Customer Journey Tracking: Can You Really Do This?

When I describe my new company Digital Mortar to folks, the most common reaction I get is: “Can you really do this?”

Depending on their level of experience in the field, that question has one of two meetings. If they haven’t used existing in-store customer tracking solutions, the question generally means: is the technology practical and is it actually OK to use it (i.e. does it violate privacy policies)? If they have experience with existing in-store customer tracking solutions what they mean is: “does your stuff actually work as opposed to the garbage I’ve been using?”

I’m going to tackle the first question today (is the technology practical and legal) and leave the second for next time.

Is the Technology Practical?

Yes. As my post last week made clear, the various technologies for in-store customer tracking have challenges. Data quality is a real problem. There are issues with positional accuracy, visitorization, journey tracking, and even basic reliability. This is still cutting or even bleeding-edge technology. It’s like digital analytics circa 2005 not digital analytics 2017. But the technologies work. They can be deployed at scale and for a reasonable cost. The data they provide needs careful cleaning and processing. But so does almost any data set. If chosen appropriately and implemented well, the technologies provide data that is immediately valuable and can drive true continuous improvement in stores.

How Hard is it to Deploy In-Store Tracking?

Unfortunately, the in-store customer tracking technologies that don’t take at least some physical in-store installation (Wi-Fi Access Point based measurement and piggybacking off of existing security cameras) are also the least useful. Wi-Fi measurement is practical for arenas, airports, malls and other very large spaces with good Wi-Fi opt-in rates. For stores, it just doesn’t work well enough to support serious measurement. Security cameras can give you inaccurate, zone based counts and not much else.  Good in-store measurement will require you install either measurement focused cameras or passive sniffers. Of the two, sniffers are lot easier. You need a lot less of them. The placement is easier. The power and cabling requirements are lower. And they are quite a bit cheaper.

Either way, you should expect that it will take a few weeks to plan out the deployment for a new store layout. This will also involve coordination with your installation partner. Typically, the installation is done over one or two evenings. No special closing is required. With sniffers, the impact on the store environment is minimal. The devices are about the size of a deck of playing cards, can be painted to match the environment and any necessary wiring is usually hidden.

After a couple week shake down, you’ll have useable measurement and a plan you can roll out to other stores. Subsequent stores with the same or similar layout can be done as quickly as your installation partner will schedule them. And the post-install shake-down period is less.

So if you’re planning a Pilot project, here’s the timeline we use at Digital Mortar:

Month 1

  • Select Store Targets: We typically recommend 3 stores in a Pilot – one test and two control stores with similar layout and market.
  • Select Initial Store
  • Design Implementation for the Initial Store
  • Train Installation Partner
  • Do initial 1 store installation

Month 2

  • Test the initial installation and tune plan if necessary
  • Rollout to additional stores
  • Provide initial reporting
  • Targeted analysis to develop store testing plan

Month 3

  • Run initial test(s)
  • Analyze control vs. test
  • Assess findings and make optimization recommendations
  • Evaluate pilot program

This kind of Pilot timeline gets you live, production data early in Month 2 with initial store findings not long after. And it gets you real experience with the type of analysis, testing and continuous improvement cycle that make for effective business use.

Is it Ok to Use Location Analytics?

Yes. In-store tracking technology is already widely used. The majority of major retailers have tried it in various forms. There is an established community of interest focused on privacy and compliance in location analytics (the Future of Privacy Forum) that is supported by the major technology players (including giants like Cisco who do this routinely), major retailers, most of the vendors specific to the space, and plenty of heavy-hitters from a political standpoint. They’ve published guidelines (with input from the FTC) on how to do this. In many respects, the landscape is similar to digital. To do this right, you must have a documented and published privacy policy and you MUST adhere to your own privacy policy. If you offer an online opt-out, you must provide and honor an online opt-out. If you offer an in-store opt-out, you must provide it. To abide by the privacy standards, you must treat the visitor’s phone MAC address as PII information. You must not keep and match the visitor’s MAC address without opt-in and you should make sure that is hashed or transformed when stored.

And, of course, in the EU the tracking guidelines are significantly more restrictive.

In almost all respects, this is identical to the use of cookies in the digital world. And, as with the digital world, it’s not hard to see where the blurry lines are. Using in-store customer journey tracking to improve the store is non-controversial – the equivalent of using first-party cookies to analyze and improve a website. Using appropriately described opt-ins to track and market to identified customers is fine as long as the usage is appropriately disclosed. Selling customer information begins to touch on gray areas. And identifying and marketing to users without opt-in using any kind of device fingerprinting is very gray indeed.

Bottom line? In-store customer tracking and location analytics is ready for prime-time. The technologies work. They can be deployed reasonably and provide genuinely useful data. Deployment is non-trivial but is far from back-breaking. And the proper uses of the data are understood and widely accepted.

In my next post, I’ll take up the analytic problems that have crippled existing solutions and explain how we’ve solved them.

The Strategic Uses of In-Store Customer Journey Measurement

Store layout, promotion and staff optimization are the immediate and obvious ways to use the core data from customer journey analytics. Together, they comprise the “you” part of the equation – optimizing your operational and marketing strategies. But the uses of in-store tracking don’t end there. There’s tremendous strategic value in being to understand customer journeys – a lesson we’ve learned over and over again in digital. When it comes to omni-channel, store and experience design, and the integration of new technologies to the store, you simply can’t do the job right without in-store journey measurement.

I cover the fundamentals of why the in-store journey matters and how to build in-store customer journey data in this new post on Digital Mortar.

 

What is in-store customer journey data for?

In my last post, I described what in-store customer data is. But the really important question is this – what do you do with it? Not surprisingly, in-store customer movement data serves quite a range of needs that I’ll categorize broadly as store layout optimization, promotion planning and optimization, staff optimization, digital experience integration, omni-channel experience optimization, and customer experience optimization. I’ll talk about each in more detail, but you can think about it this way. Half of the utility of in-store customer journey measurement is focused on you – your store, your promotions and your staff. When you can measure the in-store customer journey better, you can optimize your marketing and operations more effectively. It’s that simple. The other half of the equation is about the customer. Mapping customer segments, finding gaps in the experience, figuring out how omni-channel journeys work. This kind of data may have immediate tactical implications but it’s real function is strategic. When you understand the customer experience better you can design better stores, better marketing campaigns, and better omni-channel strategies.

I’m going to cover each area in a short post, starting with the most basic and straightforward (store layout) and moving up into the increasingly strategic uses.

 

Store Layout and Merchandising Optimization

While bricks&mortar hasn’t had the kind of measurement and continuous improvement systems that drive digital, it has had a long, arduous and fruitful journey to maturity. Store analysts and manager know a lot. And while in-store customer journey measurement can fill in some pretty important gaps, you can do a lot of good store optimization based on a combination of well-understood best practices, basic door-counting, and PoS information. At a high-level, retailers understand how product placement drives sales, what the value of an end-cap/feature is, and how shelf placement matters. With PoS data, they also understand which products tend to be purchased together. So what’s missing? Quite a bit, actually, and some of it is pretty valuable. With customer journey data you can do true funnel analysis – not just at the store level (PoS/Door Counting) but at a detailed level within the store. You’ll see the opportunity each store area had, what customer segments made up that opportunity, and how well the section of the store is engaging customers and converting on the opportunity. Funnel analysis forever changed the way people optimized websites. It can do the same for the store. When you make a change, you can see how patterns of movement, shopping and segmentation all shift. You can isolate specific segments of customer (first time, regular, committed shopper, browser) and see how their product associations and navigation patterns differ. If this sounds like continuous improvement through testing…well, that’s exactly what it is.

Questions you can Answer

  • How well is each area and section of the store performing?
  • How do different customer segments use the store differently?
  • How effective are displays in engaging customers?
  • How did store layout changes impact opportunity and engagement?
  • Are there underutilized areas of the store?
  • Are store experiences capturing engagement and changing shopping patterns?
  • Are there unusual local patterns of engagement at a particular store?

Next up? Optimizing promotions and in-store marketing campaigns.

 

Why do we need to track customers when we know what they buy?

Digital Mortar is committed to bringing a whole new generation of measurement and analytics to the in-store customer journey. What I mean by that “new generation” is that our approach embodies more complete and far more accurate data collection. I mean that it provides far more interesting and directive reports. And I mean that our analytics will make a store (or other physical space) work better. But how does that happen and why do we need to track customers inside the store when we know what they buy? After all, it’s not as if traditional stores are unmeasured. Stores have, at minimum, PoS data and store merchandising and operations data. In other words, we know what we had to sell, we know how many people we used to sell it, and we know how much (and what and what profit) we actually sold.

That stuff is vital and deeply explanatory.

It constitutes the data necessary to optimize assortment, manage (to some extent) staffing needs, allocate staff to areas, and understand which categories are pulling their weight. It can even, with market basket analysis, help us understand which products are associated in customer’s shopping behaviors and can form the basis for layout optimization.

We come from a digital analytics background – analyzing customer experience on eCommerce sites we often had a similar situation. The back-office systems told us which products were purchased, which were bought together, which categories were most successful. You didn’t need a digital analytics solution to tell you any of that. So if you bought, implemented and tried to use a digital analytics solution and those were your questions…well, you were going to be disappointed. Not because a digital analytics solution couldn’t provide answers, it just couldn’t provide better answer than you already had.

It’s the same with in-store tracking systems; which is why when we’re building our system, evaluating reports or doing analysis for clients at Digital Mortar, I find myself using the PoS test. The PoS test is just this pretty simple question: does using the customer in-store journey to answer the question provide better, more useful information than simply knowing what customers bought?

When the answer yes, we build it. But sometimes the answer is no – and we just leave well enough alone.

Let me give you some examples from real-life to show why the PoS test can help clarify what In-Store tracking is for. Here’s three different reports based on understanding the in-store customer journey:

#1: There are regular in-store events hosted by each location. With in-store tracking, we can measure the browsing impact of these events and see if they encourage people to shop products.

#2: There are sometimes significant category performance differences between locations. With in-store tracking, we can measure whether the performance differences are driven by layout, by traffic type, by weather or by area shop per preferences.

#3: Matching staffing levels to store traffic can be tricky. Are there times when a store is understaffed leaving sales, literally, on the table? With in-store tracking we can measure associate / customer rations, interactions and performance and we can identify whether and how often lowered interaction rates lost sales.

I think all three of these reports are potentially interesting – they’re perfectly reasonable to ask for and to produce.

With #1, however, I have to wonder how much value in-store tracking will add beyond PoS data. I can just as easily correlate PoS data to event times to see if events drive additional sales. What I don’t know is whether event attendees browse but don’t buy. If I do this analysis with in-store tracking data, the first question I’ll get is “But did they buy anything?” If, on the other hand, I do the analysis with PoS data, I’m much less likely to hear “But did they browse the store?” So while in-store tracking adds a little bit of information to the problem, it’s probably not the best or the easiest way to understand the impact of store events. We chose not to include this type of report in our base report set, even though we do let people integrate and view this type of data.

Question #2 is quite different. The question starts with sales data. We see differences in category sales by store. So more PoS data isn’t going to help. When you want to know why sales are different (by day, by store, by region, etc.), then you’ll need other types of data. Obviously, you’ll need square footage to understand efficiency, but the type of store layout data you can bring to bear is probably even more critical than measures of efficiency. With in-store tracking you can see how often a category functions as a draw (where customers go first), how it gets traffic from associated areas, how much opportunity it had, and how well it actually performed. Along with weather and associate interaction data, you have almost every factor you’re likely to need to really understand the drivers of performance. We made sure this kind of analytics is easy in our tool. Not just by integrating PoS data, but by making sure that it’s possible to understand and compare how store layouts shape category browsing and buying.

Question #3 is somewhere in between. By matching staffing data to PoS data, I can see if there are times when I look understaffed.  But I’m missing significant pieces of information if I try to optimize staff using only PoS data. Door-counting data can take this one step further and help me understand when interaction opportunities were highest (and most underserved). With full in-store journey tracking, I can refine my answers to individual categories / departments and make sure I’m evaluating real opportunities not, for example, mall pass throughs. So in-store journey tracking deepens and sharpens the answer to Staffing Gaps well beyond what can be achieved with only PoS data or even PoS and door-counting data. Once again, we chose to include staff optimization reports (actually a whole bunch of them) in the base product. Even though you can do interesting analysis with just PoS data, there’s too much missing to make decision-makers informed and confident enough to make changes. And making changes is what it’s all about.

 

We all know the old saying about everything looking like a nail when your only tool is a hammer. But the truth is that we often fixate on a particular tool even when many others are near to hand. You can answer all sorts of questions with in-store journey tracking data, but some of those questions can be answered as well or better using your existing PoS or door-counting data. This sort of analytics duplication isn’t unique to in-store tracking. It’s ubiquitous in data analytics in general. Before you start buying systems, using reports or delving into a tool, it’s almost always worth asking if it’s the right/easiest/best data for the job. It just so happens that with in-store tracking data, asking how and whether it extends PoS data is almost always a good place to start.

In creating the DM tool, we’ve tried to do a lot of that work for you. And by applying the PoS test, we think we’ve created a report set that helps guide you to the best uses of in-store tracking data. The uses that take full advantage of what makes this data unique and that don’t waste your time with stuff you already (should) know.

 

Digital Transformation Dialogues – Part 4 – Creating the Right Culture around Collaboration Tools

[Here’s more from my ongoing dialogue with transformation expert and friend Scott K Wilder. In the last post, we discussed ways to make an older workforce more digitally savvy. Scott ended that post with this: “Personally, I would rather be HipChatted vs. Slacked. But technology sometimes like religion. You have to find out what people are most comfortable with. At Marketo, it was Slack. At Salesforce, it is Chatter. For me, I prefer to be Skyped!. How about you?”]

GA: I’m a reluctant video user. I was always the kid who liked to sit in the very back of the class hunched down behind somebody who played Right Guard on the football team. That being said, I have some issues with chat too. It’s a very interruptive technology. I know that’s it’s super popular with developers – and I see the point particularly in Agile teams. But I always viewed serious code writing as essentially monastic. That may seem ludicrous, but writing large scale software is a real intellectual undertaking – requiring you to hold hundreds of thousands of lines of code in your head and have at least a general sense of how they fit together and what’s there. I’m not convinced you can do that while you’re regularly dropping in and out of chat sessions (or, for that matter, having meetings every 30 minutes). When I was writing large-scale code I pretty much talked to no one. Of course, a vanishingly tiny percentage of people are writing serious code. But I feel the same way about writing – something I do regularly. When I’m writing a piece I care about, I seriously don’t want to be interrupted. So my question really is about protecting culture – you’ve talked about adoption – and creating a culture of usage. I agree that’s important – in fact it’s a far more common failure point. Life being what it is, though, we also have to worry about too much success (and part of adoption is assuring people that culture won’t change too much – even if it will). So how do you create an etiquette culture around collaborative technologies that protects other types of behavior we value? After all, no company wants the family equivalent of everyone whipping out their iPhones at the dinner table…

SW: Ah, now we are getting into a little psychology and ethnography. For me, there are two ways to approach this (business) issue:

  1. Constantly try to understand the different personalities in your company
  2. Consistently establish and communicate company values throughout the organization

In every organization, there are many personality types. Each responds to new challenges in different ways, especially when it comes to adopting new technologies. Individual or team behaviors can be looked at through a Myers Briggs lens. Or you can examine various personas involved.

Ironically, 80% of companies do market segmentation with personas or some other kind approach, but few take the time to do the same thing when trying to figure out how to work with their own employees. Few companies step back and look at the different ways their own people adopt technology. There is often little conversation about how new processes and technologies diffuse throughout an organization. So what’s my point about all of this psycho mumbo jumbo.

Before you can create a culture around the adoption of a new technology, you need to understand the different personality types in your organization. And it helps if you leverage a topography like Myers Briggs, to help understand how people learn or adopt new technologies.

For example some people might prefer to learn on their own either studying a user manual or watching videos before kicking the tires and testing out a new product. Others might prefer to learn with a mentor or teacher to guide them. And others might want to learn by participating with others. The important thing is to first understand how an individual responds and adapt to new ways of doing business.

After you know the different types of people/personalities you are dealing with, you can begin to focus a culture that fosters the adoption of new technologies while protecting people’s values (or how they want to start using the new technology).

Finally, the challenge is getting these different types of people to function on a day-to-day basis with each other. This will be easier if you have provide a comfortable and safe environment for them to learn at their own speed and in their own way.

Secondly, when creating an etiquette culture around collaborative technologies, it’s important to present them to your employees by showing how they map to your core company values (This assumes you have company values). Atlassian, my current employer, has very strong values which are reviewed every time the company works on a project. Some of them include:

  • Don’t #@!% the customer: This statement promotes honesty and transparency. The company knows that their Customers are their lifeblood. Without happy customers, they are doomed
  • Play, as a team: As they say “We spend a huge amount of our time at work. So the more that time doesn’t feel like “work,” the better. We can be serious, without taking ourselves too seriously. We strive to put what’s right for the team first – whether in a meeting room or on a football pitch.”

These are just two of the values. There are others, but each one is used to help keep every employee aligned and heading towards the company’s True North, especially when adopting a new collaborative technology or trying to change behavior across the organization.

Finally, collaboration has no beginning nor end It is a continuous journey that involves multiple parts of your organization.

GA: There’s a lot here to respond to. I’m totally on board with your thoughts around corporate culture and values. Most companies pretend to have values – some actually do. And while I’ve argued in some other cases that you can drive analytics without necessarily having top-down support (though it sure does help), culture building is either hierarchic or anarchic – and anarchic rarely works as a model. That isn’t to say that individual managers can’t create micro-cultures inside a larger organization. They do – and pretty constantly. But those micro-cultures – for good or ill – are always getting worn down and eroded by the broader culture. There’s no place where the impact of senior folks is more pronounced than on setting the tone for this kind of culture building – and, as I’ve argued elsewhere, culture building isn’t done with words. In the beginning was the deed! You can talk “Don’t #@!% the customertill you’re blue in the face, but the first time an executive makes a decision to the contrary, all that talk will be less than worthless (and I do mean less since it creates negative value in the company). That’s one good reason why it’s important to have values you A) actually care about and B) can reasonably live up to.

I’m less comfortable with tests like Myers-Briggs for employee segmentation. I’ve never been confident that personality tests capture anything real. I know they have a lot of fans (and a lot of fans among people whose opinions I respect) – but I’m unconvinced. Sure, we all see ourselves in the results of these tests. But we see ourselves in our horoscopes too. Self identification isn’t objective verification. But I’ll give you the validity of personality types and still question whether it’s a good tool to help drive cultural adoption (and proper etiquette) around social technologies. I’ll buy that segmentation would bring something to crafting a change management and adoption strategy – but would I use personality types or would I use things like rank, role, and behavior?

Convince me if you can!

Finally, let’s talk technology. I’d love to get your thoughts on what types of collaborative technologies make the biggest difference in an organization. And I’d also like your thoughts on whether that’s even the right question. Do you need to think about a collaborative suite? Will one tool likely die on the vine where a constellation of tools might work? I’ve seen both approaches fail – but that’s never conclusive. We live in a “baseball” world where failure is always the most common outcome.

 

Digital Transformation Dialogues – Part 3 – Bringing an Older Workforce up to Speed and Driving Adoption of Digital Tools

[Here’s more from my ongoing dialogue with transformation expert and friend Scott K Wilder. In the last post, we discussed the role of Millennials in balancing an older workforce. But I wanted a little more detail on how to get an older workforce more digitally aware…]

SW: I probably forgot this one because I am an older guy, but I’m also someone who thinks it’s every marketer’s responsibility to learn digital technology. Before I directly answer the question, let me give you an example. My son is really into drones and wants me to take him to some national parks so he can fly his drone. Before I make a road trip with him, however, I want to master drones, so I hired a drone coach. After all, I am the one who is ultimately responsible for my son’s safety. Working at as a Digital Marketer or Digital Employee requires the same commitment. The only difference, however, is that companies need to play a bit of the parental role and provide a clear path for their employees to learn about technology.

This can be done by paying for courses (Marketo, my former employer, pays for its employees to take courses at Lynda.com). It can be done by making ‘learning certain technologies’ as required for the job. Instead of saying you learn it or you lose it (your job), position this change as an opportunity to skill up — and that the company is investing in the future (in its best asset, its employees).

Companies also should provide career guidance — either for older employees to find other opportunities within their company or with a company’s partner. Training, career guidance are not only great retention tools, but also build loyalty after an employee moves on.

Companies also need to gently require that digital technologies be used in their everyday business practices. If the older person wants to remain part of the company, they will have to hop on the digital bus. And like the Magic School Bus (a book my kid loves), it will be a journey into unknown — with lots of opportunity to learn, a bit of uncertainty and a fun adventure. You know what. Even outside the office, they will feel as if they are on the Magic School Bus because by learning technology, older folks can have a more enriched life. My son Facetimes and Skypes with his Grandma twice a week.

Why should companies do this? Why should they make this investment? Several reasons, such as older workers tend to be loyal, older workers already know ‘your business’. Companies should also build incentive systems — gamify their career development — so they will be motivated to take on the exciting challenge of improving their skills.

Final note: Being Digital is more than just using the internet and Facebook. Companies should also figure out what digital technologies will help these older workers do their job better. If they need to be on social media, teach them Hootsuite. If they need to manage email programs, teach them Autopilot or Exact Target. If they need to collaborate better, be their guide while they learn Slack or HipChat.

GA: There’s a couple of points that I want to particularly call-out there. One is that company’s aren’t taking full advantage of the explosion in high-quality educational courseware that’s available these days. Sure, lots of folks will do this on their own, but not everyone is sufficiently motivated. I’ve always said my number one guiding principal – and the reason transformation is so hard – is that EVERYONE IS FUNDAMENTALLY LAZY. Giving people real incentives and formal guidance on courseware so that it’s part of an employee’s basic career development is really easy to do and I think pays tremendous dividends. If your company hasn’t curated public courseware for specific career-tracks and incentivized your employees to take advantage, you should be kicking your HR team’s butt (just my humble opinion).

I’m also a huge fan of the idea (as you know) that people have to DO stuff. And I’m glad you brought up the technologies because that’s the next (and last) area I wanted to explore. A lot of the digital technologies are fundamentally collaborative. But that can make adoption critical to their success. I know you’ve been living this problem – how do you get a team (and keep my older, non-digital workers in mind) to adopt tools like Slack?

SW: Gary, why are you always asking me the hard questions? I think you ‘re correct in focusing on ‘the team’ vs. ‘the company’ and trying to mandate day 1 that a whole company start using something like Slack.

They key is to start with one group.  Pick a team that seems receptive to taking on new ways of doing things — especially when it comes to digital technology. And within that group, you should also identify a few key digital change agents, early adopters, who are willing to not only try out the new technology, but also be champions for it.

Create a program for these digital champions. It can be rewards focused, but even better,  show them how sharing their knowledge and experience will help them learn a new technology even better and make them more marketable. Intuit, where I spent almost a full decade, has a philosophy called “Learn Teach Learn.” The only way to really learn something is to teach it to others (Intuit has a great learning culture!).

Of course, there is another option. You could see if any group in the company is currently using Slack and make them that group ‘your change agents. At Marketo, it was actually the company’s commuters — employees who took a small shuttle bus that looked like one of those vans old age homes use to transport its frail residents – who started using Slack. They let their fellow workers know if they wanted the bus to wait for them or if they wanted the van to turn around and pick up someone they forgot. My group of commuters called our Slack group, The Purple Lobster.

The Slack group was called the Purple Lobster because that’s what we called the van. We picked purple because that was Marketo’s company color. And lobster because it wasn’t the fastest moving vehicle on Highway 101.

And like a lobster slithering in the sand (sorry about pushing the poetic envelop here) slowly, but surely ,other commuting groups started to using Slack. Eventually, product teams started using it And finally, the CTO and his team made the call to not fight the crowd and force the company to use another tool, like Chatter. Instead, CIO convinced his fellow executives to adopt Slack across the company. It was a brilliant ‘if you can’t beat them, then join them’ strategy.

If you identify a group using Slack, challenge them to go completely cold Turkey. See if they are willing to only use Slack only (no email) for a week or so. At Atlassian, I had my hand Slacked when I tried to send an email to someone with a simple question. They recommended I use their Slack like product, Hipchat. And now, I only have 20 emails in my inbox. How many of you have only 20 emails in your corporate email inbox?

If you are not so lucky to find early adopters, you need to find a group of people who are most like to use the new technology. If there are some older folks on the team, pair them up with the younger wipper snappers. Or provide some training.

The key in all this is not to focus on technology. Instead, treat the change to Slack or any other digital technology as a change management exercise. Focus on adoption — education, onboarding and engagement. None of this should be done in a vacuum. You need someone to shepherd the process. Someone who can be a guide, a teacher, a problem solver and yes, a true change agent.

Other considerations include rewarding people for their efforts and successes. Gamify the process! In doing so, make sure to acknowledge people’s efforts for trying. Don’t make the same mistake most schools make and only pass people for knowing the answer. As Carol Dweck, well known motivational researcher,  points out, children praised for hard work chose problems that promised increased learning (vs. just getting the right answer). This also applies to adults. Really!

The key here is to alter someone’s mindset. Instead of rewarding (just giving them a bonus) or punishing someone (not promoting them) for adopting a new technology, recognize their effort and hard work. The end result will be they might adopt taking on new challenges and succeeding at them. Even if it means learning and using something like Slack.

Finally good old training is important. It always amazes me how many companies introduce a new technology and offer one time training. Usually during a three hour class. If you are licensing a technology like Slack, see if they can conduct monthly webinars to answer questions (if not, you offer it). Also have videos and Q&As available for your staff.

Personally, I would rather be HipChatted vs. Slacked. But technology sometimes like religion. You have to find out what people are most comfortable with. At Marketo, it was Slack. At Salesforce, it is Chatter. For me, I prefer to be Skyped!. How about you?