Tag Archives: Measuring the digital world

Measuring the Digital World – The Movie!

I’ve put together a short 20 minute video that’s a companion piece to Measuring the Digital World. It’s a guided tour through the core principles of digital analytics and a really nice introduction to the book and the field:

Measuring the Digital World : Introduction

Measuring the Digital World

An Introduction to Digital Analytics

The video introduces the unique challenges of measuring the digital world. It’s a world where none of our traditional measurement categories and concepts apply. And it doesn’t help that our tools mostly point us in the wrong direction – introducing measurement categories that are unhelpful or misleading. To measure the digital world, we need to understand customer experiences not Websites. That isn’t easy when all you know is what web pages people looked at!

But it’s precisely that leap – from consumption to intent – that underlies all digital measurement. The video borrows an example from the book (Conan the Librarian) to show how this works and why it can be powerful. This leads directly to the concepts of 2-Tiered segmentation that are central to MTDW and are the foundation of good digital measurement.

Of course, it’s not that easy. Not only is making the inference from consumption to intent hard, it’s constantly undermined by the nature of digital properties. Their limited real-estate and strong structural elements – designed to force visitors in particular directions – make it risky to assume that people viewed what they were most interested in.

This essential contradiction between the two most fundamental principles of digital analytics is what makes our discipline so hard and (also) so interesting.

Finally, the video introduces the big data story and the ways that digital data – and making the leap from consumption to intent – challenges many of our traditional IT paradigms (not to mention our supposedly purpose-built digital analytics toolkit).

Give it a look. Even if you’re an experience practitioner I think you’ll find parts of it illuminating. And if you’re new to the field or a consumer of digital reporting and analytics, I don’t think you could spend a more productive 20 minutes.

Afterward (when you want to order the book), here’s the link to it on Amazon!

Digital Transformation of the Enterprise (with a side of Big Data)

Since I finished Measuring the Digital World and got back to regular blogging, I’ve been writing an extended series on the challenges of digital in the enterprise. Like many analysts, I’m often frustrated by the way our clients approach decision-making. So often, they lack any real understanding of the customer journey, any effective segmentation scheme, any real method for either doing or incorporating analytics into their decisioning, anything more than a superficial understanding of their customers, and anything more than the empty façade of a testing program. Is it any surprise that they aren’t very good at digital? This would be frustrating but understandable if companies simply didn’t invest in these capabilities. They aren’t magic, and no large enterprise can do these things without making a significant investment. But, in fact, many companies have invested plenty with very disappointing results. That’s maddening. I want to change that – and this series is an extended meditation on what it takes to do better and how large enterprises might truly gain competitive advantage in digital.

I hope that reading these posts is useful to people, but I know, too, that it’s hard to get the time. Heaven knows I struggle to read the stuff I’d like to. So I took advantage of the slow time over the holidays to do something that’s been on my wish list for about 2 years now – take some of the presentations I do and turn them into full online webinars. I started with a whole series that captures the core elements of this series – the challenge of digital transformation.

There are two versions of this video series. The first is a set of fairly short (2-4 minute) stories that walk through how enterprise decision-making gets done, what’s wrong with the way we do it, and how we can do better. It’s a ten(!) part series and meant to be tackled in order. It’s not really all that long…like I said, most of the videos are just 2-4 minutes long. I’ve also packaged up the whole story (except Part 10) in single video that runs just a little over 20 minutes. It’s shorter than viewing all 10 of the others, but you need a decent chunk of uninterrupted time to get at it. If you’re really pressed and only want to get the key themes without the story, you can just view Parts 8-10.

Here’s the video page that has all of these laid out in order:

Digital Transformation Video Series

Check it out and let me know what you think! To me it seems like a faster, better, and more enjoyable way to get the story about digital transformation and I’m hoping it’s very shareable as well. If you’re struggling to get analytics traction in your organization, these videos might be an easy thing to share with your CMO and digital channel leads to help drive real change.

I have to say I enjoyed doing these a lot and they aren’t really hard to do. They aren’t quite professional quality, but I think they are very listenable and I’ll keep working to make them better. In fact, I enjoyed doing the digital transformation ones so much that I knocked out another this last week – Big Data Explained.

This is one of my favorite presentations of all time – it’s rich in content and intellectually interesting. Big data is a subject that is obscured by hype, self-interest, and just plain ignorance; everyone talks about it but no one has a clear, cogent explanation of what it is and why it’s important. This presentation deconstructs the everyday explanation about big data (the 4Vs) and shows why it misses the mark. But it isn’t designed to merely expose the hype, it actually builds out a clear, straightforward and important explanation of why big data is real, why it challenges common IT and analytics paradigms, and how to understand whether a problem is a big data problem…or not. I’ve written about this before, but you can’t beat a video with supporting visuals for this particular topic. It’s less than fifteen minutes and, like the digital transformation series, it’s intended for a wide audience. If you have decision-makers who don’t get big data or are skeptical of the hype, they’ll appreciate this straightforward, clear, and no-nonsense explication of what it is.

You can get it on my video page or direct on Youtube

This is also a significant topic toward the end of Measuring the Digital World where I try to lay out a forward looking plan for digital analytics as a discipline.

I’m planning to do a steady stream of these videos throughout the year so I’d love thoughts/feedback if you have suggestions!

Next week I hope to have an update on my EY Counseling Family’s work in the 538 Academy Awards challenge. We’ve built our initial Hollywood culture models – it’s pretty cool stuff and I’m excited to share the results. Our model may not be as effective as some of the other challengers (TBD), but I think it’s definitely more fun.

Practical Steps to Building an Analytics Culture

Building an analytics culture in the enterprise is incredibly important. It’s far more important than any single capability, technology or technique. But building culture isn’t easy. You can’t buy it. You can’t proclaim it. You can’t implement it.

There is, of course, a vast literature on building culture in the enterprise. But if the clumsy, heavy-handed, thoroughly useless attempts to “build culture” that I’ve witnessed over the course of my working life are any evidence, that body of literature is nearly useless.

Here’s one thing I know for sure: you don’t build culture by talk. I don’t care whether it’s getting teenagers to practice safe-sex or getting managers to use analytics, preaching virtue doesn’t work, has never worked and will never work. Telling people to be data-driven, proclaiming your commitment to analytics, touting your analytics capabilities: none of this builds analytics culture.

If there’s one thing that every young employee has learned in this era, it’s that fancy talk is cheap and meaningless. People are incredibly sophisticated about language these days. We can sit in front of the TV and recognize in a second whether we’re seeing a commercial or a program. Most of us can tell the difference between a TV show and movie almost at a glance. We can tune out advertising on a Website as effortlessly as we put on our pants. A bunch of glib words aren’t going to fool anyone. You want to know what the reaction is to your carefully crafted, strategic consultancy driven mission statement or that five year “vision” you spent millions on and just rolled out with a cool video at your Sales Conference? Complete indifference.

That’s if you’re lucky…if you didn’t do it really well, you got the eye-roll.

But it isn’t just that people are incredibly sensitive – probably too sensitive – to BS. It’s that even true, sincere, beautifully reasoned words will not build culture. Reading moral philosophy does not create moral students. Not because the words aren’t right or true, but because behaviors are, for the most part, not driven by those types of reasons.

That’s the whole thing about culture.

Culture is lived, not read or spoken. To create it, you have to ingrain it in people’s thinking. If you want a data-driven organization, you have to create good analytic habits. You have to make the organization (and you too) work right.

How do you do that?

You do it by creating certain kinds of process and behaviors that embed analytic thinking. Do enough of that, and you’ll have an analytic culture. I guarantee it. The whole thrust of this recent series of posts is that by changing the way you integrate analytics, voice-of-customer, journey-mapping and experimentation into the enterprise, you can drive better digital decision making. That’s building culture. It’s my big answer to the question of how you build analytics culture.

But I have some small answers as well. Here, in no particular order, are practical ways you can create importantly good analytics habits in the enterprise.

Analytic Reporting

What it is: Changing your enterprise reporting strategy by moving from reports to tools. Analytic models and forecasting allow you to build tools that integrate historical reporting with forecasting and what-if capabilities. Static reporting is replaced by a set of interactive tools that allow users to see how different business strategies actually play-out.

Why it build analytics culture: With analytics reporting, you democratize knowledge not data. It makes all the difference in the world. The analytic models capture your best insight into how a key business works and what levers drive performance. Building this into tools not only operationalizes the knowledge, it creates positive feedback loops to analytics. When the forecast isn’t right, everyone know it and the business is incented to improve its understanding and predictive capabilities. This makes for better culture in analytics consumers and analytics producers.

 

Cadence of Communications

What it is: Setting up regular briefings between analytics and your senior team and decision-makers. This can include review of dashboards but should primarily focus on answers to previous business questions and discussion of new problems.

Why it builds analytics culture: This is actually one of the most important things you can do. It exposes decision-makers to analytics. It makes it easy for decision-makers to ask for new research and exposes them to the relevant techniques. Perhaps even more important, it lets decision-makers drive the analytics agenda, exposes analysts to real business problems, and forces analysts to develop better communication skills.

 

C-Suite Advisor

What it is: Create an Analytics Minister-without-portfolio whose sole job is to advise senior decision-makers on how to use, understand and evaluate the analytics, the data and the decisions they get.

Why it builds analytics culture: Most senior executives are fairly ignorant of the pitfalls in data interpretation and the ins-and-outs of KPIs and experimentation. You can’t send them back to get a modern MBA, but you can give them a trusted advisor with no axe to grind. This not only raises their analytics intelligence, it forces everyone feeding them information to up their game as well. This tactic is also critical because of the next strategy…

 

Walking the Walk

What it is: Senior Leaders can talk tell they are blue in the face about data-driven decision-making. Nobody will care. But let a Senior Leader even once use data or demand data around a decision they are making and the whole organization will take notice.

Why it builds analytics culture: Senior leaders CAN and DO have a profound impact on culture but they do so by their behavior not their words. When the leaders at the top use and demand data for decisions, so will everyone else.

 

Tagging Standards

What it is: A clearly defined set of data collection specifications that ensure that every piece of content on every platform is appropriately tagged to collect a rich set of customer, content, and behavioral data.

Why it builds analytics culture: This ends the debate over whether tags and measurement are optional. They aren’t. This also, interestingly, makes measurement easier. Sometimes, people just need to be told what to do. This is like choosing which side of the road to drive on – it’s far more important that you have a standard that which side of the road you pick. Standards are necessary when an organization needs direction and coordination. Tagging is a perfect example.

 

CMS and Campaign Meta-Data

What it is: The definition of and governance around the creation of campaign and content meta-data. Every piece of content and every campaign element should have detailed, rich meta-data around the audience, tone, approach, contents, and every other element that can be tuned and analyzed.

Why it builds analytics culture: Not only is meta-data the key to digital analytics – providing the meaning that makes content consumption understandable, but rich meta-data definition guides useful thought. These are the categories people will think about when they analyze content and campaign performance. That’s as it should be and by providing these pre-built, populated categorizations, you’ll greatly facilitate good analytics thinking.

 

Rapid VoC

What it is: The technical and organizational capability to rapidly create, deploy and analyze surveys and other voice-of-customer research instruments.

Why it builds analytics culture: This is the best capability I know for training senior decision-makers to use research. It’s so cheap, so easy, so flexible and so understandable that decision-makers will quickly get spoiled. They’ll use it over and over and over. Well – that’s the point. Nothing builds analytics muscle like use and getting this type of capability deeply embedded in the way your senior team thinks and works will truly change the decision-making culture of the enterprise.

 

SPEED and Formal Continuous Improvement Cycles

What it is: The use of a formal methodology for digital improvement. SPEED provides a way to identify the best opportunities for digital improvement, the ways to tackle those opportunities, and the ability to measure the impact of any changes. It’s the equivalent of Six Sigma for digital.

Why it builds analytics culture: Formal methods make it vastly easier for everyone in the organization to understand how to get better. Methods also help define a set of processes that organizations can build their organization around. This makes it easier to grow and scale. For large enterprises, in particular, it’s no surprise that formal methodologies like Six Sigma have been so successful. They make key cultural precepts manifest and attach processes to them so that the organizational inertia is guided in positive directions.

 

Does this seem like an absurdly long list? In truth I’m only about half-way through. But this post is getting LONG. So I’m going to save the rest of my list for next week. Till then, here’s some final thoughts on creating an analytics culture.

The secret to building culture is this: everything you do builds culture. Some things build the wrong kind of culture. Some things the right kind. But you are never not building culture. So if you want to build the right culture to be good at digital and decision-making, there’s no magic elixir, no secret sauce. There is only the discipline of doing things right. Over and over.

That being said, not every action is equal. Some foods are empty of nutrition but empty, too, of harm. Others positively destroy your teeth or your waistline. Still others provide the right kind of fuel. The things I’ve described above are not just a random list of things done right, they are the small to medium things that, done right, have the biggest impacts I’ve seen on building a great digital and analytics culture. They are also targeted to places and decisions which, done poorly, will deeply damage your culture.

I’ll detail some more super-foods for analytics culture in my next post!

 

[Get your copy of Measuring the Digital World – the definitive guide to the discipline of digital analytics – to learn more].

Measuring the Digital World

After several months in pre-order purgatory, my book, Measuring the Digital World is now available. If you’re even an occasional reader of this blog, I hope you’ll find the time to read it.

I know that’s no small ask. Reading a professional book is a big investment of time. So is reading Measuring the Digital World worth it?

Well, if you’re invested in digital optimization and analytics, I think it is – and here’s why. We work in a field that is still very immature. It’s grown up, as it were, underneath our feet. And while that kind of organic growth is always the most exciting, it’s also the most unruly. I’m betting that most of us who have spent a few years or more in digital analytics have never really had a chance to reflect on what we do and how we do it. Worse, most of those who are trying to learn the field, have to do so almost entirely by mentored trial-and-error. That’s hard. Having a framework for how and why things work makes the inevitable trial-and-error learning far more productive.

My goal in Measuring the Digital World wasn’t so much to create a how-to book as to define a discipline. I believe digital analytics is a unique field. A field defined by a few key problems that we must solve if we are to do it well. In the book, I wanted to lay out those problems and show how they can be tackled – irrespective of the tools you use or the type of digital property you care about.

At the very heart of digital analytics is a problem of description. Measurement is basic to understanding. We are born with and soon learn to speak and think in terms of measurement categories that apply to the physical world. Dimensionality, weight, speed, direction and color are some of the core measurement categories that we use over and over and over again in understanding the world we live in. These things don’t exist in the digital world.

What replaces them?

Our digital analytics tools provide the eyes and ears into the digital world. But I think we should be very skeptical of the measurement categories they suggest. Having lived through the period when those tools where designed and took their present shape, I’ve seen how flawed were the measurement conceptions that drove their form and function.

It’s not original, but it’s still true to say that our digital analytics tools mostly live at the wrong level and have the wrong set of measurement categories – that they are far too focused on web assets and far too little on web visitors.

But if this is a mere truism, it nevertheless lays the ground work for a real discipline. Because it suggests that the great challenge of digital is how to understand who people are and what they are doing using only their viewing behavior. We have to infer identity and intention from action. Probably 9 out of every 10 pages in Measuring the Digital World are concerned with how to do this.

The things that make it hard are precisely the things that define our discipline. First, to make the connection between action and both identity and intention, we have to find ways to generate meaning based on content consumption. This means understanding at a deep level what content is about – it also means making the implicit assumption that people self-select the things that interest them.

For the most part, that’s true.

But it’s also where things get tricky. Because digital properties don’t contain limitless possibilities and they impose a structure that tries to guide the user to specific actions. This creates a push-pull in every digital world. On the one hand, we’re using what people consume to understand their intention and, at the very same time, we’re constantly forcing their hand and trying to get them to do specific actions! Every digital property – no matter its purpose or design – embodies this push-pull. The result? A complex interplay between self-selection, intention and web design that makes understanding behavior in digital a constant struggle.

That’s the point – and the challenge – of digital analytics. We need to have techniques for moving from behavior to identity and intention. And we need to have techniques that control for the structure of digital properties and the presence or absence of content. These same challenges are played out on Websites, on mobile apps and, now, on omni-channel customer journeys.

This is all ground I’ve walked before, but Measuring the Digital World embodies an orderly and fairly comprehensive approach to describing these challenges and laying out the framework of our discipline. How it works. Why it’s hard. What challenges we still face. It’s all there.

So if you’re an experienced analyst and just want to reflect your intuitions and knowledge against a formal description of digital analytics and how it can be done, this book is for you. I’m pretty sure you’ll find at least a few new ideas and some new clarity around ideas you probably already have.

If you’re relatively new to the field and would like something that is intellectually a little more meaty than the “bag of tips-and-tricks” books that you’ve already read, then this book is for you. You’ll get a deep set of methods and techniques that can be applied to almost any digital property to drive better understanding and optimization. You’ll get a sense, maybe for the first time, of exactly what our discipline is – why it’s hard and why certain kinds of mistakes are ubiquitous and must be carefully guarded against.

And if you’re teaching a bunch of MBA or Business Students about digital analytics and want something that actually describes a discipline, this book is REALLY for you (well…for your students). Your students will get a true appreciation for a cutting edge analytics discipline, they’ll also get a sense of where the most interesting new problems in digital analytics are and what approaches might bear fruit. They’ll get a book that illuminates how the structure of a field – in this case digital – demands specific approaches, creates unique problems, and rewards certain types of analysis. That’s knowledge that cuts deeper than just understanding digital analytics – it goes right to the heart of what analytics is about and how it can work in any business discipline. Finally, I hope that the opportunity to tackle deep and interesting problems illuminated by the book’s framework, excites new analysts and inspires the next generation of digital analysts to go far beyond what we’ve been able to do.

 

Yes, even though I’m an inveterate reader, I know it’s no trivial thing to say “read this book”. After all, despite my copious consumption, I delve much less often into business or technical books. So many seem like fine ten-page articles stretched – I’m tempted to say distorted – into book form. You get their gist in the first five pages and the rest is just filler. That doesn’t make for a great investment of time.

And now that I’ve actually written a book, I can see why that happens. Who really has 250 pages worth of stuff to say? I’m not sure I do…actually I’m pretty sure there’s some filler tucked in there in a spot or two. But I think the ratio is pretty good.

With Measuring the Digital World I tried to do something very ambitious – define a discipline. To create the authoritative view of what digital analytics is, how it works, and why it’s different than any other field of analytics. Not to answer every question, lay out every technique or solve every problem. There are huge swaths of our field not even mentioned in the book. That doesn’t bother me. What we do is far too rich to describe in a single book or even a substantial collection. Digital is, as the title of the book suggests, a whole new world. My goal was not to explore every aspect of measuring that world, but only to show how that measurement, at its heart, must proceed. I’m surely not the right person to judge to what extent I succeeded. I hope you’ll do that.

Here’s the link to Measuring the Digital World on Amazon.

[By the way, if you’d like signed copy of Measuring the Digital World, just let me know. You can buy a copy online and I’ll send you a book-plate. I know it’s a little silly, but I confess to extreme fondness for the few signed books I possess!]

SPEED: A Process for Continuous Improvement in Digital

Everyone always wants to get better. But without a formal process to drive performance, continuous improvement is more likely to be an empty platitude than a reality in the enterprise. Building that formal process isn’t trivial. Existing methodologies like Six Sigma illustrate the depth and the advantages of a true improvement process versus an ad hoc “let’s get better” attitude, but those methodologies (largely birthed in manufacturing) aren’t directly applicable to digital. In my last post, I laid out six grounding principles that underlie continuous improvement in digital. I’ll summarize them here as:

  • Small is measurable. Big changes (like website redesigns) alter too much to make optimization practical
  • Controlled Experiments are essential to measure any complex change
  • Continuous improvement will broadly target reduction in friction or improvement in segmentation
  • Acquisition and Experience (Content) are inter-related and inter-dependent
  • Audience, use-case, prequalification and target content all drive marketing performance
  • Most content changes shift behavior rather than drive clear positive or negative outcomes

Having guiding principles isn’t the same thing as having a method, but a real methodology can be fashioned from this sub-structure that will drive true continuous improvement. A full methodology needs a way to identify the right areas to work on and a process for improving those areas. At minimum, that process should include techniques for figuring out what to change and for evaluating the direction and impact of those changes. If you have that, you can drive continuous improvement.

I’ll start where I always start: segmentation. Specifically, 2-tiered segmentation. 2-tiered segmentation is a uniquely digital approach to segmentation that slices audiences by who they are (traditional segmentation) and what they are trying to accomplish (this is the second tier) in the digital channel. This matrixed segmentation scheme is the perfect table-set for continuous improvement. In fact, I don’t think it’s possible to drive continuous improvement without this type of segmentation. Real digital improvement is always relative to an audience and a use-case.

But segmentation on its own isn’t a method for continuous improvement. 2-tiered segmentation gives us a powerful framework for understanding where and why improvement might be focused, but it doesn’t tell us where to target improvements or what those improvements might be. To have a real method, we need that.

Here’s where pre-qualification comes in. One of the core principles is that acquisition and experience are inter-related and inter-dependent. This means that if you want to understand whether or not content is working (creating lift of some kind), then you have to understand the pre-existing state of the audience that consumes that content. Content with a 100% success rate may suck. Content with a 0% success rate may be outstanding. It all depends on the population you give them. Every single person in line at the DMV will stay there to get their license. That doesn’t mean the experience is a good one. It just means that the self-selected audience is determined to finish the process. We need that license! Similarly, if you direct garbage traffic to even the best content, it won’t perform at all. Acquisition and content are deeply interdependent. It’s impossible to measure the latter without understanding the former.

Fortunately, there’s a simple technique for measuring the quality of the audience sourced for any given content area that we call pre-qualification. To understand the pre-qualification level of an audience at a given content point, we use a very short (typically nor more than 3-4 questions) pop-up survey. The pre-qualification survey explores what use-case visitors are in, where they are in the buying cycle, and how committed they are to the brand. That’s it.

It may be simple, but pre-qualification is one of the most powerful tools in the digital analytics arsenal and it’s the key to a successful continuous improvement methodology.

First we segment. Then we measure pre-qualification. With these two pieces we can measure content performance by visitor type, use-case and visitor quality. That’s enough to establish which content and which marketing campaigns are truly underperforming.

How?

Hold the population, use-case and pre-qualification level constant and measure the effectiveness of content pieces and sequences in creating successful outcomes. You can’t effectively measure content performance unless you hold these three variables constant, but when you control for these three variables you open up the power of digital analytics.

We now have a way to target potential improvement areas – just pick the content with the worst performance in each cell (visitor type x visit type x qualification level).

But there is much more that we can do with these essential pieces in place. By evaluating whether content underperforms across all pre-qualification levels equally or is much worse for less qualified visitors, you can determine if the content problem is because of friction (see guiding principle #3).

Friction problems tend to impact less qualified visitors disproportionately. So if less qualified visitors within each visitor type perform even worse than expected after consuming a piece of content, then some type of friction is likely the culprit.

Further, by evaluating content performance across visitor type (within use-case and with pre-qualification held constant), you have strong clues as to whether or not there are personalization opportunities to drive segmentation improvement.

Finally, where content performs well for qualified audiences but receives a disproportionate share of unqualified visitors, you know that you have to go upstream to fix the marketing campaigns sourcing the visits and targeting the content.

Segment. Pre-Qualify. Evaluate by qualification for friction and acquisition, and by visitor type for personalization.

Step four is to explore what to change. How do you do that? Often, the best method is to ask. This is yet another area for targeted VoC, where you can explore what content people are looking for, how they make decisions, what they need to know, and how that differs by segment. A rich series of choice/decision questions should create the necessary material to craft alternative approaches to test.

You can also break up the content into discrete chunks (each with a specific meta-data purpose or role) and then create a controlled experiment that tests which content chunks are most important and deliver the most lift. This is a sub-process for testing within the larger continuous improvement process. Analytically, it should also be possible to do a form of conjoint analysis on either behavior or preferences captured in VoC.

Segment. Pre-Qualify. Evaluate. Explore.

Now you’re ready to decide on the next round of tests and experiments based on a formal process for finding where problems are, why they exist, and how they can be tackled.

Segment, Pre-Qualify. Evaluate. Explore. Decide.

SPEED.

Sure, it’s just another consulting acronym. But underneath that acronym is real method. Not squishy and not contentless. It’s a formal procedure for identifying where problems exist, what class of problems they are, what type of solution might be a fit (friction reduction or personalization), and what that solution might consist of. All wrapped together in a process that can be endlessly repeated to drive measurable, discrete improvement for every type of visitor and every type of visit across any digital channel. It’s also specifically designed to be responsive to the guiding principles enumerated above that define digital.

If you’re looking for a real continuous improvement process in digital, there’s SPEED and then there’s…

Well, as far as I know, that’s pretty much it.

 

Interested in knowing more about 2-Tiered Segmentation and Pre-Qualification, the key ingredients to SPEED? “Measuring the Digital World” provides the most detailed descriptions I’ve ever written of how to do both and is now available for pre-order on Amazon.