Tag Archives: analytics

Data Analytics – Why The C-Suite Ignores Data

The data is too math-intensive to hold their interest.

Eyes glaze over. Gazes wander. Executives shift in seats. Expensive pens tap on conference tables. Math does that to people. You remember high school math class when you had to turn word problems into equations? Well, now you have to do that in reverse.

They really believe in their hunches.

C-Level executives are paid a lot of money on the assumption that they know stuff mere mortals don’t. And they got swagger. Who the hell are you to tell them their intuition doesn’t reflect reality? Well, it’s actually your job to tell them, isn’t it? Break the ice by asking if they saw Moneyball. Then slip the name Nate Silver into the conversation at some time in the presentation. You’re doing them a huge favor correcting assumptions that are untrue. But telling the truth isn’t enough – you have to sell the truth. Get your own swagger on.

They get a conflicting story from their own tribe.

Department heads like to say “Don’t bring me problems, bring me solutions”. So, unfavorable data within a department will often be put through the “spin cycle” before they are presented to the C-level person. Or, the data may be buried altogether so as not to make the middle managers look bad. For instance, maybe the IT team only presents the CTO cumulative data on users of the new app – the arrow on the graph is a never-ending march upward. It’s no wonder, then, that the CTO may not embrace Marketing telling her that customers don’t like the new app, and never come back after their first visit. It may be the first time she’s heard it.

How Data-Driven Marketing Contributes to the Class Divide

Once on a calibration call, I heard a customer service rep actually tell a customer who was doggedly asking for a better rate something like “You’re one of our low-segment customers. No matter how many times you ask, none of you guys can get that rate.”

Clearly, that rep was seriously off-script. But the segmentation score on her CRM screen told her that the company didn’t value this customer very much, so she didn’t either. I built the segmentation model that told her that. For me, it brought into stark relief how marketing segmentation can affect dynamics far down the road.

In many years in the field, I’ve seen the good that data-driven marketing can do. It makes online life more relevant. It helps businesses stock products their customers want to buy. But the intrinsic power of data analytics to segment a population can also be wielded to divide it.

The last twenty years in which database marketing has hit the big-time coincides with a period of increasing polarization between rich and poor. I’m not suggesting that segmentation causes this polarization. Rather, it’s that it helps drive the wedge, already in place due to a complex mix of social, economic and political factors, deeper.

The objective of segmentation is to enable businesses to target their marketing capital toward the acquisition and retention of those customers yielding the greatest profit. There isn’t anything wrong with this, per se. Making money is what businesses are supposed to do, and it is the responsibility of their marketing organizations to help make that happen.

Customers and prospects are identified by their potential to enhance the bottom line, and strategies are crafted to reward the more desirable segments for doing business with them and not reward less desirable groups for it (or even subtly discourage them from it). The most profitable customers are not always the wealthiest – but let’s face it, it’s often the way predictive models will tell you to bet.

Predictive and yield models tell builders how to market and build most profitably. A prospect who can only afford a $195K house is courted by no one and can’t find a new house to buy. A prospect who can afford a $950K house is courted by everyone and has plenty of choices.

Profiling will tell businesses which customers are likely to have the wherewithal to pay on time and upgrade to more profitable products. This insight will be incorporated into the firms’ CRM systems. Those segments will receive the best offers, the special concierge customer service phone lines, the waived fees. There might also be “aspirational” or “elite-in-training” groups that get slightly better treatment in hopes that they will start behaving like the elite groups. And the other segments?

It costs them more to do business. They pay more for products. They have to wait in a longer phone queue for customer service. As for the service they do get when the phone is answered, there is no scripting in the CRM system that explicitly says “you don’t have to go the extra mile to treat this customer well”. But it’s pretty much guaranteed that some harried customer service reps will (perhaps in a rush to minimize the Average-Handle-Time metrics they’re bonused on) interpret it that way.

Before analytics, businesses often had policies that every customer should be treated like they’re the best customer – because absent the data, the assumption was that every customer had that potential. But in the data age, there is no more benefit of the doubt. When people complain that customer service doesn’t exist anymore, they’re wrong. It’s still alive and well – it’s just heavily up-market.

To reiterate, marketing segmentation and analytics did not cause the class divide. That has existed for millennia. Let’s at least be aware about how marketing analytics contributes to it in present day.

A Marketer’s Guide to Agile Development – Testing…Testing…

I don’t know if this is a universal trait – and I’m not entirely sure why – but my experience with Agile dev professionals is that there is a tendency to resist testing. I’m not talking about standard software Quality Assurance (incredibly, that’s not always a given either). I’m talking about A/B or multivariate testing, and true UX testing. These are things I have actually heard in my career as a marketing professional from dev groups.

“We don’t have the resources to do A/B testing – it would double development time.”

This person that said it misunderstood what A/B testing is. In most cases, you’re testing something new against your existing site – which is already developed and in production. So you’re only developing the new thing – and you were doing that anyway. Where’s the extra work?

If you’re introducing multivariate testing, developing multiple versions of something new would create extra work for a dev team that is used to allowing development of only one version at a time. But that’s the right way to do it for maximum iterative improvement.

“We are strapped for time – we don’t have time to develop things that won’t make it into production.”

You’re saying you don’t have time to find out if the change you’re working on to enhance click-thru rates will ratchet up the bounce rate instead. That’s dangerous to your website’s health. It means the whole enterprise needs to guess right 100% of the time – which just doesn’t happen. You’re also essentially saying you only have time to develop and deliver something once – and that’s it. Not very iterative, is it?

“Our dev team/marketing team/product manager would be demoralized if all their hard work never saw the light of day.”

That’s why they call it work. That’s why they have to pay you to do it. Certainly, we can make it fun. But ultimately, decisions on what does and doesn’t go into the final software release MUST be based on how that software accomplishes business goals. Team members who only feel validated by having their work seen can start a blog. Or therapy.

“Doing UX testing on only six people isn’t statistically significant.”

The same person who told me this also told me that he wanted a study on the top 1% of revenue-generators because that would be enough customers to be statistically significant (but surely you understand that the top 1% isn’t a representative sample for…aw, screw it, where’s my propeller hat?). UX testing is not the same as quantitative testing – five or six subjects are plenty. If you’re not sure why, read Jeff Sauro’s great blog post about user testing and sample sizes.

“We don’t need to test – we have a good feel for what users want.”

Robust, qualitative user testing tells you where you need to tweak to enhance your software’s usability. User testing is necessary, in part, because you may be too much of an insider, too in love with your own work, or too defensive about your work to identify the usability problems that can torpedo software adoption. It is also necessary because it helps remove our own biases and opinions (the dreaded “I think users would want”) as impediments to truly successful software. Skip it and you’ll risk shipping software that works but that users won’t use.

I had my frat buddy/luddite office mate/mother test it, and they said it was great!

User testing can’t be successful with insiders. Or insiders’ friends. Or insiders’ mothers. Real users won’t forgive you if it worked on your machine but doesn’t work on theirs. Users won’t let you explain the jargon that makes no sense to them and perfect sense to you. UX is a professional discipline – and no, you can’t do it just as well.

A Marketer’s Guide to Agile Development – Why the Numbers Still Don’t Match

So you got a business intelligence system. It’s gonna be great! No more frustrating meetings where you spend half an hour wrangling over whose sales number is right! One source of truth, pure and simple! Suckerrrrr. Like Oscar Wilde said, the truth is never pure and rarely simple.

For instance, your BI system will have selectable date ranges – Harry will pick Monday through Sunday, Sally will opt for Sunday through Saturday, and they’ll both label it as “last week” on their slides. And then, there’s the problem of departmental myopia.

The data:

The Simple Question: How many widgets did we sell last week?

BUSINESS INTELLIGENCE TEAM

Ten. You put ‘WDGT’ in the prompt box. There’s ten of those. Where did the other two go? I don’t know, but feel free to put in a ticket and we’ll look into it after the tomorrow’s sprint 5 release.

FINANCE

Four. The ones with the MarginBuster promo are zero margin, so they aren’t really sales. Damn those Marketing guys…

MARKETING

Eight. Our MarginBuster promo code brought in eight sales. Gotta run, getting my chest waxed this afternoon.

WEB ANALYTICS

Three. WebTrends says three. Offline sales? Uh, geez, I seem to have left my abacus at home, bro. No taggie, no countie.

OPERATIONS

Eleven. One canceled. Sales guaranteed him the widget would get him first position on Google for the term “cool”.

FULFILLMENT

Four. We don’t count the load if it ain’t in their abode. If it ain’t come to fruition, you don’t get your commission. If it ain’t off the truck, it’s not worth a…well, we don’t count it yet.

SALES

Twelve. Gimme my bonus.

A Marketer’s Guide to Agile Development – The A Word

Marketers sometimes encounter Agile developers so full of themselves they’re leaking hubris all over the Kanban board. There is no part of the Agile Manifesto that says “It’s nothing personal – it’s just that we’re better than you”. But since most marketers are not that familiar with Agile principles, it’s not surprising that they blame the practice and not the practitioner when they encounter this behavior.

Does Agile breed arrogance? No more than ships cause barnacles. Agile in its best form fosters empowerment, and empowerment is very different from arrogance. It’s empowerment to use the best idea, not “I reject your reality and substitute my own.” In my opinion, arrogance is a perversion of Agile methodology, and an enemy of Agile acceptance and integration. The real evangelists of Agile are the ones who can communicate to the business what’s good about Agile in terms other than what’s good for developers.

But other Agile evangelists, in their zeal to bring the good news to those perceived to be waterfall worshippers, will assign Agile cult status. They then think they have license to shun and ridicule those who are not in the cult.

When a member of an Agile development team treats a marketer like a superfluous throwback, talks to him condescendingly, uses lean requirements as an excuse to build to their own vision instead of collaborating, and rolls their eyes and smirks when disagreements arise, the marketer blames this abuse on Agile. A product manager who works diligently with a team, sees her input ignored, then ends up with a product that technically works but users hate, concludes that Agile doesn’t work for product development. A brand manager who discovers shortly before launch that the development team was only pretending to work with him to prevent any interference in their workflow (via a process known as blocking), sees Agile as the justification for bad behavior. Things you might hear:

“I’m not changing my design just because UX said it’s not customer-friendly. That’s just their opinion.”

I don’t buy the premise that collaboration means that all opinions are created equal. I’m a database guy who is reasonably well-informed on user-centered interactive design and architecture. But branding myself a UX expert because I once listened to a Jared Spool podcast is as intellectually honest as saying I’m a nuclear expert because I stayed at a Holiday Inn Express. There are professionals that do UX for a living, and there’s a boatload of best practices that they know about and I don’t. Inclusion is great, and we’re all special, but neither my opinion nor the developer’s should carry more weight than the expert in the room. Nobody’s an expert on everything.

“We can’t take sprint time away from features coding to work on analytics – you don’t really need the data anyway.”

What the what? Do you build a car without a speedometer so you can get the leather seats just right? Do you refuse to take your temperature when you’re sick, because your intuition is better at detecting a fever? The business flies blind without success metrics, and the developer shouldn’t have line-item veto power over them. Of course, the analytics people shouldn’t be the only ones insisting on tracking. The business should be the ones to push back on any pressure to jettison analytics in order to cram more features into the sprint.

“You’ll have to prove to me that this enhancement is worth my time.”

Everyone who touches a project should know its purpose, its goals, its worth to the business, etc. If the business is not using proper discipline to prioritize projects, the development team may be within its rights to question whether an item belongs on a sprint. That doesn’t mean developers have unilateral authority to pick what they’re going to work on.

Agile can energize marketers and allow them to respond more nimbly and decisively to the needs of the business. But marketers must trust the process and trust the players to make Agile collaboration a success. If the trust is not there, or has been broken, it’s not a small “grumble at the watercooler” type of problem. It’s an emergency that must be resolved before real progress can be made.

A Marketer’s Guide to Agile Development – Backlog Bingo

My own requests to get on sprint backlogs are often analytical – that whole measuring success thing, for a variety of reasons: (1) enhanced analytics the business didn’t realize would be needed when a feature or function originally went live (2) basic tagging that should have been included when a feature or function went live but didn’t make it into the original sprint(3) tag modifications made necessary by site changes (4) repairs to tags that worked in test but for some reason broke in production.

Save your comments, I know Scenario (4) shouldn’t happen. Neither should petroleum-coated pelicans, decaf espresso or the Heidi & Spencer union, but hey, what can I tell ya?

Delivering working software is priority numero uno in an Agile development environment. Now analytical tagging is software, but it’s not the kind you can point to on a screen when your buddies are over and say “See that page? That was me.” So how will you get it prioritized over the other coding projects that result in something you can screenshot and put into a slide presentation? Or a developer’s resume? Or a fellow marketer’s resume?

Marketing has to lead the way in fostering a corporate culture that values the measurement of success as an integral and non-removable part of any website initiative. A healthy Agile environment begets continuous improvement – and continuous improvement requires baselines and monitoring to keep score.

If coding that enables measurement isn’t considered valuable software, then your analytical requests simply may not make it. Sexier coding tasks – the ones you can see, hear, click, monetize – will rise to the top of the sprint list – and your analytics request will go to shady, grassy section of the backlog to live with Jesus and Ronnie James Dio.

Tracking the success of what you build is not an enhancement or a feature or an add-on. It’s what marketers and developers need so they will know what to do next to dynamically enhance performance. And what could be sexier than enhanced performance?