Playing Digital Jenga with Modern Marketing Analytics: Interview with Christopher Penn


Today’s interview with Christopher Penn, a Chief Data Scientist and Co-founder of Trust Insights, made a buzz before it was even published. So grab some popcorn or a cup of hot tea and read this article on analytics and analysts, modern martech, and where everything is headed.

As always, thanks to Mariia Bocheva who held this great talk with Christopher.

Christopher Penn

Source: Christopher Penn’s official Facebook page

Jump straight to points of interest here:​

Mariia Bocheva: Can you give us a quick intro about yourself?

Christopher Penn: My background was in IT. My master’s degree is in management of information systems. I started working in a financial services startup in the early 2000s. Update the web server became update the website, and fix emails launch became send the email newsletter.

What happened at that time was marketing became marketing technology. Then in the late 2000s and the early part of this decade, I moved to marketing full-time and started working with a lot more data. And that was when I became much more focused on data science. I went in-house at a PR firm for about five years and really started looking more at machine learning and AI.

I parted ways with that company about two years ago now because they were going in a different direction. I really wanted to focus on data science and machine learning.

Skills to thrive

MB: You’ve mentioned data science, machine learning, and analytics. What do you think are the hard skills that are most important today?

CP: There’s three buckets of skills. If you go back to 1964, there was a famous business strategist named Harold Leavitt who posited that there were three areas that everybody needed to be competent in: people, process, and technology.

For a skilled analyst or a skilled data scientist, you need capabilities in all three areas. You need to understand the people side of things, how people work, the overall business strategy and stuff like that, and you need to have a good handle on the problems the business wants to solve. You also need to have a handle on the process — on how things are done — and be able to translate that to the work that you do. And then, of course, you need to have access to the technology, the skills, the mathematical knowledge to do data analysis — and you need to be able to code.

Those are the three big buckets that very few people have in equal and robust amounts. A lot of people have skills in one area and are weak in the other two. And that’s probably one of the most important things an analyst has to be able to do — to identify their weaknesses. To be able to say, Yep, this is what I need to work on to bring my levels up to where I’m strong.

MB: That partially covers soft skills, as I would barely call knowing how people work a “hard skill.” What do you think about that? What soft skills do you think are important?

CP: The difference between hard and soft skills is more about internal and external. Your external skills are the ones you would call hard skills: the ability to talk to another human being, to read the expressions on their face and stuff like that.

But internally, that requires a tremendous amount of self-awareness. It requires understanding your own ego, the things that make you more or less effective as a business professional, your weaknesses that you have to accommodate for. For some people, it’s their ego, their ability to say, Yeah, I always have to be right. But no, you don’t.

And in fact, it’s a deadly sin in data analytics, right? I remember I was working with this one research firm and their pitch was “whatever point you want to make, we can build data to support it.” That’s not how this works. [Laughing] And yet, they’re actually a reputable firm because they help companies invent data to defend what they want to prove. And so people do business with them because they say what you want to hear.

Another super important skill is to be able to explain things to other people in ways that they can understand. So again, there’s that sort of empathy. And then the third thing is really being able to teach at multiple levels, being able to know where somebody is (the empathy part) and then adapt to the audience you’re working with.

I was at Content Marketing World recently. And in one audience, I had people who were coders. So I told the coders, Okay, so this is Markov chain modeling for you, but for everyone else it’s digital Jenga. Being able to adapt your teaching to where somebody is a super important soft skill. It’s very difficult if you don’t have a background in the design of educational instructions or any experience with it at all. I’ve been helping out in my teacher’s martial arts school for 25 years and that’s a great skill I’ve got there — being able to see that someone is struggling with this, they need help with this, and to identify where somebody is in their own progression in their career is super important. And it’s a skill that’s not taught. You have to learn it through your own experiences.

MB: Do you think it’s also connected with storytelling and the ability to present data in a way that people can comprehend it?

CP: I agree that it is. There’s an expression in psychology called “throwing mattresses” — everybody has a mental doorway that’s differently oriented. And if you fling a mattress at that doorway, there’s only a certain number of ways it will get through. But everybody’s doorway is different. So how is your audience shaped? Can you throw mattresses in a way that gets them through as many doorways as possible?

I dislike “storytelling.” And the reason I dislike that phrase is because marketers, especially marketers who don’t have a good grasp of data, have a tendency to be too self-centered, too company-centric, and say something like, We’re going to tell you a story about our wonderful product! But nobody cares, right? Tell the stories the audience wants. And that, again, goes back to empathy and self-awareness. It’s not about you or the company. It’s about what the audience needs.

MB: That’s a great point. I totally see what you mean. Speaking of throwing mattresses and how everyone is different, what do you think about miscommunication between analysts and marketing teams? Do you have any recommendations on how communication can be improved?

CP: I don’t think it’s much of a problem because not many marketing teams talk to analysts in the first place — or even have access to them. Most of the marketers I talk to are forced to try to be analysts because their company doesn’t have any [analysts] or the company has deployed them on more pressing business problems. So the bigger challenge is for those poor marketers to develop any kind of analytical skills to deal with the data they have.

I can’t tell you the number of times I’ve talked to a marketer who’s like, Oh, yeah, we have Google Analytics. I don’t know what it means. I don’t know anything in Google Analytics. It’s so weird to have those discussions. It’s not, you know, rocket surgery. But again, going back to that self-awareness, it is rocket surgery for them because they got into marketing to be creative, to be inspirational, to write well, to craft stuff. They didn’t get into marketing because they loved quantitative science. Otherwise, they would have become a data scientist in the first place.

But now, marketing technology and digital marketing has forced them to deal with this problem or ignore it as long as possible. When they do have access to data or they have access to an analyst, their success will depend on their self-awareness.

Analysts don’t have a good marketing vocabulary, and marketers don’t have a good vocabulary in data science and data analytics. So probably the most important and easy thing to do is to get people together on Friday afternoons with the beverage of your choice and get them talking to each other. Because you can become familiar enough with a discipline just by hearing people talk about it.

Also, people have commutes in a lot of places. And it’s not a bad thing for your data scientists to be listening to marketing podcasts or for your marketers to be listening to data science podcasts to really start to hear the vocabulary, hear things like regression and logistic regression and random forest. And people might say, Oh, I should ask my team what this means or There’s different data types in each of these, how do we address them?

It’s really getting everybody hearing the vocabulary of other disciplines as a way to start having those meaningful interactions.

MB: I really like how you put it together. I have my background in analytics, but I also work in marketing. For me, it was never a problem, because I had exposure to both. But a lot of people who’ve been working in one field aren’t exposed to the other.

CP: Exactly.

MB: Great. In terms of mistakes, what do you think is the biggest mistake an analyst can make? Maybe you can share something from your experience that you’ve run into?

CP: In terms of mistakes that analysts make… These are human problems, right? So the number one really is the presupposition bias. That is, I know what outcome I’m looking for and I need to prove it. Nope. The word for that is incurious. If you’re incurious, you’re not curious, you don’t want to find the right answer. You want to find an answer as quickly as possible, or you want to find an answer that justifies a bias that you already have, an outcome that you’ve already judged, and those are human problems, right? Those are not technology-related, and no software is going to fix that. No tool is going to fix your being incurious.

Christopher Penn on the stage of Decode

So that’s something we have to train as best we can and hire for, more than anything. And you can hire for curiosity, right?

There’s one interview question I used to do, which I thought was always fun, where I would take a deck of regular playing cards and give it to a person at an interview. I would say, Here’s a deck of cards. Put it in order. I never specified any kind of order or anything like that. The goal is to look at the person and see how they react to a directive like that. Do they panic? That’s probably not so Yep, I got it. I’ll put it in order. But I didn’t say which order I wanted. Again, they have a presupposition bias, and that’s bad. Do they ask questions? How many questions do they ask? When they dig, do they push you on it? Those are attributes that you would want to have.

bonus for readers

30 handpicked Google Data Studio dashboards for marketers

Download now

The human problem of making companies data-driven

MB: Let’s move to the next part of our interview about the same thing but from another point of view. There are a lot of buzzwords today. Everyone is talking about companies being data-driven, but not a lot of companies succeed in that. Why do you think that is? Is there anything we’re missing in terms of analytics or marketing that could take companies to another level and help them grow and be able to use the data they have?

CP: This is a human problem more than anything else.

To be data-driven means that you need to make decisions with data first — not experience, not intellect, not instinct, not this is the way we’ve always done it

You need to look at the data and go, Okay, this is what the data says, and we’re confident in how it was processed, so let’s make decisions based on it.

That requires overcoming my way is always right — like we call it, the HIPPO problem [the highest paid person’s opinion]. It’s such a cultural and human problem that’s difficult to overcome, so becoming data-driven is really, really tough.

The number one thing — my friend Tom Webster of Edison Research says this — that holds somebody back from being data-driven is that they can’t deal with data and analysis and answers that they don’t like!

In the martial arts, we have an expression that you have to be comfortable being uncomfortable, which means you have to be comfortable with partial answers, with incomplete answers, with missing some of the data that you’ll never have, and still being able to use the data in ways that deliver business impact. It’s not that people have specialism problems or specific skills. You can teach any skill at all. The problem is can you make that cultural shift to say, Yes, even if I don’t like the answer, even if I’m unclear on the answer, I will still use the answer to make a decision.

MB: So it’s more about a cultural shift.

CP: It really is.

MB: Talking about this cultural shift, do you have any recommendations on where to start?

CP: Work for companies that are already that way. It sounds flippant, but because it’s such a human thing, if the DNA of the company you’re working at isn’t that way, it’s extremely difficult to make that pivot, especially if leadership hasn’t bought in. If leadership is like, No, we’re going to do it the way we’ve always done it. Our company is 126 years old, and we have always done it this way! Well, you’re not going to change their mind.

I was having dinner with somebody at a conference recently. And they said, Our company’s 126 years old, and our CEO said we just want it to be 1950. Why can’t it be 1950, when people just buy and use our products and this whole internet thing goes away? Well, bad news, that’s not gonna happen. [Laughing]

MB: It’s quite difficult. A quick story here: We work with one company that has a long history. They were founded in 1908 or something like that as a family business, and they’ve produced lingerie since then. And they became pretty big. Now they sell in France, the Netherlands, the UK, Germany, and so on. And they have a lot of brick-and-mortar stores, and they have different chains. They have a luxury segment, mass market segment, and so on. And at some point, their ecommerce team came to the management and said, We need to increase our budget because our expenses are growing, we’re bringing in sales and this and that. And they said, you know what, guys, if we look at the overall revenue, you’re only responsible for 5%.

So the ecommerce team came to us and asked, Can you help us prove, considering research online purchase offline, that we actually produce more revenue for the company and that we influence bigger numbers? We ran a project with them, and it took a good five or six months. But at the end of the day, it turned out they were responsible for almost 30–35% of the revenue. And it helped them prove they were doing stuff and that helped them increase their budget. But yeah, it took a while talking to the board of directors, showing the numbers, and trying to shift this internal feeling. Exactly what you were talking about.

CP: Yeah. Culture is tough. People are tough.

MB: That brings me to another question. I’d like to circle back to the first question about hard skills. You were talking about platform, people, and processes, but what do you think about more technical stuff, like simple Python and knowing how to build dashboards? Maybe statistics? What are crucial things to start with for people who want to dig deeper into analytics?

CP: Okay, so I have a bit of a rant on this. Dashboards are visualization. Visualization is a critical part of data analysis. And it’s one of those things that’s an art unto itself. The technologies you mentioned — SQL, Python, R, etc. — are not visualization tools. They’re compute tools. 

And one of the biggest sins in all data analytics, but especially in marketing, is trying to make your visualization tools do computing.

It’s like people who try to make their website code and interact with the content, the visualization. Your style sheets and your HTML are separate for a reason. Because visualization is separate from computing. You absolutely need SQL, and Python, and R, and SPSS, and OWOX, and whatever else you use to do the computing and to do it well — to be able to provide all the data analysis on the backend before it goes to visualization.

At no point should your dashboard — should your visualizations — be doing any kind of computing. You should not be blending data, you should not be doing data manipulation. All the data that goes into visualizations should already be computed before it gets there.

Because what happens otherwise is you run into this massive governance problem. The boss needs a new change to something, but we don’t know how that data point got there. So we don’t know how to manipulate it. So we can’t make any changes. We’ve got to tell the boss, Sorry, it’s gonna take six weeks for us to untangle our spaghetti to understand what’s going to happen here.

When you keep compute separate from visualization and the boss says, Hey, I want this, you say, Great, I’m going to go into my compute layer, load up our studio, and make changes in the code and the computation logic. And I’ll push out a new data point or change the point. And that leads to Yep, bar chart, new bar chart, there it is. And it keeps changes clean, it keeps versions under control. And it makes all data manipulations so much easier to manage, especially at scale.

If you’re a marketing all-in-one, you can probably get away with a one person, one shop, one tool kind of thing. And you’re not going to run into the issue of version control because you’re the only person who’s responsible for it. However, if you leave or get fired, that company is totally hosed. And it’s the worst practice.

If you’re at a larger company where there are multiple people in your analytics and marketing departments, you must keep computing and visualization separate. Otherwise, you’re going to waste so much time and money just trying to figure out where the data lives and who’s in charge of it.

The other thing — and this is so important for compliance these days with GDPR and CCPA and other regulations — is that if visualization and compute are blended together, you have an access control problem. You have a problem where you cannot create boundaries as to who has access to the data, which means you can leak data, you can have it exposed to employees who shouldn’t have access to certain data sources.

When compute is governed and handled separately, you can restrict it and say to your visualization team, You can visualize all this that you’re allowed to have, and at no point does any sensitive personal information ever even go to the visualization team to keep it safe from them. It’s so important that people do that to keep this stuff separate and clean.

So does an analyst who’s responsible for visualization need to know those languages? Absolutely not. Does the compute person need to know? One hundred percent yes.

Does size matter? Analysts at big and small companies

MB: You do have a big point here. I really like the approach when we visualize ready-made data sets and there are no calculated fields in them.

And I’d like to clear up one question I already asked. There are different stages of business or different sizes of businesses or businesses operating at different scales. You can look at it from different angles. SMB, SME, enterprise — they all have different requirements for an analytics team, or at least it looks like they do. What do you think? Do you think there’s a difference in the requirements for analysts at companies of different types?

CP: Fundamentally, everyone has the same goal in marketing, which is to help drive revenue. Right? If your goal in marketing is not to help drive revenue, I’m not sure what you’re doing in marketing. But whatever it is you’re doing, you’re doing it wrong. Because at the end of the day, that’s what our charge is — to create impact. What happens is not that there are different things to do. As your company scales, the tasks become more specialized. When you’re a one-person show at a small company, you’re the web designer, the email guy or gal, the analyst, the statistician, the customer service person possibly.

As you grow, you hire. And as you hire, you specialize. You start splitting off individual tasks. And each task now has its own reporting and its own data, its own visualization. And so at the end of the day, the thing that companies do wrong most often is they lose sight of the fact that every one of these specializations, especially in the enterprise, still has to reach for that same goal that everybody has, which is Did we drive business impact? Do we help the company make money, save money, or save time? Grow customers? All these things.

So depending on where your company is in its life cycle, your role, if you’re a sensible analyst, is to ask, What am I doing that contributes to that big picture goal? What are my KPIs personally? Are my personal KPIs aligned with the department’s KPIs? Are they aligned with the business’s? The definition I use in all of my talks — you heard it at the MAICON conference [Marketing Artificial Intelligence Conference] — is that a KPI is a number that determines whether you get a bonus or get fired, right? That’s a KPI. If it’s a number and you’re not going to get a bonus or get fired for it, it’s a metric.

So when you look from that perspective, what are the KPIs that you’re working towards, that your team is working towards, that your department is working towards, that your company is working towards? And you look at those numbers, and sometimes they’re not gonna make any sense. And if that’s the case, then you as an analyst need to be aware of that and raise the flag and say, Guys, I’m not sure this is going to drive any business impact. And if the company says you still have to do it, that’s when you start updating your LinkedIn profile. Because if a company is working towards nonsensical KPIs, they’re going to go out of business. It’s only a matter of time.

the conference with Christopher Penn

MB: So just to make sure I got you right, there’s no difference in the requirements for analysts at small companies versus big companies because the goal for everyone is revenue.

CP: Yeah, revenue and business impact — are you helping the company make money, save money, or save time? Those are the three functions that everybody wants and every business needs. It’s the same thing we’ve been talking about for 10 millennia: better, faster, cheaper… everybody wants that. So what are we doing to make our business better, faster, and cheaper?

The hardest challenges for the analytics market

MB: What analytical challenges do you currently have? What do you use to overcome them? Or how do you plan to solve them?

CP: My company’s biggest gap right now is in highly specialized knowledge in specific areas that we know are strategic priorities. We just need to create more time and revenue to acquire the knowledge, specifically around deep learning.

We’re proficient at classic machine learning, we’re proficient at statistics. Deep learning is particularly when you start using things like transformers and super complex neural networks. That’s an area where we don’t have enough specialization yet, where we don’t have enough expertise and enough experience. So that’s where we know we need to go, because that’s the way the market is going. We see this in tools, for example open AESGP or the Grover model by the Allen Institute for AI. The cutting edge is moving so quickly — MelNet with its voice synthesis. We need to be able to build those capabilities internally so that we can offer them to customers.

Now the good news is that there’s a bit of a market maturity issue here. A company doesn’t need GPT-2 or MelNet today to drive serious marketing impact. Those are still early adopter technologies, but we know that we need to have specialization or capability in them. So when the market does catch up, we have a presence for those things. You need none of this to put together a good marketing dashboard. Zero of it. At most you need classic machine learning to do stuff like predictive analytics, time series, forecasting, and things like that.

But knowing where the technology is going, knowing where the market is going, knowing where the device makers like the Googles, the Apples, and the IBMs of the world are going gives us strategic guidance as to what we need to be able to do. So as the market catches up, we’re there.

MB: That sounds amazing. And really inspiring.

CP: And very challenging.

MB: For sure. Since you’ve started talking about the maturity of the market, how do you evaluate the current maturity of analytics and marketing analytics in particular? What do you think is the future of marketing analytics?

CP: The maturity of marketing analytics is still very, very far behind the maturity of analytics overall. Analytics and statistics and data science have very strongly proven techniques and models and methods that are 50, 60, 70 years old at this point. And they work brilliantly, right? They’re beyond proven.

But because of marketers’ lack of quantitative skills, many of these things are showing up as new in marketing. And I’m like, Come on, that’s 70 years old! And one of the challenges, I think, for marketers, is to be able to talk to a vendor, particularly someone who’s advertising a brand-new thing and say, No, that’s snake oil. That’s total BS. That is this technique. I know it’s this technique because of the outputs that come from it. And you’re selling somebody something that is, you know, $500 a month for something they can do for free with open-source software that’s existed for 15 years.

So there’s a lot of snake oil in the marketing analytics space right now because companies are figuring out that one or two legit data scientists at best and a programmer can create a standalone solution around one technique and sell it like it’s the new Ambrosia, and that’s not the case. So the marketing analytics space overall is pretty far behind, and the marketplace is catching up.

But the people are not [catching up]. The people are still stuck very far behind. But that is changing. When I go to conferences and I talk to people who are younger, in their early 20s, new in their career, they still have no quantitative skills. They still have that, for lack of a better term, arts and crafts mindset, which is great. You need that right brain creativity. Absolutely. But you also need the left brain. You need a whole brain marketer. And that’s not what is happening in the marketplace.

I was talking to a few folks last week at Content Marketing World who were like, Yeah, I’m new in my career. I’m two years in, I don’t know any of this analytics stuff. I’m like, What did you study in college? Were you drunk the whole time? What happened there that you didn’t take a Stats 101 class? It’s not like statistics is new. And it’s not like measuring marketing is new. Google Analytics has been on the market since 2005. So at this point, you should know it. That’s the age of my eldest child. So at this point, you should know that these are strategic priorities for your career and have those capabilities. And I would say we’re going to continue running into this problem in marketing analytics for a long time to come because the folks who are brand new out of school still don’t have those capabilities.

MB: Yeah. I did have statistics at school, but I still have to go and revisit it from time to time.

CP: That’s totally fine. If you’re googling on Stack Overflow and stuff, totally fine. But at least you know it’s there.

MB: And what about the future of marketing analytics? Now we have these tools that are sort of old, or are using old approaches and trying to sell them as new. So people have got stuck at the previous level, so to speak. Where do you think it’s going? How is the analytics market going to develop? What trends do you see?

CP: I think the real interest is in the auto machine learning and auto AI space. IBM Watson Studio has Auto AI, H2O has AutoML. These tools are AI for AI, essentially. They take a data set, select the model, do hyperparameter optimization, do feature extraction stuff, and sort of spit out the best performing model from your data. And while there are still very serious limitations to these tools, they do considerably accelerate the process of doing very hardcore data analysis on data sets.

And so what will happen is that as there is market priority, as there’s market demand, as people are willing to pay for it, you will see more analytics tools saying, You know what, you clearly have no idea what you’re doing. Still. So we’re just going to do it for you and give you a buffet of answers. And then you pick the one that makes the most sense for your business. And I think that’s where the tool sets will have to go. It’s happening already in the machine learning space where, with a tool like Auto AI, you’re plunking your data and it selects the algorithm and you get the results.

And then you as the skilled data scientist look at it and go, Hmm, now we know these results still need some additional work, or Yeah, that’s good enough. That’s what I would have done. And it saves you an enormous amount of time because when you’re doing feature engineering and hyperparameter optimization, that can take so much time, like days, and to have a tool saying, I ran all 450 variations of this model and variation 73 is the one that works best, that’s cool. Because that took less than an hour. I can get on with my day as long as the output is good.

Marketers already see a lot of this happening in their tools. Google Analytics, for example, has a little button in the upper right-hand corner called Insights. And all it is an anomaly detection algorithm that Google’s running on your data to say, Hey, this happened yesterday, you might want to pay attention to it. As vendors evolve, more and more of these things are just going to be built in because the vendors know the market is not going to build the skills. They’re just not.

MB: You know, from one point of view, I totally agree that that’s what marketers expect — that you’ll do all of this for them. I like how you put it — an AI for AI. But from another point of view, don’t these systems bring up even more questions because they operate as a black box? Some people who have an understanding of what’s under the hood and how it works — and also pretty deep domain knowledge — can definitely say, Yeah, that makes sense. But most people, if they see 400 different variations, will say, I still don’t know which one looks good.
Christopher Penn on machine learning

CP: Yep. And this is one of the biggest problems in machine learning and AI today — not only knowing what’s happening inside the box but interpreting and explaining it.

Explainability is a post-hoc explanation of what the model did, as in this thing went through and these things didn’t. When you look at regulations like GDPR, that’s not good enough. GDPR says that you have to be able to tell a customer in the European Economic Area how their data was used, which means you need to be able to unpack the algorithm itself and say, This happened with your data in this step, this is what happened to your data in that step. And certainly in America we have some very serious problems with companies having absolutely no ethics when it comes to data. It’s not that they’re evil; they just have no ethics period. And so what’s happening is that you’re creating all kinds of biases that are reinforcing existing system-wide structural problems.

The most prominent example of this is healthcare data on African Americans in America. It’s a system that is systemically corrupt. There is no good, reliable African American healthcare data, period. It doesn’t exist, because there are so many cultural and societal biases that it becomes all wrong. So you actually have to figure out how to compensate for that by pegging their health outcomes to the health outcomes of other ethnic groups in order to get a working model that is fair.

And the same is true even in marketing. I was talking with an insurance company about their model for how they market their insurance and they said, We have all these things that help predict who the best customer is going to be. And I said, Okay, how will you prevent against redlining? And they said something like, Oh, well, we wouldn’t ever draw lines on a map and say, we won’t sell insurance to these people. And I’m like, No, no, digital redlining means that in your model you may have features that you didn’t expect, like race, religion, things like that, that are disallowed. You’re not allowed to use those for decision-making. That’s kind of the point about data-driven marketing: you have to know what data you’re not allowed to use. And this company wasn’t doing that. So they were doing the digital equivalent of redlining. And they didn’t know it, because they didn’t have anyone with technical knowledge to look at the algorithm say, This is not okay.

So one of the things that we have to take into account as we’re rolling out these technologies and marketing is that we have to be able to interpret and explain what the machines are doing and be able to stand in front of auditors or possibly in a court of law and explain how it works. And this is why we are not discriminating against women, for example. Last year, Amazon tried to build a hiring algorithm, and it stopped hiring women immediately because the training data set was 95% men, and the machine picked up on that. And it’s like, okay, so no more women. No, no, that’s not how that’s supposed to work. But no one thought it through.

MB: Amazing example. Thanks for that. Let’s get deeper in our troubleshooting investigation. What problems do you see on the market today? One of them is ethics, for sure. And probably culture, employees, and the maturity level of companies. Maybe you can unwrap that once again, or maybe you want to add something else to the problems?

CP: I think probably one of the most important things that companies are going to need to do is change how they hire. The reason you don’t have good people coming out of school with skills in statistics and data science as a standard — even if there are exceptions to every rule — is because you have professors at these schools who are, you know, in their later years of their career who also don’t have those skills and don’t know how to teach them. And the marketplace doesn’t demand them.

If the marketplace demanded them and said, Hey, even to be a marketing coordinator for this company, you have to have Statistics 101. You have to be able to distinguish between a mean, median, and mode. If the marketplace demanded it, guess what? The hiring pipeline and the people coming out of school would have to adapt if they wanted to have jobs. It’s not a big deal right now, but it will be as we move into the next recession, which is coming. 

Companies will be able to afford to be a lot more selective because the talent pool will be much larger and the number of people employed will be much smaller. And so that will flip again for another five or six years. So now is the time to skill up so that when that flip happens and companies can be a lot more selective, you have the experience, you have the knowledge, and you may even have the portfolio that says yes, I can do what 99% of other marketers can’t. I can be creative and I can be quantitative at the same time. 

That’s a major problem that the marketplace will solve as we go through another economic cycle. But again, 99% of these marketers are not ready. So that’s something that they need to address themselves, becoming proficient in this stuff. And then all these vendors are creating all these analytics tools, which is great, but if you still don’t understand what you’re doing, it’s like putting somebody in a top-line kitchen with Viking ranges and Cuisinart appliances and Blendtec blenders and all the stuff, and they still don’t know how to cook. You can give somebody the best tools, the best software, the best everything. If they don’t know what they’re doing, it’s still useless.

Resources and recommendations for analysts

MB: I totally agree with you there. We touched on the theme of self-education. What professional resources or events can you recommend for analysts?

CP: I don’t recommend events if you’re trying to skill up. You should be taking courses and classes for that. And it depends on where you want to be.

If you want to do the visualization route, which is valuable for a data analyst, there are tons of really great books. Edward Tufte’s book The Visual Display of Quantitative Information is excellent. There’s all these great books that can teach you the art and science of visualization. That’s one side.

If you want to learn the data analytics side, I strongly recommend taking a stats course, because you need the math side. If you’re mathematically inclined and you love it, then you could progress into linear algebra, for example. If you’re not mathematically inclined, that’s not so good. Then you absolutely should be.

If you’re going to be on the compute side, you need to be doing either the R or Python programming languages, period. End of story. Those are the core languages in data science. And every vendor out there that is offering sort of an easy drag-and-drop interface — their technology is lagging behind what the open-source code is in the marketplace. If you want to use the latest version of GPT-2 from OpenAI, guess what? You better know how to use Python, because no vendor has it in their software yet. And if you want to be able to do advanced text analytics, that’s where you’re going to have to go. So you have to learn those things. If you want to take some courses, there’s a phenomenal resource hub from IBM called Cognitive Class, and it’s 100% free. You pay zero dollars and you can take all these different courses on every topic imaginable in data science and AI. I strongly recommend it.

MB: One very last question. How can an analyst have a greater impact on marketing or on business? I predict you’ll also say that it’s a human problem and an involvement problem, but maybe you could add something else.

CP: It’s somewhat related, but it’s having those KPIs and those goals and understanding your business’s KPIs. If you want to have an impact, you have to serve the overall goals of the company. And that means you have to understand them.

As a marketer, if you just blindly have that bias — I’m going to drive new leads, that’s what I’m really good at — but the company has a customer retention problem and we need your help creating content that retains customers, that keeps them loyal, then you’re going to be doing the wrong thing.

Do you understand the goals of your company? Do you understand how your marketing relates to those goals? If you don’t, better update your LinkedIn profile, because you’re not going to last very long.

MB: Thanks a lot. That inspires me the most!

To sum up

Wow! There was a lot of stuff revealed in this interview! We’re still processing it all.

We totally agree with Christopher about the human and technology problems. Sometimes they’re hard to admit, but you should fight them. We’re glad to be one of the companies that’s moving in the direction of making data a tool for development, decision-making, and people.

Share this article with your colleagues, and stay tuned for our next interviews!