Video

Balancing the Modern and the Practical in Digital Marketing

E-commerce expert Rob Schmults joins Bain Partner Cesar Brea for a discussion on how even though data and analytics have changed marketing as we know it, good fundamentals remain as important as ever.

Video

Balancing the Modern and the Practical in Digital Marketing
en

As digital tools have evolved, so have marketing strategies. But tools alone aren't the key to effective marketing and e-commerce. Rob Schmults, an executive board member at private equity investment firm MidOcean Partners with extensive experience in e-commerce and digital marketing, joins Bain's Cesar Brea for a conversation about not losing sight of the importance of good marketing fundamentals amid the evolution of digital marketing tools, analytics and strategies.

Read a transcript of the conversation below:

CESAR BREA: Our guest today is Rob Schmults. Rob is currently a member of the executive board at the private equity investment firm MidOcean Partners. Most recently, he was SVP of e-commerce and CRM for Talbots, the multichannel women's apparel retailer. And he's a longtime board member at the National Retail Federation. Going back, all I'll say is that Rob is an e-commerce OG and a longtime friend, and it's my privilege to have the opportunity to talk with him today. So hello, Rob.

ROB SCHMULTS: Hello, Cesar. And can I? I want that clip so I can just use that as my standard intro.

BREA: Awesome.

SCHMULTS: Because it's better than I usually give myself, for sure, so thanks for that.

BREA: Good. Well, Venmo me the 20 bucks, and we'll be good. So you and I talk a lot about the topic today, which is really this issue of balancing the modern and the practical and managing by results. And, you know, essentially making sure that we stay within what we can get done, but stay open to the possibilities that some of these new capabilities provide. And so we thought that's a topic that we actually end up talking about a lot both pre-Covid in person and now online and in various settings. So we thought it would be a good topic to explore a little further.

But before we get into that, like, how's it going? How are you doing? The Strava feed suggests you're keeping fit, but maybe you're just channeling a lot of stress like me.

SCHMULTS: Yeah, yeah, exactly. Well, you know, my stock line is I'm sick of Covid but not sick with Covid. So I think generally I try to remind myself I'm very lucky and very thankful that, you know, me and my immediate family and everyone are all healthy and more or less doing well, et cetera. And that's certainly not the case for a lot of people, so very grateful. And hopefully, you know, same is true for both you and for the listeners today, because it's interesting times, as they say.

BREA: Good. I'm very glad to hear it. So we got to know each other pretty far back now. It's actually counting going on, like, 20 years during kind of like the Precambrian explosion or something of the Internet. I'm wondering if you could recap your evolution a little bit.

SCHMULTS: Yeah. And you're right. I think you and I probably first started having our off-camera chats about data and analytics probably more than 20 years ago when I think about it. So just full disclosure, Cesar and I have never actually worked together at the same place, but our paths crossed in I think was the late '90s.

BREA: Yes.

SCHMULTS: And despite our youthful demeanors, we've been at it that long. And I know I've certainly learned a lot from you over the years about all things data and analytics. So it's fun to be having this conversation. In terms of my evolution, you know, there's that classic graphic showing the stages of human kind. I put myself somewhere in the middle, sort of pretty hairy knuckles starting to come off the ground, but, you know, moving up that chain there. So that's progress, I guess.

And then on the professional front, as you said in the intro, I've worked across both retailers, large and small, and also solution providers, again, large and small. And there's a fair bit of diversity in there despite the common thread of the Internet and certainly e-commerce. But I think the thing that's really linked all the roles together has always been data and analytics, not surprisingly. And whether that was the sort of central focus of my job, as it was sometimes, or just a portion of it, as it was in many cases, you know, all roads kind of led to and from there. So excited to have this conversation today and to the degree I've learned anything over my advanced years, hopefully, there's something of interest to the listeners.

BREA: Good. You're being very modest, but we'll leave it at that.

SCHMULTS: Let the audience decide.

BREA: Exactly, exactly. So when we talked about doing this, what we talked about was that the natural starting point was kind of the topic that we seem to gravitate to in our pre-Covid meetups. And I guess to describe it is basically swimming upstream through a lot of hype about tech and advanced analytics to actually get stuff working and getting results. But one of the things that I think prompted this conversation was that we both noticed that we were most often on the same side of the argument.

And so it seemed like a good idea, even if we couldn't argue the other side with a straight face, to talk a little bit about how to keep that balance right. And more specifically, the balance between pushing for these new modern capabilities and yet trying to stay practical and focused on results and mitigating risks. At least that's how I thought about framing it. How would you shape it?

SCHMULTS: Yeah, so you already gave away the—you know, I could just be the worst guest ever and say, yes, I agree. But I'll elaborate a little further. One, it's been sort of interesting to see how e-commerce and online marketing, that sort of hype over substance balance, has, I think, gone from being very much hype to now being much more substance, right? And, I think, it's not that it's totally settled down, but that landscape's largely settled down. That is not the case with analytics, which, if anything, a few years ago was all big data, and now it's all AI.

And we're still in this mode that we've been in now for quite some time around analytics of it sort of being this automagical world of amazing, almost aura of wizardry, which has a lot of downsides, right? Not the least that, by definition, hype usually represents a separation from substance. But I think there are a couple of things that are implicit there.

One is that when something's got a lot of attention and a lot of focus, and boards and CEOs understandably say, "hey, I want some of that," when they hear this fantastic story about how solution x or company y achieved these unbelievable performance changes from this amazing, as I said earlier, wizardry. You know, one thing that can happen is you get a separation of focus on means rather than ends. So people start to say, "I need an AI strategy," right? As opposed to "I have a business strategy; does AI support that? And if so, where, and how can we use it to improve our underlying?"

And so, again, that's something certainly I know you've seen, I've seen a lot over the past couple of decades, and very much is with us now in kind of all things analytics, where, again, people wanted to have a big data strategy. If you're a big data firm, that made sense. If you're a retailer or a brand or a manufacturer, that's not your business. Your business is your business. And then again, whether it's big data then or AI now applied appropriately.

And then the other thing—and this is such a tried-and-true statement, but so hard to stick to—is that, you know, for most companies, there's typically a lot of unharvested opportunity around the basics or the fundamentals, if you will. And I'm not throwing companies under the bus, right? It's just that there's always so much to do in any company. And those things tend to be more boring and less sexy, but often have a much higher return than the new new thing. So, anyway, I guess rather than just saying yes, I gave you a long winded "yes, I agree with your construct, and I think it is one that we need to be aware of and kind of, I guess, on guard for.

BREA: Let's actually take it to a more specific level. I'm wondering if there's a couple of examples where you felt like you were trying to make this trade-off between bringing in something new while actually sticking to making sure you were harvesting the basics. Maybe one where you maybe feel like you were a little bit too aggressive, maybe one where you were too conservative, and maybe the just right Three Bears version here.

SCHMULTS: Well, I mean, Cesar, you know me well enough at this point to know I never have ever made a mistake. And so unfortunately--

BREA: I will attest to that.

SCHMULTS: Actually, sarcasm aside, my problem here is probably narrowing down the number of things where I didn't get it quite right or awkwardly got it wrong. I guess one story I'll tell is an example where I guess you could say I bought into the hype using our construct. I was at a firm where we were actually able to build the coveted, omnichannel, fully instrumented, fractional attribution model. So there are a whole bunch of words in there. But what this was was we were able to build a model that had awareness of all of our marketing touchpoints.

And then—and this is often the harder part—it also had awareness of where those customers and those prospects we touched then showed up across the various places that we sold our product, right? So most people can do that latter point around their digital site, but often, what's known as the match rate in the physical world is very, very low, because people will pay with cash or they use credit cards, where you can't get the data, et cetera, et cetera.

So even though I know, hey, it was Cesar—I know Cesar, you're in my file—because you showed up in my store. In a lot of places, I don't know if it was you; I just know there was a transaction. So this company that I was at, we had the data. We actually had a very high, a ridiculously high, patch rate. So we could really see how each dollar that we spent in our marketing programs led to sales or not in our physical program.

So this is not a small undertaking, even when you have the data. There's just a ton of work that needs to happen. And we were lucky we had a fantastic team. You know, people I would work with again in a heartbeat, just really strong performers. We had great third-party support. I think we picked the right platform, the external platform.

And we had great support from them. And we spent months wiring this thing up and then wringing out the gremlins, making it work, et cetera. We finally got to the point where, all right, everything's good, it's worked. The big reveal: What did we discover? What new, amazing thing? Well, we found out that our prior methodology, which was sort of a commonsense-based light-touch approach to relative incrementality, of our marketing programs and how we thought they performed was indeed correct.

So we literally learned nothing new. And the team, understandably, was really bummed. They were expecting we were going to find—aha, there's this huge pile of money sitting over here that we didn't know about now when we can go get. Or—aha, there's this massive wastage going on over here. We didn't find either of those things. And so I guess in a way, I personally felt like, oh good, we thought we had a reasonably well-run ship, and we did.

So, you know, in a way, you could say, well, that doesn't sound like such a bad story. But the resources that we put against this—and I'm not talking the money, per se, but the people, and the time that we had those people spend on it—you know, what else could we have done with that that would have actually maybe found a pile of money over here or helped us remove a bunch of costs over here? You know, it probably was not. In hindsight, that was a mistake.

You know, the opportunity cost there exceeded. And, again, going back to what we talked about, that was a whizbangy-type thing. I mean, to this day, people still, like, oh, we really need this. And don't get me wrong, I'm not at all saying don't have a clear understanding of the incrementality of your performance marketing and where it's spending, where it's working, et cetera. It's so critical to manage that appropriately. Not saying that.

But at least in this case, we built the sort of Wizard of Oz machine only to discover that our man behind the curtain, which was our old methodology, actually worked just fine and we didn't have to invest time, money and scarce resources on that. So anyway, it may be a longer story than you wanted to hear, but I think illustrates the question.

BREA: Yeah.

SCHMULTS: On the other side—hopefully, I can do this in less time—same company, our website kept throwing 500 errors. And anyone who knows what a 500 error is, it's literally your other bucket when it comes to website errors. It's sort of your nondefined error.

And so these are like your worst nightmare, because your IT team is, like, well, I don't know what to do with that. Like, we can't replicate it. We don't know what it is. And the IT team in question here, by the way, incredibly hard working. They were so diligent. These guys sweated, you know, and worked nights, weekends. Just couldn't be more dedicated. And yet this 500 error thing was just one they're, like, we don't even know this is a real thing. We don't know how big a deal it is. It's just, sorry, we kind of can't help you. They said it maybe a little more politely than that.

I wasn't willing to let it go, because I was, like, it just doesn't feel right. So, a data resource who also was incredibly dedicated (and Helen Vetrano, if you happen to be listening to this, yes, I'm talking about you), she just rolled up her sleeves and dug in here, and went through the muck of the logs, the server logs, and just kind of (you know, we're talking the down and dirtiest type data there is out there, and just kind of) parsing it out, following all the threads.

And sure enough was able to build a case that not only was able to illustrate that, yes, these 500 errors are a material thing, they're having a material impact, but also construct the case in such a way that IT was then able to go first, A, be convinced that, yep, the problem's real, so therefore, worth our time. And B, got it, I think, thanks to you, we can now go figure out how to fix it.

Now that's not, you know, I'm never going to get asked to come on stage (nor probably is Helen) to tell the story of solving the 500 errors, right? It's just, it's so not sexy.

BREA: Except here, where we just salivate over stories like that.

SCHMULTS: Yeah, yeah. Well, you know, I'm embracing your indulgence by even having me in the first place to tell a story like this. But no great sales pitch is going to be constructed by vendor x or solution provider y around this story, et cetera, et cetera. You know, I don't even think I mentioned it to our board, not that I was hiding it but it just wasn't something that you kind of would claim credit for.

But arguably, this did more—this solve, this data exploration to figure it out, diagnose it, help IT get it solved probably did more—certainly to improve our conversion rate. It took a bunch of calls and costs out of customer service. And in terms of loyalty and long-term value creation, I would bet that this did more than any new whiz-bang feature or function that we probably ever launched on that particular website. We had a lot of wins there, by the way.

But it was just this really basic, boring, unsexy thing that's never going to get written up in Internet Retailer or be a blog post on Medium. That was a really big win. And so I think a good example of just how much more fundamental can it be to have a website that doesn't throw errors at customers. And yet it's so boring, and people typically don't want to spend time on it. And, you know, it was worth it.

And interestingly—sorry, just to tie the two stories together—Helen was also one of the resources on the fully instrumented omnichannel fractional attribution tool, right? So here's an example of someone that could work in either the hype end of the spectrum or the incredibly boring, unsexy, but yet really valuable end of the spectrum. And so, just again, kind of illustrates the point I was making about opportunity costs.

BREA: There's something in that about not mistaking the tool for the work, I think, right?

SCHMULTS: Yeah.

BREA: It's really resonating for me.

SCHMULTS: Hey, I know you're kind of interviewing me, but elaborate on that, because that's a really big point that I could go off on a riff on.

BREA: Yeah, I have my favorite story, too. You might recognize this one. But years ago, I had a chance to work with an iconic specialty retailer. It was a privilege to have a chance to do that. And one of the lessons that sort of really gets hammered home in that experience is when you're trying to think about using one of these new capabilities, the importance of really thinking it through end to end.

You know, obviously, lots of pieces to that story, but one of them that sticks with me is part of that end-to-end experience was a really sophisticated configurer—sorry, configuration—rather, a suggestion tool, a recommendation tool so that when you came to the website, you would see, for example, what other people were shopping for, if you looked for something, what other gifts might be suggested alongside it if it was a birthday or holiday or something. It was supposed to be really good at helping you navigate a very long tail of options.

And was great. But when we actually went to diagnose the end-to-end customer experience, we noticed that the results that it was suggesting were pretty nonsensical. And what we discovered was that when you implement this, there is a configuration file where you actually, through a variety of numbers, basically set the proposed sort order—you know, the order in which the products will be recommended—most popular, most relevant, whatever. It turns out that the default option in that tool was actually the date which the product was added to the database.

SCHMULTS: Yes, newest first.

BREA: Right. So basically, even for stuff that looks out of stock, it was basically returning completely nonsensical stuff. And the lesson that I learned—you know, really, a lesson that I learned—from that is you're only as sophisticated as your weakest link in that chain, right? So if you go to implement one of these tools, but you forget to basically tell the IT folks, after sort of reading the instruction manual, that it would be nice to go with most popular or most relevant or something else. They don't know any better—God bless them—they're just putting it up; they're technically installing it on the website. But the configuration choices are actually a business decision that needs to get made, not a technical decision.

And obviously, when we fixed that, we managed to get much higher results in terms of getting people navigating to the right products and conversion rates and so forth. So, for me, that was a seminal kind of a, you know, in midlife, sort of to rediscover something as obvious as that was kind of a breakthrough moment for me.

SCHMULTS: There's so many sort of things that illustrates. But, you know, anyway.

BREA: Yeah.

SCHMULTS: Great, great classic story. And I was kind of smiling and nodding. Because yeah, I've lived exactly what you just described—more than once, frankly.

BREA: If you distill this out a little bit then, think about like some of the factors. You know, I could ask you, like, what did you learn from all this, but more specifically, as you are now advising companies in your current role and the portfolio companies that you're working with, to what degree do you distill the set of factors that you consider for helping people make that trade-off between being aggressive or conservative? Like, for example, you and I've always talked about the importance of momentum, of kind of varying your level of ambition by how much, you know, of it you have, and where you are on the journey, and so forth. Can you say more about that?

SCHMULTS: Yeah, well, let me first start with the first part of your question. And for the second part, I actually may flip it back to you because I've heard you talk about a framework I really like about momentum. So I may, you know, flip the interviewer role back.

But, you know, like so many other things, the space of analytics and data and data-driven decisioning, et cetera, it's not like it's some completely alien form of a business and management. And so, often, the rules and the lessons that we've learned, you know, in the, whenever, 100 years it's been—before data and analytics was even a thing—still apply, right?

So, you know, two really big interrelated ones are focus and prioritization. And I think, as I said a couple of minutes ago, everything has an opportunity cost. So, one, around focus: If you're spread like peanut butter in this space, just like anywhere else, you're going to be working on a lot of things, but you're not going to get a lot of those things done, right? So focus helps a lot. Pick your biggest, most important things.

And that's where prioritization comes in. So how do you pick the right things? And again, this is true whether it's technical or improving things in your warehouse or whatever. But very much same with analytics, which is, first, as I already tipped my hand, I would tend to bias towards getting the fundamentals right first. If you haven't gotten those things down, running on to the new new thing, trying to do the advanced math, et cetera, not only is that likely to probably yield less, because maybe you're not quite even ready to do it, but, generally, the basics and the fundamentals just have a higher ROI, right? Like that 500 error story I told, right? We weren't installing a neural network; we were just making a website work better using data analytics to get us there. So pretty basic.

One of the things that helps a lot with the fundamentals and the basics, choose which word you will, is that your error bars, both around how hard it's going to be—because human beings by nature, we tend to be optimistic, which is why we often start these projects that end up turning into a sort of death marches—your error bars around sort of fundamentals of basics tend to be tighter, right? You're less wrong about how hard it's going to be to get anything done. And similarly, your error bars around the return it's going to deliver tend to be less wildly optimistic.

And I think with the new new thing, the really advanced math, et cetera, human beings, again, by nature, lots of studies have shown this, tend to be very optimistic about how hard it's going to be—so, in other words, it ends up taking longer and being harder than was originally conceived beginning of whatever this effort is—and then also tend to be really optimistic about how great the other side's going to look. So, again, the returns tend to be less. So starting with basics/fundamentals, right away, a), you're getting easier things, closer in things; and then you're also less likely to go completely awry on your calculation around it.

Related to that, and you talked about momentum velocity, I also tend to recommend a strong bias toward doing things of shorter duration versus doing things of longer duration, right? And this isn't some, oh, Rob came up with this crazy idea. I mean, it is one of the core premises behind the Agile methodology development thesis, right? It's like if you do shorter chunks of things, your success rate is likely to be higher, because there's just less room and less time and less scope for things to go wildly wrong and for surprises to come up. So bias toward things that are shorter.

The other thing that happens with projects that take less time, efforts that take less time, is you start getting whatever the return is much faster right? So I'd rather start getting the yield off something, even if it's quote, unquote, "smaller," next month than waiting until sometime next year for a promise that may or may not turn out to be bigger, right? So hopefully, that makes sense to people.

And then, lastly—and then as I said, I'm going to flip the interview over to chime in on this—but, there's some really, there's a, having a filter to help you get through this, right, both figuring out what the basics are, what are the faster returns and what the relative opportunity cost is. That sounds like, oh, it must be some really complicated model. What I use, and for all I know, it's something you guys at Bain invented, it's a really simple two by two.

On one axis, you have a level of effort, and on the other axis, you have value, with the top right box being "high value, low effort," right? So hopefully, that makes sense to people; they can visualize it. Sorry for not having a whiteboard.

And the terms "effort," "value" are chosen very deliberately to be mushy, because that allows you around effort, for example, to include the notion of time. And also, to include, frankly, the notion of how cross-functional is this, how much help am I going to need for other parts of the organization, which even in the best-functioning organization, creates friction and issues, et cetera, right? So the effort is meant to be a little bit stretchy.

And then same as value. As soon as you turn it into what's the ROI, then people start go into the spreadsheet and trying to do, you know, very frankly, falsely precise calculations on the inputs and outputs, et cetera. I like value because it's just an approximation of how good is the outcome going to be here. And then what allows this rather mushy framework to work is you're putting it all in there. So even though you're a little bit imprecise using terms effort and value, you're comparing all your different opportunities on that relative basis. And so I've just found it over the years to be a lightweight, flexible and, frankly, surprisingly effective way to wade through all the different things, and bias toward the things that are going to most likely give you the highest sort of return on effort, as well as, frankly, return on investment.

Anyway, that was kind of long-winded. Hopefully, there was something useful there. But as I've threatened, there's a framework I know you use, which helps with this notion of time and momentum. And so rather than me do a bad job describing it, why don't you do a good job describing it? Because I personally found it to be really impactful.

BREA: Well, that's kind. Now I got to think about how to express it. But basically, I call it 3-2-1, and let me explain what that is. You know, the metaphor that informs it for me is that if you're driving a car with a manual transmission, it's kind of knowing what gear you're in, and therefore, what your level of ambition for the portfolio of analytic efforts that you have would be. You know, the numbers 3-2-1, basically, the notion is in any given quarter, you have a portfolio of analytic efforts,

and your objective is to try to—and these numbers are sort of notional; your numbers may vary—but the idea is three insights: two of those insights promoted to run as tests, and one of those tests actually promoted to actually drive to scale. If I've got a portfolio of analytic efforts, and I sort of think like a venture capitalist, you know, my objective is to be seeing the companies that I invest in sort of move through those gates, right? And I understand, of course, that there will be sort of a winnowing in that process.

But there is, two things should happen. One is that you should see that portfolio actually yield stuff as you go. And obviously, ideally over time, your ambition expands, and you see more in it, right? So, the way this works is in any given quarter, you have a governance process over this that really asks two questions: No. 1, are we focused on the most important problems, and what kind of insights, tests, and kind of scale to production efforts do we have that are aimed at those things?

And then two is what's the health of our portfolio? Regardless of whether were, let's stipulate that we were focused on the right thing. Let's actually ask, are we actually from a capability perspective helping to move the numbers through that way, right?

SCHMULTS: Yeah.

BREA: The advantage that that has it that it keeps you from managing analytics like a task list that's passively received and just says, OK, I got asked to sort of do a propensity study or a demand forecast or something else. And just sort of take it out. It also by, as you noted, time-bounding it, where you're sort of reviewing this, you know, say, monthly, and then adjusting it quarterly. Your schedule may vary.

But the idea is that you hedge the risk that any one of these efforts will try to get too cute by half with itself, right? You're constantly focused on, OK, have I learned something, am I trying it out, and am I actually able to put something to work end to end? And that keeps you very practically grounded. You know, back to the example I mentioned about that retailer. It could have been there was some really great algorithm behind it that said that people who see this product should be shown this other thing, because that will maximize their conversion.

But if you've got the configuration file wrong—you know, you didn't actually test it and try to get it to scale—then you really got no value. And so it sort of keeps the whole analytic effort grounded in this way. And it helps to keep that balance between the high math and the high art, you know, with the actual grungy work of putting it to work.

SCHMULTS: Well, the other thing that I think is so powerful about your 3-2-1 and time-bounding it is you're probably going to be bad at any company. And I'm sure you've seen this when you've been working with companies, helping us. They're probably not great at it on day one. And so by time-bounding it, what you have is the ability to learn, and then bake those lessons in.

Whereas if you just, like you said, you know, we're doing the propensity study, and you just start. Well, it's not really working. Well, we can keep working at it. And you have these projects that sort of become an end in and of themselves versus delivering the value of whatever was identified that the propensity study was expected to deliver. And so by having these shorter kind of sprints—you know, 90 days or less—I would expect you're seeing people that you're working with on this, like, the first one didn't go so well. They got a lot of things wrong. But then they baked those lessons in, the next one goes a little better.

And then, eventually, you've got this muscle going where you have this methodology and you're cranking things out. And, again, it's not that every, the definition--. Your 3-2-1 model allows for some things to be, eh, we didn't find anything that we--. We were optimistic at the start.

BREA: Right.

SCHMULTS: Didn't work out, whatever. But you're probably getting more and more hits, as you said. Your health of your portfolio just keeps getting better as you get better at each of those stages. Anyway.

BREA: Let me switch to a couple of different topics. One is this question of, as you try to assess what you should aspire to, right, what your ambition should be, how much of that is sort of internal lydriven versus externally driven by benchmarks? How have you made that assessment with the team's that you've managed.

SCHMULTS: Yeah, you know what's interesting, I was asked this question on a completely unrelated space just recently, which is good, because that means now I can feel like I've got a somewhat thoughtful answer for you. Look, I'm a person who's benefited my entire career from asking other people for help, getting advice from others, learning from other people's lessons. So I don't want for a moment to suggest people shouldn't be doing that—particularly in a space whether it's anything related to digital or analytics—nobody knows everything, right? So being able to go and talk to people is just so helpful and learn from others' experiences.

That said, you don't start there. You've gotta start with your problems, your opportunities, your pinch points, et cetera. And identify those first so then you know what it is you're looking to get the answers around. Because if you do it the other way, if you say, I'm going to go to this conference and just hear all these speakers and I'm going to come back with a bunch of really good ideas, what you're doing is adopting someone else's problem, someone else's pinch point, someone else's constraints, and transposing them onto your business, and saying great, I'm going to go solve this now.

You may share a problem or an opportunity with whoever you spoke with, but going back to what we talked about earlier on prioritization, it may not belong at the top of your prioritization matrix, right? And so you need to be, if you start with correctly identifying your own opportunities, your own things you need to solve for, then go outside and look for help and inspiration and lessons learned, that's the best of both worlds.

BREA: Right. Got George Harrison going through my head, you know, "if you don't know where you're going, any road will get you there."

SCHMULTS: Yeah, yeah.

BREA: So. And that may even be a Dylan cover, and if it is, I'm embarrassed, because I'm not sure. But anyway, there's a lot of talk about how you just can't do any of this without clean data, but the data is never perfect. How do you, how did you manage that? Are there specific stories about trade-offs you made and just, you know, when you waited, or was good enough, and you got on with it?

SCHMULTS: Yeah. I mean, I guess the answer is get comfortable with imperfection, right? So, I've been lucky to work with really strong analytical talent. I am not an analyst, a math major, or anything like that by training. And that's probably what made it easier for me to get comfortable with imperfection, right? Because if you are literally trained—you always have to show your work and show that it's perfect as you're getting your educational credentials, et cetera—and it can be jarring for you if that's been your historic experience to then be told, like, eh, I just need to get a sense of direction on what's correct, right?

But often, what we find is that direction and correct is all that business requires, right? And partly because of what you were talking about earlier, this notion of, you know, "rinse and repeat" type models, that also reinforces this notion of directionally correct. So I don't need to know the exact coordinates and mapping for some long journey; I just need to know it's kind of in that direction. And then I start. And then I'm, again, I'm rinsing, repeating, rinsing, repeating. I'm going to get there.

Whereas if I stay at the starting line, waiting for the perfect map to be formed, I'll never get the voyage underway, because there is no perfect map, right? So I think that's a big part of it is that you need to be comfortable with imperfection.

And then going back to what I just said, a lot of companies seem to be waiting for the sort of data cavalry to arrive. Whether it's some unicorn-type hire they're going to bring in, or this new system that we just talked to vendor x about and we're thinking about buying, and we'll probably be able to play it sometime next year, and then everything's going to be great. The data cavalry's never, never arrives, right? That unicorn hire either doesn't exist or isn't quite as all-knowing, all-powerful as you thought.

And the magical machine spits out the answer 42, and you're, like, well, what do we do with that? So I think it is important to start on the journey. Do the best with the data you have, and you'll probably be able to—again, with a heavy dollop of common sense and just knowing your business—you'll be able to get pointed in the right direction. And the important thing is you'll get going.

BREA: Very wise, hard-earned advice. So we can't leave 2020 without talking about 2020, right? In particular, what this means for your posture toward blackbox tech and also the assumptions that you make about what's feasible. Now in our prep call, you had a bunch of things to say about this. Wondering if you could elaborate on that.

SCHMULTS: Whew. Well, I think one is—and this is a point I know you've made kind along—is that, you know, look, there's nothing wrong with blackbox tech, per se, right? I mean, at the end of the day, you don't really want people climbing inside the sausage maker and messing with the gears, right, because they want it to work a certain way. Most people probably listening to this, and myself included, don't have the ability to do that, anyway, right? We shouldn't be going in and trying to, I don't know, update the core code in TensorFlow or, you know, ask Adobe for the source code so we can update, you know, their analytical platformer or you name it, right?

But I think humans do need to be very much involved at the front end and in terms of data in and then the back end in terms of insights and answers out. That's always true. I think what Covid, if nothing else, exposed is just how true it is, right? So if you take something like online marketing, there are a lot of great models out there that companies rely on successfully to drive their bid strategies, their business, predict who's likely to respond, et cetera.

Like many models, these models have a lookback window. And they're sort of saying, hey, based on everything we've seen in the past period of x weeks and months, we think this is what's going to happen today. And that's, you know, I'm oversimplifying, but that's kind of how the model works.

Obviously, Covid comes along, and your lookback windows, if they're looking back into January, February, now it's March, April, they're just wrong. And so I think the companies that fared best were the ones who recognized, hey, we've had a fundamental break here. It's not that they need to now throw out the models entirely, but it could be as simple as just saying, like, go in, and shorten the lookback windows. Sort of akin to the story you were telling about the retailer. Like, go in, and make sure this thing is doing what we actually want it to do, right? So that's kind of one example on the input side.

The other is on the output side. You know, I don't know about you, but I've rarely, maybe even never, had an output where I thought, oh my gosh, that makes zero sense to me, but it must be right. Now it's not to say I haven't got a lot of valuable outputs. I'm not making that claim. I'm not saying I always knew better than the algorithm. But what I'm saying is, like, yep, that seems right to me.

And so if you're getting outputs—and this is more likely to happen in a time of disruption than otherwise—you're like, what, that doesn't make sense to me, something's not right here. You know what? Something probably isn't right here. And so, again, I think I've used this word a couple of times over the course of our conversation, you know, common sense is probably one of the most powerful analytical tools that's ever been created. And, you know, don't just sit back and think that the machine is all-knowing and will just spit out the answers. You know, make sure that you've got people tending it on the front end, and also tending it on the output side.

And that's even more important when we have the kind of destruction we're seeing with lockdowns and economic disruption, you know, people's feelings about the future of wildly gyrating. And, you know, the geographic disruptions that are occurring, point A versus point B, et cetera, et cetera. There's just a lot going on, and humans are even more important now frankly—they always are, but they're even more important now.

BREA: Right. Pros, I think, call that human in the loop.

SCHMULTS: Yeah.

BREA: Exactly.

SCHMULTS: That's a great way to think about it, for sure.

BREA: Yeah. Well, there's so much here. But let's leave it at that. Rob, thank you, again, so much. There's so much that you've learned that I think folks can benefit from. And I hope we can get together again soon at a diner and push the conversation further. So thanks again.

SCHMULTS: Yeah. Thanks for having me, and good excuse to substitute this in even if we don't get to watch each other eat while we try and talk, I think.

BREA: Exactly.

SCHMULTS: Hope everyone is well, listening. And thanks, Cesar. It was really fun catching up with you.

BREA: All right. Thank you.

Tags

Vuoi continuare la conversazione?

Aiutiamo i leader globali e le loro aziende ad affrontare problemi e a cogliere le opportunità. Sosteniamo cambiamenti e otteniamo risultati duraturi.