October 07, 2025
Article

Analytics: Even the most rigorous analysis fails without the right story

SUMMARY:

In this episode, I talked to Jiaxi Zhu, head of analytics for the SMB division at Google. He transparently shared lessons he’s learned throughout his career, lessons that help him inform decisions on his division’s several hundred-million-dollar budget. Consider these steps and apply this framework in your own career and campaigns.

Listen now to get lessons about data, influence, and storytelling with numbers.

by Daniel Burstein, Senior Director, Content & Marketing, MarketingSherpa and MECLABS Institute

Analytics: Even the most rigorous analysis fails without the right story

Action Box: Turn your knowledge assets into a new revenue stream

MarketingSherpa has teamed up with parent company MeclabsAI to produce a research study. We are granting 10 AI engineering vouchers worth $7,500 each to eligible companies. Apply for your $7,500 AI Engineering Voucher.

I am not a numbers guy. And I’ve been in my fair share of meetings with decks filled with endless charts and data and my eyes glazed over. I’m sure you have as well.

So were we the problem? Were we just not captivated enough by the correlation coefficient?

I think not. And I think the reason why can be best summed up by this lesson I read in a How I Made It In Marketing podcast guest application – Even the most rigorous analysis fails without the right story.

Numbers, even very impressive numbers, still need that marketer’s touch.

To hear the story behind that lesson, along with many more lesson-filled stories, I sat down with a numbers guy who also has some deep marketing and business insights to share with us –  Jiaxi Zhu, head of analytics, SMB division, Google.

Google is part of Alphabet. In 2024, Alphabet reported $350 billion in revenue. Google’s SMB division works with five million customers. Zhu leads the analysis for how to manage and prioritize the division’s several-hundred-million-dollar budget.

Listen on Apple Podcasts | Listen on Spotify | Listen on Amazon Music

Lessons from the things he made

Even the most rigorous analysis fails without the right story

In Zhu’s role leading analytics for the Small & Medium Business division at Google, his team built what they thought was an exceptionally sophisticated scenario simulation model. The model was used to forecast budget allocation needs across 100+ countries, accounting for seasonality, business prioritization rules, and even regional macroeconomic patterns.

The team was very satisfied with the model’s technical rigor and holistic approach.

But when the team presented it to senior leaders, it failed to resonate with the audience. Senior leaders did not seem to appreciate the complexity and rigor of the model, and instead, they were pushing for decision clarity. For example, the first question in the room was: “So what does this mean for the LATAM budget next quarter?” Instead of enabling decisions, the team’s presentation seemed to confuse and overwhelm the decision makers.

This experience taught Zhu that analytics must be designed for decision-making, not just for analysis. It was not enough to be technically correct. They need to embed storytelling at every layer, anticipating the context that executives deeply care about. This realization shaped his work from then on and led to creating a new narrative-embedded analytics framework that fundamentally changed how his team approached analytics design.

Landing solutions requires going deep and doing the hard work

During his time at McKinsey, Zhu led a project to build a marketing personalization solution for a Fortune 500 automotive brand. He worked day and night with cross-functional teams across marketing, sales and engineering at the headquarters. Together, they designed a recommendation engine to deliver the right offer at the right time, based on customer behaviors and preferences.

The concept and the model was very well received by executive decision makers and was quickly approved for deployment.

However, delivering impact meant going far beyond just building the model and getting leadership approvals. They had to go many layers deep into the business across technology and operations.

For example, Zhu spent many weeks and late nights working side by side with CRM system administrators and developers to embed it into the client’s CRM system, so that sales reps at dealerships could actually use the recommendations in their sales pitches.

At the same time, this required an entirely new approach to customer engagement, and they had to ensure buy-in from the frontlines. They made visits to over 100 dealerships across regions, talked to thousands of customer service agents, closely observed workflows, and adapted the solution to fit their operational processes and incentives.

What Zhu learned was that landing something goes far beyond an executive presentation. It is not enough to design a beautiful solution on a slide. You have to go deep into operational realities, connect with frontline teams, and sweat the unglamorous details to make sure that ideas actually translate into real impact.

Define your data with your business partners

In another Google project, his team was tasked with analyzing productivity by measuring the number of customer meetings. Initially, it seemed straightforward. But they quickly found inconsistencies in the data. Different regional teams had vastly different definitions of what counted as a “meeting,” ranging from quick email exchanges to full-day conferences.

In some instances, teams have been found to report more than double the number of meetings for the same region, depending on the definition they used. This discrepancy resulted in wildly different resourcing recommendations.

When they presented initial numbers to executives, the meeting counts did not make sense and that diminished any trust in their recommendations. Questions such as “What are they even calling a meeting here?” derailed the entire discussion.

They had to go back and align cross-functional teams on a shared, business-relevant definition. They anchored it to what leadership valued, which was building substantive customer relationships. Based on this principle, they aligned that only scheduled phone or in-person meetings counted. They also built a unified data source categorizing meetings by channel and engagement depth.

This experience taught him that data governance is the foundation to effective decision making. Without shared definitions and clarity, even the best analysis loses credibility and fails to drive action.

Lessons from the people he made things with

Real influence means accepting the world is not fair or rational by default

via Cade Massey, practice professor and faculty director, Wharton People Lab

In one of Zhu’s first MBA classes at Wharton, Professor Massey opened the course with a slide titled “The Just World Fallacy.” This is the notion held by many talented professionals that if they just do excellent work, they will naturally be recognized and rewarded. But in his opening statement, Professor Massey made it clear that the world does not work that way. Recognition and impact require deliberate communication and influence.

This lesson stuck with him. Reflecting on his personal experience in the workplace, he has personally experienced instances where genuine impact was not properly recognized. For example, he built rigorous models, performed detailed analyses, and even put in long hours. He assumed that the sheer quantity and quality of his work would speak for itself.

But that was not always the case. There were multiple scenarios where genuinely valuable work was overlooked. This was not because it was not good enough, but because it was not positioned or communicated in the right way.

This experience initiated a mindset shift toward stakeholder influence and transformed his approach to analytics leadership. He adapted his approach toward embedding stakeholder alignment from the start, deeply understanding the full context, and making sure his insights were delivered in a compelling narrative that resonated with leadership teams.

The result was not just better presentations, but also faster decisions and improved partnerships across teams. It taught him that influence is earned not just through expertise, but through empathy, clarity, and influence.

True empathy is active work

Maria van Hekken, Executive Leadership Coach, Yes2Yes

Van Hekken challenged Zhu’s perception of what empathy in leadership meant. Before their coaching sessions, Zhu thought he was already a good listener. When he engaged with others, he had always tried to be respectful and give them space to share their points of view. But Maria pushed him to rethink what it meant to be an empathetic people leader.

She asked him to think about the difference between listening to respond vs. listening to build meaningful relationships. She told him, “Empathy is not passively responding to what others have to say. It should be active work. It involves being truly invested in the other person’s emotional journey.”

At first, these points sounded very abstract to him. But the lesson resonated with him in a tangible way when he first stepped up to become a people manager. In the initial stages in this new role, he focused on being as supportive as possible. For example, he frequently checked in with his team to problem solve and review outputs. He also gave them space to work, making sure they knew that he trusted in their expertise.

Despite these efforts, he started to notice friction. Overtime, some team members were burning out while others felt that they were outgrowing their roles.

He realized that just listening in 1:1s was not enough. He had to take the time to put himself in their shoes and seek to understand their personal motivations, goals, and challenges. So he changed his approach. He spent time discussing their career goals, what excited them, and how their personal journeys took them to where they were. He took long walks with them along the walking trails around the office, and reminded himself to take a step back and celebrate each person’s successes together with his team.

That mindset shift is now a core part of his leadership philosophy. He started defining projects more intentionally, aligning individuals with work they found meaningful while balancing it with broader business priorities. He became better at anticipating conflicts before they blew up. The result was a team that felt seen, valued, and motivated. It taught him that empathy as a manager is not just about being nice. It is about doing the work to understand your people deeply enough to set them up for success.

Intellectual curiosity means always asking “why” behind the observations

Bruce Xia, Partner, McKinsey & Company

During Zhu’s consulting days at McKinsey, he worked with Xia who deeply influenced how he thought about analysis. Early in his career, Zhu took a lot of pride in delivering polished, professional decks with neat charts and tables.

During one project, he was investigating customer acquisition trends to help identify further expansion opportunities. He put together a few slides showcasing historical customer acquisition rates across regions, thinking that he created the “money slide” that would help move the project forward.

But in their internal reviews, Xia spent over an hour on the slides Zhu created, grilling him on questions he had no answer for. For example: “Why was there a dip in new customer acquisitions in Q2?” and “Why did they see a spike in Southeast Asia last year?” Zhu felt frustrated at himself for not asking himself these questions earlier. He thought he was answering these questions by showing the trends.

But Xia’s questions forced him to see the difference between describing what was happening and uncovering why it was happening. Xia pressed further, “If you do not know the underlying drivers, how can the client act on this?”

He encouraged Zhu to take a step back, stare at the data and charts, and brainstorm questions on trends and observations to know more. He guided him to dig into operational details and speak with front-line teams. Instead of stopping at, “Customer acquisition is down 10%,” he learned to explore “Could it be a targeting issue? Offering relevance? Competitive moves? Operational bottlenecks?”

This experience stuck with him and changed how he approached analytics. As he advanced in his career, he continued to apply this critical thinking and intellectual curiosity. Leading analytics projects, he emphasized not just building dashboards that showed KPIs moving up or down but developing analyses that explained why.

For example, when he saw lower customer engagement in a region, he challenged his team to not just report it but traced it to specific drivers like reduced customer coverage, misaligned marketing messaging, or changes in local market signals.

This approach helped him drive confidence in his recommendations and decisions. It taught him that intellectual curiosity is not just about surfacing data. It is about asking the right questions that lead to meaningful change.

Discussed in this episode

Transparent Marketing: How to make your product claims credible … not incredible

Customer-First Marketing: How The Global Leadership Summit grew attendance by 16% to 400,000

B2B Marketing Leadership: The higher you get in the organization, the more details you need to know (podcast episode #115)

Marketing and Brand: Embrace healthy friction (podcast episode #48)

Get more episodes

Subscribe to the MarketingSherpa email newsletter to get more insights from your fellow marketers. Sign up for free if you’d like to get more episodes like this one.

For more insights, check out...

This podcast is not about marketing – it is about the marketer. It draws its inspiration from the Flint McGlaughlin quote, “The key to transformative marketing is a transformed marketer” from the Become a Marketer-Philosopher: Create and optimize high-converting webpages free digital marketing course.

Full Transcript

Not ready for a listen yet? Interested in searching the conversation? No problem. Below is a rough transcript of our discussion.

Jiaxi Zhu: Don't fall for the just world syndrome. Don't assume that when you have the best data, when you have the best capabilities, where you have the best qualifications, you'll automatically get to where you're you aspire to. You know, there's a lot of other things that's beyond your control, beyond your influence, that may have an impact on the final outcome.

How do we navigate that world? We all have to think more deeply about influencing. Obviously, building data and skills is important, but influencing is equally important as well.

Intro: Welcome to how I made it in marketing. From marketing Sherpa, we scour pitches from hundreds of creative leaders and uncover specific examples, not just trending ideas or buzzword laden schmaltz. Real world examples to help you transform yourself as a marketer. Now here's your host, the senior Director of Content and Marketing at Marketing Sherpa, Daniel Bernstein, to tell you about today's guest.

Daniel Burstein: I am not a numbers guy, and I've been in my fair share of meetings with decks filled with endless charts and data. And, you know, my eyes just glazed over. I'm sure you've been there some times as well, right? So were we the problem? Were we just not captivated enough by the correlation coefficient? I think not, and I think the reason why can best be summed up by this lesson I read in a recent podcast guest application.

Even the most rigorous analysis fails without the right story numbers, even very impressive numbers still need that marketer's touch. So here to share the story behind that lesson. I, with many more lesson filled stories, is a numbers guy who also has some deep marketing and business insights to share with us. Joining me now is Jesse Ju, head of analytics for the SMB division at Google.

Thanks for joining me, Jesse.

Jiaxi Zhu: Yeah. Thank you Daniel. Very nice to meet you.

Daniel Burstein: Before we jump in all your lessons and stories, I let people know I'm talking to us. Let's take a quick look at your background. Josh has been a technology consulting associate at PwC, digital business analyst at McKinsey, and for the past five years he has been at Google. Google. I'm sure you've heard of it. It's part of alphabet.

In 2020 for alphabet reported $350 billion in revenue and Google's SMB division works with 5 million customers. Josh leads the analysis for how to manage and prioritize the division's several hundred billion dollars budget. So, Josh, give us a sense. What is your day like as head of analytics?

Jiaxi Zhu: I would say my typical day typically starts, either with a briefing on, for example, how, the latest modeling that we have done or maybe some conversations with my team on, some of the numbers that we're seeing. In some cases, I may step into another meeting with some of the other leaders in our sales and marketing divisions to talk through the latest, recommendations, plans and budgeting.

So one example is that most recently we're just going through the road mapping process for next year. So a lot of the conversations have been going about, you know, what are some of the key initiatives. What are some of the data that we want to fuel a compelling business case? And in addition to that, I also step into conversations and problem solving sessions on past performance.

So understanding are we driving the expected impact through our investments campaigns, customer outreaches and so on, so forth. And as part of that, obviously lots of, forensics, with the data. But at the same time, also qualitatively with people who are actually in the front lines to be able to add color to that data as well.

Daniel Burstein: I like that you mix in the qualitative, because a lot of times some people think numbers can answer everything but the numbers to me. Do you tell me what you think? To me, the numbers answer the what? And that's really important. The what? We still need to answer the why. And that's where some of the qualitative can come in.

What do you think?

Jiaxi Zhu: Yeah, I think that's a very important observation because, I think that's also one of the lessons I've learned personally as well. You know, as we are talking about numbers, right. So, for example, whether it's an impact measurement of our previous campaign, or whether it's, you know, the expected growth projection for the business as a result of certain investments.

We don't have we can't just start at, you know, just showing a numbers or what percentage that is. It's very important to conceptual to, contextualize it. In the broader, I would say, background of the business decision. So one specific example was when we saw a decline in the impact that we're getting from certain investments year over year.

Obviously we saw that X percentage decline. But the more important thing is understanding what's driving that decline. And in many cases, even the best data science models won't be able to pinpoint specifically to what are the decisions or operational gaps that might be driving this. And to answer that, we actually have to go two steps deeper into the business.

One is to understand how the business fundamentally works. And by understanding what are some of the other data sets, we should look out to help us triangulate what is going right versus wrong. So in that specific case, we looked at secondary metrics, things like, you know, what are some of the secondary growth metrics by market, by customer segment, by business model.

But at the same time also, at some of the other qualitative inputs, as I previously alluded to. So we actually spoke with some of the, for example, marketers or salespeople or so on, so forth who had firsthand information of how specific customers responded to certain, in certain ways to our outreaches and engagement. That provided a much more robust picture of these are all of the drivers.

This is what's within our control. This is what might have been driven by something that's just beyond what we can move. So one example was that there was a market where we saw a decline just because there was some political election, and that changed the customer spending behavior quite significantly.

Daniel Burstein: Yeah. And that's when the light bulb goes off. So let's take a look at, some of the lessons from your career of doing that. As I like to mention, that's a great thing about our jobs as marketers. I've never been in another industry. I've never been a podiatrist or an actuary. But we get to build things as marketers often.

We're building brands and campaigns. And for you, you're building like models and analyzes that help inform those brands and campaigns. You said the first lesson you learned was even the most rigorous analysis fails without the right story. So how did you learn this lesson?

Jiaxi Zhu: That's a great question. So as I previously mentioned, one of the most important qualities in data analytics is not only being able to show the correct data and robust and interesting data, but at the same time also being able to contextualize it within the broader, business, in the decision at hand. So one example was, there was one instance where we were optimizing our budgets across various markets.

And when we presented the budgets, there were more questions than answers. So we were allocating budgets across different markets and segments based on past performance, but also future initiatives that, we are aware of. But when we presented the numbers, some of the questions came up. Was, you know, how did you come up with this, you know, 20% increase in that specific market?

Why did my market not receive the same increase? And tell me, why are we cutting the budget by, let's say, 2 million in this segment? What is driving that? So the initial meeting, I think that's a lesson learned for me was which was, you know, the initial meeting, I thought numbers will be enough to tell the story and probably have some high level, I would say, narrative on, you know, these are the business decisions that fed into those numbers.

And this is what the models below. Obviously that was not enough. So one of the things that, I had to do was actually going back and actually thinking more critically about, okay, the numbers, the numbers, the but how do I use that to tell a compelling narrative from the initial business context, higher level decisions. Two of the specific recommendations that we're making for this market, and really bridging the gap, is for this is the business decision at hand.

For example, if we want to aggressively grow our video exposure versus this is what I meant for you in this specific market, which means that we have to more aggressively reach out to this type of customer, and there's a higher concentration of those customers in your market, that really help to bridge the gap. And then in addition to that, we are I also built a very dynamic model which essentially showcased these are all of the assumptions going into the projections and the allocation algorithm.

We could tweak those assumptions. Life and the model would actually spit out something else. So you could actually see, very directly in terms of if we, you know, maybe tuned up the growth projection by 1%, this is what it's going to do to your budget next year, so that people have a very clear understanding of how sensitive the numbers are from a decision standpoint.

And then they can also take a step back to gauge whether increasing the growth by 1% is a really realistic or not realistic goal. So I think that's, that's a, I would say a concrete example of how I, you know, I learned the lesson of numbers or numbers, but we'll also have to put that into a coherent narrative.

Daniel Burstein: Yeah, that narrative is essential. Let's talk about a key part of that narrative. How do you communicate the limitations of a model while at the same time trying to convince decision makers to take an action? Right. So, for example, and I've written about transparent marketing before, I focused on how to make your product claims credible, not incredible. Right.

That's a real challenge for marketing to an end customer. They hear so much hype, right? Why should they believe you? But the same is true when you're talking to a business decision maker, right? They have so much data and numbers stored at them, none of it is perfect. And I think the more they understand what goes into creating that data, kind of like you were mentioning more, it will stick out from the sea of noise as something they trust, not just hype with numbers, but it's a tricky balance because you are, at the end of the day, trying to inform a decision.

So Josh, how do you communicate the limitations of a model while at the same time trying to convince business decision makers to take an action?

Jiaxi Zhu: Yeah. So I think there's several steps. I would, I would say would be involved in this process. The very basic step is aligning with the key stakeholders and decision makers that the data that we're using is the correct data. Because one of the most common issues, myself and my team encounter is that different teams use different data sources and maybe they have different ways of defining the same data even.

And that could result in some pretty significant gaps in how people will interpret the results. So, you know, as is very commonly said in the data and analysis, I would say world is, you know, garbage in, garbage out. So the first step is preventing garbage from going in to the model itself. So to that end, I actually spend a lot of time, internally within, Google in our division to align on the common data governance process.

And also, stood up like a committee to, to run that. So what that is, is we have this weekly before it was weekly, but now that it's more stable, it's more of a monthly meeting to look at all of the key data sources, metrics across the business for involving all of the key cross-functional partners, from finance to marketing to sales, so that we can all hold hands on, you know, this is what we mean by, for example, by what customer satisfaction this, and we have a very rigorous way of documenting that so that every team that uses this data for their projections and analysis, we go off of the same source.

So I would say that's the first step, the second step in terms of building confidence, in the model and explaining the limitations is also overlaying the model results with the actual outcomes. So one of the most common things that I'm doing is testing. So let's say if I'm building a forecasting model to forecast, the business growth in a specific segment, obviously I will run, for example, you know, previously it's regression.

Now we use AI to do that. But in order to instill confidence in the result, I typically run it on the previous period to to show this is let's say this is what the model would have said if I applied the model last year to forecast this year's growth, and then I overlay that against this year and see, you know, what the gap is.

I mean, if the gap is big, then probably means that there's some limitation in the model that the model is not picking up. And that will essentially start, I would say process of number one diagnostics. And then number two also I would say, understanding if there's anything that we can do to improve the model. So in one instance, we actually found that the model inherently had some limitations.

So it had a limitation in terms of picking out the largest customers, in the market because they are very unique and in some cases outliers. So the model is very much optimized for the averages. So maybe the 80% of the customers, about the 20% that they were not able to capture. And that that is not something that we're able to fix within, you know, a week or something.

Right. So but there is a decision to be made. So what we did was, we took a very pragmatic approach, which is we clearly explained why this was happening in the projections to our stakeholders. We came up with several options. We could address this issue and the the options that we ultimately went for. It was, let's use this forecasting model for the 80% that we can identify using some basic analysis based on their revenue, their, I would say, spend, diversification across various platforms, so on, so forth.

And then for the remaining 20% that's fueled a more, I would say, simplistic model, because based on historical, actual historical data and with some quantitative inputs from the market leaders, so that we are able to, I would say, take a more customized approach for that one. So I would say that's my second step, which is backtesting and trying to explaining what's, you know, what are the real limitations and aligning paths.

We could potentially address those limitations. And then, yeah, go ahead. If if you had another question.

Daniel Burstein: No no, no. Go on, go on. Yeah, I.

Jiaxi Zhu: Would say the last stop I wanted to say is, overlaying qualitative inputs aside of the numbers, you know, as I mentioned, the numbers are just the numbers. But at the same time, there's also other considerations that we have to think about. What? So one example is operational complexity or maybe even legal implications. So maybe when the model says we should do X, if it's not even operationally feasible we would just pull that off the table.

So one example was there was an instance where the model basically said we have to cut our budget from a certain market because the ROI was just not that great, right? That's true from a mathematical standpoint. But then we also had a lot of, contractual commitments to the vendors and folks in that market that prevented us from doing so.

So but obviously, we cannot feed every single contract into the model that would just blow it up. Instead, we basically added manual steps, manual checks, and steps after the model spits out numbers to double check on some of these operational considerations and laying them out in front of our, stakeholders as a holistic solution.

Daniel Burstein: That's great. I think what, you're really illustrating here is all the work that goes into building these models. You know, in marketing, maybe we just see a number on a screen in a PowerPoint, but here's everything that goes into it. And you mentioned from your time at McKinsey that was something you learned. Landing solutions requires going deep and doing the hard work.

So maybe you can take us into that story at McKinsey.

Jiaxi Zhu: Yeah. So, that was one of my, most, I would say, most exciting projects, at McKinsey. So I was, essentially on the team of three people. I was one of them, in implementing a, I would say, a personalized marketing strategy for a global auto company. And specifically for their after sales service. So essentially what they're trying to do is, you know, when somebody buys a car, there's going to be lots of services required after he or she purchased the car.

There's going to be, regular maintenance visits. There is going to be spare parts that he or she might buy to fix something that breaks in their car, and they actually make a lot of money off of that because it's not a one off transaction. You know, people will go in the dealership and buy stuff on an ongoing basis as long as he or she owns the car.

Now, one of the things that they're trying to do, and that was back in 2016. So it was still relatively early on in this whole, I would say, online digital economy space. And it was a traditional, I would say dealership. So they were trying to build their online presence, collecting data about past transactions and customers, and actually sending out personalized campaigns to all of these people based on, their behaviors.

Now, one of the challenges, that came up with was, you know, they had a lot of legacy systems. A lot of the things were just happening offline. So they may have a sheet of paper that record, you know, what the transaction was and so on and so forth. They also were in the early days of implementing their own CRM system, where there was some, some data about some customers and their salespeople were just using these to send promotions to people.

But the data and the infrastructure, all of that is very spotty. So, as with a very typical McKinsey project, we initially went in and thought, okay, this was going to be a blue sky marketing strategy, maybe five year vision type of thing. And which we did. I mean, we, I went in and we put together this brilliant vision of how everything could be automated, how everyone can receive his or her own personalized campaigns with mint within a matter of seconds.

And we presented this to, the sort of the CEO of the after sales division, and he was fully balled in and he basically said, okay, this is great. Let's just get this implemented. I want to see the results, ASAP. Now, the that was all great. But then the real challenge came when we were going to put that vision into action.

We ran into those issues that I initially mentioned was that there's legacy systems where we just don't have consistent data about customers because personalization, a lot of that is enabled by this data. People are not trained because, for example, the people that used to, send out emails and send out campaign calls, they are not trained on those new workflows.

As for what they are supposed to say, how they are going to introduce the products and offerings to the people. So to that end, in what initially this project was envisioned to be like a two month project, it became an eight month long transformation. So we had to start with the very basics, which was building all of those connections between those data, infrastructure pieces, so that the data started to talk to each other, holding workshops with the rank and file in in those customer support teams and divisions so that they're trained on this new way of engaging with customers, and then also opening up, for example, APIs with external payment providers so that

people have a more seamless experience when they received an offer, they could automatically convert as they receive the offer. So there was a lot of, I wouldn't say unglamorous, but just really deep work that had to be done for us to realize that vision. And I don't think, you know, I went in with that expectation. So the lesson learned from this experience was, you know, you know, a vision may look great on paper, but, you know, data and enabling that vision does require doing the actual work.

Putting in the actual work to do some of the basic foundational things that may initially look trivial to folks from the outside, but is actually core to enabling, that go ahead and vision.

Daniel Burstein: Well, what advice would you give to us marketers and business leaders as we work with data scientists and try to get these projects off the ground? Because earlier I asked you about, okay, the limitations of your models. But frankly, on this side too, as you just illustrated, there are limitations. We have these great ideas about what we can do with models or data errors.

And these things are real on the ground limitations. And so when I think we go to talk to a data scientist, sometimes, if we're not well versed in it, it just seems like a black box of magic. Okay, we've got all this random stuff here. I guess this data, you do your magic and then we'll get these sort of things going.

So what kind of advice would you give to us marketers as business leaders when we're looking to launch a data science project? What what type of limitations, which type of communication should we be doing on the up front?

Jiaxi Zhu: What I would say is, you know, even though the vision is not everything, it's still very important. So I would definitely start with the vision, so that people know why they're doing all of this and why we should be mobilizing everybody around that single goal. But obviously that's not enough. So as part of the implementation implementation, it's also very important to think upfront what might be some of the things that might go wrong.

Because if you were sitting in like, in the office and working with, C-suite, they may they may not know what's going on in the CRM system in XYZ dealership store. That is where we have to invite some of the front line experience for from a data standpoint. For example, the data analysts who work day in, day out, but the CRM systems, they will be able to tell you if it were to do this, that's going to break right.

Similarly, we'll have to invite, for example, the the actual people who did the customer engagement tell us about, you know, what do you find challenging about using this data? If we presented this dashboard to you, are you going to be able to figure out what you want, what you needed to say to the customers as you were making those outreaches?

I think, you know, I think these are all very important parts, which is inviting the input not only just from the leadership, but also from the rank and file, so they can tell you about all of the potential operational implications as your role mapping everything out. But again, that's not enough. The third step is once you have a good understanding of this, bottoms up.

I would say implication of your strategy, one core role of a of a successful data analyst or, business analysis leader is to be able to synthesize all of these, limitations and operational investments that we need into themes and narratives that the leadership can understand, because in some cases, we do need the leadership sponsorship to be able to make some of those changes offline.

So, for example, for us to maybe invest in a new CRM system or a new data storage system or something like that, it does require more budget and that requires leadership sign off. And in order for us to get that sign off, we have to translate these practical limitations and needs that come directly from the front lines into something that's more exact friendly.

So really serving as a bridge between the vision, the strategy, and as well as the operational implementation.

Daniel Burstein: I like that the top down in the bottom up and giving everyone understanding. I think part of that understanding too, is the frontline workers. If you're asking them to, let's say there's three extra steps in their day. They have to do every day for them to understand why those are essential, or they'll just kind of be griping like, gods, take me away from what I really want to be doing.

But another key part on the up front, and you mentioned this a bit earlier. You know, sometimes in business we talk about that, that word data very generically, like it's all the same. And as you mentioned earlier, there is that famous garbage in, garbage out problem we have with models. So you said another lesson you learned was define your data with your business partner.

So take us into another Google project where you were able to do this.

Jiaxi Zhu: Yeah. So there was one project where we were thinking about, I would say team investments in terms of headcount investments, because, you know, when we run, campaigns or outreaches, it's not only just the creatives or the money, it's also about people who make this possible. And in many cases, you know, when we invite people, for example, to marketing campaigns, we'll have to invest in those customer engagement folks.

Now, the question is, you know, what's the optimal number for us to have that? You know, that person on the ground? Is it, you know, should we have ten people in this market or 200 people in those markets? Right. As I as you probably previous said, we have millions of customers. Technically, we could just hire tens of thousands of people in a year to call these customers, invite them to all of these events.

But, you know, is that economical? Probably not. So we do have to come up with a smart approach to understand the optimal, I would say, capacity that we can allocate for this type of role and what that means to the business outcome, whether it's revenue growth impacts, ROI, so on and so forth. So as part of the analysis, one key aspect was understanding what's the realistic capacity of such a person to be able to have those outreaches to potential customers.

And and, and to promote some of the campaigns and offerings? There's, you know, there's multiple data points we could look at. One is, you know, how many meetings can they have in a month or in a quarter? Or we can look at, you know, what is their, based on past experience, based on inputs from experts, on the front lines, how many can they realistically manage based on the complexity and so on, so forth?

What of the workflow and the customers? So we ultimately landed on the meetings because that was one of the more standardized, I would say data points, but even that was a very, I would say sloppy process, in terms of different people talk meetings differently. So even though we assume that a meeting is a meeting, you know, we were talking on the video, it would be a meeting.

But in some cases, for example, people in the finance department, they, you know, they had a very strict definition of meetings. It had to be in person. It had to be, you know, previously scheduled for them to count as a meeting. And that's what they have in, not in terms of number of meetings in their system. Whereas some of the more front line folks, they contain everything from a, four day workshop to like a in some cases, a WhatsApp message message and some markets as a meeting.

So in those cases, we ended up with huge discrepancies in terms of how many meetings someone could realistically handle in that role. In some cases it differed by twice as much. So the result? You could probably imagine that depending on what definition we used, we could end up with a very different recommendation. So when we found out about this issue, we actually have to take a step back.

And to my earlier, point, going back to that data governance process so that we actually align on, you know, what do we actually mean by meeting? Because as you, you know, now we have a strong business case for aligning that because it's going to inform an important investment decision for budget. So we had to sit down and align on something before we actually moved on.

And how do we define it? It always had to occur on a business objective that we're driving. What are we driving is meaningful customer touchpoints so that we can convince them to to come to us. Now, what does that look like? For markets, obviously a a WhatsApp message, for example, it's probably not a meaningful, engagement to convince people to come because even in some cases, the marketing materials you can't just send it over.

WhatsApp is probably not even, I would say, compatible. All right. So we ended up with you know, it had to be a meaningful engagement, but, you know, it's it could be in person, it could be over the phone, it could be over video, but definitely not over email or call or chats. So that's where we landed, anchored on the business objective.

And we were able to come up with a relatively, I would say, high confidence, recommendation based off of that.

Daniel Burstein: So when you're planning a project or when you're being assigned a project like this for you, how much time, what percent are you allocating to some of that front end? You know, work? We talked about getting the data straight, getting everyone right on the business objectives and all of this. How much actually goes into okay, here's the nitty gritty of building the models, testing the models and all that.

Like what kind of ratio are we talking here?

Jiaxi Zhu: Yeah, I think in some cases it also heavily depends on the readiness of the data for the specific business problem that we look at. So for example, if the data is already relatively mature. So if we were looking at financial metrics and so on and so forth, these are relatively stable and well-defined metrics. So in that case it's you know, I don't think there's a lot of, from front loading that needs to happen.

But let's say if we were to look at something, if we were to push the frontier and introduce new signals or metrics, let's say we were to introduce, I would say a recent example was introducing, sort of the customer experience metric. So satisfaction and so on, so forth. That is something that we haven't really looked into previously.

For those use cases, we actually have to spend a significant amount of time aligning and defining what the data actually is, cleaning the data and putting that into action. It could easily take months, if the data is not ready, and if the data is sitting across multiple systems and maybe departments as well. So, so yeah.

So I would say, as I mentioned, it would be dependent on how complex and how new the data is. But it could be anywhere between, you know, maybe a couple of weeks to maybe a few months.

Daniel Burstein: Okay. Well, in the first half of how I made it, Mark, can we talk about some of the things we build, like Joshua was mentioning these analyzes and budgets in the second half, we talk about some of the people we built them with because that's what we get to do as marketers. We build cool things and we build them with other people.

But before we get there, I should mention that the how I Made It in Marketing podcast is brought to you by Mic Labs AI, the parent company of marketing Sherpa. Transform your intellectual property into a revenue engine in just days. In just 21 days, you can pilot your first AI powered product. Learn more at MC labs ai.com.

That's NSC labs. I.com. All right. Josh, let's take a look at one of the people you mentioned that you learn from in your career you collaborate with. You mentioned Kade Massey, who's a practice professor and faculty director at the Wharton People Lab. And you learn from Cade, real influence means accepting the world is not fair or rational by default.

So how did you learn this from Kade?

Jiaxi Zhu: Yeah. So this goes back to my MBA program at Wharton. So I was I took this class, influence with him. And I would say that class was by far the most useful class I've taken in that program. And, as, I would say if I remember, the first, slide that Professor Massey put on, in our first session was this one slide with that one sentence, which is just don't fall for the chess world syndrome.

Don't assume that when you have the best data, when you have the best capabilities, when you have the best qualifications, you automatically get to where you're you will aspire to, you know, there's a lot of other things that's beyond your control. Your, your influence, that may have an impact on the final outcome. How do we navigate that world?

We all have to think more deeply about influencing. Obviously, building data and skills is important, but influencing is equally important as well. So I would say that really, stuck with me. And that is something that I, I reference to, the, you know, every now and then in my day to day of work as well, because, as we discussed earlier, my work is partially, you know, based on building accurate data analysis, part and, but more importantly, partially also using that to build narratives that drive influence across the organization.

Daniel Burstein: Yeah. When I hear about this, I think of value proposition. Right. And I wonder if you can give us an example of how you've communicated a value proposition, either for yourself or even for an internal project you're trying to win approval of. Right? Because, for example, I've shared many case studies about how a value proposition worked before. One of them like how the global Leadership Summit grew attendance by 400,000.

It was thanks to the value proposition work they did. It was they were creating this event. It's a great event people know about. They don't buy it by creating that value proposition and communicating that value proposition. That's how, as you mentioned, you get real influence. So, you know, do you have any examples you can think of in your career of, you know, okay, we talk about there's data that's kind of a fixed thing, but how did you actually build a value proposition for yourself or for an internal project to overcome this unfairness?

We talk about in life and business?

Jiaxi Zhu: Yeah. I would say, number one, obviously don't forget about the data. I mean, the data. Well, it's not everything. It's still a good starting place. So one example, as I was building a business case for like, an investment internally, I started with the data, which is, you know, this is how many customers, this initiative will benefit.

This is how much revenue and impact this is going to drive. I mean, these are all important things to, I would say to serve as a starting place, but it will be wrong to assume that, you know, you can win approval and support just by putting the number out there and maybe compare, you know, this is going to generate X million dollars and that's generating $1 million.

And it's more than X, it's more than Y. And so you should invest in my project. No doubt project. Obviously that's not the whole story because decisions really happen like that. Because, I mean, if all decisions happen like that, probably, the world would run much faster, right? So, but in our specific case, for example, in addition to putting together the data pack for the project, at the same time, I also had to contextualize it.

So what that means is putting that into into the shoes of the stakeholders I'm trying to influence. One example is, you know, if if it's, let's say if there's a finance person, you know, because they control the budgets, at the overall corporate level, understanding how this will fit into the overall, I would say, financial outlook for the company.

Is it ROI positive or negative? So essentially I was speaking their language because they care about ROI. But when I'm talking to somebody like, let's say a, let's say a market marketing leader, I mean, they or even the sales area they really care about, you know, how what does it do to my specific segment that I am, you know, maybe I have a target against, right.

So really explaining, okay, this is what it brings to the company, which is great. But then what is it in it for you and how is it aligned with the broader goal? So the company, let's say if I'm talking to this, about this, to the operators who actually had to implement this, I would say in that case, I would be more in the listening mode, which is understanding from them and leveraging them as subject matter experts as well.

What would it take for us to to make this, this a reality so that, you know, we don't just have numbers on paper, we also have credible numbers that we can actually realize through, through the changes. So in those cases, I would say the lesson learned in here is, you know, the numbers as a starting point.

But influencing extends to speaking that's the same language as the folks that you're trying to influence, understanding who will have an impact on the ultimate decision and getting their point of view weaved into the recommendation as well. Aside from the numbers alone, and then also before actually having a meeting, I probably I think this is something that a lot of people do now, which is something I also learned from, I would say from the influence clause, which is always have the meeting before the meeting, which is, you know, meet with those folks offline, even with just the working progress models, when material to invite their input so that you're actually co-creating,

I would say the recommendation with them rather than, you know, coming to them with a fully baked solution and trying to feed it to them, that's 90% of the time it's going to end up worse than if you were to co-create, the vision with the.

Daniel Burstein: Yeah. I mean, from your stories I can hear too. There's a real understanding of that. You're working with other people in their complex. I'm sure you work with some great AI and machine learning team and all that. Right? But you're working with human beings. And so one lesson you mentioned is true empathy is active work. And you said you learned this from Maria van Hacken.

Executive leadership coach. Yes, yes. So how do you how did you learn that and how do you practice empathy.

Jiaxi Zhu: Yeah. So first of all, I'm very thankful for Maria for being my first, leadership coach. And we had multiple, coaching sessions specifically around how to build effective relationships. I would say in the business environment, I would say that is something that I would say is it's not, you know, it comes second nature.

I admit to me, because I'm, I would say instinctively driven by data and facts. But I, you know, as I previously mentioned, the human element is also very important. One of the things that Maria specifically mentioned to me, which I thought was very important, which is, you know, don't underestimate those informal relationships and ties or even weak ties that you can build throughout the organization.

Or maybe, you know, with, critical partners, elsewhere, you know, if it's the first time that you're talking to that person, then, I mean, if it's if the first time that you're talking to that person and it's about an actual project or something, that you're going to pitch to them, it's probably already a lost game, because it would have been much better if, let's say I went in and have had an informal relationship with that person before.

I actually think about the project so I can come in and the relationships are already warm, and the other person will be much more receptive of what I'm, what I'm going to talk about. And it also, I would say low risk, I would say the risk and the stakes of that conversation because, you know, in many cases, I, it would be much easier for me to position that conversation as an exploration or problem solving rather than something that I'm presenting and seeking the approval of that person.

So to that end, you know, one of the things that I took away was that, you know, I have to actually putting the active work to build those relationships even when there's no natural collaboration as of now. So, for example, I, I have lunches and coffee chats and I have intro meetings with people who are in adjacent teams and organizations know not to push projects or actually problem solving, actual content, but just to get to know them, because you never know when your path will, cross and when you actually have to pitch a new project to them, and seek your support.

So actually having that investment, that informal network of, I would say, potential supporters or even maybe problem solving partners is a powerful assets, when it actually comes to landing something successfully.

Daniel Burstein: Is there any specific cadence that you figure figured out to be proactively empathetic? Because, for example, when I interviewed Jim Kruger, the executive vice president, chief marketing officer at Informatica, one of his lessons was people want to feel part of a team and without empathy, that is very difficult to do. And he had a very specific cadence, had told the story at one to ones, one diffuse, one to many.

He's written in person, virtually at a whole cadence. How you figure it out. So love what you're saying. But I think in reality we get busy, right? We get really busy. And this is a thing that we can just overlook or just fluff off to the side if we're not really diligent about it. So is there any specific cadence you figure it out to make this work?

Jiaxi Zhu: Yeah. I would say, you know what works for me might not work for everybody, but just for my personal experience, what I thought, worked really well for me is number one, I mean, we just, we help each other out informally. So, for example, if somebody, some team was looking for some data or some insight, I'm more than happy to step in and, you know, provide my perspective informally and serving as a sounding board for some of the problems that they were trying to solve.

I think that builds not only builds an initial relationship, but also informally builds my credibility, in front of all stakeholders. That's number one. Number two is that, I also make it an active effort to really connect with folks who I don't normally see, around me. So, as I mentioned, we are a global business, and there are a lot of important stakeholders, who are based in, for example, Epok or Amia, who I probably only see maybe once a quarter or once six months.

Now, I take advantage of things like, you know, global events or global summits where we all travel to the same place and, and I make it a case for myself to actually have coffee chats and in-person meetings with these people, because it's going to be much easier later down the road if you know, if they'll be able to put, a face behind a name, rather than it was just seeing me through, video chats.

I will say that's number two. Number three, I would say it's also being very, I would say direct, but at the same time also, empathetic in a sense. So balancing that. So one example is, I, I, you know, I provide feedback on things that I think I'm is, you know, is not working.

Well, and I do that, you know, empathetically, which means I have my I position that more as suggestions and maybe my $0.02, to those folks offline. And I have those problem solving sessions where we can just chat one on one and maybe do a whiteboarding session, and that that tends to build camaraderie, as well across, myself and the, the other person.

And I would say finally, if it's a big team, I, I am a big believer of, ensuring that we build a strong team culture for, for example, you know, as we were moving, as I mentioned, I was we were moving offices next week. So, I just started this impromptu team lunch in our old office, so that we are, you know, so we get everybody together, we are able to celebrate our time in this office over lunch, and we all share our best memories and fondest memories of this old office.

You know, that is something that's relatively easy and low lift, but also brings people together on a very, you know, based on a shared experience.

Daniel Burstein: Well, you mentioned giving people feedback politely and in the right way. Here's a great story about someone earlier in your career who gave you some feedback that still sits with you today. You mentioned Bruce to the partner, McKinsey and company, and you said from Bruce you learned that intellectual curiosity means always asking why behind the observation.

So how did you learn that from this partner at McKinsey?

Jiaxi Zhu: Yeah. So that was relatively early in my career. And I think that feedback really stuck with me. And, and I think that's also become I would say, one of the things that I also go back to as I'm encountering new business problems that I'm trying to solve. So in that instance, we were, doing like a market scan, for a real estate client, who was trying to go into a certain new market.

And as part of that, we are also, just trying to understand, you know, if that new sort of, commercial real estate market is, viable for that customer. So there were some analysis on market trends, forecasts, projections and so on, so forth. I was the analyst on the team. So I was running a lot of the research on and data on the growth projections and, you know, at the time I was just thinking about, okay, if I put together a chart, maybe a line chart or a bar chart that shows what the trend was, and maybe let's draw a line and say, you know, this is growing by 20% year

over year. So we'll have a pretty good chance at, you know, getting some growth from that, from that market. But when I was presenting the chart to him, he essentially said, okay, this is good. Thank you for putting together this chart. But can you explain why, let's say two years ago, there was a kink in the chart that essentially the market went down by 15% and then rebounded by 30% the year after.

He also. And then he pointed to a several other cases where he said, okay, this is good. But, you know, this is what the market looks like, let's say in the US. But then if I look at the market trend in, Australia, it just follows a completely different trend. Can you help me understand what that what drove that?

And does that mean that we have to think about all of these markets a little bit differently? You know, as expected, I did not have any answers, to any of these questions. And, that was, I would say that was a pretty, I would say, nasty experience personally for me, you know, in the moment.

But I think it's a very powerful learning experience now that now, I think talk about it, which is, you know, when we look at data, you know, whether it's from a, from a report or from a model that we, we run or the report, you should always go back and look at the data and just take a step back and understand.

Are we seeing any consistent trends? Are we seeing anything that, you know, that would stir my curiosity and that I want to learn a bit more about? So for example, and I made it and I made it, I would say my personal rule to look at the data from the lens of a decision maker and then just draw down list of questions.

As for okay, what I what would I be curious and just puzzled by, by this data. So any kinks in the charts, any unexpected declines or growth, any, for example, absence, you know, lack of data in some cases, maybe in some years we just don't have the data. Then what could be driving that? Is it a data error?

Is it a model limitation as we previously mentioned? Is it something that, that's more specific to, to, you know, to the segment or market that we're looking at? And can we get some more intelligence from the people who are actually close to that, market from the front lines? And all of this will add much more, I would say, richer information to make the decision because, you know, a growth is a growth.

But if the growth is coming from a part that's not relevant to the business, then, obviously that's not going to be useful. From a decision making standpoint. So going deeper into the data, asking the right questions and coming up with, I would say, credible drivers, credible hypotheses that you can then support with evidence and facts.

I think that is something that I took away from this experience.

Daniel Burstein: Yes. I love what you're saying, and I agree with it, but I always like asking questions from the other side to see what the other side of it is. So how do you balance that curiosity and asking the why with not getting stuck in analysis paralysis? Right. Because, for example, when I interviewed Justin Herber, the CMO and chief brand officer at Tractor Beverage Company, one of his lessons was pursue the best version of what an idea could be.

And I love that idea. And it sounds great in theory, but in reality there are limitations. And for me, as a writer, you know, for me, I, I love the great Bill Condon quote no piece of writing is ever finished. It is just do right. And that's the thing that reigns me. And from I try to pursue the best idea but something to be.

But some points do and that is what it is. So for you Joshy, I love this idea. We want to go. Why go deep though? You know, ask why. But how do you not get caught in analysis paralysis when you do that?

Jiaxi Zhu: Yeah. I would say the the most important vehicle for doing that is coming up with a strong analytical agenda. So what that means is that before, you know, I sent my team off to do a bunch of data analysis and data pulling. Let's take a step back and just list out all of our questions on the page.

Based on what we're seeing, what we're hearing, and and based on that, can we categorize all of those questions into big buckets? You know, even though there might be two questions that they might be they might seemingly be asking about different things. But actually, when we take a deeper look at it, they might be asking about a very similar theme that we're seeing in the data.

So the first step is just write down all those questions, categorize them, and then prioritize them in terms of does it? I would always ask myself two questions. Number one, does this question matter for the ultimate decision? If the answer doesn't really change our, I would say, our decision making, then it's probably a lower priority question number two do we have actually data already available for us to be able to answer that question?

How much lift is there for us to to get to that? If it's a quick analysis, yes, it's a higher priority. But if it's a very complex analysis and probably nobody has ever seen it, probably data is not the best, way for us to drive to an answer. Maybe we'll have to talk to somebody who actually knows, you know what the evidence may look like.

So in that agenda, I would have the list of questions, the priority. And then also what type of analysis and evidence we need to collect to answer those questions. And then, you know, we I set my team of, to, to do that, so that we don't end up in this world where we're trying to answer and entertaining every single question that might come up.

Daniel Burstein: I want to stop that. You said something really brilliant right there. I don't want audience to lose it. And sometimes data isn't the best way to answer this question. So that is a step you have in your process where you're saying, okay, like are we are we even right to answer this question with the information we have or should just, you know, I totally other approach be taken.

Jiaxi Zhu: Yeah. Yeah. I think one example is, you know, one question that I ask my team or stakeholders is that are we making this decision based on evidence or based on belief? You know, in some cases, decisions are based on past successes or past observations. Let's say, you know, if one specific type of marketing campaign worked really well last year, maybe we should consider doing that this year again.

Right? So that's a decision that's made that's based on evidence. Now, in some other cases we make decisions based on our belief system. So one example is we believe, you know, going back to that I would say the auto manufacturers after sales example, that's a decision based on belief. They believe that they want to become a digitized business, within the next two years.

So they have decided to invest in all of this capability, building in their data infrastructure, the sales force, the workflow improvements. And for us to make that decision, given that this is based on belief, yes, we can surface on data to, you know, in terms of how the broader industry is evolving, what our competitors, what what their competitors were doing, but there's no historical data that we can actually reference because it's not relevant to that new vision.

In that case, you know, is it worthwhile for us to pull historical data on how they perform then whatever marketing campaign turned out probably not as useful. So I think that's a very important, distinguished, I was I was thinking of the thing to distinguish, which is, is it based on evidence and past, or is it more on belief and what we think the future should be?

Daniel Burstein: Well, from your lessons and stories, clearly I hear a lot of fundamentals for what it means to be a marketer, a data analyst, a business analyst. There's a curiosity. The narrative, if you had to break it down, what are the key qualities of an effective marketer?

Jiaxi Zhu: I would say the key quality. I would say the number one is obviously being able to tell a compelling story. I would say both to your audience, which is our customers, but also internal stakeholders, because in, I would say in a large part, marketing is about telling a compelling story and drive folks to action, whether it's to buy your product, consider it, or even maybe sign off on the new project.

I would say that's number one. Number two is how do you tell the story? I would say being able to tell a story that's both visionary, which is you have your own compelling set of beliefs that's grounded on truth and facts and trends, but at the same time, also being able to tell a coherent and clear story that's anchored in, in, I would say in data and evidence.

That's number two. I would say number three, I would say it's, this sense of intellectual curiosity as we just discussed, which is, you know, there's always something that's unknown, no matter how deep you go in your analysis or investigation, and, you know, in asking the right questions is going to be very important, not only to inform the decision, but in some cases, even if it's more forward looking, it's your marketing campaign or your investment up for success, because these are the questions or your learning agendas that you will be taking as you roll out things.

And that gives you an opportunity to really I would say, gauge how successful or not successful the project is. And that will create room for course correct.

Daniel Burstein: Well, then I got to ask, did I ask the right questions today? If asking the right questions is key, is there is there a question I missed?

Jiaxi Zhu: No, I think you asked all of the right questions. You know, I wish we could have two more hours because I, I really enjoyed this conversation. I think we can go, go much deeper in many of those case examples.

Daniel Burstein: That would be a lot of fun. Well, thank you for the time you did give us today, Josh. I learned a.

Jiaxi Zhu: Lot. Yeah. Thank you. Thank you. Daniel was a pleasure speaking with you.

Daniel Burstein: And thanks to everyone for listening.

Outro: Thank you for joining us for how I made it and marketing with Daniel Burstein. Now that you've got an inspiration for transforming yourself as a marketer, get some ideas for your next marketing campaign. From Marketing Sherpas extensive library of free case studies at Marketing sherpa.com. That's marketing rpa.com.

Daniel Burstein: And.


Improve Your Marketing

Join our thousands of weekly case study readers.

Enter your email below to receive MarketingSherpa news, updates, and promotions:

Note: Already a subscriber? Want to add a subscription?
Click Here to Manage Subscriptions