Room For Growth
Episode 
56
|
July 31, 2024
(
51
 min)

Finding the North Star for Your AI Strategy feat. CMO and eCommerce Leader Pauline Reader

Apple Podcasts
Spotify
YouTube
Episode Description

In this episode of Room for Growth, Billie sits down with accomplished CMO Pauline Reader. With a career that includes CMO and SVP marketing leadership positions with renowned brands such as eBay, Minted, and Stitch Fix, and most recently Podium, a communication SaaS platform for SMBs and larger enterprises. Pauline now runs Reader Consulting where she offers fractional CMO and consulting services, and in this episode she brings valuable insights on the role of artificial intelligence and machine learning in B2C and B2B marketing strategies.

Pauline isn’t new to AI — for years she’s been actively leveraging traditional data science methodologies to enhance customer experiences (e.g., providing personalized clothing recommendations) and improve business operations. She shares pragmatic and tangible examples of interpreting data-driven insights and combining human judgment with technology for the most effective results.

Throughout the discussion, Billie and Pauline explore topics including data collection, personalization strategies, maintaining customer privacy, decision-making in technology investments, and finding a balance between short-term goals and long-term AI strategy.

Additional Resources

Topics Discussed
  • The criticality of suppression, proper targeting, and brand storytelling in traditional paid media
  • Using AI and data science models to better understand customer preferences in eCommerce platforms
  • Overcoming obstacles with data compliance and privacy in AI-driven marketing strategies
  • Ways of leveraging unstructured data through machine learning to extract relevant information
  • Key considerations for rolling out AI features or changes in customer-facing technology
  • Connecting and measuring customer data within brand loyalty platforms
KEEP THE GROWTH GOING
Host
Billie Loewen
Show Description

Join WillowTree’s Billie Loewen for a deep dive into growth marketing. In each episode, Billie discusses the latest news and topics in lifecycle marketing, chatting with a wide array of guests, including WillowTree colleagues, client-partners, and industry thought leaders. Let's grow!

Hosts
Billie Loewen
Billy Fischer
Show Description

Join WillowTree experts Billie Loewen and Billy Fischer for a deep dive into growth marketing. In each episode, Billie and Billy will discuss the latest news and topics in lifecycle marketing, chatting with a wide array of guests including WillowTree colleagues, client-partners, and industry thought leaders. Let's grow!

Read the Transcript

Billie Loewen [00:00:04] Hi, everybody. Welcome back to another episode of Room For Growth. I'm your host, Billie Loewen. If you are watching this on YouTube, that is my dog Alister in the background, and he will be with us for the duration of today's episode. We have a phenomenal guest today. Her name is Pauline Reader, and I am going to tell you a bit about her incredible accomplishments, her educational background, all of the places that she has worked here in a moment. But I'll warn you that today we're going to primarily talk about her time at two businesses. One is Podium, where she has served in a leadership role, and then the other, where we're going to spend the most time is Stitch Fix. For folks who aren't aware, Stitch Fix is an eCommerce business. It relies on AI insights, primarily data science, less than sort of generative AI, but, traditional data science methodologies to help make recommendations for items of clothing that go in sort of this curated box that then a stylist also helps to cross-reference customer expectations and what they prefer in terms of the clothing items that they receive. And then what a customer receives is this box of several clothing items that are really curated for their style and their preferences. So when Pauline talks about AI and the advent of it, and really the moment that we're in, I've been reflecting on her attitude towards AI because as a leader, she has been doing AI for years and years. And when I say AI, what I really mean is building data science models to help improve the customer experience and to help improve business operations. It's not exactly these massive language models that are dominating headlines today from just a handful of companies that really have the access to the types of data that you need to build them. She comes from Stitch Fix, where they had data scientists building relatively limited but critically important models for their own customer bases. And she just has this very pragmatic view on the state of AI today, and the state of data science and data governance. She doesn't get too distracted by big, huge, bright, shiny things. She also has a real down-to-earth approach to how she talks about why and how data science, in particular, but also different forms of AI, should and can be integrated into your business. The result is this very tangible, tactical set of advice about where to start and how to think about getting your hands around opportunities coming from AI today. And I really love her attitude. She's got this real "been there, done that, did this before it was the shiniest thing in the world, and before it was scaring people into thinking the entire world was going to change overnight" sort of view. It's just easy to approach, easy to say, okay, I can pull one, two, three things from her advice that we could apply inside our own business. So Pauline Reader, as mentioned, extremely accomplished. She just started her own business where she's offering fractional Chief Marketing Officer consulting services. But her background, her education, is from Princeton University. She's got a bachelors in economics. She went to Columbia Business School. And then in addition to her role as Chief Marketing Officer at Podium and then Senior Vice President of Marketing at Stitch Fix, she has also worked for brands like Minted, which is an eCommerce brand. She's worked for eBay and she's worked in marketing roles there. She was an analyst in financial services, particularly at Morgan Stanley. So she's just got this incredible background in data. And she brings her marketing and her data experience and her technology experience together in a way that I loved. So we are going to jump into that interview, very shortly. Before we do, and maybe this is a bit backwards, but I want to talk about a few things that aren't AI, that are incredibly important trends and needs that we are seeing today from many of our partners. This is just a bit of my reminder that while even in today's interview, we're going to focus on how you can start to apply AI in your marketing operations, organization or as part of how you build customer experience, there are so many things that are not AI that are AI-adjacent or that are AI building blocks, or that are just tried and true pillars of growth that should not be forgotten because they're going to be extremely meaningful this year, this summer, and then into the future. So the first of those is just the criticality of suppression. I think a major opportunity for brands today, rather than focusing on getting personalization perfect or getting audience segmentation perfect, is by doing sort of the inverse that creates the same result, which is how can you focus on better suppression of audiences? This is particularly important anywhere that money is involved in what's going to happen to the segmented experience that you're building. So, for example, in paid media, what we know about media algorithms and how platforms for paid media will spend your money is they are going to optimize to spend any budget that you give them. So your job as the marketer,and the strategist is really to determine how do you make your audience as small as possible, so that anybody seeing your ads or seeing your offer, it is hyper-relevant to them? Same thing if you are offering coupons or discounts, things like loyalty or churn prevention campaigns where potentially you're offering a special promotion. Again, I think the real cusp of this is how do you get better and better at not giving that message to the wrong person? How can you start to eliminate more folks out of your segmentation pools for those targets, to ensure that you are optimizing each and every ad that somebody sees to exactly an action that they're going to take. One of the things I still see today is that not enough brands are suppressing out, say, people who already made a purchase from the marketing that they do to retarget an individual item back to consumers, even really savvy eCommerce brands, I'm still seeing: you'll click on an item, purchase it, and then you'll see ads for that same item across social media, or you'll see that image replayed across their email messages to you. That is a really great suppression use case, where you want to make sure that you're eliminating those folks from the messages that you're sending. Another would be, if you have geographic information and perhaps you have people in your audience base who are not in a state where you have a store or they can't take advantage of a certain offer. Or you can assume that based on the preferences that they've put in, how can you start to suppress some of the messaging that you're sending so that it's less or that it's not existing, so that even for certain periods of time, you aren't targeting people with a message that they can't take advantage of. An example of this might be a fast food restaurant or a pizza restaurant that wants to offer coupons and discounts based on double points or double stars or a dollar off of a special drink or an order that day. But they're still sending that same message to people who are in a geographic area where it's going to make sense for somebody to drive to that store in the 24-hour period. How can you get them out of that so that a) you're just reducing noise in their life, but b) you're making sure that your offers are more relevant and more optimized to just who's going to respond. So thing number one: where could you be suppressing today? We love the rule of subtraction. When in doubt around complexity, remove things, don't add things. So challenge one: think about suppression and how you can amp this up. Second is around brand storytelling. We are about to be inundated with my favorite kind of marketing event. It is almost the Summer Olympics. I love the Olympics from a marketing perspective, because this is where you see brands that sometimes you don't even really hear from. They're just not relevant to you. But if you stop right now and think, who advertises heavily during the Olympics and what kind of story do they generally tell? Hopefully what comes to your mind, what your brain conjures is brands like Johnson and Johnson or P&G telling really warm and heartfelt stories about athletes coming of age, or what it means to be a parent raising a high-performance athlete, or just what it means to be a family in a community cheering somebody on. And so I love these types of brand campaigns because they really are sticky. They are not the type of thing that you generally see talked about as much from a growth perspective, because they're harder to measure. They have longer tail effects. But the storytelling that goes into creating your brand around a character in particular, and building up characters as a representation of who you are as a brand, what your core values are, what your beliefs are, is such critical messaging. I mean, this goes back to caveman times. It's a tale as old as time and it works for a reason. But in the age of AI, there's so much less of a focus on how do we take characters in our business, how do we take characters that we invest in from a marketing perspective, and how do we build stories that really go end-to-end in our marketing efforts? So even for brands who are doing this really well, there are huge opportunities to make sure that these storytelling campaigns that help build that long-term sentiment towards your brand, as well as some short-term results, really resonate across every channel where you're working. How can you bring your email and your retention-based marketing folks, your loyalty team, your app team, your web team, all to the table to make sure that when you are spending money on these big storytelling efforts that hit primarily TV screens, you are ensuring that that campaign gets carried out through your entire customer experience so that it's even more resonant. So challenge number two: think about the story that you're telling and how you can add that human element and then carry that story more broadly across all of your channels today. We are already starting to do some of this work for our partners who are planning for holiday and thinking about how they can make the most resonant holiday experience. So it's never too early. And then last thing: loyalty programs are in a heyday right now. I think with reduced spend across technology, generally in the SaaS space, and in the software space, something that we're seeing is a real resurgence to do more for brands with what they have today. And a lot of that means, how do we put a facelift on our loyalty program? How do we make it more impactful to our customers? How do we become more price efficient in terms of how we're motivating our customers to exhibit certain types of behaviors, or take certain steps with our brand where we don't give as much value away upfront. We really challenge them to do more through the loyalty program. We're seeing lots of brand makeovers in loyalty. We are seeing lots of integration work to make sure that data that's happening in a loyalty platform and a point of sale system to CRM is all connected and is more measurable. So I suspect this is a trend happening far more broadly than just what I get to see through the 50 to 100 or so clients that we work with on a daily basis is that loyalty investment is going to be in sort of a v2 state where we're going to see a real resurgence and interesting things happening to drive customer loyalty and to influence customer behavior by offering rewards and really measure the results of that. So challenge three: how is your loyalty program doing? Does it exist? Does it need a facelift? Does it need to rebrand? What is the customer experience in the app, in the website, as a way to, draw people in to see their points or rewards? How is it integrated across different points of sale, both for in-person and for digital? How are you measuring the campaigns that you're sending in, the promotions that you're using? Have you reinvented them lately? Are you testing? There are so many ways to take advantage of loyalty and be doing more with less. So those are my three trends. Those are my three predictions. Those are the three areas where we're spending a tremendous amount of time this summer. I hope you will join us. And then without further ado, let's get to our fabulous guest, Pauline Reader.

Billie Loewen [00:13:10] Hi Pauline, welcome to Room For Growth. We are so excited to have you here. We are going to spend a good chunk of time today talking about your role at Podium. But before we do, I'd love for you to introduce yourself to us. Tell us about your background and some of the experiences that led you to places like Minted and Stitch Fix, and then ultimately landed you a Podium. What got you where you are today?

Pauline Reader [00:13:36] Yeah. And so, Podium, what it is, is a SaaS communication platform primarily for SMB — small, medium-sized businesses — but also caters to enterprise. And so the business started out in the texting- sorry, review space. So small businesses need to collect reviews. And the best way to do that was via texting. And that kind of prompted the business to go into more of the texting communication and later added phone communication. And so if I think about the biggest challenges at Podium, I think on the marketing side, it would be the company is now kind of focused on a handful of different verticals, like 5 or 6 verticals. And so I think some of the biggest challenges were finding those small businesses, because there's 17 million small businesses in the United States, and only some of them kind of are the right target for Podium. And really reaching those SMBs, SMBs are a notoriously hard segment of the market to reach and to market to, and so I think just finding the right channels and the right avenues to reach them can be challenging. And they're also very they're not very confident, I would say, as a whole, around technology. And so another part of the challenge was not just sort of getting to them, but also explaining the product and the solution in a way that they didn't feel intimidated by or they felt like it was for them. They often feel like, you know, that you hear not that we would say we were a SaaS technology platform, but anything like that, they would feel, "Well, this isn't for me. This is for someone bigger than that." So I think convincing them or explaining to them that they could really find value and their peers find value from it was another challenge.

Billie Loewen [00:15:31] So interesting. Definitely want to dig into that more later. It's such an interesting conundrum. How to say we're SaaS for exactly the size business that you need. Places like HubSpot have done that really well. But to your point, such an interesting challenge when the amount of sales time that you can put towards, you know, traditionally would have been a face to face conversation to say, "Here's why we're the right solution for you." How do you do that in digital means relatively quickly? But you are also at the cornerstone of what is probably the largest and most important set of acronyms in technology today, which is AI and ML. So with your experience and your expertise in both marketing and in growth of these brands and how they go to market, how did you see AI and machine learning playing a role in how you delivered personal experiences, both for direct-to-consumer brands like Stitch Fix and Minted, but then in Podium as well?

Pauline Reader [00:16:34] Yeah, I think, so Stitch Fix is a really kind of interesting example because it was a company that was built on machine learning. I mean, in essence, it's a machine learning company. Everything about what they do is driven by models. And so what just to explain what the service is for those who don't know the customer experience, you fill out a style profile in which you explain what you like, what styles of clothing you like and about your sizing. And then there's machine learning algorithms that basically rank effectively, or assign a score to every piece of clothing for you. And what it is, it's the probability that you buy it. Right. And so everyone's score is different. Of course, because the probability that you would buy something, you and I would have different scores. And then a stylist, there's a human stylist that then uses that information and layers on their human judgment to decide, what to send you. And you would get a box of five items. You keep what you like and you send what you don't like back. And so it's really that intersection of machine learning and human judgment that is the that is the company. But I think as the company grew and evolved, we found machine learning applications throughout the company. So in my area, in marketing, we used it, in multiple ways. One way was to create, we call style cards or outfits where it would show five different items. We made thousands and thousands of these versions, and then we used it to power something called a Dynamic Product Feed in Facebook ads, which is traditionally more for like an eCommerce company to upload their inventory. But we didn't expose our inventory directly to customers. So this is kind of our way around it. And so that was that was driven by machine learning and was quite successful. When we think about building personalized emails, or push notifications, or even pieces of direct mail. It's really, really important to be relevant to people. It's more effective and especially in apparel where if you don't show people things that they're interested in, it really doesn't matter what you say at that point, you've lost them. And we had all this amazing data on people and we were, you know, when I first got there, everyone was kind of getting the same thing. It was this batch and blast approach, and it was using these personalization models, these machine learning models where we evolve the program such that everyone was getting an email that was unique to them. Same with like push notifications. And so these are, you know, millions of customers, so millions of unique items. And it was really powered by machine learning algorithms. And so those are just some of the examples.

Billie Loewen [00:19:27] Such a fun industry. I remember when Stitch Fix was new on the scene and the notion of renting, borrowing clothes, having curated for you, returning what you want, was such a new and novel concept. In fact, I have a cousin who was a data scientist who worked at Stitch Fix in the early days on some of those algorithms, and I just loved learning from her about how Stitch Fix was thinking about being one of the first on the scene to really leverage AI and machine learning. I also find it really interesting that you started as a technology eCommerce company leveraging AI for how you sell, but then figured out how to use things like machine learning to drive business efficiency, to drive personalization in the customer experience. So all of that is really cool. It's also now a really crowded space. Stitch Fix is no longer the only player. There's lots of different models, everything from things like Rent the Runway to a Newly to all sorts of players. So rapidly changing industry. I'm really curious how, as technology improved and sort of more widely adopted, what else you did to try to stay cutting edge or stay ahead of copycats in your industry?

Pauline Reader [00:20:42] I think, well, on the machine learning front, I think some of what is really, honestly the most valuable thing is the data. Because if you don't have the data then you don't have, like it doesn't really matter how good your model is, it can't do a whole lot. And so a lot of the evolution at the time was to try to figure out low cost and I mean low cost to the customer ways of getting more information. So a while, this is some time ago, but something called Style Shuffle was created. And it was basically this, it was a little fun I guess little application where you would have a piece of clothing that would appear and you would say whether you liked it or you didn't like it. And people love rating things. So what we learned, I don't remember the number, but it was millions and millions of votes. And they liked doing it. They, customers like doing it. And then we would get the data that would then better power these algorithms to deliver an even better customer experience. And so part of the evolution was finding additional ways of collecting data. And I think another example that's been more recent is people provide feedback. And a lot of times that feedback, to their stylist, right. Like you know what you buy? You know what they bought, you know what they didn't buy. But then they'll leave notes and they'll leave comments in free-form, non-structured text about why they didn't like something, why they did like something. And that is very, very hard. Like a human can take that and read that and use it just fine. But when you think about millions of data points like that that are in free-form text, how do you extract in a scalable way, what that is saying? And so I think some of the new advancements in machine learning are allowing for that kind of extraction of pulling, making structure out of something that's unstructured. So that's another example of how Stitch Fix is continually trying to stay ahead with AI and machine learning.

Billie Loewen [00:22:56] Well, perhaps being cutting edge without realizing it. Stitch Fix was, of course, one of the first to take AI modeling and pair it with high human touch. I was always really curious how much of Stitch Fix's magic was the human component of what was happening, this notion of a personal stylist, adding dimensionality to the recommendations, versus the AI component, which was really trying to understand out of thousands of options, which few would you funnel down to for an individual consumer? Can you talk a little bit about what you learned about the importance of both, and how to bring people and AI together to help make the most intelligent recommendations to customers?

Pauline Reader [00:23:43] Yeah. I mean, I think, even now versus when I was there, there's just always limits to any kind of data modeling or any kind of AI. There's just kind of no way around it. And there are some things that, like, humans are just better at doing, and there's some things that they're a lot worse at doing. And so it was really that combination, having one like working together that really made it powerful. And so, you know, a stylist would be looking and choosing what five items to send, and they'd be looking at scores on outfits. But that didn't take into account what they just told you last week or a month ago, or what they put in their last note. And so it was sort of the human layering on top, some judgment that was hard for the models to really discern, that work together. I think each by themselves would have been far less, far less valuable to the customer or wouldn't have delivered as good results. Now, I think as these models evolve, like you could imagine, like over time, perhaps the human element maybe becomes less. You could picture that. But to me, that would just mean you would use people to, you could raise the bar on your service even more with humans. Not just replace, not just do everything via the models.

Billie Loewen [00:25:14] Well, and to your point, any kind of personalized marketing really relies on data that is actionable, that provides some kind of actionable insight. It's not just collecting for collection sake, which, you know, I love the conversation around, what do you do with unstructured data and this notion of judgment? There's not a replacement for that at the moment, at the speed so many technologists hope that AI will eventually be able to replicate. What was some of your experience in making sure that you were delivering a personalized experience and really able to leverage data to do something actionable with it? What considerations or strategies do you have between collecting data for data's sake and truly creating something that's like more personalized or more actionable?

Pauline Reader [00:26:04] I mean, I think we wouldn't collect data if it wasn't for some kind of value or some kind of purpose. And that almost always was about delivering a better customer experience. I mean, there's probably some exceptions that I can't think of, but I think, by and large it was it was really-- because asking people for information is creating friction and that decline, that makes your customer experience worse. So you really don't want to ask or don't you don't want to collect things that are extraneous. Because if you're not going to use it to provide a better experience, then they're just going to get friction. And no one really benefits from that. So yeah, I would say pretty much all the time it was not just collecting for for the fun of it, but for a purpose.

Billie Loewen [00:26:59] I think that's so interesting, though, that your answer is sort of like, well, of course, the only reason to collect anything is: what is the purpose and what are you going to do with it in the end? We work with a lot of partners where the people who are creating, say, the naming conventions in the architecture for analytics, or considering what data to capture in some kind of central repository, or a customer data platform, or wherever that data is stored, are fairly separated from folks making the highest business decisions about where to invest in customer experience or where to improve personalization capabilities. And so often, the methodologies that go into planning for how to use data don't have that kind of clarity. There's not necessarily a mechanism to say, why are we collecting this? What will it be used for? And then make the judgment call of, well, if that question can't be answered, let's just not like-- so many of our partners just default into: collect everything. What advice might you give those companies, especially coming from a place like Stitch Fix, which is relatively large in size, it wasn't as if it was like a total startup in terms of scale. But from what I'm hearing, you had a lot more capability to be very close to the methodologies going into this data collection, and that was key to the strategy.

Pauline Reader [00:28:20] I guess it depends on I mean, there's kind of different kinds of data collection, right? There's the kind of data collection that-- someone's using your product and you're collecting how they're using it. And to me, that's-- you don't always know, too, what you're going to need. And so I think that collecting that kind of information is super important, super valuable because you're going to want to look at it and understand, like how customers use your product. So you can build a better product, the kind that have friction or is like explicitly asking people things. And I think that's the kind that you really want to be, you really want to be thoughtful about. I mean, you should be A/B testing that. You should be very thoughtful about everything you asked because, you're going to lose, people drop off. Right? You're going to lose some amount of customers by doing that. And so I think, I guess what I would say is, make sure to test it, make sure you need it, before just asking for it.

Billie Loewen [00:29:30] And where do you think about the line in terms of just customer privacy, where even if you can collect it, should you?

Pauline Reader [00:29:40] I think I guess the way I view privacy is, I think if it's how people are using your application and you asking information from them to better understand, and you're keeping it for yourself to make a better product, to me, I think that's pretty fair game. I think any rational consumer or customer would assume  that's happening. I think where it becomes a stickier issue is then sharing that information with other platforms in anonymous ways. I think that's where you have to be a lot more thoughtful and give customers the ability to not, for you not to send it, or even sometimes maybe even not collect it. I think that's where the line crosses and you have to be you have to think about that a whole lot more about what the right decision is and what the right balance is.

Billie Loewen [00:30:38] Yeah, it makes total sense. You've talked a bit about some of the challenges and roadblocks that you face when you are establishing the data infrastructure to be doing AI or any kind of machine learning. I'm curious how, well, I'm curious about other kinds of roadblocks or challenges that you have seen, that you might have a bit of a guidebook for beyond just how do we generate enough constantly? And then even on that topic, maybe take us a layer deeper when you start to run into like, oh my goodness, we need so much data to keep this model working for us. How do you start to solve some of those challenges?

Pauline Reader [00:31:18] I mean, I think the challenges that I would see the most with machine learning type of applications was the difficulty in interpreting the models. I mean, I think that's one sort of weakness or flaw of something like machine learning is they're not-- I mean, the whole point of a machine learning model is that it's making inferences and I guess figuring out patterns that a straightforward model someone could easily interpret, can't do. That's the whole point of it. But then the downside is that it can be not always intuitive why you're getting the answer you are or why you're predicting the answer it is. And so a fair amount of, sometimes a fair amount of work has to go into kind of unpacking the models and understanding why it's saying what it's saying, so you have confidence in it. And so that's an issue I saw quite a bit at both eBay and at Stitch Fix. It's something that took a lot of time. And then oftentimes, you know, the data scientists that are building these models, you know, they're giving answers to people who don't necessarily understand them and maybe don't always necessarily like the answers. And so I think that's where I've seen like some of the big, some of the biggest issues.

Billie Loewen [00:32:44] Do you have an example of that that would make this-- There are so many marketing leaders right now who I think want to better understand how they're supposed to be leading in this age of AI takeover, really and truly. But even the building blocks for things like, "hey, when you're building these models, build in time to interpret their meaning." What's an example of that? What would be a practical step of what to expect?

Pauline Reader [00:33:12] Like an example where they're hard to interpret. You mean?

Billie Loewen [00:33:15] Yeah, totally.

Pauline Reader [00:33:16] Yeah. I think where I saw it come up a lot is kind of in these, these types of media-mix attribution models or combining attribution with causal A/B tests. And so, we had this at Stitch Fix, and we had had this at eBay too where we would basically be trying to figure out how much to spend based on, like, what the CAC was looking at relative to our ATV, what it looked like last week, what it typically this season looked like, and what should the allocation between channels be. So you're making constant spend decisions both on the amount to spend and where to allocate it. And that is where this came, that's where this comes up a lot, probably because that's also somewhat of a controversial area to begin with. In terms of what to do, or "the right answer." And it's not as black and white as any of the numbers purport them to be. And so, you know, I think sometimes the models would suggest, like, I'm exaggerating here, but to radically increase Facebook or search or one way and it sometimes it was a little bit of a head scratcher because you're like, it's not really obvious why we would do that. That kind of thing happened a fair amount, I would say. And in eBay, within paid search, you know, it was a version of that as well.

Billie Loewen [00:34:46] I'm almost chuckling just because I love that example attribution remains incredibly challenging to figure out. What should a sort of medium expend be? Is AI ever going to be capable of solving some of these "there's no clear answer" kinds of questions for marketing leaders or for anybody trying to drive customer experience? Like, it strikes me that in the world of personalization and trying to win at market because you're bidding on keywords, but you're also figuring out where and how to optimize the customer experience. There's a lot of gray area where you can make a very intentional decision to go left or right, but there's not always a perfect black and white. So even within the space of attribution, though examples outside of it great too, where do you see AI continuing to advance versus where is human judgment and discernment going to remain really critical to any marketer's strategy?

Pauline Reader [00:35:46] I think the areas where I think AI can go a very, very, very long way and maybe lessen or minimize the role of people is understanding sentiment and making structure out of unstructured data. I think for that-- that seems to be one area where AI has been, and it's only somewhat early days, seems to be quite good at. And I could see that being a place where just less and less is needed. You know, as in like and then writing content too. I think one thing is, as an example, like at Podium, one of the things we were doing was trying to write content leveraging AI. And so it was all about getting-- a lot of the human work was less in the writing, and it was more about iterating and getting the right prompts. So what do you ask it? How do you ask it such that you get the right information? That's where a lot of the human time went into, was getting that right. And once we got that right, you didn't need to do it again. It was fairly scalable. But sort of the setup and the initial part definitely did take human iterations. And then, I also imagine you're always going to want to have some kind of process where there's some kind of human evaluation or human judgment at the end of it. But part of the reason to do that is to give the model feedback and information on how well it did so it can take that information in and learn better. I mean, I think where AI is like, never gonna never really gonna get to play a role is more in the like human relationships. That trust, I mean, obviously the tools around that, that we all use, can be powered by AI, but the actual like how I'm talking or conveying myself on this podcast, like AI is not going to help that.  Maybe it can get me here faster, but that nothing is really-- It's hard for me to see a model substituting for that kind of human relation type of activity.

Billie Loewen [00:38:04] Totally. I'm curious. And again, staying kind of close to AI and machine learning at the moment, so many businesses are trying to understand where to even get started, what their long term strategy should be. They're trying to think about how to leverage AI next week and five years down the road. How do you strike the balance between the appropriate short-term goals and a longer-term strategy in this space?

Pauline Reader [00:38:33] Yeah, I mean, I think you probably hit on the question that everyone wrestles with almost every day to some extent. And ultimately, I don't ever think there's some kind of magic bullet or silver bullet or magic answer there. One framework or way to think about it is really being very clear on your end goal and your vision and your North Star. And that should rarely, rarely, rarely change. But the way you get there and the path to get there, that is where you're more willing to be flexible and to find that Path A doesn't work. So you need to maybe try Path B, or maybe it's try Path A in a different way. And so that's where I think, that's sort of the thinking on the long-term vision, and what you're trying to get to should be fairly immovable. But then, how to get there should be you should be pretty flexible and open to cutting. Not everything is going to work. And you kind of have to let things go and move on to the next thing.

Billie Loewen [00:39:38] How do you think about or advise business leaders to be investing in technology right now? I think I remain cognizantly aware that almost every SaaS company in the world is going to start building some kind of add-on AI feature that may or may not be a core part of what their platform was built to do. There's going to be the emergence of lots of new, very AI-specific companies and tools and technologies. I'm curious if you have a methodology for decision-making around what is a good piece of technology to invest in or trust in, versus where to have some caution?

Pauline Reader [00:40:17] I mean, I think the approach you should take is not just doing AI for AI's sake or machine learning for machine learning's sake. But really, what is the problem that you're trying to solve? What are you trying to solve? And really leading with that first, which sounds, maybe pretty obvious, but I think it can also be hard and you kind of get caught up in feeling like you need to kind of keep up. But I think really keeping at the forefront of your mind, what are you trying to solve for? And I always find it funny. I find a lot of the AI, a lot of companies out there are trying to sort of plaster the AI word everywhere. And I find it so odd because, by itself, I don't see how that's at all valuable to anyone. It's what it's doing for you that matters. So I find that quite curious that I'm seeing so much of that these days. I mean, I guess it's understandable, but also odd at the same time.

Billie Loewen [00:41:19] Yeah, "AI Included" is the new "non-GMO" label. That's just on every product. Just a nice, like, sticker. And I think you're totally right. So many companies aren't even sure what problems they could be solving with AI, or could be solving really well, or frankly, how much to invest in those experiences. I think I've said the words more frequently lately, like, yikes, this seems like too big of a roll out of a feature that you're not totally sure how humans are going to react to something that's not human. Maybe we bring it back a little bit. Maybe we plan to over-staff humans to interpret what's happening a bit. I'm also curious if you have advice in that space, if you're going to market with something new that involves AI, what's the right size of a test to do with that? Is it something that should be rolled out to all customers? Do you recommend testing in small increments?

Pauline Reader [00:42:18] I think you always want to-- there's rare circumstances, I think, where you don't want to test. And I think this would be definitely, definitely an example where you want to make sure that, you have it right and you've nailed the customer experience. I mean, at Podium, I think was about a year ago. A lot of the-- some of what Podium does is allow businesses to talk to their customers. But the communication would happen via text. Or to thank them for a review or to ask for a review. And so instead of there's a lot of businesses having to write these messages like hand-write these messages. And what we did was we created basically like, what's the action you want to take? And it would start with, it would kind of give you a template or something to start with. But yeah, we absolutely tested that before rolling it out. So I think for, especially around like new technology that is customer facing, I would definitely, you know, advise testing it.

Billie Loewen [00:43:27] You recently started your consulting business, Reader Consulting. Congratulations. And I know you are still within the first year, the first six months, but I'm curious what types of market challenges you are seeing most frequently, or what kinds of challenges you're being called in to spend the most time on?

Pauline Reader [00:43:51] Yeah, to be honest, it's pretty wide. It's pretty wide-ranging. I mean, I'm working with companies who are just barely getting their marketing program sort of set up. And it's very in its infancy. And then others, like public companies that have pretty large marketing teams and they're trying to address a certain area. So I'd say it's, at least so far in my experience, it's pretty wide-ranging. I would say most tech-forward companies, I think a lot of them are trying to figure, are actually trying to figure out this AI question about how we can better how we can better leverage this technology, how we can become more efficient, how we can reduce costs or maybe not reduce costs. How can we keep costs the same, but do more, produce more output? And so I think that's definitely on the more tech-forward companies. I think that's definitely a common one.

Billie Loewen [00:44:57] A fun set of cross-functional challenges. None of your days are the same, it sounds like.

Pauline Reader [00:45:02] Yeah, yeah.

Billie Loewen [00:45:05] And then I always ask this question because so much of this podcast is about how brands create experiences that customers become truly loyal to and spend quite a bit of time grappling with. What even is loyalty? How do you build it? I'm curious what brands you are loyal to and why.

Pauline Reader [00:45:29] Maybe I'll give you a different one. Not a common one. I have two kids, and there's this kids bike company called Woom. I think they're from Austria. And it's the kind of thing, if you don't have kids, you would never know them, and if you do have kids, you probably see them constantly. But I'm very loyal to them because they just make an amazing product. It's more expensive, but they're incredibly lightweight. And when you have a four-year-old that's learning to ride a bike, the difference between something that's 11 pounds and something that's 20 pounds is really, really significant. And I think what they do is kind of simple, all they do is bikes, but they just do it really, really well. And so I'm an admirer of those. I think another one is, and just speaking about the personalization and who does it really well and is Netflix. Just does a fantastic, amazing job. I've noticed-- I used to, when wanting to watch something, a movie or documentary or something. I would go to Metacritic, or I would go to Rotten Tomatoes to find stuff, to look stuff up. And I've kind of stopped doing that because I find the recommendations are so dead on for me, that it's just a better way to discover content. And so I think they have it down pretty well and I think do a pretty amazing job. And I'm never without something to-- I can always find something, and almost always like it too. So I think they've done a pretty amazing job.

Billie Loewen [00:47:17] Yeah. Netflix is a constant watch to see how they're thinking about the content hierarchy on their scroll of their home feed. Because it is, it's really fascinating. I think my one gripe with them is they occasionally forget that a show that I used to binge-watch that's entirely out of my normal realm of content will get a new season, and then they won't immediately and promptly scream it at me.

Pauline Reader [00:47:41] Really?

Billie Loewen [00:47:42] Like, I really like the show. There's a survival show called Alone, it's very like rugged outdoor show.  

Pauline Reader [00:47:49] Oh, I love that show!

Billie Loewen [00:47:50] Yeah, I love that show. Yeah, yeah, yeah. As soon as it comes out, I want to, like, there's something so ethereal about it. I want to binge it. And they always forget to tell me there's a new season or let me think if there's another example. Oh, like Couples Counseling. I think it's such an incredible show.

Pauline Reader [00:48:07] That one. Yeah.

Billie Loewen [00:48:08] And actually I think that's on a different network, but I'm always annoyed that I'm like, how do I explain that? I am so passionate about this show that as soon as it's coming out, I want to be made so promptly aware?

Pauline Reader [00:48:20] That's funny, I remember, this is, probably ten years ago or so that documentary Making a Murderer. I remember I was at my parents' place, I was in Canada, and I got a notification that this was coming, and it was like, it's completely 100% up my alley. And I think I watched it the day it came out. And it was on a recommendation from them. I probably would have eventually heard about it because it became so popular, but I started watching it the first day it became available because of the recommendation.

Billie Loewen [00:48:59] Yeah. Very fascinating. Always does such a great job of setting the bar for customer expectations. I love all the brands that today everyone else has to mimic because they're doing something so good.

Pauline Reader [00:49:12] Yeah. I'm also big fan of Etsy. Just because they have products that are just hard to find, or at least to find the amount of them that you find on their platform. And so I think they're yeah. I always wonder why I don't shop more with them because they just they have so much and it's so good and unique.

Billie Loewen [00:49:38] Yeah. As far as a marketplace goes, I don't know that there is a better marketplace in terms of quality and quantity of items sold. I think they don't perform nearly as well in search as one larger brand with a smaller focus, which makes it really challenging to find their items when you're searching for something.

Pauline Reader [00:49:58] Yeah.

Billie Loewen [00:49:59] You have to remember Etsy.

Pauline Reader [00:50:01] Oh, yes. Oh, that kind of yeah. I think for inspiration and for ideas and just the uniqueness of the items they sell, I find them to be really good.

Billie Loewen [00:50:15] Totally. Well Pauline, it has been great to have you here. I think for so many, as much as we talk about AI and machine learning, it is still a bit of a black box and you have left us with some really practical tips, tricks, things to think about, especially as somebody who for a decade, dare I say, has been working in AI. So I love your sort of very practical "of course this is what we're doing. It's not quite as complicated all the time as we make it out to be."

Pauline Reader [00:50:47] Thank you. Well, thanks for having me.

Want to get in touch?

Is there a topic you're dying for us to cover? Or do you have a guest nomination? If so, shoot us an email.