6 Common Mistakes Companies Make When Launching GenAI Pilots

+

Upcoming

July 17, 2024 11:00 AM

 EST

6 Common Mistakes Companies Make When Launching GenAI Pilots

Ready to kick-start your genAI pilot but anxious about potential pitfalls? Learn common missteps to avoid and strategies to successfully pilot your genAI use cases.

Webinar Starts In:

Thank you!

Check your inbox for event details.
See you on 
Jul 17
 
@
11:00 am
 
EST!

Thank you!

By registering, you'll get:

Recording
Live access
Calendar reminders

Meet the speakers

Jessica Hreha

Jessica Hreha

Director, AI Transformation Office, Jasper

What we'll cover

  1. Navigating Common Challenges: Learn about the six critical mistakes that can derail your genAI initiatives right from the start and how to avoid them.

  2. Strategic Implementation: Get practical tips on choosing the right focus areas, assembling the ideal team, and setting up effective training and feedback mechanisms to ensure your genAI projects succeed.

  3. Achieving Tangible Results: Understand the importance of baseline documentation, clear goal setting, and continuous improvement to measure success and drive real business outcomes with your genAI projects.

Don't miss this opportunity to gain the knowledge and tools you need to lead successful genAI pilots in your organization. Register now to secure your spot and take the first step towards transforming your marketing strategies with the power of AI.

+

Replay

July 17, 2024 11:00 AM

 EST

6 Common Mistakes Companies Make When Launching GenAI Pilots

Ready to kick-start your genAI pilot but anxious about potential pitfalls? Learn common missteps to avoid and strategies to successfully pilot your genAI use cases.

Webinar replay coming soon

You should be redirected shortly...

Webinar Replay recording is coming soon!

Missed the webinar?

Fill out this form to watch the replay.

What we covered

  1. Navigating Common Challenges: Learn about the six critical mistakes that can derail your genAI initiatives right from the start and how to avoid them.

  2. Strategic Implementation: Get practical tips on choosing the right focus areas, assembling the ideal team, and setting up effective training and feedback mechanisms to ensure your genAI projects succeed.

  3. Achieving Tangible Results: Understand the importance of baseline documentation, clear goal setting, and continuous improvement to measure success and drive real business outcomes with your genAI projects.

Don't miss this opportunity to gain the knowledge and tools you need to lead successful genAI pilots in your organization. Register now to secure your spot and take the first step towards transforming your marketing strategies with the power of AI.

Full Transcript

Welcome and Introductions

Krista Doyle: Welcome everybody making their way into the room. Welcome to avoiding pitfalls 6 common mistakes Organizations make in getting started with their gen pilots. I'm Krista Doyle. I help lead content here at Jasper and your host for today will be Jasper's very own head of marketing, AI strategy and client transformation, Jessica Ria. So obviously Jessica is an expert on this topic. She will tell you all about herself in a bit. She has a really exciting talk planned for you all today but I wanted to go over some quick housekeeping while we're waiting for everybody to finish loading in. So feel free to use chat today as much as you want to ask questions, add commentary. We will have Carissa from the customer Success team actually here today with us interacting with you from time to time and helping to answer some questions. And then at the end we will do a Q and A. So please send in any Q and A questions you might have for Jessica toward the end of the talk. With that I will turn it over to Jessica. Welcome and everybody have fun.

Jessica Hreha: Sounds good. Thank you so much Krista. Welcome everyone to our session today. So happy to have you all here and maybe watching on Demand later. I look forward to hanging out with you for the next call it 40, 45 minutes or so. As I said, feel free to connect with others in the chat. Submit your questions in the Q and A box. If you do have other resources to share. If you're a former colleague of mine and you have other examples or experiences to share with your own pilots, feel free to use the chat. This is a community. We're all here to learn together. There's no single one right way to do things and I learn from others every day as well. So with that let's get started.

About Jessica Hreha and VMware GenAI Pilot

Jessica Hreha: I want you all to know I am not years into an AI career even though Krista just called me an expert and you don't have to be either to take action, step up and lead right now. My previous job, I spent three years at VMware. In the last position I was in there I led global demand content strategy and we embarked on a pilot with Jasper which I'll go into more detail in a few minutes but with the introduction of ChatGPT the message was clear, right? This is going to have a much more sweeping impact on our entire marketing team and we needed to involve other team members into our pilot, into our AI strategy and uncovering to really figure this all out together. So I started a cross Functional Marketing AI Council officially in February of 2023. That team created AI usage guidelines for marketing established a steady beat of AI literacy and enablement programming and we expanded our Jasper licenses from just my team of nine to over 750 of our global marketers at that time at VMware. Now I'm here at Jasper, super honored to be helping other enterprise clients at various stages of their adoption journey and continue to evangelize and advocate for responsible AI adoption. The point is, everyone has to start somewhere with one use case. Pre chatgpt when we were introduced to Jasper, we looked at it in two ways all around, kind of one use case, all coming back to our team's KPIs. So it's a great place to start because we all have KPIs for a reason, right? An area that needs improvement. For us, the production of our promotional campaign bill of materials wasn't quite meeting the timelines required by our stakeholders, regional field marketing and our digital media planning and execution teams. So we were missing media deadlines and getting content into market in that quarter. The global teams were waiting too long for translation translated bill of materials and the partner teams had even longer to wait for their partner partner personalized BOMs as well. So we hypothesized that by using Gen, we could reduce this time frame because that current content creation model was just too agency dependent, overly complex and inefficient. So really wasting time and money and often at the expense of quality because our time was so squeezed. So our initial pilot goals were this. By decreasing time spent waiting for agencies to produce content, we can spend our own time drafting and then more our own time actually editing. With that, we hypothesized we could cut agency fees from our team spend by 50% while increasing production velocity and getting our campaign packages into market faster. And then when other marketing functions started getting involved with our Marketing AI Council, we realized we're not just promoting a writing assistant, we're testing out capabilities of a tool that could really revolutionize the way our marketing team operated. Boosting efficiency, saving time and fostering creativity. And none of this information, I'm sure is new to you all by now, but thus the Marketing AI Council was born. We created this charter educating and empowering our marketers by focusing on tools, governance and education. And that's how it all kind of scaled from there. So I've been at Jasper for the past five, close to six months now, talking with companies, but Jasper's been doing this for a few years now, talking to companies as they adopt generative AI. And even as I'm catching up with customers at their various phases. We're spotting definite trends to this path to adoption and points along the way points along this path that companies can stall out. And this stall out is usually happening at a stage when a shift in strategy is necessary or leadership or different type of leadership needs to emerge. So companies start out by trying out AI for individual acceleration. Couple of content marketers use it to help them get them through blog posts for example. Right? But they never get past those individual productivity gains. Team acceleration looks like rolling out an AI system across a team. Maybe adjusting the way that you work in your workflows to incorporate AI, using a brand and style guide for consistency across content creation. Maybe even in your editorial review process. Business acceleration looks like when you combine generative AI with the analytical parts of AI machine learning. It's been around for years, right? But connected to your data sources to speak spot trends in performance and implement continuous optimization with content that drives better performance outcomes. Really revenue impacting. But again, this takes a shift in strategy and a different type of leadership to make that jump. I want you to know though, most people, most companies are still stuck in that individual experimentation phase at the beginning. Which is why I want to talk to you about going from individual to team acceleration with what we call managed experimentation. I've been talking at other events this year about these six steps to getting started, but I thought it would be interesting to turn it around and start with the pitfalls as more of an explanation as to why these steps are so important.

Common Pitfalls in GenAI Pilots

Jessica Hreha: So let's talk about how what I see is how to avoid common pitfalls as you prepare to set up your team based Gen AI pilot. There are so many opportunities and ideas and ways to apply Gen AI in marketing. Right? And I hear from leaders all the time about all of the use cases they're excited to tackle to the point of overwhelm because we're having the same conversation months down the road because they're struggling with exactly where to get started. So they don't. Right? Thinking too broadly, boiling the metaphorical ocean can really stall your ability to get off the ground. Even though there's so many things I know we are excited to tackle. But the best way to get started is really to choose small focused use cases. One use case, a few use cases in a part of the business business. Which team or BU experiences the biggest bottlenecks in content creation? Workflows, for example, or an area of the business that has large amounts of content that needs refreshing or repurposing? You want to choose an area of the Business that can commit the time to test and then you can scale workflows across the org. If you were to map out all of your different use case ideas, map them on a quadrant to value and effort, you would want to choose high value, low effort use as your pilot candidates. Because my advice is to keep it simple just for now. Integrations, for example, important part of streamlining and scaling workflows with AI and especially in startups or smaller companies. I talk with leaders that are like but I can create it all integrated from the beginning, right? But you don't necessarily need to start there and that still takes time to set up. Meantime, you could be missing out on leveraging and benefiting from use cases that you can start now and then in parallel set up those advanced and more integrated use cases. You want to begin with a small pilot to validate the technology, gather the data to build that stronger business case for scaling so high value, low risk, demonstrate the value of AI and then build momentum for scaling across your organization. So what does your team do repeatedly? Where can you save time while increasing output and quality? Which parts of any workflow or process slow production or are squeezed? Where do you need more time actually in the process? But you don't because you're spending more time in other parts. Everything from research and ideation, distilling research reports and insights, to informing your strategy to build a better, more informed campaign brief to create campaign content and then repurpose that content across your buyer journey. Start with defining your use cases. What are your business priorities? How can AI help you get there? Remember, AI is not a strategy in and of itself. It shouldn't be a KPI to do AI this year, right? It's a way to accelerate the goals and strategies you already have. But to get started though, we've got to choose one or a small set of pilots, get off the ground in a meaningful way and then build on your use cases from there. Number two, I don't know about you, but if anyone relates to this, how often does a new work stream or center of excellence get established and then people are just appointed to lead them? With AI especially is this someone who's really excited about rolling up their sleeves and uncovering use cases? Or they did they just get told to do something with AI this year, right? Are they leading because they're already the head of marketing Ops or Martech or the expectations of whatever role they're in say that they are supposed to do this. Especially if this person is higher level, they're further removed from that hands to keyboard work, right. Which can sometimes slow progress and stall the culture and change management that's really needed here. And yes, I'm covering two topics in one here, but anyone ever run up against a defensive marketing ops or legal IT security team who maybe didn't act as warmly as you thought when this project gets just thrown over to their side of the fence? Right, let's talk about this. But first I need to sidestep here for a moment. And if you are tracking or have a bell for every time someone says change management nowadays, now's the time. But I want to touch on the importance of this idea of a guiding coalition in change management because Dr. Cotter, professor of Leadership at Harvard Business School, has an article on change management that confirms what we saw at VMware and what I'm seeing with our customers now across all industries and segments. This idea that a change management coalition actually not include members of senior management, that it does tend to operate outside of normal hierarchy because if the existing hierarchy were working well there would be no need for major transformation. And this comes into play when you're setting up the people involved in these initiatives. And in this pilot team, look for the hand raisers rather than appointing logical hierarchical leaders or team members who do this function but may not be excited about doing this because they have to be willing to spend time outside of their normal day to day obligations to run tests and document experiences. You need a lead and people involved who raise their hand are excited to dig in. And if you have a use case that covers multiple teams, for example, I would recommend cross functional participation because then it allows you to gather outcomes from more than one group, which is important and can really accelerate once you get to that scaling phase. But you need to include IT security, privacy, legal, marketing, apps. Of course from the very beginning. Our CIO at VMware at one point told me to get his team in as early as possible so they knew what was coming. Grease the skids for them. And the only reason I was even talking to our CIO is because my CMO came on board as our exec sponsor to help make those connections across the business. Just double clicking on the importance of an exec sponsor. They can be at different levels of the organization based on your size and based on where your team is in its piloting or AI adoption journey. At one point it might be one vp, at another point it might be another vp. And it wasn't until five or six months later our CMO came on board. So they all played various roles, but the idea is that they have the ability to create urgency, accelerate decision making. They're bringing legitimacy, funding, resources, rallying cross functional support, making those introductions that you might not have had otherwise, really being that C suite champion and influencing the cultural shift really needed in this level of change management. Number three how many times have we gotten to the end of a project or even the beginning of a discussion about changing the way we do something, when we realize we don't actually have documentation on what was done today or before? Right? Before we could apply AI to our global demand content operations at VMware, we had to understand the process that we had today and most importantly, the time spent in each stage to have a foundational baseline to be even able to show improvement. And we had to establish KPIs around what that improved timeline needed to look like. So I call this documenting the status quo of your workflow today. How many people or agencies are involved? How long does it take? How much does it cost? What's the level of quality? And then define your success criteria. Make a hypothesis on what success looks like and document that in terms of clear, measurable goals. So your success criteria can be Cost savings. How much can you save by bringing something in house to time savings? How many hours are saved per project? Can you show a salary average for those hours for added cost savings even to streamlining production? More time spent in review or editorial, faster time to market. And with that time saved, what can you do now that you didn't have time for before? And I love this one because we all have things we know we should do, wish we could do, but we just don't have the resources, headcount or hours in the day to get it done. What if AI gave us that time? So I already talked about this last year for our team, right? I thought if we could draft content ourselves, we'd save a huge amount of time waiting for agencies to get back to us. We can spend more time on editing to improve content quality, save money on agency fees, and get our campaign bombs out faster to our stakeholders. Ultimately, I signed up for 50% reduction in agency fees because that was the metric that mattered to my leadership to onboard our vendor. But if you're piloting webcast copy, you're measuring ad webinar engagement, right? And reglift. If you're looking at email subject lines, you're measuring open conversion rates, etc. Here's an example of how to document baselines in your use cases and the results. You can put this in whatever format you want, whether it's word or maybe you like Excel or sheets, or you live in PowerPoint or slides, right? But what's the status quo process today? People? Time, cost. And then how many times do you do that per week, month, quarter, year? This way you're not scrambling at the end to think backwards and figure out what it was at the time when you really would rather be focusing on the results. And this is something that I talk to companies about months into their pilot, realizing that they didn't actually take this step to start to document what they were using their AI system for. Right? So what are these use cases that you're going to do? And what does this process look like today? And then you can apply the AI results later. Do not skip this step. Don't just jump right in. You've got to figure out what are your priority use cases, why are you doing this? And then you'll discover other things that it works for along the way. But this is a crucial step to showing measured results at the end. So number four, how many times do we hear about, or have we maybe ourselves experienced an AI output that is, let's call it. And the sad part is the likelihood of someone going back to try it again or pushing through that barrier to get the results they need is low. Right. It's another reason why adoption suffers. You hear it all the time, oh, I tried it once and it sucked. Right. A lot of this can be fixed by giving AI systems the right context to better inform and improve accuracy and quality of outputs. And you can talk a lot about prompting and prompt engineering, but that's essentially what you're doing, right, is you're providing adding context. The cool thing about Jasper is, and this is meant to be more thought leadership, right? But it actually is pulling the context out of you as you go. But in addition to that, hopefully you're using a system that has the ability to leverage and understand and learn your brand voice and tone guides. So this is a stage where you're gathering those and training your system within your pilot. So it's all set up there. Along the way you have your list of grammar, terms, rules and along with your brand guidelines all set up before you start your pilot. You've got your system learning your company messaging solution, product brief, information briefs, outlines that then your content creators can use to create content assets that are grounded in the foundational content important to your company. And then do you have a set of quality standards for your organization and who is checking for quality along the way? What does that look like? And how are you thinking about Continuous improvement movement, refining the context you're giving AI training it on more of what good looks like. And this is again something Jasper does for our clients to continue to refine workflows and outputs to get you closer to meeting your content quality needs and goals. And we have a full time prompt engineer and content specialist or writer on the product team dedicated to this continuing improvement and high quality outputs. So whether you're doing this on your own or you're doing this in partnership with a vendor, this is a really important stage, making sure that the system has the context it needs to get the quality output and then continuing to improve and refine along the way. I like to say now that contents reign is over because context is the new king or queen in town. And then I said something similar in London and they were like, well, at least content is still a duchess. Right? So take it as you will, but here is an example of a quality framework from one of our global pharmaceutical clients that I share all the time now because I love it. They used an independent human review team that looked at their existing content and their AI generated content and assessed that based on a quality score of 1 to 5 including turnaround time and cost. And they measured all of that. Not only did Jasper outperform here, but they couldn't identify which was which. So this framework allowed them for their own sort of Turing test to prove their pilot's effectiveness, which again gives them the data they needed to continue to build and scale across the organization. Number five, just give them the licenses and they'll figure it out. Right? That's what Copilot, we're all doing with Copilot right now. I'm sure you are right, but are we really setting up our teams for success here? Because it's not only just training on your AI system of record, but where's your team at in terms of foundational AI literacy? Are they ready from a mindset perspective to just jump into a tool and experiment and are they inspired to do so? The market and AI Institute's State of AI report from last year said that the leading barrier to AI adoption we see this in other reports, right. Continues to be lack of education and training and close behind that is lack of understanding and awareness. This and most respondents to this survey said that org had no AI focused education or training and quick plug. Their new State of Marketing AI report is coming out later this month. There's a webinar you can sign up for now marketing AI institute.com I love their data. I'm hoping that this Stat jumps from last year. Right? But education is really important. I believe AI literacy is foundational to responsible adoption. And this is where inspiring that mindset shift really comes in. We want to empower and motivate employees to really cultivate that innovative culture that's ready to adopt gen AI. But first we have to make sure these employees understand what gen is. They can't accelerate if they don't know how to drive. Once they're informed, then they can more quickly connect the dots and experiment. So now we introduce our tool based training. For example, Jasper has learning paths in Jasper Academy and in the community. And you can do that right before you hand out your licenses so that when your users do get their licenses, they already have a surface level understanding of how it works. And then you can jump into your use case or workflow specific training in smaller groups based on those use cases or marketing functions. And then we wrap all of this with an active open internal comms channel like Slack or Teams channel for all of your AI users or pilot users for everyone to share and learn. Include your vendor in this Teams or Slack channel as well to reduce friction, answer questions along the way, really ensuring no one gets stuck paving the way to faster time to value. I still talk to companies about setting up a Slack or teams channel. If this is something you haven't done yet, this is really important. Every use case shared inspires another use case. One thing worked for someone but didn't work for another and you can talk about it and figure that why. Or your vendor can come in and give you the tip on how best to do what you're trying to do. But this open engaged user community that's sharing and learning together, this is culture building and this is your rising tide. So here's an example of customized plan I like to do, including that foundational gen AI education. For example, the Intro to AI for Marketers course from the Marketing AI Institute, which they have every month for free as a webinar, but we also have access for our customers to it on demand. But then you're getting into your tool specific training and then your use case specific training based on what your priorities are. Then you're entering this testing, learning, continuous feedback loop and this is something that your vendor should be able to help you with as well. Lastly, without regular feedback from your users and stakeholders, you have the potential to miss opportunities to identify areas for improvement or optimizations that could enhance your AI system's performance and effectiveness. And we've seen pilots where it's unclear what the successes were because the results weren't collected along the way. So now you're having to go back and remember what you did and if it worked. So there's verbal feedback that it's working and everyone loves it. Right. But nothing documented that would even pass the red face test with organizational leadership or the board. Board, they're not going to tell us as much about how it's going if we're not asking or maybe if you're not regularly engaging in that open Slack or Teams channel. So we need to have this continuous feedback loop to document experiences and results because you'll find small wins along the way if you ask or as people are sharing. And then you can immediately start collecting those early results with time and cost savings. And it can be really simple things like, wow, I can't believe this worked for me. And then you can ask them, well, how long did it take you before? Or if you see something that worked, just have a parking lot where you're starting to document that along the way as well. But then in 30, 60, 90 days, of course, you're going back formally to check and see if your hypothesis aligned with your results. Or do you need to make adjustments or maybe content needs more time in market market. You can also formally survey your users before or after, before and after to collect and compare results. And your vendor can help you with this too, because the right vendor will want to partner with you to ensure you have the data needed to make those decisions. And if you're here as a Jasper customer, ask your CSM for support because we have tools and templates along the way to help you. So remember that baseline documentation we did? Now we just add in the AI assisted results multiplied by how many times you do that per year, for example. So if you're saving $6,000 per customer advocacy package and your business targets 75 a year, that's over $400,000 in savings. That's going to get some leadership attention, right? But also this is where you can really start to peel back the onion on the results and find out what matters to your leadership team. Maybe they don't care as much about hours saved, but they would care about getting to that project, that personalization, those vertical landing pages you've been saying you don't have the time or resources to do so. In terms of seeing tangible results, we typically see this in a linear path, starting with time savings. So 90% time savings compared to status quo, followed by cost savings, $6,000 saving saved per project or work hours turned into cost savings and then finally, once your A B tests have run or contents in market, you can start to see performance improvements, whether it's customer engagement, AD conversion, CPL, increased velocity, etc.

Step-by-Step: Avoiding the Pitfalls

Jessica Hreha: I get asked what it looks like to share results. So these next few slides were just how we did it at VMware and this was an example of just things we pulled out of our Slack channel. So this is the idea of sharing early and often nothing is too small to share and it can be as simple as this early time savings. This slide made it all the way up to our board because they wanted to see positive impacts of AI aside from the conversations they were having around risk. And my former CMO still talks about happier employees to this day, even at her new company and in all of the other external speaking she's doing. Because we received user quotes saying Jasper was increasing their overall job satisfaction quote happier than I have been in years. And all of that was through our survey work and our Slack channels. Or here's a way to show use case specific results for this customer story pilot package. So we had laid it all out, what our objective was, what our results were and you can see all of the hard results as well as the fact that improved qualities through my workflow close and cuts churn. Or if you have a marketing AI council or you have multiple pilot leads, you're rolling up your pilot progress in a slide that looks like this. Now that content's in market, you can see CTR increases, CPL decreases, maybe your B tests are outperforming your controls so far. And here are more examples of that time cost performance metrics from other customers. 7,500 product descriptions in less than 24 hours this VMware one we did hit our 50% agency savings goal, but because we documented our baseline foundation, you know, our content operations process, we found out that it could take our agency up to six weeks to start one of our social kits and we could draft it with Jasper in one day. And with Morningstar here, I'll dive into this one a little more. Their research marketing team needed to identify or they do identify and promote high impact research content content for their investors and investment professionals. So they have to distill all this complex investment research reports and papers into a variety of derivative content like summary articles, blog content, gated landing pages, organic paid media, et cetera. Right? So they use Jasper to scale this creation of derivative investment research content, help them break through writer's block and generate SEO optimized content for their digital channels. This team works across marketing teams to launch these organic and paid media campaigns to generate reach, engagement and influence leads and in 2022 they achieved 40% year over year increase in content downloads and saw 70% growth in marketing qualified leads. That's revenue impacting results, right? Here's some more the VMware team further refined their campaign bomb workflow and got it to 95% good at one shot output and projected half million dollars in annual savings and agency fees. Wakefield frees up 10,000 hours in their research reports every year. More time spent on strategy and not on the page quite yet, but Sage Publishing saw 50% decrease in marketing spend on translations, 99% faster turnaround for their textbook descriptions and then what you see here from a performance perspective, an increase in their campaign impressions which was the goal of their campaign, plus increase in content production while saving time. These six steps to manage experimentation can get your team from individual acceleration to team acceleration. Really important steps Leveraging the right use cases, the right pilot team documenting your baseline goals Having the right context to improve output quality that's on brand and therefore more usable for your organization Having a pilot team who is AI, literal IT and train to leverage AI systems to uncover those impactful use cases and the feedback and documentation to show your success and prepare to scale. There's a lot more I can get into about the scaling process and the importance of a cross functional marketing AI Council or whatever you want to call it, Workforce Task Group, the importance of this group in establishing the innovative culture and change management necessary. But that is another talk for another day. I will mention though because our marketing AI council was very purposeful and including at least one AI enthusiast so that hand raiser, someone willing to roll up their sleeves and uncover what works. We had one from every marketing function in our global marketing team. We saw use cases popping up in all of the different functions beyond the original ones we focused on. One example are when a leader from our Martech team, not normally a user of end user marketing software for content creation right was one of our highest Jefferson for users. She was able to use Jasper to create a Martech decision framework and did a platform comparison analysis that gave her the same results as a third party consultant engagement a few months prior in just one prompt. But because we saw proof of value in our initial pilot, we made the decision to provide equitable access to our entire marketing team. So teams were using it for exact speech straps, video script writing even on site and an event, taking a live customer interview, transcribing it into Google Docs and then having Jasper pull an Immediate summary as well as additional content derivatives within five minutes of the interview. Something that would have taken weeks to execute after getting back from the event. Right, so this is what I call the gravy on top of the core measurable use cases that you start with for those Efficiency, time, cost savings. This is what happens when you enable an entire team and give them the support for constant experimentation and an innovative culture. And remember, it's not just about time savings. This marketer is spending the same amount of time on research, but instead of four hours searching and four hours reading and analyzing, they're spending seven hours analyzing. And quote, the quality of work improves because I can focus more on analysis instead of rushing through it. I get asked a lot about advanced use cases and I didn't have time to share all of those with you today. But this is probably the most advanced level, the ability to. Ability to think and apply differently. This is from Mark Wolney, the AI champion and leader of the AI Council at Merge, which is a global first full service marketing agency, who said we're not just integrating AI into the existing workflows we have Today, we're exploring AI's capabilities to define entirely new ones. What have you always wanted to do? What do you know you should do, but you never had time to do in the past? What if AI gave you that time? What if we finally figured out hyper personalization and one to one ABM at scale, an entirely new workflow that wasn't possible before. I had a client say they've always wanted to come up with more creative experiential campaigns but never had time. What if automating other processes gave us that time? Doing things that would have otherwise been impossible in the past. Creating separate assets specific to each buyer in a B2B buying group. That's an average of 25 different buyers, by the way, what we could do. What can you do now that was impossible before? Let's think about AI as not necessarily replacing work, but also creating something entirely new. So this cycle is not forever, but it's important to start here. It allows you to start small, ensuring that you follow these steps along the way so you can fail and move on or prove success and scale. There's so much opportunity right now. A lot of reasons to be inspired. We need to inspire our marketing teams and remind ourselves to have this open mind, consider that there might be a smarter way of doing things and take action. It's time to get started. This continues to be a brand new space for us in marketing. The sole reason I'M here at Jasper now and surrounded by the most passionate, AI literate, customer obsessed people I know is to help you. My second job, that AI role last year at VMware was incredibly vulnerable for many reasons, but the entire time I felt like the people at Jasper were right along, partnering with me every step of the way and had my back. So yes, it's an exciting time to be in marketing. We would be honored to help you in that next step in your AI journey and just keep in mind and know that we are here to help. Let's get you to that next step. So thank you for being here. We're going to continue with some Q and A, but I hope that going through these steps and some of these lessons was helpful. I'm happy to connect on LinkedIn in or by email as well, but we are ready to head over to our Q and A. Krista Yes.

Q&A Session

Krista Doyle: Awesome. Well thank you. First off Jessica, it was just so encouraging hearing this process outlined and simplified. I think this makes this just seem so much more approachable for anyone kind of just starting off in that pilot phase of figuring out what are the first steps and so hearing from one of the best in the field who really saw real results at your previous company and then having a clear place to start is so valuable. So thank you for sharing your insight sites and moving into the Q and A. If anyone has specific questions that you want Jessica to answer live again, drop those in that chat or Q and A. But I'm seeing one question kind of stick out a few times that people are asking. They want to lead their company's AI pilot initiative but don't feel like they are really there or know enough about AI to effectively start that initiative. So do you have any recommended courses or resources to help get people up to speed when they're just starting off?

Jessica Hreha: Yes, there is a lot of content out there and a lot of training and resources, but the Marketing AI Institute for me is one of the best ones out there. They're not an overnight course. They've been around for a decade. Paul Raitzer and Mike Caput lead that organization. It's themarketingai institute.com they're the resource we use to really get started. And there's an Intro to AI course that's free. It's offered once a month. You can have a Global Watch party for your team as well. But then they also offer piloting, a Piloting AI for Marketers certification. And we actually got funding to get this certification across our entire Marketing AI Council so that we were all on the same page about piloting, but even if it's just you starting out, there's lots of templates and resources there as well. And then this year they've taken it beyond that to offer a scaling AI for marketers course, but lots of resources, webinars events. Jasper. I'm going to be speaking at their Marketing AI Conference Macon in September in Cleveland as well. So just a really great community of other people who are stepping into the marketing AI space. So you're surrounded by people who are also getting started, but also people who have run pilots and adoption for a bit now as well.

Krista Doyle: Yeah, I think you were the one that introduced that resource to me as well. And it's been so impactful and getting up to speed with generative AI and keeping up with trends. So thank you for that insight. I see another question in here about documenting the status quo, while being a really crucial step, is also kind of a high barrier to entry. Do you have any tips for making that process less daunting of starting out where even to begin of documenting the processes you're doing currently?

Jessica Hreha: So you have to start out with a use case in mind. Mind, right. If you're bringing on a tool especially that is not free, you know, so you've got to say like, okay, I hypothesize that AI could help me with this because for these reasons it's not the most perfect process in the world or I could benefit from it being faster, better, cheaper, whatever, right? So you have to just think to yourself, and I think you can, you can make it complex, but you could also make it really simple. Doc, how do I do this today? Right? And even just telling somebody or just taking a step through or maybe screen record yourself doing it, to say, what are all the steps that I'm taking today to do this and how long did it take me? Even if it's just a guesstimate, you know, I hypothesize that we could use an AI system for executive speech writing or executive content. And, and the problem is right now there's only making this up similar to a use case we have with the customer. There's only two people in the company who can do this for our C suite. So we literally feel like we can't take a vacation, right? So then it's like, okay, well what is the process? Well, you have to, you have to know the executive's voice. We have to write all of this content ourselves. Here's how long it takes me to write a blog post or a Speech and also just this almost intangible knowledge that only two people in the company can do it. So you're time restricted or you can't take vacation. So I hypothesize that using a brand voice or executive voices. Right. We can now have four people learning this content and trusting that the AI can do it. So now you know the intangible, we can take a vacation, we can rely on other people. And because it's drafting the content in the voice to begin with, we're seeing that our approvals from our executives are going from two to three days to 20 minutes because we're getting there. That's much faster. Does that's a really broad one, but I'm trying to give you an example of that's a bit more like not as cut and dry. Hopefully that helps.

Krista Doyle: Yeah, I think that's super helpful and super well laid out. It makes me think of our, one of our customers commerce tools that Ashley the champion recorded everyone's processes just to make them have an understanding of how long it took to do XYZ of writing a blog or creating a campaign. And it really forced people to slow down and say, wow, this is actually taking longer than we thought it would. And having that baseline was super helpful. So just reminded me of that.

Jessica Hreha: Yeah, I mean we had no idea that it was taking six weeks, for example, for one of our agencies to pick up one of our social kits because there was just a long line of content in their process. But we, we actually started using a project management tool, pulled it in there and yeah, it did take a couple months for us to do that, but we wouldn't have had the results to share without that baseline knowledge from first. So it's like yeah, you can jump into a use case and write something faster, but for those larger workflows or use cases that you're trying to really improve, it's really important to know what the process looks like today.

Krista Doyle: Yeah, for sure. Just for the sake of time. I know we're running up to the end, so I just have one more question that I'm seeing in the chat that we'll end off on. But someone lastly is asking how to approach testing AI team with a group that's maybe more skeptical and hesitant to use it. How do you you recommend approaching overcoming those barriers?

Jessica Hreha: Yeah, I think the thing is to find people who are your AI enthusiasts because it's really hard to experiment with somebody who doesn't want to experiment. Right. So you've got to figure out who those hand raisers are on your team. It's such a great leadership, visibility, opportunity. The people who are the doers but who are also hungry to make change, to implement new processes in the organization. I have seen people getting promoted, getting bonused, all implementing AI councils or implementing new use cases. So those are the people that you want to look for, the, I call them the AI enthusiasts. And those are the people you also want to build on these like coalitions or councils. Because like I said, they're the ones who take time above and beyond their workflow. You have to pivot, right and swivel chair and figure it out. But once they figure it out, then they can take it back to their team and say, you know, this workflow you're all doing, doing similarly in the same way, right? Watch me and then physically show them on screen, you know, in a meeting room, whatever, how they're doing it. And I guarantee you'll see some, some light bulbs or some eyes pop up from there. That's one way to do it. You know, you can't force everyone to do something, but why wouldn't you want to do something in a faster, better way, you know? And that's where you also want to, I think, remember to incorporate your team in the process so you're not just shoving new processes is, you know, down their throats. Or people will say, I don't want to do this, but let's make them a part of the process. Say, okay, there's got to be a better way to do this. I'm looking for people to help us figure this out. And then look for those hand raisers to help you through that. And those people on your team too, taking it back to their own team, they have the trust and rapport to be able to talk about those AI adoption barriers too. It just real quick makes me think of someone on our digital execution team team working really closely with her teammates, actually went so far as to have one on ones with each of her team to say, like, what's holding you back? Right? Or talk to me about the things that you hate doing about your job. Let's figure out if there's a way to make AI improve that and all on her own because she's an AI enthusiast and wanted to lead this part of her organization. So you've got to look for those people first.

Krista Doyle: Yeah, I love it. That's amazing. I just wanted to say thank you again for your insights. I think this is so helpful even for me as I'm still learning about generative AI. I feel like there's never enough that you can learn learn. So it's really helpful to hear kind of your thought process behind this all. One thing I did is I just dropped a link to our webinars page for everyone to navigate to after this. That's where you can stay up to date with all of our webinars here at Jasper. If you really liked this one, we are starting to host these monthly so you guys can navigate there after this call. I'm seeing some last few questions rolling in, but we are at time so if you do want to get connected with our team and you can obviously book a demo with us, get connected. Message us on LinkedIn. Yeah, Jessica is saying feel free to message me. She is one of our top AI experts so would highly recommend leveraging our team as resources. And if you have any last words, Jessica

Closing Remarks

Jessica Hreha: Yeah, thank you all for being here. Remember, no one is an expert. We're all just at different phases of our age or AI journey. Everyone's learning. The more that we share lift up this community as an industry, the more that we can all do together. So I encourage you, if you're not in a community that's regularly talking about this to join one as well. The Marketing AI Institute has a community. Jasper has a community as well. It's just a really great way to connect with others. And because every use case shared inspires another use case, even if we're all doing different things, we can all learn from and inspire each other. Thanks so much everyone.

Webinar replay coming soon

You should be redirected shortly...

Meet the speakers

Jessica Hreha

Jessica Hreha

Director, AI Transformation Office, Jasper

More webinars & replays

View All Webinars
Watch The Replay
Watch The Replay

February 4, 2026

February 4, 2026 12:00 PM

 EST

The End of the AI Experiment: The Operational Era is Here

Join Jasper CMO Loreal Lynch & CEO Timothy Young for a candid conversation on what the State of AI in Marketing 2026 report data says about marketing’s next evolution.

Watch the Replay

Hosted by

Timothy Young

Timothy Young

CEO, Jasper

Loreal Lynch

Loreal Lynch

CMO, Jasper

Watch The Replay
Watch The Replay

December 17, 2025

December 17, 2025 12:00 PM

 EST

From Insight to Impact: Scaling SEO/AEO/GEO Content with Jasper

Marketing teams are moving past experimentation, defining playbooks for scale. The most forward-thinking teams aren’t asking if AI works. They’re asking where it drives the biggest outcomes — and one area continues to rise to the top: SEO, AEO, & GEO.

Watch the Replay

Hosted by

Daniel Su

Daniel Su

Principal Product Manager, Jasper

Sara Mo Vanacht

Sara Mo Vanacht

Product Marketing Manager, Jasper

Watch The Replay
Watch The Replay

November 5, 2025

November 5, 2025 11:00 AM

 EST

Unlocking People for AI Transformation: Change Management that Sticks

A conversation from Jasper Assembly

Watch the Replay

Hosted by

Alex Buder Shapiro

Alex Buder Shapiro

Prev.Chief People Officer, Jasper

Raakhi Agrawal

Raakhi Agrawal

Managing Director and Partner, BCG X

https://www.jasper.ai/resources/webinars/avoiding-pitfalls-with-genai-pilots