Home > Newsletter > How to Use AI Responsibly in Marketing
Back to all Articles
How to Use AI Responsibly in Marketing

How to Use AI Responsibly in Marketing

Generative AI is beginning to transform marketing and advertising — and brand managers are scrambling to keep up with the onslaught. Eighty-two percent of marketers are familiar with generative AI tools and 65 percent have incorporated them into their tech stack, according to the "The Highs, Lows, and 'Whoas' of AI," which surveyed 317 marketing professionals. Yet many marketers are having a tough time leveraging the full potential of AI: 70 percent say they feel inundated by the current pace of AI development and its incorporation into their marketing strategies, while 42 percent still haven't received any formal training on AI and its applications in marketing.

 

"Establishing governance and compliance protocols within organizations to address data privacy and ethical concerns is essential," says Monica Ho, CMO at the comarketing cloud platform SOCi, which conducted the survey. "Aligning AI use with industry regulations and company values will be a huge effort this year."

 

With AI presenting several potential sand traps — whether via AI bias, copyright infringement, and/or plagiarism — brands have to make sure they use the technology responsibly and mitigate any risk.

 

Citi, for example, recently created a group to explore how the company might use AI, while Disney launched an AI task force to implement the technology across the company.

 

"As marketers and as an enterprise as a whole, we're still learning how AI can improve our processes," says Kinjal Parikh, head of media sciences at ANA member Citi, and a member of the ANA's AI Forum, which examines use case opportunities that marketers are pursuing in applying AI and helps to address the issues and risks concerning governance, ethics, and intellectual property protection. "We all have to learn how to use these tools and interpret the data that we get from them. You can't just let them run wild."

 

Phase One

 

Industrywide efforts are also starting to take shape, with the ANA developing an AI playbook and other resources, for example, to assist marketers as they embed generative AI into their day-to-day operations.

 

"Our strategy is to educate and enable our members to safely and effectively embrace the use of AI in support of their business objectives," says Michael Donnelly, EVP of AI, marketing technology, and marketing futures at the ANA. "AI is going to affect every stage of the marketing process. Recognizing that there are some unknowns, we are working with members to proactively develop guidelines that allow them to safely and effectively embrace this at scale."

 

Marketers are eager to join the fray. Indeed, 69 percent already believe that marketers leveraging AI will replace those companies that fail to leverage the technology.

 

"We're going from the learning to experimental phase now, but we're not quite ready for scale yet," says Alan Schulman, cofounder, managing partner, and chief experience officer at UpperRight, a customer experience consultancy, and author of "Generative AI in Creative and Content Generation," a playbook published by the ANA that provides an overview of how to leverage generative AI for ad creative and content creation.

 

Internal Affairs

 

Experiments should be controlled and guided by clear and transparent policies, which are sorely lacking. Only 22 percent of companies have generative AI policies, and just 21 percent have an AI ethics policy or responsible AI principles, according to the "2023 State of Marketing AI Report." The survey, based on the responses of roughly 900 marketing professionals working in professional services, software, media, education, and other sectors, was conducted by the Marketing AI Institute.

 

"You absolutely need to experiment, but there are a lot of pitfalls out there," says Mike Kaput, chief content officer at the Marketing AI Institute. "You don't know what you don't know, and there are probably people at your organization that are using [generative AI] and not admitting it. You really need policies in place."

 

Establishing an internal body to evaluate use cases and promote learning can help create a culture of responsible AI use. "It can't just be marketing or sales," Kaput says. "You need a handful of leaders from across functions who are tasked with figuring this out, including legal, HR, and executive support. Otherwise, you will end up with every single department or function doing their own thing or trying to move it forward on their own."

 

AI policies need to encompass how employees should (and should not) use AI. Templates now available, for instance, can help companies shore up their approach to vendor selection and security and show how to make disclosure of AI usage transparent.

 

It's also an opportunity for marketers to lead the charge. "We have to make sure our company cultures are even more driven by curiosity and early adoption, as this change is happening very fast," says Matt Garbutt, creative director at Brave Bison, whose clients include KFC, New Balance, and the World Wildlife Fund.

 

The Lookout

 

AI should complement marketing skills, not replace them, and marketers need to be cautious about using proprietary data for training AI models. "With strict regulations like GDPR, ensuring that AI tools comply with legal standards for data handling and privacy is crucial," SOCi's Ho says. "Marketers should also ensure that they are working with a solution or partner that ensures legal, privacy, and compliance standards are followed."

 

It's essential to train AI tools with sufficient data to avoid bias in the results and make sure that population segments are not underrepresented or overrepresented. "You need to have someone review the data training plan to look specifically for bias," says Jenny Kelly, head of content at ANA member Deloitte Digital.

 

Brands should be transparent in how they use AI and integrate watermarks or labels to clearly indicate images that could be misleading. "Brands need to make sure that consumers feel like they're being transparent," Kelly says. "When a consumer thinks that a brand is using generative AI, or it's not done in a transparent way, it reduces trust in that brand."

 

Marketing teams should view generative AI's outputs with a grain of salt, and any integration of company data into AI tools should be applied carefully. "We should have people experimenting and creating stuff in our own walls right now, but before we deploy anything, [we have to] make sure we've got the right governance and cybersecurity around it to make sure we're not going to allow somebody to hack our customer experience," Schulman says.

 

No matter how far along they are in integrating generative AI, marketers should focus on learning and get involved with industry-level conversations, particularly as standards and regulations evolve. "Everybody needs to lean into it," Parikh says. "Think of it as an aid to human intelligence and creativity, rather than a threat. Learn how we can leverage it to our advantage."

 

Full article: https://www.ana.net/magazines/show/id/ana-2024-02-ai-responsibly?st3=240229smartbrief&utm_medium=email&utm_source=smartbrief&utm_campaign=mkc2402-bm