Remember the first time you used ChatGPT? What did you type? Maybe something about a celebrity or TV show? A how-to question or work-related request perhaps.

It’s possible your prompting abilities have advanced to the point that you are confident in understanding how ChatGPT and other AI tools work. 

But have you considered how AI impacts the way we work?

At the recent Rep Cap webinar “From Pilot Projects to Operational Ally: The Role of AI in the Enterprise in 2024,” Rep Cap CEO and Managing Editor publisher Mary Ellen Slayter brought together a panel of industry leaders to find out how they’re implementing AI in their organizations — and what they expect AI to do next. 

Webinar participants included: 

Embrace Productivity Gains with Generative AI

Generative AI is no buzzword in the enterprise. These companies are using genAI to automate countless low-level and redundant tasks and generate insights from mountains of data. Content marketers should note that all the panelists agreed that AI has dramatically impacted content creation.

But genAI isn’t without its challenges. We’re still grappling with questions about security, intellectual property rights and transparency around AI-generated content. Within content marketing teams, what happens to junior team members in an AI-dominated world? And who will train employees to use AI effectively? 

Our panelists argued that rather than reducing headcount, AI will largely change roles and responsibilities. Automating repetitive tasks will allow more time for strategic work. As John says, “Generative AI is not going to replace people. It’s going to replace people that don’t know how to use generative AI.”

Install Human Oversight Over AI Outputs

It’s a given that enterprises are embracing AI to enhance productivity and efficiency. They increasingly have no choice if they don’t want to fall behind.

But enterprise companies can’t completely turn over operations to AI. Human oversight is essential, especially when producing content. Lauren from Starmind noted that many genAI tools cannot access the latest information and facts. And don’t even think about trusting these tools to accurately fact-check or confirm knowledge on their own. AI can accelerate content creation, but marketers must manually review and validate anything produced.

To that point, Michael Carden urged content marketers to spend more time on AI implementation. Does it actually improve workflows? Or does it merely generate more mediocre or questionable content? “If you kind of lower the cost of producing bullshit, then you’ll produce a lot more bullshit,” he says. 

Tailor AI Tools to Find the Right Fit

AI isn’t a one-size-fits-all solution. Panelists stressed the importance of understanding your business needs and challenges before integrating AI. Custom solutions that rely on internal data tend to outperform generalist models like ChatGPT. “We’re creating specific language models geared towards a certain vertical or subject,” Mike Zack says. 

In fact, humans have an important role in using genAI effectively. The training data and the prompts you provide are crucial. As Mary Ellen explains, “The more your prompt up front is specific and pulls data that’s unique to us … it is unlikely to give me something generic.”

Customer interview transcripts, for example, can help machine learning models create content that truly resonates with your audience versus generic keyword-based blog posts. While AI can help you connect the dots between interviews and uncover insights humans might have missed, time savings isn’t the primary benefit. 

“I’m just gonna go ahead and say that is not really less work,” Mary Ellen said. “It’s more like I shifted my time, and [AI] made this possible,” 

Embrace Visibility and Transparency

Speakers also discussed the value of transparency and visibility. How is AI reaching its conclusions? Is your data safeguarded and secure? You need users to trust AI-generated insights, whether they’re your employees, your customers or anyone else. 

Acterys helps companies deal with their financial information. As Mike Zack notes, “These are all metrics based on data that has garbage in, garbage out. … The data has to be there and has to be accurate first, and that’s something that we pride ourselves on doing.”

Transparency in AI is king. It enables productive human/AI collaboration and is part of good governance around responsible data usage. And when you’re transparent about AI, John says, your employees might even trust AI more than they do other people. 

“The idea that people worry about something like biased performance reviews from generative AI,” he said, “is almost laughable when you think about how biased performance reviews are when humans do them.”

Finally, who will keep an eye on AI? While content marketing leaders handle the day-in and day-out details within their teams, Lauren from Starmind sees a bigger role being added to the C-suite: the Chief AI Officer. 

“I think we’re going to see the birth of the chief AI officer,” she says. “They’re really gonna decide, OK, we’ve got to remove this from IT” and dedicate a team to AI needs.