Don't forget the non-technical costs with generative AI projects as there are a bevy of ongoing maintenance to consider, said Lori Walters, Vice President, Claims and Operations Data Science at The Hartford.

Speaking at an Amazon Web Services (AWS) financial services event for analysts this week, Walters provided several takeaways about generative AI efforts and how they fit into the broader picture. I'll have more takeaways from a broader range of financial services CxOs, but the Walters comments stuck out.

Simply put, genAI projects require human costs, expertise and ongoing maintenance that are often overlooked. Walters said:

"We spend a lot of time talking about the cost to build, about the training costs and the inference cost. But what we're seeing is the human capital associated with genAI is significant. Do not underestimate it. It's not just the initial build but how do you sustain these solutions. Prompt engineering is really critical, but we're finding there's a lot of work enhancing and maintaining models. The prompts with models is brittle and don't extend well. There's a maintenance cycle to re-engineer prompts. We're not talking about moving from GPT-4 to Claude. There's a lot of engineering even moving from GPT 3.5 to GPT 4.0.

The other aspect of human capital is the subject matter expert component. We have SMEs from the business that have to define what a good summary needs to look like. What's the ground truth around that? We don't have any label data and our SMEs are working with us to develop the ground truth. And then as we are producing outcomes, they're having to validate it and test it and develop accuracy metrics so we know it is safe to put in production. I think planning on that human capital is something we're not talking about."

Those words of wisdom deserve a callout since the technology sector isn't really talking about the human capital involved. And certainly, we're not hearing of the prompt engineering involved with swapping models. The other notable item was the role of humans in the loop have ongoing chores with validation.

Also see: Intuit’s Bet on Data, AI, AWS Pays Off Ahead of Generative AI Transformation | Rocket Companies’ strategy: Generative AI transformation in turbulent market

Thinking through the human costs is just one of the takeaways from Walters worth highlighting. Here are a few others from Walters' AWS talk in New York.

Building blocks that need to be in place before genAI

Walters said the generative AI journey is smoother if there are other transformational building blocks already in place.

Preprocessing data. Technologies like Optical Character Recognition (OCR) are still critical to get documents in digital form so the LLMs can read them. "A lot of the work is actually in that pre-processing," said Walters.

The cloud. "The cloud is a means to the end. It's not really the end, but has been a very important accelerator. We are in the middle of a very aggressive technology agenda focused on bringing the power of data and AI together to transform our end-to-end business," said Walters.

There were advantages to bringing analytics, data and technology ecosystems to the cloud. The benefits were faster product cadence and being able to spot areas that needed improvement.

Machine learning is the precursor. "We have several hundreds of models in production deployed across all of our business segments. Our business leaders have been able to see and feel not only the potential but how to put it to work," said Walters. "One of the most important investments we've made over the past few years with our move to AWS was MLOps, the machine learning equivalent of DevOps, and it allows us to automate and standardize the life cycle of a model."

Then it's AI before genAI. Walters said mature AI practices are a good start to scale into genAI. "There was an early tendency to treat genAI as something different, but you need platform, operating models and governance," she said.

Flexible platforms. "The foundational models are evolving daily so having the flexibility to get model choice, but having a modular ecosystem is critical. Plug and play is a necessity here more than we've ever seen before. The state of art today is not the state-of-the-art tomorrow," she said.

Foundation models are just one piece of a more complicated orchestra. "What we're finding is GenAI is often the smaller and maybe the easier piece. You need a platform that plays well with the rest of the ecosystem and integrates with data and AI services," said Walter.

Governance. Hartford has taken some existing governance frameworks and extended them to genAI, but the effort is in the experimental phase. Walters said she wants to automate governance, but there are challenges with the fluid regulatory environment.

Business buy-in. "Early in our journey we were focused on building buy-in on the art of the possible. We crossed that seven or eight years ago where our business leaders wanted to start investing more in machine learning and AI. From there the focus was on how do you scale," said Walters.

The willingness to experiment with discipline. She said:

"We are approaching generative AI with disciplined urgency. There's a lot of hype. There's a lot of noise. And the environment is changing minute by minute. So, we've really focused on being intentional about priorities and focus. Generative AI is just another tool in the toolkit--a very powerful tool. But it is one that we are validating that complements our existing AI capabilities well. It is allowing us to tap into unstructured data that was largely untapped by traditional models."

More on genAI: