Interview with Lucy O’Boyle: How are we using AI in fintech content writing

Home  ❯  Interview   ❯   Interview with Lucy O’Boyle: How are we using AI in fintech content writing

19th March, 2024 No Comments

We got to speak to fintech content creator, and founder of Lumen LabsLucy O’Boyle. We chatted about the rise of AI in marketing content, the traps and pitfalls, and the best way to maximise the new technology…


Lucy, can you tell us a bit about yourself?
I’m a former journalist, previously working in London’s newsrooms. The adrenaline rush was great working on live shows, but I soon realised that the 4 am starts for the early morning shifts were not for me. I felt a strong pull towards the start-up scene in London and knew my experience in journalism could be put to good use. 

I worked with an EdTech first on their content and production, then moved into working for a fintech called Circa5000, which specialised in impact investing.

The content I produced served two very different audiences — retail investors and professional investors, both required a distinct approach. Sadly, the team I was working alongside were all made redundant last year and I decided to put my entrepreneurial skills to the test by launching my strategic content agency, Lumen Labs

What was your favourite thing about journalism before you made the switch to content?
I loved the fast pace of journalism. There was something exhilarating about stepping into a newsroom in the morning and having that set agenda of exciting stories that were going to play out across the day. The biggest change when I moved into start-ups was that the pace of the news cycle meant daily deadlines, and I had to adapt to slow down a little — which is amusing because everyone always jokes about how fast-paced the start-up world is.

So we don’t have to look very far to see stories around AI and content production. As a content producer yourself, and a former journalist, how do you feel about it?
My initial thought is to adopt with caution. New technologies, particularly in sectors like fintech and financial services where there’s a significant responsibility in handling people’s finances, demand careful consideration. It’s crucial to align content and marketing strategies with compliance regulations, perhaps even more diligently than usual. If you’re using AI in any customer-facing content, it is not at the stage where you can blindly trust the output. Both from a quality perspective and a compliance point of view in terms of whether the information it’s sourcing is correct, make sure there is a set of eyes scrutinising outputs. 

I had a case when Chat GPT launched where I was looking for a very specific statistic on market share which we were going to put out for a crowdfund, and Chat GPT outputted a number which I was a bit suspicious of. I prompted for the source and it returned with the name of a report which, when I checked it online, didn’t actually exist.

The problem with AI tools at the moment is that rarely respond with, “I don’t have the answer”. That’s a problem. 

It’s difficult because you need to have a sense of what an appropriate number is to catch that. There’s an element of experience and expertise you need just to edit and review what comes out, which suggests these stories talking about AI “replacing” writers aren’t thinking these situations through…

Do people ever ask you if you think AI will replace you?
Some companies saw the launch of Chat GPT and Google Gemini as an opportunity to cut costs and let go of a large proportion of their marketing team. I think those companies will pay the price in terms of the quality of engagement and how they build trust with their audiences. AI does a lot of things very well, but creating connections and empathy aren’t its strong suits. 

Interestingly I’ve seen some companies are now recruiting freelance writers to edit or re-write AI-produced content. What they’re realising is that while they might be able to produce content at scale using AI, it’s all quite generic.

If companies take an AI-only approach they will find it hard to differentiate themselves as a brand and in their content. That’s not to say that as a tool AI shouldn’t play a central role in increasing the quality of output and reducing the time it takes to create differentiated content.

The companies that succeed here will get the right balance of human strategy and expertise combined with using AI efficiently and effectively. 

So how can we use AI to improve content, and support the process?
The stronger the inputs, the better the output. If you combine someone who has a strategic marketing approach with AI which can unlock extremely technical knowledge, you can create some seriously impressive content.

AI can help with ideation, proofreading, and challenging ideas or coming up with “devil’s advocate” arguments for your content, too. Which are sometimes areas people forget to mine.

If you are creating a blog, AI is now pretty strong at producing the technical aspects and following the right framework. Your team are then in a strong position to get that content to the next level by, say, interviewing the CEO and weaving in quotes to strengthen the blog post. 

And how about how we use AI for personal brands, we’re seeing a lot of LinkedIn prompts like “rewrite with AI” and LinkedIn Voices offering AI ‘support’ when drafting answers. Should we apply the same thinking to this?
I think you need to consider the sort of posts that you engage with on LinkedIn and why you find them interesting or relatable. I don’t know about you, but I can tell when a post has been written solely by AI a mile away.

Humans have an excellent radar for inauthenticity. Really good human writing has personality, and it also has flaws. On LinkedIn for example, people want to feel like they’re having a conversation with someone, and sometimes that means the author subverting the classical rules of grammar in their posts. That’s something I haven’t seen AI quite crack when it comes to personal branding. Maybe it will get there someday but that’s certainly not today.

And my last question for those who are thinking about SEO-first, will Google penalise AI-produced content?
I always say to clients that it’s something they should proceed with caution on. When I’ve looked into Google’s guidance on this, it’s about creating originally high-quality and people-first content. So if you are creating content strategically, answering questions that your customers are asking, and you have them in mind while you’re creating that content, then so far as we can see, you’re not going to be penalised for using AI.

However, to get original high-quality people-focused content, you are likely going to have some human input there. So for example, it’s best to use humans to collect information such as a quote or interview a senior individual with specialist insights for an article. Your content team can spend time gathering this sort of information and making high-quality edits.

As AI is now a tool that is used in most content creation today, I think it’s unlikely that Google will outright penalise businesses for using it – but the content must genuinely provide value and answer search intent.


Leave a Reply

Your email address will not be published. Required fields are marked *

Our website uses cookies to provide your browsing experience and relavent informations.Before continuing to use our website, you agree & accept of our Cookie Policy

Accept