
Designing Voice AI for Home Loans: Humanistic conversations without the deception
By Benjamin Baume, Co-Founder of Craggle
Introduction
In a time when trust in the home loan industry is eroding and customers are navigating ever more complex lending landscapes, there is a growing call for change. According to a recent Deloitte study, 54% of Australian consumers say they do not fully trust their bank or broker to act in their best interests. Yet, more than three-quarters of customers still stay with the same home loan provider even when better deals are available elsewhere.
At Craggle, we believe there’s a better way. We’ve developed an AI-driven voice agent that helps Australians discover tailored, unbiased home loan options—without the pressure, complexity, or long wait times. But building this agent wasn’t just a technical challenge; it was a philosophical one too. The question at the heart of our journey was: how do we make voice-based AI interactions feel natural and humanistic without ever misleading the user into thinking they’re speaking with a real human?
The answer, as it turns out, came from a conversation I had over a few glasses of whisky with the Shackleton Rescue Committee.
A Conversation Over Scotch: Ethics and Expectations
The Shackleton Rescue Committee is a small but mighty group of professionals from varying industries, united by a shared love of scotch, the history of whisky, innovation, and an ever-changing debate on the top 10 movies of all time. Our backgrounds span a broad spectrum: information technology (including software development and networking), financial services, risk management, commercial construction, rigging and building, security and crisis management, military, painting, teaching—to name just a few.
We meet regularly under that tongue-in-cheek name to discuss everything from personal growth to industry disruption to survival strategies (literal and metaphorical). It’s a place where no idea is too bold to be explored. Conversations often drift into philosophical territories: what made certain ventures successful? What did we learn from failure? What does ethical leadership look like in a tech-saturated world? It’s also a sanctuary—a space where we support one another across different stages of life and career, challenging assumptions and lending our respective skillsets.
During one such session, the topic of Craggle’s AI Agent came up. I shared that we were building a voice-based generative AI that would speak to home loan customers on the phone, helping them understand their options, assess eligibility, and compare deals. We discussed what this AI should be able to do, how it should respond, and—most importantly—what lines should never be crossed.
Our group’s resident risk management expert chimed in with a passionate objection: “Whatever you do, don’t let the AI pretend to be human.” And he was right. It’s a line of deception that’s not just ethically dubious—it risks breaking user trust before it even forms.
The Dilemma: Human Experience vs Ethical Transparency
Here’s the paradox: for a voice-based AI to be successful, it must sound natural. That means mimicking human conversational patterns—pauses, intonation, empathy cues, adaptive phrasing. If it doesn’t, people struggle to engage with it. But the moment it sounds too human, we risk creating the illusion that there is a human on the other end.
The solution isn’t to make the AI sound robotic; it’s to make it clear what it is, why it exists, and how it helps—without trickery. The voice agent can still use humanistic speech patterns and natural language understanding, but with full transparency. We start each interaction with a greeting that clearly introduces the AI as a digital assistant powered by Craggle, and we reiterate this periodically to keep expectations clear.
It’s the difference between respectful design and manipulative design. And it matters.
The Design Approach: Conversational AI That Listens Like a Human
One of the biggest advantages of generative AI is its ability to interpret nuance. Let’s say a user says, “I want a loan where I can make extra repayments if I have a good month.” A static form or rule-based bot would miss this entirely unless the user selected the exact checkbox labelled “additional repayments.”
But our voice AI listens like a human. It understands the intention behind the statement and can match it to the appropriate feature. This doesn’t just reduce friction; it empowers people to express themselves in their own words, just like they would in a natural conversation.
Importantly, we recognised early on that not all customers speak in financial terms. In fact, many don’t know the technical name of the specific feature they want. So it’s not just about detecting nuance—it’s about enabling people to describe their needs without jargon. When customers say, “I want to be able to redraw some of my repayments if I need cash later,” CLARA understands they’re talking about a redraw facility. This lifts the level of transparency by allowing users to express themselves naturally and helps reduce the discomfort many feel when they don’t know the ‘right’ financial term to use.
Why is this important?
“We don’t ask humans to learn how to talk to bots. We build bots that know how to listen to humans.” – Benjamin Baume
This principle is at the heart of Craggle’s user experience philosophy. We’re not trying to trick people. We’re trying to make it easier for them to make better financial decisions.
CLARA: Bringing the Vision to Life
Much of what we’ve learned has culminated in the development of CLARA—our Craggle Lending and Rate Assistant. Originally built as a text-based AI chat experience, CLARA demonstrated how generative AI could guide home loan conversations in a natural, personalised, and unbiased way. Building on this foundation, we’ve been working to bring CLARA to life through voice—a conversational AI Agent capable of real-time support.
CLARA is still in development and undergoing testing. While not yet available to the public, the design and testing phases have already revealed the power and flexibility of generative AI when used to deliver transparent, human-like interactions. Through this journey, we’ve been able to prototype and test how voice-based AI can extend the reach of our digital services, offering clients a hands-free, accessible experience that meets them on their terms.
Bringing CLARA into a voice format has meant reimagining not just how people talk to AI—but how AI can make complex processes feel simple. Real-time support removes the need for endless form-filling and repetitive inputs. With natural conversation, we reduce not just paper, but digital friction—replacing clicks with simple, direct engagement. More than a tool, CLARA becomes a guide.
And perhaps most crucially, CLARA delivers this service without any of the unconscious biases a human may carry. There’s no sales pressure. There’s no judgment. It’s a consistent, unbiased service experience every time. By allowing users to move at their own pace, and by helping them build confidence through clarity and transparency, CLARA gives Australians the tools to make the next step toward their financial goals—without fear or friction.
Privacy Boundaries and Customer Comfort
Another insight from our Shackleton Rescue Committee discussion was around privacy. I hypothesised that while people may be hesitant to share personal details with an AI agent, they would be more comfortable discussing financial scenarios. This turned out to be true.
Most members agreed: they wouldn’t want to give their full name, address, or email to a machine. But they’d be happy to say, “I earn about $120K a year, and I’m looking to refinance my home to save on repayments.”
This aligns with broader market research. A 2024 survey by PwC found that 62% of Australians are more open to digital financial tools if they believe their data is anonymised and used responsibly. When building Craggle’s AI Chat Agent, data security and privacy were paramount. We developed an automated PII obfuscation solution, which allows sensitive data to flow through the system without leaving a trace. This solution is still ahead of the curve today and has been seamlessly integrated into CLARA. The only exception to this obfuscation rule is the client’s first name, retained to support personalised interactions and natural human-like engagement.
We’ve also introduced several natural speech features into CLARA based on real-world testing and user feedback. For instance, users can interrupt CLARA mid-sentence to ask for clarification or shift direction—something that naturally happens in human conversations. This helps build trust and reduces frustration.
CLARA also supports dynamic interaction flow. If a client wants to start by talking about a particular concern, like switching from interest-only to principal and interest repayments, CLARA will follow their lead—then circle back to collect the remaining required details in a structured way.
Perhaps most impressive is CLARA’s active listening ability. Just like a skilled human advisor, CLARA can detect when a piece of information shared in one part of the conversation is relevant elsewhere. If a customer answers one question but includes a detail useful for another later on, CLARA recognises this, stores it, and applies it at the right time. It’s not just keyword or intent matching—it’s contextual recall that evolves with the conversation.
Why This Matters for Home Loan Accessibility
For too long, the home loan process has been inaccessible, overly complex, and riddled with jargon. Many Australians feel uncertain about where to begin or anxious that asking a question may reveal what they don’t know.
Our work with CLARA aims to flip that script. Natural language processing allows customers to speak in their own terms, not in financial terminology. Real-time voice interaction replaces friction with flow. CLARA doesn’t push users toward a particular product or pathway—it guides them with information they can understand and at a pace they’re comfortable with.
This is critical not only for transparency, but for inclusion. When the AI listens, interprets, and responds like a human—without pretending to be one—it levels the playing field for borrowers who have felt excluded by traditional systems. And in doing so, it makes the path forward clearer for everyone.
Industry Trends Supporting AI-Led Conversations
The direction of consumer expectations is clear:
- 80% of consumers say the experience a company provides is as important as its products or services (Salesforce, 2024).
- 47% of Australians now say they prefer digital-first interactions for financial services, up from 32% in 2021 (Roy Morgan).
- Voice AI adoption in financial services is projected to grow by 24% YoY through 2027 (Gartner), especially in use cases where decision support is required.
Craggle’s AI isn’t just keeping up with the trend—it’s setting a new benchmark for ethical, humanistic interaction.
Conclusion: A Better Way Forward
Technology should always serve people—not the other way around. At Craggle, we’ve built a voice-based AI Agent that does just that. It listens. It understands. It never pretends.
Our goal is to simplify the home loan process and remove the pressure and bias that often comes with it. We believe that by designing AI that communicates clearly, transparently, and humanistically, we create a better experience for all Australians—not just those fluent in bank-speak.
“It’s not about replacing humans. It’s about restoring humanity to the process.” – Benjamin Baume
CLARA is more than just an AI. She’s the embodiment of a promise: that technology can be used to increase access, reduce complexity, and empower better financial decisions—anytime, anywhere.
As we continue to refine and scale Craggle’s voice AI, we hold tight to the insights from our Scotch-fuelled brainstorming sessions: transparency matters, empathy matters, and how we build is just as important as what we build.
If we can do that right, then perhaps the future of lending doesn’t just look more efficient—but feels more transparent too.