AI-powered patient information
Ask Eve Assist. A trusted voice, when it matters most
The challenge
In 2024, a leading UK cervical cancer trust sadly closed, destabilising established patient advocacy services that thousands of women relied on. The negative impact of such a loss directly hit The Eve Appeal.
Overnight, their Ask Eve nurse service saw a 1,141% monthly increase in woman making contact and seeking help. People needed information, often outside office hours, in private, and in language they could understand.
They needed to be able to access a trusted, always-on source of information that was available out of hours. The information needed to be accredited content, and designed with clinical safety in mind.
Our solution
An AI powered information tool, driven by clinical content only.
We started with governance and before building anything, we completed a full Data Protection Impact Assessment (DPIA) and used it to shape the design from the outset. Those in the sector know that health information is special category of data under GDPR. Users may choose to share symptoms, family cancer history, emotional distress, and other sensitive concern, therefore the tool was designed for data minimisation and safety first.
We set up a three-tier governance structure to guide the design and build:
• Steering group — Trustee oversight and board-level approval.
• Partners group — Lived experience and patient community representation to ensure that the tool was built to answer the questions people really asked
• Delivery team — Clinical and technical, to ensure product alignment throughout.
We assessed the tool against emerging AI governance frameworks and classified it as limited risk. In practice, that meant clear disclosure of AI use, responses grounded in PIF TICK-accredited content, and nurse oversight for escalation and review.
Ask Eve Assist is an AI-powered information tool embedded in The Eve Appeal’s website. It provides clear, clinically sound answers about gynaecological health, drawing only on The Eve Appeal’s PIF TICK-accredited content and operating within tightly defined safety parameters.
Available 24/7 with no login required, the tool always identifies itself as AI and signposts to human support (the nurse service, a GP, or crisis resources) when needed. It does not diagnose, prescribe, or guess.
If no relevant accredited content is found, it says so rather than generating an answer from inference.
34 patients ran 132 evaluations across 15 scenarios during testing. Scenario-based testing with real users, was the key to creating something that really delivered when it matters as it enabled us to validate tone and clarity, and ensure safety behaviour.
It was only during user testing that we surfaced an important safety edge case that we didn’t see in development.
A negative statement containing a crisis keyword was initially treated as a positive match and triggered escalation, generating a false positive response. As a result we reduces false positives while still prioritising safety, by added a two-stage negation check. We further validated the change with 43 test cases.
What the client said
"The process for scoping and developing how an AI tool can provide accurate health information and support people in their information seeking journey has been guided by Simon at every stage. The design process and collaborating and engaging a range of stakeholders has been important – to understand their needs and make sure that Eve is delivering to both its values and our information standards. It’s been brilliant working with the team at Digital Wonderlab to get us to this stage."
Technical approach
We built Ask Eve Assist to be highly dependable for people in the moment, and auditable for the clinical team. It runs on GPT-4.1 via Azure OpenAI and is tuned for consistency (a low “temperature” setting of 0.07, which reduces random variation in answers).
When someone asks a question, the assistant searches The Eve Appeal’s PIF TICK-accredited articles in Azure AI Search, which is updated nightly, and uses the most relevant passages to produce a response. This means it’s grounded in approved content, not guesswork. Safety is paramount and layered in. Azure Content Safety checks outputs, crisis signals trigger nurse escalation, and a two-stage negation check reduces false alarms without lowering safeguards.
We designed it DPIA-first with data minimisation by default, including a 30-day retention for conversations and a 365-day archive for escalations. All processing and storage stay in Azure UK data centres (UK South/West) to support GDPR compliance.
The result
In the 30-day pilot, usage peaked during Cervical Cancer Prevention Week and although it made up around 25% of the monitoring period, it accounted for 44% of all conversations.
Over the same period, the Ask Eve nurse helpline received only 39% of the contacts forecasted. Routine information queries were now being handled by Ask Eve Assist, freeing nurses to focus on the complex and high-need conversations.
Ask Eve Assist works because we anchored the build in community need, governance and patient testing. Using AI as the delivery mechanism, not the starting point.
It has also received national recognition, as the Overall Winner in the Patient Partnership Index 2026, Ovid Health. It was recognised for responding to an "urgent and pressing patient need" and using AI without compromising on trusted, high-quality information.
93%
Knowledge base utilisation
0%
system errors
4.72 / 5
ease of use
Need a bespoke AI solution to digitise your operations?
Digital Wonderlab partners with charities, social enterprises, and purpose-driven organisations to build responsible AI and digital services that support the people they’re designed for
