← Back to projects

A Contextual AI System for Personalised AI Agents Supporting Younger Adults With Cancer

Large Reasoning Models (LRMs) · Multimodal Foundation Models · Heterogeneous Time Series Data

Project Video - Digital Health Week 2026

This work is being presented at Digital Health Week 2026 in Melbourne, Australia. The video below provides an overview of the project, titled: "Empowering younger adults living with cancer: A mobile health app combining AI and wearable technology for personalised digital support."

Challenges faced by younger adults with cancer

The rising rate of cancer in adults under 50 years of age (early-onset cancer) globally is something to be concerned about. A cancer diagnosis at a younger age often comes as an unexpected shock and feels like having your world turned upside down.

Many younger cancer patients have spoken about the disorientating impact of a cancer diagnosis at a period of life where careers, relationships, social connections, and financial and geographical stability are being established. Moreover, cancer treatments can cause long-term health issues that make it even more difficult for patients to resume normal life activities.

79.1%
increase in the global incidence of early-onset cancer between 1990 and 2019 (1)
40%
of 2-year cancer survivors (aged 15–39 at diagnosis) had multiple chronic comorbidities 10 years later (2)

While the quality of cancer treatment and care in Australia is outstanding, it surprised me to learn that patients are left on their own to navigate the challenges of returning to everyday life after a cancer diagnosis.

Harsh choices faced by younger cancer patients with an uncertain future

Health or financial stability
Quicker recovery or career opportunities

One of many cruel dilemmas faced by younger cancer patients is that of "financial toxicity". Imagine waking up one day to find yourself facing an impossible decision: risk your health and well-being with a serious, unpredictable illness to maintain your financial stability and career prospects, or prioritise your health and recovery but face financial hardship and loss of career opportunities. Australia's means-tested (or "all or nothing") welfare system means that many younger cancer patients who were previously healthy and employed before an unexpected cancer diagnosis do not qualify for any financial support.

Productivity studies add up the aggregate impact of these issues across the population and demonstrate that the economic burden of cancer is substantial:

Breast cancer case study (3)

10,372
working-age Australian women diagnosed with breast cancer in 2022
$3.2 billion
projected cost in lost productivity over a 10-year period

AI and wearables for personalised digital support

Many younger cancer patients are highly motivated to use technology to help them manage the complexities of their condition and regain control of their lives. The aim of this project is to equip patients with the tools to achieve this, building on my PhD research in artificial intelligence (AI) and machine learning (ML) for health and genetics.

This project developed a contextual AI system that was found to support patients in:

  1. Monitoring and understanding physiological changes during cancer treatment
  2. Notifying patients about health issues requiring clinical intervention, to manage these in a timely manner before potentially escalating into more serious problems
  3. Automated appointment preparation for effective communication with health professionals to resolve ongoing health concerns
  4. Balancing medical treatment, side effects and quality of life

We (Sophie Wharrie and Elliot Tikhomirov) created a solution that (1) utilises digital tools that patients already own (including smartphones and wearable devices) and (2) provides long-term, tailored support by dynamically adapting to the changing needs of the individual. This includes the period following diagnosis and treatment with the intent to cure, post-treatment survivorship (which involves managing long-term side effects), monitoring for potential recurrence, and metastatic disease if it arises.

To achieve this, we built a mobile app that consists of (1) a data layer that collects (longitudinal) data from multiple sources over time, which forms of the basis of (2) what we refer to as a "contextual AI system", that acts as a personalised AI agent for cancer patients.

We developed a digital solution that seamlessly integrates with digital tools you already own:

  1. iOS mobile app compatible with Apple Health
  2. Apple Health integration for wearable device data and medication tracking
  3. Calendar integration for appointment and treatment scheduling
  4. Designed for the Australian healthcare system

Breast cancer case study

The app has been used continuously for over 12 months by a 27-year-old patient receiving care for stage 3 triple-negative breast cancer (TNBC), an aggressive subtype of breast cancer.

Triple Negative Breast Cancer (TNBC) accounts for around 10–15% of breast cancers and does not express estrogen, progesterone or HER2 receptors that are commonly used to guide breast cancer treatment. The treatment is therefore very intense — which for this patient involved over 12 months of cancer treatment, including chemotherapy, immunotherapy, surgery and radiation therapy.

This patient noticed that the medical system and cancer support services in Australia are designed for a much older demographic and do not address their needs as a younger adult with cancer.

This starts at the initial diagnosis and extends throughout treatment and beyond. Younger breast cancer patients often present with a symptomatic diagnosis (not found incidentally through a screening program), when the disease has advanced to a stage that is harder to treat. At the time of writing this, women aged over 40 can have a free two-yearly mammogram for breast cancer screening in Australia, and only women aged 50–74 are actively invited for this. This is just one example of a gap in the system for younger patients, despite research recognising that cancer is a heterogeneous disease that affects each patient differently and could benefit from targeted approaches. For example, this could include using more advanced technologies such as genetic risk scoring that I spent part of my PhD working on, to identify patients who can benefit from earlier screening.

A foundational principle of this project is that research and development of an AI system that utilises longitudinal data monitored over time and connected with clinical and biological data can facilitate personalised, digital biomarkers to more effectively support younger adults through each stage of their cancer journey.

Foundations of advanced AI: the app's data layer

The data layer of the app consolidates various, multimodal health data sources in a common format used by downstream AI models. The data model was designed around three types of data inputs:

  1. Treatments: represent decisions made by medical professionals and actions taken to manage the patient's condition;
  2. Responses: represent the state of the patient's condition and reaction to a treatment (e.g., symptoms, vitals, activity levels);
  3. Additional context: includes evidence-based information that provides further background about the patient's medical situation.
A significant amount of effort went into figuring out the best way to capture these data inputs for patients undergoing cancer treatment in the Australian healthcare system. For example, a single cancer patient can accumulate hundreds of pages of lab and imaging results, specialist letters, hospital discharge summaries and other personal health documents over the course of their treatment. These documents act as a record of the patient's medical history and are crucial for grounding an AI system with reliable details about the patient's specific diagnosis, treatment plan and evolving needs.
However, the Australian electronic health record system (My Health Record) is not used consistently by all healthcare providers, so these documents end up scattered across digital and physical formats.
Furthermore, doctors and nurses will often provide patients with printed copies of additional information about their diagnosis and treatment plan during consultations. For example, patients can find information about their specific chemotherapy regimen from eviQ.
These documents provide crucial details about potential side effects of cancer treatments and when to seek medical attention, but the large volume of information can be overwhelming for patients and can be better managed with digital tools.
Therefore, we designed the app to allow patients to upload PDF documents (e.g., from My Health Record and eviQ), or take photos if they have physical copies.

Conceptual data flow for the data layer:

%%{init: {"theme": "base", "themeVariables": { "background": "transparent", "lineColor": "#D0B5F7" } }}%% graph LR %% Orientation %% Inputs on left → EventVariable → Outputs below; KB on right subgraph TreatmentInputs["Treatments"] A1["Medication tracking
(Apple Health)"] A2["Medical appointments
(iOS Calendar)"] end subgraph ResponseInputs["Responses"] B1["Patient-reported symptoms
(free text input)"] B2["Wearable data
(Apple Health)"] B3["Patient-reported vitals
(body temp., BP, weight)"] K1["Personal health documents
(PDF or image upload)"] end subgraph KBInputs["Additional context"] K2["Non-personal health documents
(PDF or image upload)"] end EV["EventVariable"] Q["Quantitative variables"] H["Health-related events"] %% Edges from inputs to EventVariable A1 --> EV A2 --> EV B1 --> EV B2 --> EV B3 --> EV K1 --> EV K2 --> EV %% Outputs EV --> Q EV --> H %% Styles: subgraphs transparent with azure mist borders style TreatmentInputs fill:transparent,stroke:#ECFEFB,color:#ECFEFB,stroke-width:1px style ResponseInputs fill:transparent,stroke:#ECFEFB,color:#ECFEFB,stroke-width:1px style KBInputs fill:transparent,stroke:#ECFEFB,color:#ECFEFB,stroke-width:1px %% Node classes classDef inputNode fill:#C6FBF4,color:#000000,stroke:#C6FBF4,stroke-width:1px classDef coreNode fill:#D0B5F7,color:#000000,stroke:#D0B5F7,stroke-width:1px class A1,A2,B1,B2,B3,K1,K2 inputNode class EV,Q,H coreNode %% Link styles: all lines and arrows mauve linkStyle default stroke:#D0B5F7,stroke-width:1.5px,color:#D0B5F7

The backend implementation of the data layer uses Firestore, a non-relational NoSQL database, to store each data point in an "EventVariable" data structure. It's named this way because every data point collected represents a health-related event (medical appointment, side effect experienced after treatment, vital sign measurement, imaging test, etc.). For an AI-heavy application, it was important to have this unifying data structure to record key attributes of each data point in a consistent way, while also being flexible enough to accommodate a wide variety of data types and sources.

I used multimodal foundation models (as of December 2025: Gemini 3 Pro, Gemma 3n, and Gemini embedding models for creating vector representations of text data) to convert unstructured data into structured EventVariable records and extract measurable, quantitative variables from these data sources for downstream AI models to use. For example, if the patient uploads a PDF document containing lab results, the model will automatically extract relevant quantitative variables (e.g., haemoglobin level, white blood cell count, platelet count) and create EventVariable records for each of them.

Illustrative Python class for the unifying data structure:

class EventVariable(BaseModel):
  EventName: str 
  EventDatetime: datetime 
  EventType: AllowedEventType 
  EventDetails: Dict[str, Any] | None
  EventQuantity: float | None ...
  EventQuantityUnit: str | None
  …

From the patient's perspective, their everyday mobile usage — including medication reminders, medical treatments scheduled as calendar events, symptom journaling and wearable data — naturally feeds into this data layer and provides intelligence to their personalised AI agent.

Creating a contextual AI system that evolves with the patient's needs

The key idea behind the contextual AI system is that an AI agent needs to understand the relationships between the three types of data inputs (I. treatments, II. responses, III. additional context) to provide relevant and timely assistance to the patient. I developed algorithms for large reasoning models (LRMs) to interpret responses (II) in the context of the treatments (I) that the patient is receiving, as well as additional context (III) about their specific diagnosis and medical history.

As an illustrative example of why this is important, consider a wearable device that observes that a cancer patient had a poor night's sleep. Without further context, this observation is not very useful (for example, how do we know if this is related to a cancer treatment side effect, pain, anxiety, or something else entirely?). The data model of the app provides the necessary context and temporal ordering of events for a LRM to reason about this observation: continuing with the sleep example, suppose that the medication tracking shows that the patient recently took steroid medication and the calendar shows that prior to this they had chemotherapy treatment. The additional context retrieved from their health documents (which we refer to as the "knowledge base") using the retrieval-augmented generation (RAG) approach indicates that steroid medication is commonly prescribed post-chemotherapy and that poor sleep quality is a known side effect of steroids.

With this context, an AI agent is more likely to provide useful information to the patient about these observations recorded in the data, connecting what happened to why it happened (i.e., as a form of causal inference).

Large reasoning models (LRMs) tailored to longitudinal health events:

%%{init: {"theme": "base", "themeVariables": { "background": "transparent", "lineColor": "#D0B5F7" } } }%% graph LR %% Layout: left = Why, right = What subgraph Why D((Days 1–3
post chemo)):::treatment S((Steroid
medication)):::treatment F1(( Other
potential
factor )):::faded F2(( Other
potential
factor )):::faded F3(( Other
potential
factor )):::faded end subgraph What SQ((Sleep
quality)):::response end %% Subgraph styles: transparent fill, colored label + border style Why fill:transparent,stroke:#C6FBF4,color:#C6FBF4,stroke-width:1px style What fill:transparent,stroke:#D0B5F7,color:#D0B5F7,stroke-width:1px D --> S S --> SQ F1 -.-> SQ F2 -.-> SQ F3 -.-> SQ classDef treatment fill:#C6FBF4,color:#000000,stroke:#C6FBF4 classDef response fill:#D0B5F7,color:#000000,stroke:#D0B5F7 classDef faded fill:#043933,color:#7F7979,stroke:#043933 %% Default link style: grey lines/arrows linkStyle default stroke:#D0B5F7,stroke-width:1.5px,color:#D0B5F7

The contextual AI system was used as the basis for a range of functionalities in the app. This is best illustrated through the case study of the breast cancer patient using the app.

Over a 12-month period, the app recorded 173 cancer treatment events, 1820 medication records (tracking 37 different medications), 848 symptom reports, 388 days of wearable data, and a total of 12,943 quantitative data points extracted across all data sources.

The animation below shows how symptom reports changed over the 12-month period, illustrating how a cancer patient's needs change over time. The most commonly reported symptom was abdominal cramps - which you may not expect for a breast cancer patient - but is a reminder that every patient experiences different side effects from cancer treatment.

So how does an AI system meet these complex, unique and changing patient needs?

Firstly, the data layer allows quantitative variables to be tracked over time and creates new variables as needed. I implemented an algorithm that detects anomalous changes in this multivariate time series data, compared to the patient's own historical baselines.

This proved useful for early detection of concerning changes in the patient's condition. For example, while the patient was undergoing treatment with anthracycline chemotherapy drugs (potent chemotherapy agents with a significant risk of cardiotoxicity), this algorithm could detect relevant irregularities (e.g., from heart rate wearable data), prompting the patient to discuss this with their oncologist.

The patient also found this feature useful for managing their self care. For example, the app could detect when their energy levels were unusually low compared to their typical patterns, and suggest how to adjust their daily activities (e.g., reduce exercise intensity).

Secondly, I created a feature that produces summaries of the patient's recent symptoms, vitals and medical treatments. On a technical level, this was implemented by combining the previous algorithm for analysing longitudinal data with a large reasoning model (LRM) and retrieval-augmented generation (RAG) to fetch relevant and credible medical information from the knowledge base.

On a practical level, this produced summaries that provided clear guidance to the patient on what symptoms or other health issues they should prioritise (e.g., what warrants a trip to the emergency department). This is crucial in cancer treatment for effectively managing treatment-related toxicities before they escalate into more serious problems.

Thirdly, I developed a feature for organising the patient's upcoming medical appointments. The technical implementation is similar to the summaries feature, acting as an AI agent that automatically fetches upcoming appointments from the patient's calendar and creates tailored summaries of relevant symptoms, treatments and suggested questions.

This proved most useful for the patient when there were multiple treatments, medical specialists and hospitals involved in their care (which was a frequent occurrence). The AI agent helps break down this complexity by organising what symptoms, medications and other health information are most relevant for each appointment, helping the patient communicate effectively with their healthcare team.

The videos below show how these three features look in the app in practice, from the patient's perspective.

Case study: managing immunotherapy-related toxicities

The app proved especially useful when difficult complications occurred during the patient's cancer treatment. I believe this is important to highlight as it was a significant result of this project and demonstrates an important use case for AI in cancer care.

Pembrolizumab (Keytruda) is an immunotherapy drug that works by blocking a protective mechanism of cancer cells: the programmed cell death protein 1 (PD-1) receptor on T cells. Blocking PD-1 allows the immune system to better detect and attack cancer cells.

For this patient, the immunotherapy had an overreactive response and their immune system started attacking healthy cells, similar to an autoimmune disease.

Immune-related adverse events (irAEs) are autoimmune side effects of immunotherapies that can affect any organ and range in severity from mild to life-threatening.

It is critical to identify and manage irAEs early as they can escalate quickly. The treatment for irAEs typically involves immunosuppressive medications (e.g., corticosteroids), which themselves have side effects. More serious irAEs (which the patient experienced) require hospitalisation and close monitoring, as the first-line immunosuppressive treatments prove insufficient.

This was a complex situation for the patient, as they also needed to continue their other cancer treatment while managing these serious side effects, as well as dealing with practical matters of daily life (e.g., financial stress, work commitments). The app proved crucial during this period, for helping the patient monitor and manage their symptoms, communicate effectively with their healthcare team, and balance the competing demands of cancer treatment, side effect management and overall quality of life.

The interactive chart below shows a snapshot of several symptoms related to the irAE that the patient was asked to monitor during this period. It also overlays the timing of irAE-related hospital admissions and immunosuppressive treatments, with other cancer treatments (chemotherapy, surgery and radiation therapy) the patient was receiving.

The visualisation shows how the dose of prednisolone (a corticosteroid medication) was adjusted over time in response to the patient's symptoms. The app's AI algorithms used this data and many other data points to create tailored summaries and appointment preparation materials for the patient, and notified them of concerning changes in their condition. This was particularly beneficial given the unpredictable nature of irAEs, and the need for the patient to recognise and respond quickly to specific symptoms indicating a potential escalation of their condition.

The patient was often contending with multiple, complex treatments and health issues at the same time: managing irAE symptoms, immunosuppressive medication side effects, breast surgery recovery, and radiation therapy side effects. The app's aforementioned AI features were highly effective because the underlying contextual AI system could distinguish competing treatment effects, and proved very useful to the patient during a period of added physical and emotional stress.

Final thoughts

This project demonstrates how an AI system integrating multimodal, longitudinal health data assisted an early-onset breast cancer patient in effectively managing their disease and complications from cancer treatment. This blog post focused on the foundations of the app's data layer and contextual AI system, and the key functionalities this enabled for the patient.

Precision oncology has made significant advances for personalising cancer treatment but there is a huge untapped opportunity for purpose-built AI systems that use advanced computational and statistical modelling to fully utilise the wealth of data generated by cancer patients. High quality data is critical for medical research, but if this data isn't being collected and used effectively then opportunities to improve cancer outcomes are being missed. This blog presents an operational AI system validated using 12 months of data collected for a breast cancer patient, including wearable data, symptom journaling and medical documents. This approach could be extended further by integrating more biological data (e.g., genomics, metabolomics, next-generation sequencing) to create an even more impactful AI system that supports precision oncology.

Frontier AI technologies are an area of active development, and from my experience working on this project I have gained valuable insights into areas for further improvement. This includes further refining algorithms for large reasoning models (LRMs) to execute causal reasoning with longitudinal health data. Automated approaches to identify relevant, evidence-based information for the app's knowledge base are beneficial in a healthcare context, where factual accuracy is critical (and so-called hallucinations of AI systems are catastrophic). Furthermore, stronger capabilities for an AI system to determine the confidence of its own outputs (i.e., uncertainty quantification) and recognise when there is insufficient information to provide a reliable answer (e.g., and defer to human experts or suggest follow-up tests) are important ongoing areas of development for this work.

The technical achievements of this project can also be generalised to other conditions beyond cancer, such as autoimmune diseases. The solution developed here is designed for a younger adult demographic who are comfortable using digital tools to manage their health and are often underserved by existing healthcare systems. During my PhD I saw the transformative impact of biobank projects such as the UK Biobank, and the data layer developed in this project could similarly be used to create a digital biobank of longitudinal health data for medical research, to further improve health outcomes for this vulnerable group.

References

  1. Zhao, J., Xu, L., Sun, J., Song, M., Wang, L., Yuan, S., ... & Li, X. (2023). Global trends in incidence, death, burden and risk factors of early-onset cancer from 1990 to 2019. BMJ oncology, 2(1), e000049.
  2. Chao, C., Bhatia, S., Xu, L., Cannavale, K. L., Wong, F. L., Huang, P. Y. S., ... & Armenian, S. H. (2020). Chronic comorbidities among survivors of adolescent and young adult cancer. Journal of Clinical Oncology, 38(27), 3161-3174.
  3. Lloyd, M., Bassi, D., Zomer, E., & Ademi, Z. (2025). The productivity burden of breast cancer in Australia. Cancer Epidemiology, 94, 102726.
  4. BreastScreen Australia Program. Retrieved December 2025, from https://www.health.gov.au/our-work/breastscreen-australia-program
  5. Polygenic Risk Score Task Force of the International Common Disease Alliance. (2021). Responsible use of polygenic risk scores in the clinic: potential benefits, risks and gaps. Nature medicine, 27, 1876–1884. https://www.nature.com/articles/s41591-021-01549-6
  6. Yagoda, M. (2024). The Unique Hell of Getting Cancer as a Young Adult. Time. https://time.com/6761629/cancer-young-adult-essay/
  7. Cancer Institute NSW. eviQ. Retrieved January 12, 2025, from https://www.eviq.org.au/
  8. UK Biobank. Retrieved December 30, 2025, from https://www.ukbiobank.ac.uk/