How AI Saved My Life

When Alvina Nadeem typed her persistent symptoms into ChatGPT, she never imagined that AI would save her life. While subtle, symptoms like cramping and fatigue continued to worsen, prompting her to use the chatbot as a thinking partner to seek out more information. This fateful decision led to Nadeem receiving an early—and lifesaving—diagnosis of high-grade ovarian cancer.
This particularly aggressive type of cancer is often referred to as a “silent killer” because of its vague symptoms and lack of a reliable screening test. At just 36 years old, Nadeem was well outside the typical age range for the disease. “I should have fallen through the cracks,” she said. “The fact that I was diagnosed as stage one feels like I was lucky with my bad luck.”
“ChatGPT prepared me mentally to even hear words like, ‘yeah this is cancer.’ It gave me time to process on my own.” ~ Alvina Nadeem
Nadeem, a member of HDRN Canada’s Public Advisory Council (PAC), shared her story at Health Data for All of Us, the network’s annual public forum. As an engineer working in the technology sector, she was encouraged to use ChatGPT at work, making her familiar with the chatbot. But as a mother to two young children, Nadeem initially brushed off her symptoms as parental burnout or peri-menopause. When they began to worsen, she developed a nagging suspicion that something was wrong, and instinctively turned to ChatGPT.
Nadeem was methodical about her approach. She prompted ChatGPT to take an objective stance and act as a medical student evaluating a patient with her age and symptoms. When the model eventually suggested ovarian cancer as a possible cause, Nadeem maintained a neutral lens: “It’s important how you prompt it because if you are looking for something, you can corner it,” she noted. “But, if you come at it with more of a curious approach with open-ended questions, it can lead to a better outcome.”
When discussing the potential risks associated with AI hallucinations, Nadeem pointed out that making up information isn’t unique to AI. “Either you use the model and you’re aware that it can hallucinate, and so then you keep that in mind as you’re using it. But when you’re Googling, you need to have a lot of self-awareness and self-control to not get into that mode of hallucinating your own symptoms and hallucinating your own condition.”
Recalling a conversation with her cousin, who had related Nadeem’s symptoms to her own during menopause, she added, “We all have biases based on our lived experiences. Each one of us and those experiences cast shadows onto all the interactions we have with anyone,” she explained. “AI doesn’t have its own specific shadow. It has a more collective bias but it doesn’t have that individual bias.”
ChatGPT also helped her prepare emotionally for her eventual diagnosis. “It prepared me mentally to even hear words like, ‘yeah, this is cancer’. So when I spoke to my oncologist, I wasn’t like, ‘oh my God, it’s cancer’. I had no idea. It gave me time to process on my own.” Nadeem used the chatbot as a researcher as well as an interactive journal that could “talk back.” It also supported her to navigate the difficult conversation with her young children about her hair loss caused by chemotherapy.
Nadeem describes using AI as being part of “the first step towards self-advocacy.” The tool helped her identify which symptoms to raise with her doctors and supported her between appointments by translating complex medical information into digestible insights. “It made me feel like they weren’t the only ones holding my life in their hands,” she said, noting that being part of the conversation allowed her to gain some sense of control over her health.
After her diagnosis, Nadeem got involved with advocacy organizations such as Ovarian Cancer Canada as well as HDRN Canada’s PAC, recognizing the urgent need to amplify underrepresented voices in health care policy discussions. “The PAC’s focus on using data to drive genuine, people-centered health care solutions resonates strongly with my own vision for an equitable health care system,” said Nadeem. “Our data must comprehensively represent marginalized communities, Indigenous peoples and racialized populations to ensure health care solutions are relevant and accessible for everyone.”
Watch and share Alvina Nadeem’s full conversation with Dr. Kim McGrail at Health Data for All of Us: A Public Dialogue on AI in Health.