The way people respond to women’s health issues can be improved by AI and the burgeoning ‘trillion-dollar’ FemTech industry, a panel of experts has heard.

Professor Debra Anderson, Dean at UTS Health, told a Respect at Uni Week talk at UTS on March 6, artificial intelligence could solve issues such as gender bias that impact the medical treatment of women.

“I want AI that works for me and is designed for me, that’s what women are saying at the moment,” Anderson said.

“There are people who are designing the technology, rightfully so, however, we have this amazing opportunity where we can use ‘Big Tech’ to promote women’s health.”

The Uncensored: Law, Health and Gender event, also heard from Sonia Pinchler, a caseworker at WILMA Women’s Health Centre, and Paula Moncrieff, a health liability partner at law firm Norton Rose Fulbright.

They discussed language in women’s health and law, and how it affects targets for women’s health, giving the example of the different ways a woman can describe a heart attack compared to how a man describes it, which sometimes results in women being given anxiety medication instead of a cardiology exam.

The talk heard there was a difference in how male medical professionals respond, with an emphasis on assuming women are ‘hysterical’, ‘overly-emotional’ or ‘demanding’.

The experiences of how [women] go through medical school and go through the training process can be quite scarring.

While FemTech – a term for AI that is used to improve women’s health and wellbeing – is considered by investors to be ‘the next trillion-dollar’ industry, the panel said AI had both the potential to help women’s health, but that there were also several negative side effects.

Moncrieff said large companies, such as US tech giant Apple, have access to data related to women’s health through the Health app, and that while this could be used in FemTech the data is also harvested by 9 out of 10 health apps, according to a study published in the British Medical Journal in 2021.

It is unknown what jurisdiction Apple’s Health app data falls under, as Apple is based in multiple locations around the world. The panel heard some data on the app can be accessed by those involved in domestic violence situations, such as forced termination of a pregnancy by a spouse.

As a result, much like several other women’s health issues, the panel said FemTech has been deemed ‘risky’ by some investors. Even with those setbacks, Anderson believes AI can still benefit women’s health.

Moncrieff added the treatment of female health professionals is another barrier in women’s health.

“The problem is barriers to medicine, the experiences of how they go through medical school and go through the training process can be quite scarring because you can’t be tired,” Moncrieff said.

“I feel like in medicine it’s still quite difficult to talk about, I mean, I think it’s changing, it’s changing a lot, but I still think there’s those barriers.”

Main image generated in GenCraft AI