November 22, 2024:
Who hasn’t used ChatGPT? The popular large language model (LLM) has been the fastest-adopted application in the history of the internet, and hundreds of millions of people worldwide use it every day, along with AI models from its competitors like Microsoft, Google, and Anthropic.
But what do people actually use these LLMs for? A question that becomes particularly relevant to us as Better2Know is whether people are using these LLMs to self-diagnose STIs.
It’s a fascinating topic. Read on to find out more.
If you’re concerned about your sexual health, get a comprehensive view of it with Better2Know’s Platinum Screen.
If you’re reading this blog, you probably know what an STI is. But for the uninitiated, here’s a quick recap.
Sexually transmitted infections (STIs) are infections that you get from sex: most often from unprotected vaginal, anal, and oral sex. You’ve probably heard of the most well-known culprits, like Chlamydia, Gonorrhoea, HIV, Herpes, and Syphilis. But there are loads of other, less well-known STIs, like Mycoplasma, Trichomonas, Ureaplasma, and Hepatitis B, all of which can cause serious damage to your health if left untreated.
You don’t have to engage in full sexual intercourse to get an STI, though it is the most common mode of transmission. STIs like HPV and Herpes can be transmitted through simple skin-to-skin contact during foreplay and kissing, respectively.
Blood-borne viruses like Hepatitis B and HIV can also be transmitted through blood-to-blood contact, like sharing needles and getting tattoos through unsterilised tattooing equipment.
Getting an STI doesn’t have to be overly serious. After all, most STIs can be cured with medication, and others can be managed with the right treatment and care. But getting an STI and leaving it untreated is where the trouble starts.
Untreated STIs can lead to a range of negative health outcomes like pelvic inflammatory disease, complications with fertility, complications during childbirth, and damage to major organ systems like the immune system, nerves, liver, eyes, brain, and bones.
“Artificial intelligence” has become a bit of a fuzzy term in the last few years. When most people think of AI, they think of things like Terminator’s Skynet or 2001: A Space Odyssey’s HAL 9000: mechanical or digital consciousnesses that overpower humanity through their ability to think and act faster, better, and more thoroughly than we can.
However, the things we call “artificial intelligence” these days, such as applications like ChatGPT, Gemini, and Claude, probably aren’t “intelligent” at all. Rather, they’re sophisticated software programs that can generate text, images, code, and video by training themselves on vast amounts of data. They use probability and statistics to generate the most likely results based on publicly available data and the prompts given to them.
People use AI for all sorts of things, such as sourcing meal recipes, creating online content, and, professional and life advice.
But many people also use AI for something else really important: search.
Rather than turning to traditional search engines for their search queries, lots of people are using AI to generate answers to simple questions.
When you enter a question into your average LLM, it will spit out a response, and this response is scraped from publicly available content, most often from the internet.
So, for our purposes, what might an app like ChatGPT say to someone looking for information about STIs?
This question was partly answered by a recently published study in the British Medical Journal. Researchers chose to study how adolescents with limited ability to formally access healthcare would use ChatGPT to address their concerns about their sexual health and other commonly asked questions about STIs.
They used ChatGPT (GPT 3.5) to answer questions in several categories:
All responses were checked against CDC STI treatment guidelines for 2021.
The results of this study were interesting. According to the researchers, ChatGPT’s responses were fairly concise and accurate in content. The app also recommended safe sex practices and HPV vaccination as STI prevention measures.
However, there were issues with other responses the app gave:
“[ChatGPT] failed to recommend HIV pre-exposure prophylaxis. When an individual expressed a symptom that could potentially represent STI (eg, dyspareunia) ChatGPT appropriately provided reassurance that other possibilities exist, but advocated for testing. In terms of treatment, ChatGPT consistently communicated the importance of partner testing and follow-up testing, but at times, failed to highlight the importance of testing for other STIs. Overall, the advice given was not tailored to the specific individual’s circumstances.”
So, what can we learn from this study?
There are certainly benefits to using LLMs to learn more about STIs and sexual health.
LLMs have access to all the information on the Internet, and they can source answers from a wide range of content and present that content to the user as a response to a query. Some LLMs can also provide sources for their answers so that users can read further, like in Microsoft’s Copilot.
Most well-known LLMs are free to access and easy to use. Anyone with a computer, smartphone, and internet connection can access and use many LLMs. Providing access to more information to more people is certainly a good thing. Most LLMs also present information clearly and concisely that most people can understand.
While providing better access to more information is crucial for many people, using LLMs to learn about your concerns about sexual health can be problematic.
Most LLMs can’t guarantee the accuracy of the information they present, which can cause serious problems. They may present the wrong information, source incorrect information, or simply misunderstand the intent behind the prompt.
Most LLMs encourage users to verify important information to make fully informed choices about any information they are presented with.
They are also only trained on certain information. They don’t have access to copyrighted information, and most can’t present information that was publicly available before 2022.
So much of health is personal. When a doctor diagnoses someone, it’s usually after asking many questions about the patient’s life and health. The answers to these questions will determine how and why to treat a patient who comes to see them.
While LLMs can present users with vast amounts of information and present it in a digestible way, they can’t tailor their responses to individual situations.
That’s why, for most healthcare-related queries, most LLMs will encourage users to seek medical diagnoses, testing, and treatment.
The study concluded that:
“ChatGPT can provide helpful information regarding STIs, but the advice lacks specificity and requires a human physician to fine-tune. Its ubiquity may make it a useful adjunct to sexual health clinics, to improve knowledge and access to care.”
While this is only one study, and more research needs to be conducted to figure out how people will use LLMs to guide sexual health decisions, the input of a medical professional, along with rigorous testing, will always be essential.
If you’re worried about your sexual health, you can get tested for STIs with Better2Know today. Find a sexual health clinic near you by clicking the button below.
You can get a comprehensive STI screen at a sexual health clinic near you. Book your place today.