
ChatGPT struggles with detailed answers to specialised queries around smoking and oral health, according to a new study.
The research team asked 500 frequently searched questions to ChatGPT around keywords such as ‘smoking and dentistry’ and ‘smoking and dental health’. These were categorised into five subjects: periodontal conditions, teeth and health, oral hygiene and breath, oral soft tissues and oral surgery.
Around one in five of ChatGPT’s answers were considered ‘not useful’ or ‘partially useful’. This was particularly common when the question related to specialised areas, such as the effect of smoking on oral surgeries or soft tissue topics.
The researchers said this suggests that while it ‘excels in addressing general queries’, ChatGPT ‘struggles with niche or more specific subjects’.
‘Balancing answer quality with simplicity’
The study also found that the answers generated by the chatbot scored low for readability. Readability levels varied between different topics, with responses relating to oral surgery proving particularly difficult. The study’s authors said this highlights ‘the difficulty of balancing answer quality with simplicity’ and ‘the need for more accessible language to enhance understanding’.
Despite low readability scores, the answers generally scored highly for understandability. This indicates that while the information provided may be too complex, the content remained clear to those reading it.
One acknowledged limitation of the study was that the level of understandability was measured health professionals who are likely to have a basis of knowledge in the areas discussed. The study says: ‘It might have been more appropriate to involve laypeople in evaluating understandability.’
Can you rely on ChatGPT for oral health advice?
Another metric used to evaluate ChatGPT’s responses was actionability. Responses on periodontal conditions scored most highly here, ‘offering useful advice like oral hygiene tips for smokers’. However this also varied greatly depending on the topic, suggesting that ‘some responses lack sufficient practical value’.
Overall, the researchers concluded that ChatGPT ‘can effectively supplement healthcare education’. However, the text-based format was seen as a drawback as images and video were deemed to ‘better support patient education and health communication’.
Combined with its inconsistency in terms of specialised knowledge, readability and actionability, the chatbot therefore has significant limitations as a source of oral health information. The authors emphasised that it ‘should not replace professional dental advice’.
The study was published in the BMC Oral Health journal on 19 July.
Follow Dentistry.co.uk on Instagram to keep up with all the latest dental news and trends.