Skip to main content

Advertisement

ADVERTISEMENT

Letters to the Editor

Comment: Enhancing Complex Wound Care by Leveraging Artificial Intelligence

October 2023
1943-2704
Wounds. 2023;35(10):E329. doi:10.25270/wnds/23125

Dear Editor:

We would like to share our thoughts on an article recently published in Wounds, titled “Enhancing Complex Wound Care by Leveraging Artificial Intelligence: An Artificial Intelligence Chatbot Software Study.”1 The purpose of this study was to assess the use of an artificial intelligence (AI) chatbot (ChatGPT, Open AI) in challenging wounds. The study comprised 80 patients who were evaluated by a wound care provider, who then made a diagnosis and treatment plan based on their expert knowledge. The AI chatbot software was launched as an additional tool for individualized treatment and lifestyle suggestions. The study’s findings showed that the AI chatbot correctly selected the best treatment strategy for 91% of the patients in the sample. The first assessment by the wound care provider and the advice offered by the AI chatbot had a correlation of more than 90%.

The small sample size is one potential weakness of this study. With only 80 patients included, the findings’ generalizability may be limited. A bigger sample size would provide more robust proof of the AI chatbot’s effectiveness in challenging wound cases. No specifics are provided regarding the precise AI chatbot software employed, its capabilities, or the underlying algorithms. It is difficult to measure the dependability and correctness of the AI chatbot’s recommendations without this information. Furthermore, the report makes no mention of any potential constraints or difficulties linked with the use of AI chatbots in wound care. These may include concerns about data protection, patient acceptance, and the necessity for continual AI system updates and maintenance.

Sensitive content should not be created, changed, or accepted by AI if human review is a possibility.2 We can learn a lot about problems and solutions on ChatGPT; the findings from the AI software imply that some of these datasets may contain erroneous assumptions or notions. Patients may receive false or misleading information as a result. We must consider the ethical dilemmas that using chatbots and AI in academic settings presents before continuing. One of the most important requirements is that AI systems be developed and controlled by experts. To prevent flaws, biases, and potential hazards, AI systems must be continuously developed, tested, and monitored. 

Sincerely,

Amnuay Kleebayoon, PhD1; Viroj Wiwanitkit, MD2

1Private Academic Consultant, Samraong Cambodia; 2Adjunct Professor, Chandigarh University, Punjab, India 

Correspondence: Amnuay Kleebayoon, PhD; Private Academic Consultant, Samraong, Cambodia; amnuaykleebai@gmail.com

Authors’ Contributions: A.K. 1/2 ideas, writing, analyzing, approval for submission; VW 1/2 ideas, supervision, approval for submission.

1. Gupta S, Gupta SS, McMath K, Sugandh S. Enhancing complex wound care by leveraging artificial intelligence: an artificial intelligence chatbot software study.  Wounds. 2023;35(8):E265-E267. doi:10.25270/wnds/23073

2. Kleebayoon A, Wiwanitkit V. Artificial intelligence, chatbots, plagiarism and basic honesty: comment. Cell Mol Bioeng. 2023;16(2):173-174. doi:10.1007/s12195-023-00759-x

Advertisement

Advertisement

Advertisement