Hear his approach to excision, including why he looks for signs of “paprika” and “glistening corn silk” to evaluate the readiness of a wound...
There’s a promising solution on the horizon for accurately and quickly determining burn depth, a primary prognostic indicator of mortality following thermal injury and one of the main determinants of a patient’s long-term appearance and function.
Researchers in Sweden have found that artificial intelligence can determine burn-depth assessment with an accuracy rate of 92% to 97%, according to a paper recently published in Burns. That’s encouraging news as visual and physical examination of burn depth and area by experienced clinicians may only be accurate 70% to 80% of the time. When the assessment is conducted by a non-burn specialist, that accuracy can drop to as low as 51%.
The study, “Improving burn-depth assessment for pediatric scalds by AI based on semantic segmentation of polarized light photography images,” used artificial intelligence (AI), specifically a convolutional neural network based on the U-Net, to assess burn depth using semantic segmentation of polarized high-performance light camera images of burn wounds.
They looked at 100 pediatric scald injuries at the Linköping Burn Centre in Sweden as children under the age of 4 who are burned by hot water represent upwards of 40% of the Centre’s patients. The study sought to differentiate 4 different burn depths:
- Superficial partial-thickness (healing within 7 days)
- Superficial to intermediate partial-thickness (healing in 8–13 days)
- Intermediate to deep partial-thickness (healing in 14–20 days)
- Deep partial-thickness and full thickness burns (healing time is 21 days+)
The Linköping Burn Centre study began with 100 photographs of pediatric scald injuries. Of those, a subsection of 17 photographs showing all 4 burn depths were chosen to train the AI technology.
Researchers then increased the images from 16 to 3,936 by using an image optimization technique known as the elastic deformation technique to improve prediction accuracy. The remaining scald injury photos were evaluated by AI for burn depth. The AI assessments, which were validated by a clinical specialist 20 days after the burn, were accurate in 92% to 97% of cases.
Reliable and accurate burn area and wound depth assessment is critical to a patient’s treatment success. It can indicate how much IV fluid is necessary, the appropriate surgical response as well as healing estimates.
When burn wounds are underestimated, surgery and healing times may be delayed, according to a 2017 study by Burmeister et al published in the Journal of Burn Care and Research. Meanwhile overestimating burn wounds could lead to unnecessary surgery. Both situations are linked with poor clinical outcomes and significant hypertrophic scarring.
Other methods to determine burn depth such as thermography, vital dyes, video angiography, computer vision, video microscopy and laser Doppler differ in ease of use and produce varying results.
Computer vision, for example, only shows an average accuracy of 83% for third degree burns. Laser Doppler, which has been approved by the FDA for measuring wound depth, and laser speckle contrast imaging both create perfusion images of the injured skin. However, these methods both require upfront training, and it can be challenging and time-consuming to accurately classify burn depth.
Machine learning like AI offers precision and requires only minimal training. There are currently more than a dozen start-ups companies dedicated to developing AI solutions in healthcare. “Between artificial intelligence and the evolution of imaging technologies,” Jonathan Kanevsky, a plastic surgeon at the McGill University Health Center, recently told The Atlantic, “the marriage of those two forces is going to be just out of this world.”