Appropriate identification of burn depth and size is paramount. Despite the development of burn depth assessment aids [e.g., laser doppler imaging (LDI)], clinical assessment, which assesses partial thickness burn depth with 67% accuracy, currently remains the most consistent standard of practice. We sought to develop an image-based artificial intelligence system that predicts burn severity and wound margins for use as a triaging tool in thermal injury management. Modified EfficientNet architecture trained by 1684 mobile-device-captured images of different burn depths were previously utilized to create a convoluted neural network (CNN). The CNN was modified to a novel Boundary-Attention Mapping (BAM) algorithm using elements of saliency mapping, which was utilized to recognize the boundaries of burns. For validation, 144 patient charts that included clinical assessment, burn location, total body surface area, and LDI assessment were retrieved for a retrospective study. The clinical images underwent CNN-BAM assessment and were directly compared with the LDI assessment. CNN using a four-level burn severity classification achieved an accuracy of 85% (micro/macro-averaged ROC scores). The CNN-BAM system can successfully highlight burns from surrounding tissue with high confidence. CNN-BAM burn area segmentations attained a 91.6% accuracy, 78.2% sensitivity, and 93.4% specificity, when compared to LDI methodology. Results comparing the CNN-BAM outputs to clinical and LDI assessments have shown a high degree of correlation between the CNN-BAM burn severity predictions to those extrapolated from LDI healing potential (66% agreement). CNN-BAM algorithm gives equivalent burn-depth detection accuracy as LDI with a more economical and accessible application when embedded in a mobile device.