Mostrar el registro sencillo del ítem

dc.date.accessioned2025-12-15T20:59:44Z
dc.date.available2025-12-15T20:59:44Z
dc.date.issued2025-09-13es_MX
dc.identifier.urihttps://cathi.uacj.mx/20.500.11961/32420
dc.description.abstractThe potential to classify sex from hand data is a valuable tool in both forensic and anthropological sciences.This work presents possibly the most comprehensive study to date of sex classification from hand X-ray images.The research methodology involves a systematic evaluation of zero-shot Segment Anything Model (SAM) inX-ray image segmentation, a novel hand mask detection algorithm based on geometric criteria leveraginghuman knowledge (avoiding costly retraining and prompt engineering), the comparison of multiple X-rayimage representations including hand bone structure and hand silhouette, a rigorous application of deeplearning models and ensemble strategies, visual explainability of decisions by aggregating attribution mapsfrom multiple models, and the transfer of models trained from hand silhouettes to sex prediction of prehistorichandprints. Training and evaluation of deep learning models were performed using the RSNA Pediatric BoneAge dataset, a collection of hand X-ray images from pediatric patients. Results showed very high effectivenessof zero-shot SAM in segmenting X-ray images, the contribution of segmenting before classifying X-ray images,hand sex classification accuracy above 95% on test data, and predictions from ancient handprints highlyconsistent with previous hypotheses based on sexually dimorphic features. Attention maps highlighted thecarpometacarpal joints in the female class and the radiocarpal joint in the male class as sex discriminanttraits. These findings are anatomically very close to previous evidence reported under different databases,classification models and visualization techniques.es_MX
dc.description.urihttps://www.sciencedirect.com/science/article/pii/S001048252501412Xes_MX
dc.language.isoen_USes_MX
dc.relation.ispartofProducto de investigación IITes_MX
dc.relation.ispartofInstituto de Ingeniería y Tecnologíaes_MX
dc.rightsAtribución-NoComercial-SinDerivadas 2.5 México*
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/2.5/mx/*
dc.subjectSex classification, Hand X-ray images, Segment Anything Model X-ray image segmentation, X-ray image classification, Prehistoric handprintses_MX
dc.subject.otherinfo:eu-repo/classification/cti/3es_MX
dc.titleSex classification from hand X-ray images in pediatric patients: How zero-shot Segment Anything Model (SAM) can improve medical image analysises_MX
dc.typeArtículoes_MX
dcterms.thumbnailhttp://ri.uacj.mx/vufind/thumbnails/rupiiit.pnges_MX
dcrupi.institutoInstituto de Ingeniería y Tecnologíaes_MX
dcrupi.cosechableSies_MX
dcrupi.volumen197es_MX
dcrupi.nopagina111060es_MX
dc.identifier.doihttps://doi.org/10.1016/j.compbiomed.2025.111060es_MX
dc.contributor.coauthorMederos, Boris
dc.journal.titleComputers in Biology and Medicinees_MX
dc.contributor.authorexternoMollineda Cardenas, Ramon Alberto
dc.contributor.coauthorexternoBecerra, Karel
dcrupi.colaboracionextInstitute of New Imaging Technologies, Universitat Jaume I, Castelló de la Plana, Spaines_MX
dcrupi.impactosocialTiene impacto en el sector de la saludes_MX
dcrupi.vinculadoproyextNoes_MX
dcrupi.pronacesSaludes_MX
dcrupi.vinculadoproyintNoes_MX


Archivos en el ítem

Thumbnail
Thumbnail

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem

Atribución-NoComercial-SinDerivadas 2.5 México
Excepto si se señala otra cosa, la licencia del ítem se describe como Atribución-NoComercial-SinDerivadas 2.5 México

Av. Plutarco Elías Calles #1210 • Fovissste Chamizal
Ciudad Juárez, Chihuahua, México • C.P. 32310 • Tel. (+52) 688 – 2100 al 09