TY - JOUR
T1 - Artificial Neural Network for Laparoscopic Skills Classification Using Motion Signals from Apple Watch
AU - Laverde, Rubbermaid
AU - Rueda, Claudia
AU - Amado, Lusvin
AU - Rojas, David
AU - Altuve, Miguel
PY - 2018/7/1
Y1 - 2018/7/1
N2 - The acquisition of laparoscopic technical skills is constrained by the limited training opportunities and the necessity of having staff physicians on site to provide feedback to the trainees. In addition, the assessment tools used to measure trainees performance are not always sensitive enough to detect different levels of expertise. To address this problem, two Apple Watches worn on inexperienced subjects in laparoscopy were used to record their motion signals (attitude, rotation rate and acceleration) during multiple practices of the peg transfer task in a fundamentals of laparoscopic surgery (FLS) trainer box. This training process was carried out through a massed practice methodology (two hours of training), in which subjects were assessed following the guidelines of the FLS program. Subsequently, a series of metrics were estimated from the acquired motion signals and the Spearman's rank correlation coefficient was used to select the most statistically significant attributes. Then, a classification model based on artificial neural networks was trained, using these attributes as model inputs, to classify trainees according to their level of expertise into three classes: low, intermediate and high. Using this approach, an average classification performance of F1=86.11% was achieved on a test subset. This suggests that new technologies, such as smartwatches, can be used to complement surgical training by including motion-based metrics to improve current clinical education and offering a new source of feedback through objective assessment.
AB - The acquisition of laparoscopic technical skills is constrained by the limited training opportunities and the necessity of having staff physicians on site to provide feedback to the trainees. In addition, the assessment tools used to measure trainees performance are not always sensitive enough to detect different levels of expertise. To address this problem, two Apple Watches worn on inexperienced subjects in laparoscopy were used to record their motion signals (attitude, rotation rate and acceleration) during multiple practices of the peg transfer task in a fundamentals of laparoscopic surgery (FLS) trainer box. This training process was carried out through a massed practice methodology (two hours of training), in which subjects were assessed following the guidelines of the FLS program. Subsequently, a series of metrics were estimated from the acquired motion signals and the Spearman's rank correlation coefficient was used to select the most statistically significant attributes. Then, a classification model based on artificial neural networks was trained, using these attributes as model inputs, to classify trainees according to their level of expertise into three classes: low, intermediate and high. Using this approach, an average classification performance of F1=86.11% was achieved on a test subset. This suggests that new technologies, such as smartwatches, can be used to complement surgical training by including motion-based metrics to improve current clinical education and offering a new source of feedback through objective assessment.
UR - http://www.scopus.com/inward/record.url?scp=85056669444&partnerID=8YFLogxK
U2 - 10.1109/EMBC.2018.8513561
DO - 10.1109/EMBC.2018.8513561
M3 - Artículo en revista científica indexada
C2 - 30441566
AN - SCOPUS:85056669444
SN - 1557-170X
VL - 2018
SP - 5434
EP - 5437
JO - Annual International Conference of the IEEE Engineering in Medicine and Biology - Proceedings
JF - Annual International Conference of the IEEE Engineering in Medicine and Biology - Proceedings
ER -