The University at Buffalo Data Augmented Research Technology in Surgery (DARTS) laboratory is a unique collaboration between surgeon innovators and machine learning scientists. We inspire to democratize surgery by building a scalable and accessible surgical data science paradigm. We develop and deliver novel applications of computer vision, advanced intraoperative imaging, and artificial intelligence including multimodal large language models in surgery to improve patient outcomes, to educate and train the next generation of surgeons, and to enhance surgical expertise and excellence.
Surgical Intelligence
Creating AI models with expert level understanding of surgery
Using machine learning, deep neural networks, and multi-modal large language models, our group trains AI models to understand surgical video at an expert level. Such models are used to investigate perception of surgical video information, supplement surgical education and assist with prospective surgical action decision making (Skinner et al., 2024).
Indocyanine Green Fluorescence
Investigating novel applications and quantification of ICG
Our group investigates the application of indocyanine green fluorescence for perfusion assessment and biliary structure identification (Skinner et al., 2024). We investigate objective quantification of indocyanine green which has shown higher correlation with important clinical outcomes like anastomotic leak compared to subjective interpretation.
Laser Speckle Contrast Imaging (LCSI)
Investigating clinically novel LSCI technology
Our group investigates the application of the clinically novel laser speckle contrast imaging. This technology helps surgeons assess tissue perfusion without the use of contrast dye, that is required with indocyanine green fluorescence. (Nwaiwu et al., 2023)
We have shown that subjective interpretation of LSCI is equivalent to ICG in left-sided colorectal resections.(Skinner et al., 2024), and that objective quantification of LSCI is equivalent to quantification of ICG in pre-clinical porcine model. (Skinner et al., 2024)
Surgical Video Digital Commons
Bringing blockchain technology to surgical video enhance privacy and security
Our group has recently received seed funding from the competitive, interdisciplinary UB AI and Health fund in order to develop software to more safely collect and store and surgical videos while simultaneously making these videos accessible and analyzable for researchers in the field.
Training Surgical Skills
To enhance surgical training, new virtual skills trainers are being developed for tasks ranging from fundamentals of laparoscopic surgery to camera navigation and advanced suturing. We have been collaborating on the development and validation of these devices. In particular, in collaboration with Kitware, Inc, we are investigating the use of simulators to support surgeons in the practice of responding to adverse and rare events to improve skill proficiency and resilience.
Surgical Skill Assessment
Systematic and objective evaluation and feedback are crucial for effective training of psychomotor skills involved in point-of-injury care and hospital-based medicine (HBM). Our team has been conducting two large-scale projects, funded by the U.S. Army and in collaboration with RPI and FSU-FAMU, on the use of brain-based and video-based metrics for evaluating skill on laparoscopic suturing and on prolonged field care tasks (endotracheal intubation and cricothyrotomy). Our current studies will be able to show the benefits and limitations of these metrics relative to other measurement modalities for the classification of experts versus novices and how brain activation as individuals train and gain skill on the task. These outcomes will inform training programs, and surgeon and Army medic credentialing standards.
Mental Imagery
Non-invasive functional neuroimaging during surgical tasks has been explored as an objective assessment tool. Our group was the first to monitor brain activity during Fundamentals of Laparoscopic Surgery (FLS) tasks and demonstrated that neuroimaging data can reliably discriminate between novice and expert performance. However, brain activity during cognitive surgical tasks has not yet been examined. Our overall goal is to develop a neuroimaging methodology for the objective evaluation of cognitive surgical tasks.
FACULTY
Peter C. Kim, MD PhD
Vice Chair - Research & Innovation Department of Surgery
Professor of Surgery
Dr. Kim also has developed, validated, and de-risked complex technologies and FDA-cleared advanced visualization devices, including surgical robots. Dr. Kim’s team was the first to show autonomous robotic soft tissue surgical tasks.
Steven D. Schwaitzberg, MD, FACS
SUNY Distinguished Service
Professor and Chairman - Department of Surgery
Dr. Schwaitzberg has developed, validated, and de-risked several complex technologies, and chaired the FDA human factors committee for medical devices. Dr. Schwaitzberg will co-lead preclinical/clinical validation and data accrual, segmentation, and validation.
Jinjun Xiong, PhD
SUNY Empire Innovation Professor
Professor
Department of Computer Science and Engineering
School of Engineering and Applied Sciences
Dr. Xiong is an expert in the application of cognitive computing from industrial solutions and is the director for the SUNY at Buffalo Institute for Artificial Intelligence and Data Science. Dr. Xiong serves as a technical supervisor for the UB DARTS multiple AI and computer vision projects.
Gene Yang, MD, MS
Clinical Assistant Professor
Department of Surgery
Dr. Yang is a MIS surgeon and former Medtronics Innovation fellow with deep interests in data accrual, transparency and traceability. He has been exploring advanced local hashing techniques, providing a framework that respects the confidentiality and integrity of patient information while facilitating educational and research opportunities in the surgical field.
P. Ben Ham III, MD, MS
Assistant Professor
Department of Surgery
Dr. Ham is a pediatric surgeon and Director of IT for Kaleida Health affiliated with the University at Buffalo. As a clinician champion, he leads and manages IT infrastructure, data security and usability of the Cerner EHR at the Oshei Children’s Hospital.
TRAINEES
Brian Quaranto, MD
Surgical Innovation Fellow
Dr. Quaranto is a recent graduate from the UB General Surgery Residency program and will be working to support all functions of the UB DARTS team. Dr. Quaranto has developed hardware and software solutions to capture and analyze the surgeon’s unique visual perspective in the operating room and is working on developing pathways to commercialization for surgical computer vision tools.
Garrett Skinner, MD
3rd Year General Surgery Resident
Dr. Skinner was Surgical Innovator-in-Residence at Activ Surgical from 2022-2024 where he helped investigate clinically novel near-infrared imaging quantification and surgical artificial intelligence . Now returning to complete his residency in General Surgery, he will continue breaking new ground integrating multi-modal large language models with surgical computer vision models.
Jiajie Li
3rd Year Computer Vision PhD Student
Department of Computer Science and Engineering
School of Engineering and Applied Sciences
Emily Hannah
4th Year Medical Student
Jacobs School of Medicine and Biomedical Sciences
Ascharya Balaji
4th Year Medical Student
Jacobs School of Medicine and Biomedical Sciences
Joshua Marek
2nd Year Medical Student
Jacobs School of Medicine and Biomedical Sciences
Gabriela Miletsky
2nd Year Medical Student
Jacobs School of Medicine and Biomedical Sciences
2024
Real-time near infrared artificial intelligence using scalable non-expert crowdsourcing in colorectal surgery
G. Skinner, T. Chen, G. Jentis, and Y. Liu, C. McCulloh, A. Harzman, E. Huang, M. Kalady, P. Kim
npj Digital Medicine, 2024
Clinical Utility of Laser Speckle Contrast Imaging and Real-Time Quantification of Bowel Perfusion in Minimally Invasive Left-Sided Colorectal Resections
G. Skinner, Y. Liu, A. Harzman, and S. Husain, A. Gasior, L. Cunningham, A. Traugott, C. McCulloh, M. Kalady, P. Kim, E. Huang
Diseases of the Colon & Rectum, 2024
Dye-less Quantification of Tissue Perfusion by Laser Speckle Contrast Imaging is Equivalent to Quantified Indocyanine Green in a Porcine Model
G. Skinner, M. Marois, J. Oberlin, and C. McCulloh, S. Schwaitzberg, P. Kim
Surgical Endoscopy (Accepted), 2024
2023
Real-time First-In-Human Comparison of Laser Speckle Contrast Imaging and ICG in Minimally Invasive Colorectal & Bariatric Surgery
Chibueze A. Nwaiwu, Christopher J. McCulloh, Garrett Skinner, and Shinil K. Shah, Peter C.W. Kim, Steven D. Schwaitzberg, Erik B. Wilson
Journal of Gastrointestinal Surgery, 2023