Header logo is pi


2017


A Deep Learning Based 6 Degree-of-Freedom Localization Method for Endoscopic Capsule Robots
A Deep Learning Based 6 Degree-of-Freedom Localization Method for Endoscopic Capsule Robots

Turan, M., Almalioglu, Y., Konukoglu, E., Sitti, M.

arXiv preprint arXiv:1705.05435, 2017 (article)

Abstract
We present a robust deep learning based 6 degrees-of-freedom (DoF) localization system for endoscopic capsule robots. Our system mainly focuses on localization of endoscopic capsule robots inside the GI tract using only visual information captured by a mono camera integrated to the robot. The proposed system is a 23-layer deep convolutional neural network (CNN) that is capable to estimate the pose of the robot in real time using a standard CPU. The dataset for the evaluation of the system was recorded inside a surgical human stomach model with realistic surface texture, softness, and surface liquid properties so that the pre-trained CNN architecture can be transferred confidently into a real endoscopic scenario. An average error of 7.1% and 3.4% for translation and rotation has been obtained, respectively. The results accomplished from the experiments demonstrate that a CNN pre-trained with raw 2D endoscopic images performs accurately inside the GI tract and is robust to various challenges posed by reflection distortions, lens imperfections, vignetting, noise, motion blur, low resolution, and lack of unique landmarks to track.

link (url) Project Page [BibTex]


{Deep EndoVO: A Recurrent Convolutional Neural Network (RCNN) based Visual Odometry Approach for Endoscopic Capsule Robots}
Deep EndoVO: A Recurrent Convolutional Neural Network (RCNN) based Visual Odometry Approach for Endoscopic Capsule Robots

Turan, M., Almalioglu, Y., Araujo, H., Konukoglu, E., Sitti, M.

ArXiv e-prints, 2017 (article)

Abstract
Ingestible wireless capsule endoscopy is an emerging minimally invasive diagnostic technology for inspection of the GI tract and diagnosis of a wide range of diseases and pathologies. Medical device companies and many research groups have recently made substantial progresses in converting passive capsule endoscopes to active capsule robots, enabling more accurate, precise, and intuitive detection of the location and size of the diseased areas. Since a reliable real time pose estimation functionality is crucial for actively controlled endoscopic capsule robots, in this study, we propose a monocular visual odometry (VO) method for endoscopic capsule robot operations. Our method lies on the application of the deep Recurrent Convolutional Neural Networks (RCNNs) for the visual odometry task, where Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs) are used for the feature extraction and inference of dynamics across the frames, respectively. Detailed analyses and evaluations made on a real pig stomach dataset proves that our system achieves high translational and rotational accuracies for different types of endoscopic capsule robot trajectories.

link (url) Project Page [BibTex]


no image
Localized Single-Cell Lysis and Manipulation Using Optothermally-Induced Bubbles

Fan, Q., Hu, W., Ohta, A. T.

Micromachines, 8(4):121, Multidisciplinary Digital Publishing Institute, 2017 (article)

[BibTex]

[BibTex]

2005


no image
Adhesive microstructure and method of forming same

Fearing, R. S., Sitti, M.

March 2005, US Patent 6,872,439 (misc)

[BibTex]

2005

[BibTex]


no image
Modeling and testing of a biomimetic flagellar propulsion method for microscale biomedical swimming robots

Behkam, B., Sitti, M.

In Proceedings of Advanced Intelligent Mechatronics Conference, pages: 37-42, 2005 (inproceedings)

Project Page [BibTex]

Project Page [BibTex]


no image
Biologically inspired adhesion based surface climbing robots

Menon, C., Sitti, M.

In Robotics and Automation, 2005. ICRA 2005. Proceedings of the 2005 IEEE International Conference on, pages: 2715-2720, 2005 (inproceedings)

[BibTex]

[BibTex]


no image
Claytronics: highly scalable communications, sensing, and actuation networks

Aksak, Burak, Bhat, Preethi Srinivas, Campbell, Jason, DeRosa, Michael, Funiak, Stanislav, Gibbons, Phillip B, Goldstein, Seth Copen, Guestrin, Carlos, Gupta, Ashish, Helfrich, Casey, others

In Proceedings of the 3rd international conference on Embedded networked sensor systems, pages: 299-299, 2005 (inproceedings)

[BibTex]

[BibTex]


no image
Biologically Inspired Miniature Water Strider Robot.

Suhr, S. H., Song, Y. S., Lee, S. J., Sitti, M.

In Robotics: Science and Systems, pages: 319-326, 2005 (inproceedings)

[BibTex]

[BibTex]


no image
Polymer micro/nanofiber fabrication using micro/nanopipettes

Nain, A. S., Amon, C., Sitti, M.

In Nanotechnology, 2005. 5th IEEE Conference on, pages: 366-369, 2005 (inproceedings)

[BibTex]

[BibTex]


no image
Geckobot and waalbot: Small-scale wall climbing robots

Unver, O., Murphy, M., Sitti, M.

In Infotech@ Aerospace, pages: 6940, 2005 (incollection)

[BibTex]

[BibTex]


no image
Fusion of biomedical microcapsule endoscope and microsystem technology

Kim, Tae Song, Kim, Byungkyu, Cho, Dongil Dan, Song, Si Young, Dario, P, Sitti, M

In Solid-State Sensors, Actuators and Microsystems, 2005. Digest of Technical Papers. TRANSDUCERS’05. The 13th International Conference on, 1, pages: 9-14, 2005 (inproceedings)

[BibTex]

[BibTex]


no image
Atomic force microscope based two-dimensional assembly of micro/nanoparticles

Tafazzoli, A., Pawashe, C., Sitti, M.

In Assembly and Task Planning: From Nano to Macro Assembly and Manufacturing, 2005.(ISATP 2005). The 6th IEEE International Symposium on, pages: 230-235, 2005 (inproceedings)

[BibTex]

[BibTex]


no image
A new endoscopic microcapsule robot using beetle inspired microfibrillar adhesives

Cheung, E., Karagozler, M. E., Park, S., Kim, B., Sitti, M.

In Advanced Intelligent Mechatronics. Proceedings, 2005 IEEE/ASME International Conference on, pages: 551-557, 2005 (inproceedings)

Project Page [BibTex]

Project Page [BibTex]