Pioneering medical diagnostics on Google Glass, behind the scenes at UCLA Ozcan Lab
April 21, 2014
They stated that this has the potential to improve laboratory testing in rural and low-resource settings such as disaster-relief areas and quarantine zones.
We covered the study, published in ACS Nano, recently in iMedicalApps–Researchers develop Google Glass app that delivers instant analysis of point-of-care diagnostic tests.
This new platform can simplify the processing of rapid diagnostic tests (RDT), lower training costs, and increase accessibility and availability to such tests.
We interviewed Aydogan Ozcan, Ph.D., principal investigator and the Chancellor’s Professor of Electrical Engineering and Bioengineering at UCLA and associate director of UCLA’s California NanoSystems Institute, along with Steve Feng, MS, first author of the published paper and research lab manager.
From where did the Google Glass idea come?
Steve Feng, MS: We previously published research on diagnosis using smartphones in the past (Editor’s note: see ACS Nano journal publication “Detection and Spatial Mapping of Mercury Contamination in Water Samples Using a Smart-Phone”), and we’ve commercialized that in Holomic LLC.
After that, we were trying to explore different ways to make things easier to use. With the various [rapid diagnostic test (RDT)] readers out there, they’re very sensitive, but they require a lot of training and it requires both of your hands. You have less of an intuitive feel. So we thought that wearable computers and hands-free modalities like Google Glass or other types of hands-free devices would provide a great way to do this kind of work, [which is] especially [useful in] any other work where you can free up hands and provide interactive interfaces for the longest time.