The eyes have been called the “windows to the soul”—for good reasons. Changes in the composition of the tear film can provide a wealth of information about a person’s ocular and overall health (Jalbert, 2013; Tiffany, 2003; von Thun Und Hohenstein-Blaul et al, 2013). The presence of certain biological markers, often referred to as “biomarkers,” can signal the onset of a disease, and tear film biomarkers have already been identified in a wide range of systemic diseases including cancer, Alzheimer’s disease, Parkinson’s disease, multiple sclerosis, thyroid disease, and diabetes (Jones et al, 2021).
The major challenge is being able to accurately detect these biomarkers, which typically are in extremely low concentrations in the tear film. A “smart” contact lens can take advantage of its constant exposure to the tear film to continuously sense various tear film biomarkers and to provide feedback on their presence or changing nature.
Trials and Tribulations with Diabetes Detection
Most of the research to date on developing a diagnostic contact lens for systemic disease has focused on diabetes, which affects almost 400 million people globally (Danaei et al, 2011). Constant monitoring and management of blood glucose has been the hallmark of improving the quality of life for people who have diabetes (McQueen et al, 2011). The current gold standard approach relies on a finger prick, which is painful, inconvenient, and susceptible to infections. Research has shown that glucose is also present in tears and is at elevated levels in diabetics (Sen and Sarin, 1980). Thus, a smart contact lens that could monitor glucose would provide a pain-free, non-invasive, continuous monitoring option.
Researchers have explored a range of detection methods to quantify changes in glucose levels in tears. Optical methods work by changing color or by fluorescing when glucose binds to certain molecules and proteins; these optical changes can then be measured by patients using a handheld mobile device. However, optical changes can be difficult to measure accurately due to differences in ambient light intensities or to variable distances between the contact lens and the measurement device.
In 2014, Google and Novartis began developing a glucose-sensing smart contact lens using an enzyme-based electrochemical mechanism that produced an electric current proportional to the amount of glucose present in the tear fluid. The sensors were then coupled to a wireless powering device and antenna, all of which were miniaturized to fit within a contact lens. It was envisioned that the final version of the device would continuously measure tear glucose and relay this data in real-time to a smartphone, which would record the data and determine the appropriate response, whether telling patients to inject their insulin or alerting their physician (Liao et al, 2012). Non-enzyme-based electrochemical sensors consisting of various metals also have been developed, and some researchers have taken the development one step further by combining glucose sensing with the release of therapeutics from drug reservoirs integrated on the lens (Keum et al, 2020).
However, developing a smart contact lens for detecting tear glucose has been anything but easy, as researchers continue to struggle with problems regarding repeatability, stability, biocompatibility, and sensitivity of glucose sensors. The most difficult problem, however, may not even be addressable, as there is a physiological lag time of approximately 15 minutes between blood glucose and tear glucose (Senior, 2014; Zhang et al, 2011; Badugu et al, 2005; Vaddiraju et al, 2010). As such, this technology may not be suitable for diabetic patients who rely on accurate, real-time measurements of blood glucose to administer their insulin shots. Nonetheless, a smart contact lens could still serve as an excellent diabetic management tool to provide a daily profile of glucose levels throughout the day, enabling patients to make informed decisions regarding their disease management. CLS
For references, please visit www.clspectrum.com/references and click on document #315.