Figures
Abstract
For a transparent well with a known volume capacity, changes in fluid level result in predictable changes in magnification of an overhead light source. For a given well size and fluid, the relationship between volume and magnification can be calculated if the fluid’s index of refraction is known or in a naive fashion with a calibration procedure. Light source magnification can be measured through a camera and processed using computer vision contour analysis with OpenCV. This principle was applied in the design of a 3D printable sensing device using a raspberry pi zero and a camera.
Citation: Baudin PV, Teodorescu M (2023) A computer vision based optical method for measuring fluid level in cell culture plates. PLoS ONE 18(9): e0290951. https://doi.org/10.1371/journal.pone.0290951
Editor: Gerrit Hilgen, Northumbria University, UNITED KINGDOM
Received: January 23, 2023; Accepted: August 21, 2023; Published: September 8, 2023
Copyright: © 2023 Baudin, Teodorescu. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Data Availability: The code that implements the system presented in this paper is included in the supplementary materials and the github repo (https://github.com/pvbaudin/Optical-fluid-level-measurement-for-culture-plates). The dataset gathered and presented as plots is also included in the supplemental material.
Funding: P.V.B and M.T were supported in this work by the following grants: Schmidt Futures SF 857, the National Human Genome Research Institute under Award number 1RM1HG011543, the National Science Foundation under award numbers NSF 2034037 and NSF 2134955. There was no additional external funding received for this study.
Competing interests: P.V.B has filed a provisional patent application for the described method. The authors have no other competing interests. This does not alter our adherence to PLOS ONE policies on sharing data and materials.
Introduction
The automation of cell culture procedures has the potential to greatly increase the throughput and consistency of cell culture based experiments [1]. Manipulation of fluids of is a key feature for any automated culture platform. Systems must have the ability to deliver or move precise quantities of fluids. To automate movement of fluids, some systems use a “lab-on-a-chip” approach [2], utilizing microfluidic channels to transport fluids. Other systems use robot manipulators to deliver measured quantities of fluids with micro-pipettes [1, 3, 4]. Regardless of mechanism behind the delivery of fluids, being able to detect the presence and volume of fluids within provides key error detection and correction functionality that open loop methods lack.
Surface level measurement within a container of known volume is a simple way to determine fluiod volume. This can be done with electrically active probes dipped into a fluid being measured [5]. These probes can be effective but direct contact with the fluid can be problematic for cell culture applications wherein any potential contamination vector increases the risk of failure [6]. Non-contact sensing is preferable for sensitive applications. Optical fluid sensing methods take advantage of how fluids interact with light. Several optical approaches for measuring fluids in culture plates exist. One method uses computer vision with well placed light sources and sensors placed at specific angles to measure ray deflection [7]. This method uses reflectance of the fluid surface, which makes it usable no matter the solid mass contents of the well. However, the angular requirements of the sensor electronics makes tight packing of independent sensing elements infeasible, making parallel measurements within a plate complex or impossible. Another published method involves imaging the distortions in a printed grid to visualize changes in refraction related to fluid level [8]. While this approach can be used in parallel on many wells at once, it is highly affected by occlusion resulting from material in the plate and can be affected by ambient lighting conditions.
Here we propose a method for non-contact optical detection of fluid level using a CMOS camera sensor and an LED. We detail the principles at work and show a 3D printed device that can be used to perform measurements in 24 well culture plates. Similar to the approach shown in [8], this method relies on the lens-like properties of fluids and the resulting changes of an image viewed through them. Instead of viewing a grid, we detect the apparent size of an overhead light source. Changes in the apparent distance of the light source correlate to changes in fluid level. A bright overhead light source can shine through some samples, making this approach usable in scenarios with solid occlusion that would disrupt the viewing of a grid. Additionally, the electronics required are set up directly inline with the sample wells, allowing electronics to be packed into a grid allowing parallel sensing of many wells at once. Since the only necessary electronics for this approach are a camera and LED, this could be implemented with systems that already do longitudinal cell-plate imaging to add fluid sensing functionality with no additional hardware.
Methods
Operating principle
The measurement principle behind this system relies on the lensing effects of fluids with different indices of refraction. In Fig 1 the diagrams represent how the apparent distance of the light source changes based on fluid depth. In the case of a fluid with a index of refraction greater than air (like water), the light source will appear closer as the fluid level increases.
Refraction angles can be determined with Snell’s law with known indices of refraction of the 2 media (n1sin(θ1) = n2sin(θ2)). In the case of second transition back into the original medium, the relationship θ3 = θ1 can be derived, hence the labeling of the third angle as θ1 in this diagram. L = fluid depth, I = apparant distance of object. h0 = distance between object and top of fluid, h1 = distance between bottom of fluid and measurement plane.
To derive a function relating the fluid depth to the apparent distance of the light source, we define the following constants. These can be considered constants because our light source sends rays in all directions below it, so any angle θ1 can be chosen so long as it results in a ray that intersects the fluid, and from this chosen angle θ2 can be derived via Snell’s Law. In terms of these constants the resulting transfer function is: This function relates the apparent distance of the image I to the fluid level L and is a first degree polynomial. The full derivation can be found in section of the supplemental materials. This function is a useful approximation that ignores several factors including refraction caused by the plate material and refraction from the meniscus geometry of the fluid surface, our data show that accurate results can be obtained with this approximation in most scenarios. Meniscus has a substantial effect on measurement in two extremes, an almost empty well, and an almost full well. These meniscus effects are further explored in section.
The polynomial can be characterized analytically so long as we can determine how the image distance relates to size on the camera sensor. This can be determined by taking a picture of a ruled measurement calibration slide at several known distances. It can also be characterized by a simple calibration step, being that the relationship is linear, a 2 point calibration would theoretically be sufficient, but for greater accuracy a 5 point calibration would be more prudent. Once this polynomial is characterized, image data is sufficient for capturing fluid level.
Hardware
To test this approach, we designed a 3D printable rig that holds a raspberry pi camera and a white LED. Fig 2 shows 3D renders of the parts comprising the measurement device and Fig 3 shows the experimental setup.
A: a 3W white LED, B: 3D printed enclosure with a plug to hold the light in place, C: Raspberry Pi Spy Camera, D: Raspberry Pi Zero W, E: rigid plate holder, F: experimental setup. Rendered using Fusion360, real setup shown in Fig 3.
Analysis pipeline
Using openCV [9], a free and open-source machine vision toolkit, we can capture and measure the size of spot light image. This can be done on static images or on videos including live feeds. We do this by detecting the central contour in the image and fitting an ellipse to it. The result can be calculated fast and is robust to occlusion and noise. In order to do this On the raspberry pi zero we run a video stream using the open-source RPi Cam Web Interface [10]. The python code for this image analysis is lightweight and can be run internally on the raspberry pi or on an external system. For external system use, the code grabs frames from the MJPEG stream generated by the camera web interface.
Fig 4 details the process through which a data point is obtained. Once an ellipse is fit to the central contour of the image, we filter the output by accepting the average value of a rolling buffer when the standard deviation of that buffer falls below a threshold value. This is to compensate for the disturbance of the fluid surface when new fluid is added or if the plate is shaken. By only accepting reads when the reading has stabilized, we avoid wildly fluctuating erroneous readings any time the fluid is disturbed.
A Jupyter notebook that implements the described pipeline can be found in the supplemental material and any future updates will be accessible through the github repository linked in the supplemental materials. Through the use of Jupyter widgets, live changes to various parameters can be adjusted on the fly to optimize the detection process for any given setup.
Results
The plot shown in Fig 5 shows the output of our sensing system for from the first possible reading to the overflowing of the well. Fig 10 shows why the first possible reading from a previously dry well occurs around 0.4ml. When the fluid level is very low in a well, the geometry of the meniscus cannot fully form. The center of the meniscus will rise more slowly before the full geometry of the meniscus is formed, this is because of the fluid volume held in the meniscus edges. Therefore, before the full formation of the meniscus, the overhead light image viewed from the center of the meniscus grows more slowly up until point A shown in Fig 5. As the well starts to get full, a new distortion of the meniscus geometry occurs due to the surface tension of the fluid. The image grows substantially faster due to the magnifying properties of the convex meniscus, this phenomenon begins at point B. Finally, when the well overflows, the meniscus flattens resulting in a final level of unchanging magnification beginning at point C.
Region A shows behavior before meniscus has reached stable geometry. Region B has stable meniscus and shows linear response as predicted in section, Region C shows when the well begins to get full, the surface tension causes Meniscus Inversion. Region D is where fluid has spilled over the edge of the well, flattening the meniscus.
The region between points A and B are where this sensing process will be most generalizable, as the relationship between the spotlight image contour size and the fluid level is highly linear.
Calibration
Deriving an effective transfer function for this system can be done by fitting a curve to a set of calibration points. Deciding on the best fit curve is a question of trade offs. The phenomenon we are measuring is linear within the range of volumes that have a stable meniscus geometry. A simple 2 point linear fit with points from the linear region, generates outputs with average error below 100μl within the linear range. Outside this range, the estimation error increases substantially.
In order to represent the entire curve, we need a high order polynomial fit. Using a least squares approximation method, we can fit polynomials to any subset of points we choose. Considering the phenomena shown in Fig 5, the curve we are attempting to characterize has 3 distinct regions. Calibration points should be selected to include points from each of these regions. Precision measurements of well volume inside the meniscus inversion region is unlikely to be very accurate, as disruptions in surface tension can cause overflow to occur at different points. Therefore for calibration it should be sufficient to model the linear region occurring at the start of meniscus inversion. This is adequate to detect that the fluid level is greater than the capacity of the well. Since the other 2 regions are linear, we can adequately fit a curve with 2 data points selected from each region. Comparisons of different order curve fits can be seen in Fig 6.
Generalizability of transfer function.
To evaluate the consistency of this measurement principle, data was captured in 2 different wells during 2 different runs. Measurement error along these runs are shown in Fig 7.
The percentage error values of each fit are examined in Fig 6 and in Table 1. The error from the 2 point linear fit is substantial outside the central linear range, meaning the fit obtained by the most simple calibration process is not usable if the desired measurement range spans the entirety of the well. However, when we constrain our test points to within the linear range (as shown in Table 2), the measurement error is more reasonable. Having the option of a simple 2 point calibration for certain uses is an advantage. The other polynomial fits use the 6 points shown in the curve in Fig 6. Unsurprisingly we observe that a higher order polynomial fit results in lower error. But beyond 4th order, we experience diminishing returns.
In summation, we find that a 5th order least squares polynomial fit applied to 6 points (2 points in the low region, 2 points in the mid region, and 2 points in the end region) results in a transfer function that giving us a mean error of 5.51% and a standard deviation of 6.21%. This is a reasonable error for a low cost, low complexity system. The error is also consistent across multiple runs, indicating that the transfer function is generalizable to different wells and different runs. When constrained to the central linear region, this transfer function has a mean error of 4.62% and a standard deviation of 3.10%.
Alternate testing conditions
Oil.
The data presented so far have been captured under ideal conditions using distilled water on a level surface as our fluid of interest. To demonstrate the generalizability of the measurement principle. We also tested using food grade canola oil. The results are shown in Fig 8. Due to the cloudiness of the oil, the initial threshold step used in our analysis pipeline (Fig 4) needed to have its value adjusted. The code included in the supplementary material provides a simple slider to adjust this threshold value. Once an appropriate value was found, the measurement principle held and the results were more linear than with water. This is likely due to the fact that the oil produces a less pronounced meniscus than water. This is encouraging as it demonstrates that certain fluids can yield very accurate measurements with simple calibration methods.
5° tilted surface.
The theoretical analysis shown in Fig 1 shows that we should expect linear response absent the previously discussed meniscus effects. However, this analysis assumes a flat surface. If the well is tilted, then the relationship between fluid depth and volume becomes skewed. At low volumes, the area of the fluid surface is smaller and it requires less fluid to increase depth. As depth increases, the area of the fluid surface is larger and it requires more water to increase depth. Once the fluid surface reaches the edge of the well, the relationship between depth and volume should become linear again until the well reaches overflow at a volume lower than it would with the flat surface. We tested the response on a 5 degree tilt and the result compared to a flat surface is shown in Fig 9. The primary visible impact is a positive bias. If we were to analyse the tilted plate with a transfer function derived from a flat calibration, we would overestimate the volume. However the fundamental principle of the measurement is still valid and a transfer function derived from a tilted calibration would be accurate.
Dry well
When a well is completely dry, droplets hold their shape rather than forming a layer along the bottom. Since this approach relies on the lens-like properties of a fluid layer, it does not work when the fluid in question has not formed a complete layer across the base of the well. The optical consequences of an incomplete fluid layer are shown in Fig 10. Once 0.5ml have been placed in the dry well, a layer adheres to the bottom and the measurement starts to work. This is shown in Fig 10 The well can also be moistened beforehand with a small amount of water, doing so disrupts the formation of droplets and allows the measurements to work at smaller fluid volumes.
Discussion
Comparison to other methods
One of the most common ways to sense the height of a fluid’s surface is to use optical or ultrasonic time of flight sensors. These sensors interpret the time delay between signal emission and the return of the reflected signal to determine the distance to the surface. This method is highly effective. What is advantagous about the method presented here is that it uses hardware that may be already present in the system. For instance some lab-on-a-chip microfluidic systems contain imaging modules as well as the materials to move fluids [13]. In systems like this, An overhead LED and a camera may already be present for other imaging needs. The approach presented here can then be implemented with no additional hardware needs. By contrast, an array of time of flight sensors would represent a considerable increase in cost and system complexity.
Modeling of the fluid
While the results presented here rely on calibration to numerically determine a suitable transer function, system characterization could theoretically be done through detailed modeling. Compared to the calibration approach, this would require detailed information about system parameters like camera lens optics, fluid properties, well geometry, and the optical properties of the cell plate material. However, if such a model were developed with sufficient accuracy it could offer additional insights and error checking. For instance, if the response of a fluid in the well is not as the model predicts, that may indicate mixups or mistakes in the fluid handling process. However, the additional complexity of the modeling approach vs pure calibration based fitting would make this feasible only for applications at large scale using the same system and fluids. The calibration based approach is more generalizable and can be applied to any system with a camera, led, and a well plate.
Conclusion
By leveraging the availability of low cost camera hardware and open source machine vision tools, devices that use computer vision for sensing tasks are likely to become more ubiquitous. The proliferation of cheap mass-produced camera sensors is an opportunity for designers to consider many new methods of measurement.
The device shown here serves as a simple proof of concept for the use of this fluid level measurement principle. For practical purposes, this should be deployed with many cameras in parallel, each monitoring a single well. Previous work has been published laying out the design and use of a 24 well parallel microscope system using similar camera hardware and LEDs, this system is called the “Picroscope” [11, 12]. The picroscope is also built to be compatible with a microfluidic cell culture feeding platform [13]. Applying this principle to this system can enable feedback control for fluid contents of the wells in this and other microfluidic “lab-on-a-chip” type systems.
It would also be possible to use this principle without the overhead LED being permanently affixed above the culture plate. In applications using pipette robots, LEDs could be attached to the end effector, if the arm positions the LED at a known position, spot measurements can be taken and used to detect potential dosing errors or losses due to evaporation.
This proposed fluid level sensing method is a simple, reliable, and accurate approach that has been validated for cell culture plates and has potential for uses in other applications.
Supporting information
S1 Appendix. Distance function derivation.
Detailed derivation of image distance transfer function.
https://doi.org/10.1371/journal.pone.0290951.s001
(PDF)
S1 File. Sensor data.
Data from sensing pipeline used to generate plots shown in the paper.
https://doi.org/10.1371/journal.pone.0290951.s002
(ZIP)
S2 File. Jupyter notebook.
Notebook used for live image capture and processing. Allows for calibration and testing of the sensing pipeline. Current version can be found at https://github.com/pvbaudin/Optical-fluid-level-measurement-for-culture-plates.
https://doi.org/10.1371/journal.pone.0290951.s003
(IPYNB)
Acknowledgments
P.V. Baudin thanks David Haussler and Sofie Salama for the support and guidance they provide to the many projects in our research group.
References
- 1. Doulgkeroglou MN, Di Nubila A, Niessing B, König N, Schmitt RH, Damen J, et al. Automation, Monitoring, and Standardization of Cell Product Manufacturing. Frontiers in Bioengineering and Biotechnology. 2020;8. pmid:32766229
- 2. Figeys D, Pinto D. Lab-on-a-Chip: A Revolution in Biological and Medical Sciences. Analytical Chemistry. 2000;72(9):330 A–335 A. pmid:10815945
- 3. Fleischer H, Baumann D, Joshi S, Chu X, Roddelkopf T, Klos M, et al. Analytical Measurements and Efficient Process Generation Using a Dual–Arm Robot Equipped with Electronic Pipettes. Energies. 2018;11(10):2567.
- 4. Steffens S, Nüßer L, Seiler TB, Ruchter N, Schumann M, Döring R, et al. A versatile and low-cost open source pipetting robot for automation of toxicological and ecotoxicological bioassays. PLOS ONE. 2017;12(6):e0179636. pmid:28622373
- 5. Singh Y, Raghuwanshi SK, Kumar S. Review on Liquid-level Measurement and Level Transmitter Using Conventional and Optical Techniques. IETE Technical Review. 2019;36(4):329–340.
- 6.
Lincoln CK, Gabridge MG. Chapter 4 Cell Culture Contamination: Sources, Consequences, Prevention, and Elimination. In: Mather JP, Barnes D, editors. Methods in Cell Biology. vol. 57 of Animal Cell Culture Methods. Academic Press; 1998. p. 49–65. Available from: https://www.sciencedirect.com/science/article/pii/S0091679X0861571X.
- 7. Jain U, Gauthier A, van der Meer D. Total-internal-reflection deflectometry for measuring small deflections of a fluid surface. Experiments in Fluids. 2021;62(11):235.
- 8.
Litt GJ. Visualization device; 1989. Available from: https://patents.google.com/patent/US4824230A/en.
- 9.
Bradski G. The OpenCV Library. Dr Dobb’s Journal of Software Tools. 2000;.
- 10.
RPi-Cam-Web-Interface—eLinux.org;. Available from: https://elinux.org/RPi-Cam-Web-Interface.
- 11. Ly VT, Baudin PV, Pansodtee P, Jung EA, Voitiuk K, Rosen YM, et al. Picroscope: low-cost system for simultaneous longitudinal biological imaging. Communications Biology. 2021;4(1):1–11. pmid:34737378
- 12. Baudin PV, Ly VT, Pansodtee P, Jung EA, Currie R, Hoffman R, et al. Low cost cloud based remote microscopy for biological sciences. Internet of Things. 2021; p. 100454.
- 13.
Seiler ST, Mantalas GL, Selberg J, Cordero S, Torres-Montoya S, Baudin PV, et al. Modular automated microfluidic cell culture platform reduces glycolytic stress in cerebral cortex organoids; 2022. Available from: https://www.nature.com/articles/s41598-022-20096-9.