Skip to main content
Advertisement
  • Loading metrics

Field performance of the GaugeCam image-based water level measurement system

  • François Birgand ,

    Contributed equally to this work with: François Birgand, Ken Chapman, Arnab Hazra, Ana-Maria Staicu

    Roles Conceptualization, Data curation, Formal analysis, Funding acquisition, Methodology, Project administration, Resources, Supervision, Validation, Visualization, Writing – original draft, Writing – review & editing

    birgand@ncsu.edu

    Affiliation Department of Biological & Agricultural Engineering, NC State University, Raleigh, NC, United States of America

  • Ken Chapman ,

    Contributed equally to this work with: François Birgand, Ken Chapman, Arnab Hazra, Ana-Maria Staicu

    Roles Conceptualization, Data curation, Methodology, Software, Validation, Writing – review & editing

    Affiliation Department of Biological Systems Engineering, University of Nebraska-Lincoln, Lincoln, NE, United States of America

  • Arnab Hazra ,

    Contributed equally to this work with: François Birgand, Ken Chapman, Arnab Hazra, Ana-Maria Staicu

    Roles Data curation, Investigation, Methodology, Software, Validation, Writing – original draft

    Affiliation Department of Mathematics and Statistics, Indian Institute of Technology Kanpur, Kanpur, India

  • Troy Gilmore ,

    Roles Conceptualization, Investigation, Methodology, Validation, Writing – review & editing

    ‡ TG and RE also contributed equally to this work.

    Affiliation Department of Biological Systems Engineering, University of Nebraska-Lincoln, Lincoln, NE, United States of America

  • Randall Etheridge ,

    Roles Data curation, Funding acquisition, Investigation, Resources, Visualization, Writing – review & editing

    ‡ TG and RE also contributed equally to this work.

    Affiliation Department of Engineering, East Carolina University, Greenville, NC, United States of America

  • Ana-Maria Staicu

    Contributed equally to this work with: François Birgand, Ken Chapman, Arnab Hazra, Ana-Maria Staicu

    Roles Data curation, Formal analysis, Methodology, Software, Validation, Visualization, Writing – original draft, Writing – review & editing

    Affiliation Department of Statistics, NC State University, Raleigh, NC, United States of America

Abstract

Image-based stage and discharge measuring systems are among the most promising new non-contact technologies available for long-term hydrological monitoring. This article evaluates and reports the long-term performance of the GaugeCam (www.gaugecam.org) image-based stage measuring system in situ. For this we installed and evaluated the system over several months in a tidal marsh to obtain a good stratification of the measured stages. Our evaluation shows that the GaugeCam system was able to measure within about ±5 mm for a 90% confidence interval over a range of about 1 m in a tidal creek in a remote location of North Carolina, USA. Our results show that the GaugeCam system nearly performed to the desired design of ±3 mm accuracy around 70% of the time. The system uses a dedicated target background for calibration and geometrical perspective correction of images, as well as auto-correction to compensate for camera movement. The correction systems performed well overall, although our results show a ‘croissant-shaped’ mean error (-1 to +4 mm,) varying with water stage. We attribute this to the small, yet present, ‘fish-eye’ effect embedded in images, for which our system did not entirely correct in the tested version, and which might affect all image-based water level measurement systems.

Introduction

Much of what we know in hydrology today takes root at one point in the ability to observe and record the rapid changes in water flow or discharge. To this day, most of flow rates recorded in the thousands of hydrologic stations around the world are still calculated from the water level or stage measurements combined with stage-discharge relationships established at each monitoring station. As such, the measurement of stages will always remain a central part of hydrological monitoring.

Image-based measurements of discharge and water level have recently been given particular attention although the idea is not new [111]. The main advantages compared to other techniques include access to verifiable data, as the human eye can read or verify each data point, the non-contact sensing as the camera is safely placed away from the water, and in the case of small videos, access to the velocity field at the surface of the water. The latter is particularly important in cases or areas where establishing a stage-discharge rating curve is difficult, when possible at all. Research reported on image-based hydrological monitoring tends to either focus on Large Scale Particle Image Velocimetry (LS-PIV) to compute discharge [1228], on discharge measurements using machine learning [29], on flood detection from surveillance cameras [3033], or, on the stage measurement itself [2, 3443] from which discharge is often calculated.

Most articles on image-based stage measurements report a variety of techniques used to monitor stages but provide relatively few details on the systems actual performances and uncertainties. This article reports the performances of the GaugeCam (www.gaugecam.org) image-based stage measuring system in the field over many weeks. This system was originally designed to match the performances of other water level measurement technologies commonly used in relatively small streams and rivers (< 10 m wide). This article expands on a previous report focusing on the sources and uncertainties of this system in the lab [36], and evaluates the long-term performances and calculates the uncertainties of the GaugeCam system in a remote location in the field.

Method

GaugeCam design objectives, philosophy, and system constraints

The GaugeCam system was designed to 1) measure water stages with a precision of ±3 mm in relatively small streams and rivers (<10 m wide), wetlands and lakes, using images obtained from time lapse cameras; 2) have the system be fully solar powered using a 50W solar panel to be installed virtually anywhere in the world; 3) have images automatically streamed via cellular phone network to a server; and 4) have images analyzed on the server with calculated stage and corresponding images stored and available on any Internet client.

Traditional (non image-based) water level measurement techniques (e.g., mechanical pulley and float based systems, pressure transducers immersed in water, pressure transducers in air in equilibrium with the water column or bubblers, radar systems, ultrasonic systems placed above or under the water, reviewed by [44, 45]), involve the interpretation of the raw signal (e.g., voltage, time, etc.) in the field. Real world stage level values are thus calculated from the raw signal internally by the instruments via a calibration or rating system. However, all instruments and signals in the field tend to drift over time, rendering necessary frequent calibration of the instruments in situ by qualified personnel. A major advantage of image-based systems is that the raw signal in an image (the pixels) does not drift relative to the calibration system (i.e., machine vision algorithm; details below). As a result, maintenance does not require highly qualified personnel because there is no need for on-site calibration. The timing for and necessity of maintenance can be detected by the user (i.e., seen on the images as a large camera move, obstructed view, etc.), and every single measurement is visually verifiable. The translation of the raw signal into real world stages is not impacted by field conditions and can theoretically be done in the field (edge computing) or on a server (cloud computing). We chose the latter for the GaugeCam system.

The design constraints imposed to find a time lapse camera that would have minimal power consumption, that would be in sleeping mode most of the time and at each given time interval would wake up to take a picture, day and night, save it locally, and transfer it remotely. Another constraint on the camera was the ability to take images of size small enough to capture the desired Region of Interest (ROI) and the contextual scene, and to minimize the cellular bandwidth monthly consumption, but yet, that would allow high enough resolution and precision on the calculated stage values.

All image-based water level measurement systems reported detect the waterline on an image thanks to the sharp contrast in the distribution in the pixel grey values above and below the water line. In many systems, the ROI where the waterline is sought corresponds to the gauge staff itself [2, 37, 38, 4649]. This makes a lot of sense as gauge staffs are always present at hydrological stations. However, the graduations on the gauge staffs tend to yield rather heterogeneous distribution of the pixel grey scale above the water line, which can make the image analysis challenging, although measurement errors have been reported to be around ±1 cm in good conditions [49]. In a few reports, the goal is to obtain a rough estimate of flow/stage, and/or images do not have a specific target to detect water level [34, 5053]. To give the highest potential for optimal accuracy, others use a dedicated target [[3537, 54]; this article], i.e., a vertical white plan, installed in the field, and at which the camera aims.

GaugeCam measurement principles

The details of the principles of the GaugeCam measurement and software (GRIME2) are available in Chapman et al. [55]. In short, on a white or light grey colored vertical target plane installed in a water body, water forms a very crisp water line. This is particularly visible in an image taken in the general horizontal direction perpendicular to the target. Because of the light absorption of water (even if it is transparent), the pixels corresponding to water have a dark shade, and usually much darker than those above corresponding to the white background of the target. The sharp contrast in the pixel grey scale above and below the water line is used for automatic detection of the water line. The second principle of the GaugeCam system is the ability to automatically translate the location of the water line expressed in pixel coordinates into real world coordinates. For this, the GaugeCam system uses a set of eight bow ties shaped fiducials embedded on the target in two columns of four rows, and leaving a blank column between the two column fiducials (Fig 1). The GaugeCam software automatically detects the centers of the fiducials, for which the real world coordinates are known. The fiducials are also used for automatic correction of image movements (details herein).

thumbnail
Fig 1. Overall installation setup of the camera and the dedicated GaugeCam target background.

A- Inside of the Lookout V camera; B- installation in the field of the camera with the illuminator installed on the side; C- both aiming at the dedicated background; D- installed on the opposite bank of the tidal creek.

https://doi.org/10.1371/journal.pwat.0000032.g001

Field site, camera setup and maintenance

The site chosen to evaluate the performance of the GaugeCam system in the field was a tidal marsh located in Carteret County, North Carolina (34°49’03.7“N 76°36’16.5”W) where the diurnal tidal amplitude generally was of 0.7 m but reached 1.2 m. Thanks to the tides, all stages were relatively equally represented. The evaluation period spanned from 9 February 2012 to 24 July 2012 with a gap during the first three weeks of May 2012 because of power failure.

The camera was a Lookout V wireless internet camera from Colorado Video (http://www.colorado-video.com) equipped with a CMOS (complementary metal oxide semiconductor) image sensor, and a color filter to filter out the Infrared (IR) wavelengths during the daytime (Fig 1A). The filter was mechanically removed at night to allow the IR to pass to the sensor and change the focus since IR and visible light have slightly different wavelengths. The camera was mounted on a heavy metal pole and fitted with an LED illuminator installed about 70 cm to the side to avoid the direct glare on pictures at night (Fig 1B & 1C). The camera was installed on the right bank of the tidal creek 1.6 m above ground and at about 5.2 m from the GaugeCam target installed on the opposite left bank and facing at 240° to the Southwest (Fig 1D). The installation height was chosen to minimize chances of interference with the vegetation and wildlife, and to have the camera safely above the highest (wind driven) tides.

A commercial company printed the GaugeCam target patterns on an exterior grade white plastic material and laminated it on a 0.95 cm (3/8") Plexiglas 1.25×1.1m sheet. This was done after assurance that the black patterns would be visible by the camera with infrared lighting at night. The GaugeCam target was carefully installed against 10×10 cm wooden posts driven in the stream bank and kept vertical thanks to wooden stakes driven at about a 45° angle in the bank and secured with lag bolts against the posts (Fig 1D). The GaugeCam target was installed such that it intercepted the vast majority of low and highest stages. The water line served as the leveling system during installation and it was visually determined that there was less than 1 mm difference in elevations between mirror bow-ties.

The camera faced downward yielding view angles from a horizontal plane varying from 8.1° from the top (real world coordinates 137.6 cm) to 21.0° to the bottom (real world coordinates 11 cm) of the target, respectively. A gauge staff was installed on one of the posts holding the GaugeCam target such that the real-world coordinates could be directly read on images.

The images taken were 800×600 pixels and varied between 35 and 70Kb in size (night pixels took less memory because of the many black pixels). The camera was fitted with a 12 mm lens to flatten the images and limit the ‘fish-eye’ effect, which yielded a resolution of about 2.4 mm per pixel. The whole was powered by a 12V 8Ah ‘brick’ battery recharged by a 50 W solar panel. The camera was able to send images via GPRS wirelessly using 2G technology (camera discontinued in 2016 as a result), and physically store images on an SD card for backup purposes. The target was cleaned every other week (during other normal maintenance activities) using a brush to remove biological fouling that may have formed over the two-week periods. The camera protective glass was wiped at the same time and the SD card was collected and replaced to have a physical backup for the images.

Calibration image and pixel to world coordinate calibration

One of the weak points of image-based stage measurements are the inevitable (small) movements of the camera in the field associated with wind, temperature changes, camera maintenance, optical filter movement, etc. To correct for these, the GaugeCam system uses a ‘calibration image’ to which all other ‘working images’ are referred back after the water line has been detected and measured. It is on the calibration image that the translation between pixel and real world coordinates is performed. A calibration image is normally chosen from an image with best contrast (Fig 2A), but can be performed in less than ideal conditions if necessary (Fig 2B). On the calibration image, a template search is performed to find the pixel coordinate positions of the center of the bow-ties in the image (green circles in Fig 2A & 2B). The real world coordinates of these bow-tie centers are known and used to calculate the perspective transform coefficients [56] and/or the linear interpolation association points [57], from which pixel to world coordinate transformations are performed as illustrated by the yellow lines and labels in Fig 2A & 2B and stored in a calibration file. The position of a water level search area (blue ROI in Fig 2A & 2B) is calculated and written to the calibration file. Finally, two ROI around the two top bow-ties (framed in red in Fig 2A & 2B) are recorded in the calibration file. A template search for the bow-ties in these ROI’s is performed at runtime on working images to determine how much the target has moved compared to the calibration image. Adjustment is made to the stage measurements to accommodate any such movement (further details in [55]).

thumbnail
Fig 2. Pattern detection on calibration images (A&B) and water level measurement on working image (C).

On a calibration image, the coordinates of the center of the bow-ties are automatically detected (green circles; A&B), from which the real-world coordinates calibration grid can be calculated (yellow lines and labels; A&B). The blue ROI defines the area where the water line is normally searched, and the top two bow-ties framed in red are used to quantify the movement of ’working images’ compared to the calibration one. Calibration image is normally chosen from an image with best contrast (A), but can be performed in less than ideal conditions if necessary (B). C- points found to correspond to the water line (yellow circles); the line best fitting through these points (blue line); the point on the blue line which coordinates are used to calculate the initial stage value and reported as level in the top left part of the image; the thick red and the thin green segments representing the locations of the top fiducial of the calibration and the working images, respectively; the difference in stage between these segments is calculated and reported as adjust and used to calculate the final and recorded stage reported as Level (adj).

https://doi.org/10.1371/journal.pwat.0000032.g002

Water level search

The line finder algorithm operates within the blue ROI illustrated in Fig 2. The ROI is divided into vertical search lines for every column of pixels in the ROI. Five arguments are passed to the algorithm to control the way it performs the line search (further details in [55]). The five arguments and what they control include 1) the direction of search (directs whether the search should start in the water and search upward or the opposite); 2) the edge direction (dark to light edge transition or opposite); 3) the edge magnitude (defines minimum water line pixel brightness); 4) the edge kernel size (defines the size in vertical pixels of the edge search); and 5) Edge to select (which edge to keep among all the edge detected).

After a set of edge points are found in each of the vertical column of the search ROI, the OpenCV fitLine [58] method is called to fit a line through the points. If the line is at a reasonable angle relative to a typical water level line, then the find is set as successful. If not enough points are found to perform a line fit or the calculated line is not at a correct angle, the line search fails are reported as ‘no line detected’. The point (in pixel coordinates) on the line at the half-way point between the left and right edges is passed to the pixel to world transform model. The vertical position of the chosen point is reported in world coordinates as the water level. Because of the fitline approach in the pixel coordinates, the coordinates of the chosen points do not have to be integers but real numbers or expressed in fractions of pixels. The resolution on the final number can thus theoretically be sub-pixel or smaller than the pixel resolution (2.4 mm in this case; Fig 2C).

Stage calculation illustration

Following the processes described above, on a working image, points corresponding to the water line along the given number of vertical pixel columns in the search ROI are first detected (ten yellow circles at the water line in Fig 2C). A line is then fit through these points (blue line in Fig 2C) having an equation with slope and intercept values expressed in pixel coordinates. The stage value in pixel coordinates is calculated as the ordinate value corresponding to abscissa equal to the pixel at the half-way point between the left and right edges of the search ROI (red crossed square in Fig 2C). This pixel ordinate is then transformed into stage value in real world coordinates, using the perspective transform coefficients established from the calibration image, and reported as level: in top left of an analyzed image in cm in Fig 2C.

This analysis is thus done as if the fiducial in the working image were perfectly superimposed over the calibration image. In reality, this is rarely the case as images always end up ‘moving’, or more appropriately the camera small movements cause the images to ‘move’. The location of the top two fiducials on the calibration image is illustrated and corresponds to the extremities of the thick red line at the top of the image in Fig 2C. The location of the top fiducials on the working image corresponds to the extremities of the green segment. It turns out that in Fig 2C, both lines appear superimposed. The difference in stage between these two segments is calculated as Adjust: in the top left corner of an image. The final and recorded stage is calculated as the stage value in Level minus the adjust value and appears as Level (adj): in the top left corner of an image (Fig 2C).

Reference visual measurements

To evaluate the performance of the GaugeCam system, stages from 1,086 images staggered in time were visually read on the gauge staff visible on all pictures. This included pictures obtained at all time of the day and night, during early morning haziness, fog, winter conditions, and rain. However, this procedure is subject to the same potential errors as the automatic stage detection system, i.e., the variable interaction between the incident light and the meniscus on the appearance of the water edge in an image (details below). The resolution of 2.4 mm per pixel also limited the overall reading accuracy to about this resolution, although the brain finds ways to extrapolate. The 1 cm black tacks of the staff gauge also tend to appear a little smaller than the white intervals on a low resolution jpeg image. As a result the visual readings are not flawless and have their own embedded uncertainty. Nonetheless, these readings were done keeping all these potentials in mind and were used as references to evaluate the GaugeCam system.

Comparison with other sensors

Stages and discharge in the tidal creek were actually measured over a longer period of 13 months (March 2011 to November 2012), within which the visual readings were done for performance evaluation (February to July 2012). Over the 13 months, the stages in the tidal creek were monitored with several other sensors, including a vented pressure transducer referred to as ‘ISCO’ herein (ISCO 750 module associated with ISCO 6712, Lincoln, NE, USA; details in [59]), which was eventually replaced by a dual ultrasonic/pressure sensor referred to as ‘Sontek’ herein (Sontek IQ, Xylem Inc., Rye Brook, NY, USA). Stage was primarily calculated from the transit time of ultrasounds between the sensor immersed at the bottom and the water surface. Stage was internally corroborated with that measured from the embedded pressure transducer. A third non vented pressure transducer referred to as ‘HOBO’ herein (HOBO U20 Water Level Logger, Onset Computer Corporation, Bourne, MA, USA) was also used. A total of 35,849, 7,099, and 14,068 data points were used for comparison between, respectively, ISCO, Sontek, HOBO, and the GaugeCam system. The GaugeCam system was originally used as a backup system and to verify data.

Sources of and hypotheses for measurement uncertainties

There are two main sources of uncertainties: the first ‘natural’ source associated with the impact of the variable interaction between the incident light and the meniscus on the appearance of the water edge in an image, and the second associated with the GaugeCam series of actions from image taking to deriving stage values.

At the interface between water and the vertical target, a horizontal meniscus is formed because of the surface tension forces raising the contact of the water with the target 2–3 mm above the actual stage [36]. Because of the curved shape of the meniscus, the incident light (direct sun and diffuse light, and IR illuminator) reflected back to the camera may vary depending on the angle of the incident light itself, but also on the view angle, itself linked to the stage. During daytime, when diffuse and direct sunlight tend to bring light from a steep vertical angle, the meniscus may reflect bright light to the camera and exhibit pixels of very light grey, possibly compensating for the increased height due to the meniscus. Reversely at night when the source of light is provided by the IR illuminator generally at a much flatter angle, the meniscus may tend to appear darker, possibly making the water edge appear on an image higher than it really is.

Consequently we hypothesized that because of the change in the incident light angle, the errors on nighttime and daytime pictures should be significantly different, and the nighttime pictures should generally have a positive mean, i.e., overestimate the measurements more than the daytime pictures. The corollary hypotheses are that the nighttime measurements should generally have a positive mean, and that the daytime measurements should generally have a mean near zero. For this, daytime images were taken between 8:30 and 17:45 (~local solar time), and nighttime images when taken between 19:45 and 4:30 the next day. Images from other times were also evaluated and are referred to as dawn/dusk.

Uncertainties inherent to the GaugeCam system are numerous and include uncertainties due to the finding of the fiducial center, the corrections for perspective, ‘fish-eye’ effect and the pixel to real world coordinate matrix, the stage correction for camera movement compared to the calibration image, the water edge detection, and the coordinate of the central point of the line best fitting the water edge points. It would be futile if possible at all to study each separately to calculate an overall uncertainty. Instead, it might be possible to detect the effect of uncertainties due to the GaugeCam system among the overall observed uncertainties. In particular, it became apparent that there was significant movement of the images at night. This was attributed to the mechanical removal of the anti-IR filter in front of the CMOS sensor, which created enough vibrations for the images to move significantly and somewhat randomly at night. However, the night picture sizes tended to be smaller because of lower contrast. The errors on the coordinates of the top two bow-tie centers were thus likely higher than those for daylight pictures, possibly inducing greater errors during stage correction on night pictures.

The consequences of these observations are that we hypothesized that because the nighttime pictures had to be corrected more often than daytime pictures because of movement associated with the mechanical filter removal, the standard deviation of the errors should be larger than those of the daytime ones.

Also, the correction for the perspective is naturally higher at a steeper angle of view, i.e., for lower stage values in the study case [36]. The consequences of these observations are that we hypothesized that because of image distortion, the distribution of errors (mean and standard deviation) for measurements at low stages (less than 30 cm) should be different than the ones at high stage (greater than 70 cm).

Statistical analysis

All analyses were performed using the R software [60]. The objectives of the statistical analyses were to identify potential sources of and quantify uncertainties. Practical results sought included whether and how errors would be different from daylight or nighttime pictures, high or low stages, as a result of the perceived importance of the meniscus described above. Additionally, the analysis sought to quantify the errors expressed in ±X mm of the true stage, and whether and how these errors might be distributed.

First, measurement errors were calculated as the difference between stage values measured with the GaugeCam system and measured visually. As such, 1,086 error values were obtained. It was assumed that errors outside ±2 cm (due to the detection of the line on foul lines rather than on water, blurry pictures due to dew on the lenses, or fog) were not representative of the capabilities of the GaugeCam system and were removed for the statistical analysis. In the end, 1,033 images were used for the statistical analysis. The time dependence among errors was checked and considered negligible (see S1 File).

Data description and pre-processing

Because of camera malfunction for three weeks in May 2012, no images were available over that period and the camera brought in for repair. A 21-day gap thus appeared in the time series between the 529th and the 530th image. The camera maintenance required removal and re-installation, which inevitably caused the images to have ‘moved’ compared to the reference image before and after the 530th image. To measure the impact of such action on the performance of the system, an indicator was kept indicating the ‘repair’ point, i.e., associating a value 1 after repair and 0 before the 529th observation.

Statistical model

A test of the normality of data revealed that the empirical distribution was ‘heavy-tailed’. We posited a Student’s t-distribution to describe the distribution of errors (S2 Fig). Using xt to denote the (known) true water level at the t-th time point, we modeled the GaugeCam error Yt, given the indicators and xt, for the t-th time point where Yt is the distribution of errors, Tν(μt, σt) denotes the Student’s t distribution with location parameter μt (equivalent of the mean for normal distribution), scale parameter σt (equivalent of standard deviation for normal distribution) and ν degrees of freedom (when ν→∞, Tν(μ, σ)→N(μ, σ)).

We first plotted the errors as a function of the reference (visual) water stage (blue = daylight, dark grey = nighttime, and cinereous = dusk/dawn times; Fig 3A). This revealed that the errors heavily depended upon the stage. We then fitted a regression model allowing the mean GaugeCam error to be a smooth function using linear combination of a set of B-splines. We considered 8 knots that approximately allowed one knot per 10 cm range of the true water level. The fitted mean profile along with point-wise 95% confidence intervals (based on normal distribution assumption of the error components of the regression model) are represented by the red line and ribbon, respectively, in Fig 3A. The differences between the errors and the smooth function of the mean GaugeCam errors (referred to as residuals in Fig 3), were then plotted before and after the repair (Fig 3B), and as a function of the time of the day (Fig 3C).

thumbnail
Fig 3. Initial analysis of the GaugeCam errors and residuals.

A- GaugeCam errors vs. the reference water level (on X-axis) versus (on Y-axis) along with a fitted smooth regression curve and pointwise 95% confidence intervals (red); the blue, dark grey, and cinereous dots represent the daytime, nighttime, and dawn/dusk observations, respectively. (B) Boxplots of the residuals (based on regression of GaugeCam errors on the true water level) before and after the changepoint, and (C) boxplots of the residuals for daytime, nighttime, dawn and dusk times of the day.

https://doi.org/10.1371/journal.pwat.0000032.g003

From these results, it appeared that 1) the mean of the errors heavily depended on the water stage, 2) that it might depend on before and after repair periods, but 3) that the standard deviation might not, 4) that the mean of the errors might not depend so much on the time of the day. Also, it appeared that the 5) equivalent of the standard deviation might be a bit larger for nighttime than for daytime pictures.

This comforted our initial hypotheses. We defined the indicators 1(t>repairpoint), 1(tdaylight), 1(tnighttime) and 1(tdaylight, tnighttime) for the repairpoint, daytime, nighttime and other (dawn/dusk) times. 1(t>repairpoint) = 1 for time points after the repairpoint and zero before the repairpoint, 1(tdaylight) = 1 if the t-th image was observed during daylight times (between 8:30 AM and 5:45 PM) and zero otherwise; 1(tnighttime) = 1 if the t-th image was observed during nighttime (between 7:45 PM and 4:30 AM next day) and 1(tdaylight, tnighttime) = 1 if the t-th image was observed some other time of the day, i.e., during dawn and dusk, and zero otherwise. Mathematically, this can be translated in the equation below: where f() is a smooth function and accounts for a non linear dependence on the reference level x. We thus assumed that the location (mean for t statistics) of the distribution (μt) of errors was dependent upon whether pictures were taken before or after repair (α11(t>repairpoint)), during daytime (α21(tdaylight)), nighttime (α31(tnighttime)) or dawn/dusk (no expression because implied by the previous two indicators), and a function of the water stage (f(xt)). We modeled f as a linear combination of K known basis functions cubic B-splines {βk(⋅), k = 1,…,K}; f(⋅) is a smooth function that can be written as . We chose K = 8 as discussed above. We treated the regression coefficients {βk, k = 1,…,K} as unknown parameters and estimated them from the data. Because for any x, we did not consider any further intercept term in the model.

We assumed that the scale (standard deviation for t statistics) of the distribution (σt) of errors was dependent upon the time at which the pictures were taken. The model parameters were derived using the maximum likelihood estimation (MLE) approach [61]. Additional details on the MLE approach and how computational matrices were defined are provided in the supplementary information.

Hypothesis testing

Four main hypotheses were tested. The first one (H1) tested the effect of lighting on daytime and nighttime pictures. This hypothesis could be sub-divided into two sub-hypotheses. In the first sub-one (H.1a), we tested whether the effect of daytime on the mean GaugeCam errors was significant or not. In statistical terms, after adjusting for the other effects (repairpoint, nighttime and reference water level) on the GaugeCam errors (both in mean and scale), we tested whether the mean effect of daytime was significant or not, by testing H0: α2 = 0 versus HA: α2≠0. In the second sub-one (H.1b), we tested whether the effect of nighttime on the mean GaugeCam errors was significant or not. In statistical terms, after adjusting the other effects (repairpoint, nighttime and reference water level) on the GaugeCam errors (both in mean and scale), we tested whether the mean effect of nighttime was significant or not by testing H0: α3 = 0 versus HA: α3≠0. The second hypothesis (H.2) tested whether the effect of daytime and nighttime on the mean GaugeCam error were significantly different or not. In statistical terms, after adjusting the effect of reference water level on the mean GaugeCam errors as well as the effect of daytime and nighttime in scale, the mean effects of daytime and nighttime were significantly different or not, by testing H0: α2 = α3 versus HA: α2α3. The Third hypothesis (H.3) tested whether on nighttime pictures, the mechanical filter coming on and off and creating very small movement of the camera, added larger scale or standard deviation of the errors. In statistical terms, after adjusting the (additive) effect of daytime, nighttime and true water level on the mean of the GaugeCam errors, this resulted in testing H0: σ1 = σ2 versus HA: σ1σ2. The fourth hypothesis (H.4) tested whether the errors were dependent upon the stage. In statistical terms, after adjusting the effect of other covariates in the mean and scale, one can test whether the mean GaugeCam error varies with the true water level or not, i.e., for some constant f0, testing H0: f(x) = f0x versus HA: f(x)≠f0.

Results

Illustrations of the capabilities of the GaugeCam system

A total of 13,905 images were collected by GaugeCam between February 09, 2012 08:30AM and July 24, 2012 07:15PM at a 15 minute time interval. Among these, only 189 or 1.3% could not be read by the GaugeCam system, either because a water edge could not be detected (because of shadows or blurry pictures because of dew on the lens), or because the angle of the water line found was beyond the threshold deemed as acceptable. Among read images, 111 or 0.8% of them received an indicator of suspicious reading as the slope of the water line was among the accepted values, but still at 3° more than the horizontal line. So overall, a stage was calculated for 97.9% of the images acquired over the given monitoring period.

Below are visual examples of time where the system did or did not provide acceptable readings (the details of the accuracy evaluation are provided below). Fig 4 illustrates the capabilities of the GaugeCam system to obtain successful readings in most cases, (Fig 4A–4D), but also unsuccessful ones when the images were too blurry because of rain drops or dew on the camera lens (images not shown), or because the machine vision algorithm was tricked by fouling on the target background that blurred the sharp change in the pixel grey scale at the water edge (Fig 4E) or by interfering shadows in the images (Fig 4F).

thumbnail
Fig 4. Illustration of typically successful and rarer unsuccessful water level measurements.

A,B—Typical results from daylight and nighttime images; C—successful reading despite shadows on target and blurry picture associated with dew on the lens, D—successful reading with waves due to wind on the pictures; E—unsuccessful reading because of fouling on the target (slope threshold removed in the example here); F—unsuccessful reading because of several sharp changes in the grey scale in the search ROI.

https://doi.org/10.1371/journal.pwat.0000032.g004

GaugeCam system performance and statistical analysis results

Testing the performance of the GaugeCam system in a tidal marsh gave a great opportunity to obtain a relatively stratified population of stage values along the stage gradient. It would have been a lot more difficult to obtain high stage values in a stream as high flow/stage values are by nature rare in occurrence. Although the lower stages values (20 to 40 cm) were represented 1.6 times more than the mid to upper stage (40 to 80 cm; S1 Fig), the stage values were relatively well stratified overall, inducing no obvious bias for statistical analysis.

There was no evidence that the Student t-test distribution could not be used for the GaugeCam error data (S2 Fig), and analyses were performed using this model. The estimate of α1 was significantly different from zero indicating that the GaugeCam errors before and after the repair point had significantly different means after adjusting for the other covariates. The estimate of α2 was close to zero and lower than the estimate of α3. The estimate of the standard deviation parameter σ2 for nighttime was larger than σ1 and σ3 (Table 1). The estimate of ν confirmed the heavy-tailed distribution of the GaugeCam errors (ν would have to be >30 otherwise).

thumbnail
Table 1. Results for the coefficients of the statistical model used.

https://doi.org/10.1371/journal.pwat.0000032.t001

The estimated smooth function f(⋅) (Fig 5) representing the mean error as a function of stage shows variable mean depending on the stage. For stages between 25 and 65 cm, the GaugeCam system tended to yield slightly negative mean (<1 mm). Below 25 cm, mean became positive and linearly increased to overestimate stages by about 2 mm for stages equal to 20 cm. For stages above 65 cm, the GaugeCam system tended to positively mean stage estimate (by nearly 3 mm at 90 cm) and even more for higher stages (Fig 5). The pattern of mean in Fig 5 resembles that of the exploratory analysis presented in Fig 3. The confidence intervals for the smooth function f(x) are wider near the ends due to the fewer number of observations available when the true water levels were lowest and highest.

thumbnail
Fig 5. Pointwise means and 95% confidence intervals of the smooth function f(⋅) in the statistical model of the mean water level measurement as a function of stage.

https://doi.org/10.1371/journal.pwat.0000032.g005

The decisions and the p-values for the hypotheses tested are provided in Table A in S1 File. For hypotheses H.1a, H.1b and H.2, we did not have enough evidence to reject the null hypotheses. There was no evidence that there was a difference in the mean error from (H1.a, p = 0.776) daylight images compared to those taken at other times, from (H1.b; p = 0.134) nighttime images compared to those taken at other times, and (H.2; p = 0.172) between the daylight and the nighttime images. In other words, contrary to our original hypothesis, there was no evidence that the interaction of the light with the meniscus against the background introduced significant differences between the average errors calculated on daylight and nighttime images.

We rejected the null hypothesis for the tests H.3 and H.4. There was evidence (H.3; p = 0.002) that the standard deviation of the errors around the mean were different between daylight and nighttime images. This confirms our hypothesis that nighttime readings would be less precise. It is unclear whether this was due to the mechanical movement of the filter for nighttime images and/or to the image alignment correction system which could have been less efficient at night because of lower image contrast Rejecting the H.4 hypothesis with p<0.001 confirmed the fact that the function f(xt), i.e., the mean of the error, was not constant as a function of stage. The 70%, 80%, 90% and 95% point-wise confidence intervals of the mean GaugeCam error, as a function of the true water level, corresponding to daylight, nighttime, and dawn/dusk, are illustrated in S3 Fig and show little differences among them.

Precision of the GaugeCam system from the statistical analysis.

From the previous analysis, it was possible to calculate and express the overall uncertainty of the GaugeCam system in familiar ±X mm units. Because we established that the mean varied with stage, and that standard deviation depended on day and night, we calculated and reported uncertainties for several confidence intervals (70%, 80%, 90%, and 95%), and for daytime vs nighttime pictures (Table 2). We then combined the results by elevation ranges (15–30, 30–45, 45–60, 60–75, >75 cm). Results from the dawn and dusk pictures fell within those of the daytime pictures and are not reported.

thumbnail
Table 2. Uncertainties (in cm) of the GaugeCam system to measure stages compared to the reference visual data for 70%, 80%, 90%, and 95% confidence intervals (Lo = lower boundary, Up = upper boundary), for several ranges of reference stages, and for daytime and nighttime pictures.

https://doi.org/10.1371/journal.pwat.0000032.t002

The results in Table 2 show that in 70% of the cases, the GaugeCam system just about performed within the design tolerance of ±3mm, although uncertainties were higher for elevations greater than 75 cm because of the large mean observed (Fig 5). Generally stages were within ±4mm and ±5mm 80% and 90% of the time, respectively, although overestimated above 75 cm as well.

The uncertainties were larger for nighttime pictures by 1 to 2 mm, compared to those of daytime pictures. Several reasons may explain this. First the contrast in the grey pixels is not as stark as for daytime pictures, rendering the line detection and the reference top fiducials to be less ‘crisp’. Depending on the temperature and the battery voltage, the illuminators did not provide constant illumination and contrast. Second, the mechanical filter movement mentioned above combined with the less precise reference fiducial detection likely induced higher uncertainty.

Results of the comparison with other sensors.

The differences between stage measured by both the GaugeCam system and all three other sensors are illustrated in Fig 6 below as a function of the stages recorded by the GaugeCam system. It became obvious early on during the monitoring period that there were major errors associated with the ISCO sensor. The vented pressure transducer stages followed the GaugeCam stages, but with an apparent lag in time leading to extremely large discrepancies of up to 25 cm or more (ISCO_1 in Fig 6A). Errors were highest when stages were changing the fastest (middle range elevations), creating an ‘eye shape’ of the error cloud (Fig 6A). This appeared as if something was slowing the equilibrium between pressure at the membrane in the water and that of the atmosphere. After careful replacement of the desiccant at the vented tube (ISCO_2 in Fig 6), the extremely large discrepancies between the ISCO stages and the GaugeCam ones ceased. The only explanation for this was the possible formation of droplets in the vented tube. In the very humid air conditions of the marsh and before desiccant was diligently changed at each field visit, stages from the ISCO sensor were just totally unreliable.

thumbnail
Fig 6. Differences in stages between three other sensors and the GaugeCam system expressed in cm.

(B) same data as in (A) without the large errors from the ISCO sensor (ISCO_1) and adjusted for scale.

https://doi.org/10.1371/journal.pwat.0000032.g006

However, after this problem was identified and proper maintenance performed, smaller but still large discrepancies (up to ±7 cm differences; ISCO_2 in Fig 6B) were still observed, with the same time lag effect leading to the general eye shape of the cloud of points. Similar observations were made for the unvented HOBO pressure transducer (which values were corrected with a nearby unvented transducer placed above water to measure atmospheric pressure; HOBO in Fig 6B). Much of the discrepancy may be due to salinity which was measured to vary between 0 and 35 g/L, however. Pressure transducers may thus not be used in variable salinity systems unless a method for salinity correction be available and applied. In all cases, the observed lags in stages measured had in our case and may have major consequences on computed flow volumes in bidirectional flow systems such as in tidal marshes. Indeed, in these cases, the ebb and flow discharges would tend to be under- and overestimated (e.g., [59]), respectively, creating an increasing discrepancy on the overall water balance computed.

The GaugeCam and Sontek data were a lot more in agreement (Fig 6B) and within ±3 mm of each other 65% of the time. The cloud of green points and the black line representing mean (mean_Sontek) in Fig 6B does exhibit an ‘upward bow’ in line with the statistical and performance results described above. Indeed, the upper stages tended to be overestimated by the GaugeCam system, and the ‘mean_Sontek’, calculated as GaugeCam stages minus Sontek stages, became positive reaching +5 mm in the 75–85 cm range and +7 mm for higher stages (Fig 6B). Mid stage values (35-45cm) tended to be a bit underestimated, hence the ‘mean_Sontek’ values at about -4 mm. The discrepancy between the Sontek and the GaugeCam data mirrors the findings obtained from visually read images, although the swing in mean values seems a bit larger, suggesting that the Sontek data were very reliable, and Sontek possibly yielding the most robust values among all systems tested in the tidal marsh conditions.

Discussion

Since the first automatically registering gauge [62] was installed at Sheerness on the River Thames in 1832 [63], there have been many devices installed to record water stages (listed in the introduction; reviewed in [44]). Image-based stage and discharge measurement systems, compared to all other systems, provide the unique advantage of giving access to verifiable (by the human eye) and reinterpretable raw data. They thus provide a true revolution in that sense. They also provide the unique opportunity to fully assess the system performance along the measurement range, and to decide when maintenance (even by untrained personnel) is required.

Our evaluation shows that the GaugeCam system was able to obtain a measurement in 98% of the cases corresponding to all weather and time conditions, and to measure within about ±5 mm for a 90% confidence interval over a range of about 1 m in a tidal creek in a remote location of North Carolina, USA. Our results show that the GaugeCam system nearly performed to the desired design ±3 mm accuracy around 70% of the time. However, the GaugeCam system did induce variable mean that was not linear with stage. Our initial hypothesis on the preponderant role of the water meniscus to be central to the overall results was not verified. The lack of significant difference between means from the nighttime versus daytime pictures is the first reason for refutal. The second reason is that according to our initial hypothesis, the mean should have had a linear response with stage, and it did not, with positive values with stages below 25 cm and above 65 cm, and neutral or negative in between. These variable means are unrelated to the reinstallation of the camera that occurred during the monitoring period, otherwise there would have been an observable breakpoint. The exact reasons for the ‘croissant’ shape mean remain unclear. Although not visible to the naked eye, it is possible that the ‘fish-eye’ effect of the picture might have induced this non linear mean. Our perspective transform did correct for the majority of the image deformation (Fig 2), but not entirely. The variable mean corresponds to one to two pixels out of 800, i.e. 0.125 to 0.25% error. One way to eventually correct for it is to use the function describing the distance between the camera and the stage as an additional input to correct for this discrepancy. A future version of the GRIMEs software embedded in the GaugeCam system will address this (www.gaugecam.org). Our results thus suggest that the camera should be placed as perpendicular as possible to the target to minimize the geometrical corrections. A camera placed further away with a larger lens would minimize the variation in the angle formed by the variable stages and might minimize the bow shaped effect of the measurement meanes. However, the advantages would probably be overweighed by increased variance in errors due to more corrections associated with more frequent ‘image movement’. Evaluation in additional sites would enhance the robustness of the uncertainty analysis of the GaugeCam system.

The GaugeCam system does require relatively tranquil waters and a near horizontal waterline, although the system was able to measure stages even under windy conditions when the waterline displayed sinusoidal patterns (e.g., Fig 4D). The performance reported in this article might not apply in very windy/wavy conditions or when the waterline is far from horizontality and moves a lot up and down in fast moving streams. But there are many conditions for which the performances should apply including when installed above a weir or structure in a stream or small (~<10 m wide) river, in wetlands, ponds, lakes and reservoirs, protected marinas, etc. The GaugeCam system does require the installation of a dedicated target which can catch debris. The images used for the GaugeCam system purposely had relatively low resolution of 2.4 mm per pixel, which kept the image sizes small enough to minimize the data use to below 2GB per month for 15 min pictures and minimized costs. Our experience with instrument maintenance in the field led us to use the ‘cloud computing’ rather than the ‘edge computing’ where all images would be analyzed on site by a computer. Because the raw signal does not degrade, it would actually be possible to have image analyses done on site using much higher resolution images as raw data. It is possible that measurement uncertainty would decrease. In the current configuration, the GaugeCam system does require a relatively large ROI, compared to other systems that have a ROI focused on the gauge staff at the water level interface. It is unclear whether increasing the pixel resolution would automatically increase the performance. It is probable that the measurement variance would diminish, however, and that the visual readings would be more accurate. Having a large ROI also gave access to the image visual context. Indeed, it was possible to interpret some errors due to shadows, birds, debris, etc. because the angle of view was relatively large, which would not have been the case if the ROI had been too narrow. The height of the camera matters as the higher the installation, the more correction for movement and fish eye are needed. The actual height and the resulting angle used in this study were probably at the low range of what would be used in other conditions where cameras would be installed high above the ground to remain out of reach to minimize vandalism.

To our knowledge, this article is the first one that details the sources of uncertainties and quantifies with such level of details the performance of an image-based stage measurement instrument deployed for a long period of time in the field. It also appears to perform the best. Nguyen et al. [34] measured water levels in sewage systems and were able to detect the water line against the sewer concrete background and a camera looking down at a relatively acute angle. They reported Root Mean Square Errors (RMSE) of 1.33 cm calculated by comparison with visual measurements. Kim et al. [46] were among the first ones to provide a fully implemented image-based stage measurement system. They focused their ROI on a gauge staff for a resolution of about 3.3 mm per pixel but did not provide quantified values of their measurement errors although it appeared to be about ±7 mm. Royem et al. [47] provided a proof of concept of their very cheap (~US$200 in 2012) system, but did not provide quantified estimation of their system. Hies et al. [35] proposed a system very similar to GaugeCam’s, with a wide ‘white’ target installed against a wall, and round white fiducials around for transforming pixel to real world coordinates. Their image resolution was 1 cm/ per pixel and they reported about 1.1% of the 1.2m range, or about ±1.2 cm accuracy. Lin et al. [48] targeting a gauge staff placed nearly 20 m from the camera and for a 2 mm per pixel resolution reported sub-centimeter errors over a 150 cm stage range. However, the 51 visual reference measurements were reported at the cm resolution and the errors were calculated from measurements made only within 24 hours. Pan et al., [37] used images from surveillance cameras dedicated to hydrological monitoring and with ROI focused on a dedicated gauge target staff. They reported for 6 sites an error of about 1.5 cm. Zhang et al. [49] reported RMSE between 6 and 12 mm, and between 3 and 40 mm [38] under complex illumination in the field from automatic reading a gauge staff, although RMSE were calculated over several days only. In their situation, the image resolution was about 1 mm per pixel, for a camera located at about 6 m in horizontal distance and looking down at a relatively acute angle of more than 30°. SEBA Hydrometrie [54] advertizes an accuracy of ±1 cm for their GaugeKeeper product for rivers narrower than 10m.

The Lookout V camera was in many ways ideal for hydrological monitoring. It has become obsolete in the USA after the 2G transmission technology phased out in 2016. An ideal camera for image-based stage and discharge monitoring systems would be able to be in sleeping mode by default to save power, would wake up at given intervals to take pictures and/or small videos, would physically save them on an SD card on site, would take remotely sent instructions to update the clock and other operation modes, and would be able to send pictures and videos remotely either after being taken or as bundles at given times of day to optimize power and data consumption. Additionally, the possibility to perform some machine vision operations, and compute the stage on site would be a plus as the raw signal does not degrade with time. With such a camera, it would be possible to measure stage and velocities using LS-PIV approaches. The GaugeCam GRIME2 software for reading images is available as open-source on www.gaugecam.org.

Conclusion

Image-based stage and discharge measuring systems are among the most promising new non-contact technologies available for long-term hydrological monitoring. Many proofs of concepts have been reported and fully operational systems already exist. This article is the first to evaluate and report the long-term performance of an image-based stage measuring system in situ over long periods of time. Our evaluation shows that the GaugeCam system was able to measure within about ±5 mm for a 90% confidence interval over a range of about 1 m in a tidal creek in a remote location of North Carolina, USA. Our results show that the GaugeCam system nearly performed to the desired design ±3 mm accuracy around 70% of the time. The GaugeCam system seems be the one that performs the best among the reported systems thus far. The auto-correction system for camera movement and a dedicated target background may be the main reason for this level of accuracy. The ‘fish-eye’ effect of images may be an Achilles’s heal of image-based stage measuring instruments as it might be the reason for the observed variable mean error. For the study site, mm-size variable means in the measurements appeared to vary with stage values as a result and should be kept in mind for future applications.

Supporting information

S1 Fig. Histogram of the stage values measured visually illustrating a relatively stratified distribution of stages for error calculations and statistical analysis.

https://doi.org/10.1371/journal.pwat.0000032.s001

(TIF)

S2 Fig. Verification that Student’s t distribution fit the data.

https://doi.org/10.1371/journal.pwat.0000032.s002

(TIF)

S3 Fig.

70%, 80%, 90%, and 95% pointwise confidence intervals (inner band through outer band, respectively) of the mean GaugeCam errors during daytime (left), nighttime (middle) and other time (right).

https://doi.org/10.1371/journal.pwat.0000032.s003

(TIF)

S1 File. Further details of the statistical methods used.

https://doi.org/10.1371/journal.pwat.0000032.s004

(DOCX)

Acknowledgments

The authors acknowledge the help from Dr. Nicole Dobbs for visually reading the stage values from the images, and from Andrew Brown for handling the image transfer and storage on servers.

References

  1. 1. Fujita I, Muste M, Kruger A. Large-scale particle image velocimetry for flow analysis in hydraulic engineering applications. J Hydraul Res. 1998;36(3): 397–414.
  2. 2. Takagi Y, Tsujikawa A, Takato M, Saito T, Kaida M. Development of a noncontact liquid level measuring system using image processing. Water Sci Technol. 1998 Jan;37(12): 381–7.
  3. 3. Bradley AA, Kruger A, Meselhe EA, Muste MVI. Flow measurement in streams using video imagery. Water Resour Res. 2002;38(12).
  4. 4. Chakravarthy S, Sharma R, Kasturi R. Noncontact level sensing technique using computer vision. IEEE Trans Instrum Meas. 2002 Apr;51(2):353–61.
  5. 5. Creutin JD, Muste M, Bradley AA, Kim SC, Kruger A. River gauging using PIV techniques: A proof of concept experiment on the iowa river. J Hydrol. 2003;277(3–4):182–94.
  6. 6. Fujita I, Watanabe H, Tsubaki R. Development of a non-intrusive and efficient flow monitoring technique: The space-time image velocimetry (STIV). International Journal of River Basin Management. 2007;5(2):105–14.
  7. 7. Iwahashi M, Udomsiri S. Water level detection from video with fir filtering. In: Proceedings - 16th international conference on computer communications and networks, vols 1–3. 345 E 47th St, New York, Ny 10017 USA: Ieee; 2007. pp. 826–31. (IEEE international conference on computer communications and networks). Available from: https://apps.webofknowledge.com//CitedFullRecord.do?product=WOS&colName=WOS&SID=6EIg4SEq59kIruq5RAr&search_mode=CitedFullRecord&isickref=WOS:000257636700133
  8. 8. Iwahashi M, Udomsiri S, Imai Y, Muramatsu S. Water level detection for functionally layered video coding. In: 2007 ieee international conference on image processing, vols 1–7. 345 E 47th St, New York, Ny 10017 USA: Ieee; 2007. pp. 885–+. (IEEE international conference on image processing icip). Available from: https://apps.webofknowledge.com//CitedFullRecord.do?product=WOS&colName=WOS&SID=6EIg4SEq59kIruq5RAr&search_mode=CitedFullRecord&isickref=WOS:000253487200222
  9. 9. Jodeau M, Hauet A, Paquier A, Le Coz J, Dramais G. Application and evaluation of LS-PIV technique for the monitoring of river surface velocities in high flow conditions. Flow Meas Instrum. 2008;19(2):117–27.
  10. 10. Hauet A, Kruger A, Krajewski WF, Bradley A, Muste M, Creutin J-D, et al. Experimental system for real-time discharge estimation using an image-based method. J Hydrol Eng. 2008;13(2):105–10.
  11. 11. Kim Y, Muste M, Hauet A, Krajewski WF, Kruger A, Bradley A. Stream discharge using mobile large-scale particle image velocimetry: A proof of concept. Water Resour Res. 2008;44(9).
  12. 12. Tauro F, Petroselli A, Grimaldi S. Optical sensing for stream flow observations: A review. J Agric Eng Res. 2018;49(4):199–206.
  13. 13. Jeanbourquin D, Sage D, Nguyen L, Schaeli B, Kayal S, Barry DA, et al. Flow measurements in sewers based on image analysis: Automatic flow velocity algorithm. Water Sci Technol. 2011;64(5):1108–14. pmid:22214058
  14. 14. Dramais G, Le Coz J, Camenen B, Hauet A. Advantages of a mobile LSPIV method for measuring flood discharges and improving stage-discharge curves. Journal of Hydro-Environment Research. 2011;5(4):301–12.
  15. 15. Tauro F, Porfiri M, Grimaldi S. Orienting the camera and firing lasers to enhance large scale particle image velocimetry for streamflow monitoring. Water Resour Res. 2014 Sep;50(9):7470–83.
  16. 16. Stumpf A, Augereau E, Delacourt C, Bonnier J. Photogrammetric discharge monitoring of small tropical mountain rivers: A case study at riviere des pluies, reunion island. Water Resour Res. 2016 Jun;52(6):4550–70.
  17. 17. Tauro F, Petroselli A, Porfiri M, Giandomenico L, Bernardi G, Mele F, et al. A novel permanent gauge-cam station for surface-flow observations on the tiber river. Geoscientific Instrumentation Methods and Data Systems. 2016;5(1):241–51.
  18. 18. Tauro F, Olivieri G, Petroselli A, Porfiri M, Grimaldi S. Flow monitoring with a camera: A case study on a flood event in the tiber river. Environ Monit Assess. 2016 Feb;188(2):118. pmid:26812952
  19. 19. Tauro F, Grimaldi S. Ice dices for monitoring stream surface velocity. Journal of Hydro-Environment Research. 2017 Mar;14:143–9.
  20. 20. Tauro F, Piscopia R, Grimaldi S. Streamflow observations from cameras: Large-Scale particle image velocimetry or particle tracking velocimetry? Water Resour Res. 2017 Dec;53(12):10374–94.
  21. 21. Tauro F, Piscopia R, Grimaldi S. PTV-Stream: A simplified particle tracking velocimetry framework for stream surface flow monitoring. Catena. 2019 Jan;172:378–86.
  22. 22. Hansen I, Warriar R, Satzger C, Sattler M, Luethi B, Peña-Haro S, et al. An innovative image processing method for flow measurement in open channels and rivers. In: GLOBAL CONFERENCE & EXHIBITION-2017 “innovative solutions in flow measurement and Control-Oil, water and gas”, palakkad, kerala, india. 2017. Available from: https://www.seba-hydrometrie.com/fileadmin/user_upload/Referenzen/Hansen_et-al_Image-Processing-Method-for-Flow-Measurement.pdf
  23. 23. Sirazitdinova E, Pesic I, Schwehn P, Song H, Satzger M, Sattler M, et al. Sewer Discharge Estimation by Stereoscopic Imaging and Synchronized Frame Processing. COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING. 2018 Jul;33(7):602–13.
  24. 24. Stumpf A, Augereau E, Bonnier J, Delacourt C, Delcher E. Photogrammetric discharge monitoring of torrential rivers. Houille Blanche-Revue Internationale De L Eau. 2018 Dec;(5–6):66–74.
  25. 25. Lewis QW, Rhoads BL. LSPIV measurements of Two-Dimensional flow structure in streams using small unmanned aerial systems: 1. Accuracy assessment based on comparison with stationary camera platforms and In-Stream velocity measurements. Water Resour Res. 2018 Oct;54(10):8000–18.
  26. 26. Khalid M, Penard L, Memin E. Optical flow for image-based river velocity estimation. Flow Meas Instrum. 2019 Mar;65:110–21.
  27. 27. Pearce S, Ljubicic R, Pena-Haro S, Perks M, Tauro F, Pizarro A, et al. An evaluation of image velocimetry techniques under low flow conditions and high seeding densities using unmanned aerial systems. Remote Sensing. 2020 Jan;12(2).
  28. 28. Meier R, Tscheikner-Gratl F, Steffelbauer DB, Makropoulos C. Flow measurements derived from camera footage using an Open-Source ecosystem. Water. 2022 Jan;14(3):424.
  29. 29. Chapman KW, Gilmore TE, Chapman CD, Mehrubeoglu M, Mittelstet AR. Camera-based water stage and discharge prediction with machine learning. Hydrol. Earth Syst. Sci. 2020. pp. 1–28.
  30. 30. Lo S-W, Wu J-H, Lin F-P, Hsu C-H. Visual sensing for urban flood monitoring. Sensors. 2015 Aug;15(8):20006–29. pmid:26287201
  31. 31. Griesbaum L, Marx S, Höfle B. Direct local building inundation depth determination in 3-D point clouds generated from user-generated flood images. Nat Hazards Earth Syst Sci. 2017 Jul;17(7):1191–201.
  32. 32. Van Ackere S, Verbeurgt J, De Sloover L, Gautama S, De Wulf A, De Maeyer P. A review of the internet of floods: Near Real-Time detection of a flood event and its impact. Water. 2019 Oct;11(11):2275.
  33. 33. Vitry M Moy de, Kramer S, Wegner JD, Leitão JP. Scalable flood level trend monitoring with surveillance cameras using a deep convolutional neural network. Hydrol Earth Syst Sci. 2019 Nov;23(11):4621–34.
  34. 34. Nguyen LS, Schaeli B, Sage D, Kayal S, Jeanbourquin D, Barry DA, et al. Vision-based system for the control and measurement of wastewater flow rate in sewer systems. Water Sci Technol. 2009;60(9):2281–9. pmid:19901459
  35. 35. Hies TB, Parasuraman S, Wang Y, Duester R, Eikaas H, Tan KM. Enhanced water-level detection by image processing. In: 10th international conference on hydroinformatics. 2012. Available from: https://www.researchgate.net/profile/Hans_Eikaas/publication/262337135_Enhanced_water-level_detection_by_image_processing/links/5625a76108aed3d3f137184a.pdf
  36. 36. Gilmore TE, Birgand F, Chapman KW. Source and magnitude of error in an inexpensive image-based water level measurement system. J Hydrol. 2013 Jul;496:178–86.
  37. 37. Pan J, Yin Y, Xiong J, Luo W, Gui G, Sari H. Deep Learning-Based unmanned surveillance systems for observing water levels. IEEE Access. 2018;6:73561–71.
  38. 38. Zhang Z, Zhou Y, Liu H, Gao H. In-situ water level measurement using NIR-imaging video camera. Flow Meas Instrum. 2019 Jun;67:95–106.
  39. 39. Elias M, Kehl C, Schneider D. Photogrammetric water level determination using smartphone technology. Photogramm Rec. 2019 Jun;34(166):198–223.
  40. 40. Azevedo JA, Brás JA. Measurement of water level in urban streams under bad weather conditions. Sensors. 2021 Oct;21(21). pmid:34770466
  41. 41. Isidoro JMGP, Martins R, Carvalho RF, Lima JLMP de. A high-frequency low-cost technique for measuring small-scale water level fluctuations using computer vision. Measurement. 2021 Aug;180:109477.
  42. 42. Kuswidiyanto LW, Nugroho AP, Jati AW, Wismoyo GW, Murtiningrum , Arif SS. Automatic water level monitoring system based on computer vision technology for supporting the irrigation modernization. IOP Conf Ser: Earth Environ Sci. 2021 Mar;686(1):012055.
  43. 43. Kuo L-C, Tai C-C. Robust Image-Based Water-Level estimation using Single-Camera monitoring. IEEE Trans Instrum Meas. 2022;71:1–11.
  44. 44. Herschy RW. Streamflow measurement. CRC Press; 2008.
  45. 45. Bertrand-Krajewski J-L, Laplace D, Joannis C, Chebbo G. Mesures en hydrologie urbaine et assainissement. Éditions Technique & Documentation; 2008.
  46. 46. Kim J, Han Y, Hahn H. Embedded implementation of image-based water-level measurement system. IET Comput Vision. 2011 Mar;5(2):125–33.
  47. 47. Royem AA, Mui CK, Fuka DR, Walter MT. Proposing a low-tech, affordable, accurate stream stage monitoring system. Transactions of the ASABE. 2012;55(6):1–6.
  48. 48. Lin Y-T, Lin Y-C, Han J-Y. Automatic water-level detection using single-camera images with varied poses. Measurement. 2018 Oct;127:167–74.
  49. 49. Zhang Z, Zhou Y, Liu H, Zhang L, Wang H. Visual measurement of water level under complex illumination conditions. Sensors. 2019 Sep;19(19). pmid:31554301
  50. 50. Eltner A, Elias M, Sardemann H, Spieler D. Automatic Image-Based water stage measurement for Long-Term observations in ungauged catchments. Water Resour Res. 2018 Dec;54(12):10362–71.
  51. 51. Ridolfi E, Manciola P. Water Level Measurements from Drones: A Pilot Case Study at a Dam Site. Water. 2018 Mar;10(3).
  52. 52. Schoener G. Time-Lapse Photography: Low-Cost, Low-Tech Alternative for Monitoring Flow Depth. J Hydrol Eng. 2018 Feb;23(2).
  53. 53. Leduc P, Ashmore P, Sjogren D. Technical note: Stage and water width measurement of a mountain stream using a simple time-lapse camera. Hydrol Earth Syst Sci. 2018 Jan;22(1).
  54. 54. SEBA-Hydrometrie. GaugeKeeper.
  55. 55. Chapman KW, Gilmore TE, Chapman D. C, Birgand F, Mittlestet AR, Harner MJ, et al. Technical Note: Open-source software for water-level measurement in images with a calibration target. Water Resour Res. Accepted.
  56. 56. OpenCV: Camera calibration with OpenCV. Available from: https://docs.opencv.org/master/d4/d94/tutorial_camera_calibration.html
  57. 57. Rhody H, Others. Lecture 2: Geometric image transformations. RIT Presentation. 2005; Available from: https://www.cis.rit.edu/class/simg782/lectures/lecture_02/lec782_05_02.pdf
  58. 58. OpenCV: Structural analysis and shape descriptors. Available from: https://docs.opencv.org/master/d3/dc0/group__imgproc__shape.html
  59. 59. Etheridge JR, Birgand F, Burchell MR II. Quantifying nutrient and suspended solids fluxes in a constructed tidal marsh following rainfall: The value of capturing the rapid changes in flow and concentrations. Ecol Eng. 2015;78: 41–52.
  60. 60. R Core Team. R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing; 2017. Available from: http://www.R-project.org/
  61. 61. Scheffler C. A derivation of the EM updates for finding the maximum likelihood parameter estimates of the student’s t distribution. Available from: http://www.inference.org.uk/cs482/publications/scheffler2008derivation.pdf
  62. 62. Palmer HR. Description of a graphical register of tides and winds. Philosophical Transactions of the Royal Society of London. 1831;121: 209–13. Available from: https://www.jstor.org/stable/pdf/107930.pdf
  63. 63. Caesperlein A. Historical development of hydrometry. In: Three centuries of scientific hydrology 1674–1974, background reports presented on the occasion of the celebration of the tercentenary of scientific hydrology, paris, 9–12 september 1974. UNESCO-WMO/WMO-IAHS/AISH; 1974. pp. 54–63.