Agricultural Water Management 98 (2011) 697–704
Contents lists available at ScienceDirect
Agricultural Water Management journal homepage: www.elsevier.com/locate/agwat
Landscape irrigation scheduling efficiency and adequacy by various control technologies M.S. McCready, M.D. Dukes ∗ Agricultural and Biological Engineering Department, University of Florida, P.O. Box 110570, Gainesville, FL 32611-0570, United States
a r t i c l e
i n f o
Article history: Received 13 May 2010 Accepted 15 November 2010 Available online 10 December 2010 Keywords: Soil moisture sensor controller Rain sensor Evapotranspiration controller Soil water balance Smart Water Application Technology
a b s t r a c t Automated residential irrigation systems tend to result in higher water use than non-automated systems. Increasing the scheduling efficiency of an automated irrigation system provides the opportunity to conserve water resources while maintaining good landscape quality. Control technologies available for reducing over-irrigation include evapotranspiration (ET) based controllers, soil moisture sensor (SMS) controllers, and rain sensors (RS). The purpose of this research was to evaluate the capability of these control technologies to schedule irrigation compared to a soil water balance model based on the Irrigation Association (IA) Smart Water Application Technologies (SWAT) testing protocol. Irrigation adequacy and scheduling efficiency were calculated in 30-day running totals to determine the amount of over- or underirrigation for each control technology based on the IA SWAT testing protocol. A time-based treatment with irrigation 2 days/week and no rain sensor (NRS) was established as a comparison. In general, the irrigation adequacy ratings (measure of under-irrigation) for the treatments were higher during the fall months of testing than the spring months due to lower ET resulting in lower irrigation demand. Scheduling efficiency values (measure of over-irrigation) decreased for all treatments when rainfall increased. During the rainy period of this testing, total rainfall was almost double reference evapotranspiration (ETo ) while in the remaining three testing periods the opposite was true. The 30-day irrigation adequacy values, considering all treatments, varied during the testing periods by 0–68 percentile points. Looking at only one 30-day testing period, as is done in the IA SWAT testing protocol, will not fully capture the performance of an irrigation controller. Scheduling efficiency alone was not a good indicator of controller performance. The amount of water applied and the timing of application were both important to maintaining acceptable turfgrass quality and receiving good irrigation adequacy and scheduling efficiency scores. © 2010 Elsevier B.V. All rights reserved.
1. Introduction All plants, including turfgrass, require water and nutrients to support growth and maintenance (Connellan, 1999). Irrigation of landscapes in Florida is necessary to ensure good plant quality due to the sporadic nature of rain events and the low water holding capacity of the soils. Residential irrigation systems that are automated have been shown to use 47% more water on average than sprinkler systems that are not automated (Mayer et al., 1999). High water use with automated irrigation systems can be attributed to several factors, including the tendency to improperly program irrigation timers for changing weather conditions. In an attempt to reduce the amount of water used outdoors, landscape irrigation
∗ Corresponding author. Tel.: +1 352 392 1864x205; fax: +1 352 392 4092. E-mail address: mddukes@ufl.edu (M.D. Dukes). 0378-3774/$ – see front matter © 2010 Elsevier B.V. All rights reserved. doi:10.1016/j.agwat.2010.11.007
has been regulated in most, if not all, areas in Florida. For example, in the St. Johns River Water Management District (SJRWMD) irrigation is restricted to a maximum of 2 days/week from March to November and a maximum of one day per week from November through March (SJRWMD, 2010). Various irrigation control technologies are commercially available to supply irrigation to match measured or estimated plant demand, thus reducing outdoor water use. These technologies include; soil moisture sensor (SMS) controllers, evapotranspiration controllers (ET) and rain sensors (RS). This study compared the irrigation applied using the three types of control devices with the calculated irrigation requirement developed using a daily soil water balance model based on Irrigation Association’s (IA) Smart Water Application Technology (SWAT) testing protocol. The protocol measures adequacy and efficiency of the irrigation schedule based on the soil water balance. This protocol attempts to track, with a soil moisture balance model, the ability of the controller to
698
M.S. McCready, M.D. Dukes / Agricultural Water Management 98 (2011) 697–704
apply water accounting for plant needs without causing runoff or drainage beyond the root zone. This testing protocol is being considered for adoption by the U.S. Environmental Protection Agency WaterSense program to label controllers as water efficient devices (US EPA, 2010). A soil moisture sensor (SMS) irrigation controller in bypass configuration uses data from the sensor to allow or bypass timed irrigation events. This type of controller has an adjustable threshold setting and if the soil water content exceeds the threshold, the scheduled event is bypassed (Dukes, 2005). Residential soil moisture sensors do not adjust the length of irrigation events, only whether or not an irrigation event occurs. Cardenas-Lailhacar et al. (2008) found water savings for three commercially available SMS controllers ranging from 69% to 92% without adversely affecting turf quality in bermudagrass (Cynodon dactylon L.) during normal rainfall frequencies. In a companion to the present work, McCready et al. (2009) reported water savings for SMS controllers ranging from 11% to 53% during mostly dry conditions compared to an irrigation schedule based on historical monthly evapotranspiration and rainfall. Evapotranspiration-based irrigation controllers ideally irrigate according to calculated ET needs of the plant. Reference ET (ETo ) is the transpiration and evaporation from a uniform surface of actively growing grass vegetation that is well-watered and maintained at a specified growing height. Crop evapotranspiration (ETc ) is estimated by multiplying ETo with the crop coefficient (Kc ) (Allen et al., 2005). ET controllers can generally be programmed for various site specific conditions such as soil type, plant type, root depth, sun exposure, etc. Depending on manufacturer and model, some of these controllers determine both the frequency and length of irrigation events based on a soil water balance calculation using weather data. A study conducted in Florida by Davis et al. (2007) from 1 July 2006 to 30 November 2006 found that two of three brands of ET controllers tested compared to a 2 days/week irrigation schedule with no irrigation control devices were capable of reducing water applied by 20–60%, while maintaining acceptable turf quality. McCready et al. (2009) examined irrigation water applied using two brands of ET controllers compared to the irrigation applied, based on the recommended irrigation rates and found that water savings were between 25% and 62%. The testing was performed under dry to normal Florida rain conditions. Rain sensors (RS) are designed to interrupt time clock scheduled irrigation events after a certain depth of rain occurs (Dukes and Haman, 2002a). Testing on rain sensors has shown that, under normal rainfall conditions in Central Florida, they can produce up to 34% water savings while maintaining good turf grass quality (Cardenas-Lailhacar et al., 2008). In a study conducted in Pinellas County, Florida from June 2006 to March 2007, homes with rain sensors resulted in 19% less irrigation water applied than homes without a rain sensor (Haley and Dukes, 2007). McCready et al. (2009) found that RS systems compared to recommended irrigation rates, could produce water savings of 7–30%, under mostly below normal rainfall conditions depending on rain sensor set point and irrigation frequency. The objective of this experiment was to evaluate the irrigation water applied using SMS controllers, ET controllers and rain sensors compared to the calculated irrigation requirement from a daily soil water balance model based on the IA SWAT testing protocol. Another objective was to examine the use of irrigation adequacy and scheduling efficiency measures, as described in the SWAT testing protocol, in evaluating the effectiveness of the technologies tested.
2. Materials and methods 2.1. Site description This study was performed at the Plant Science Research and Education Unit in Citra, Florida. As described in the companion study by McCready et al. (2009), the experimental area consisted of 72 plots (18.2 m2 each) of ‘Floratam’ St. Augustinegrass (Stenotaphrum secundatum). The present study focused on 18 plots. The plots were irrigated using four Toro 570 Series (The Toro Company, Bloomington, MN) quarter circle pop-up spray heads with a measured application rate of 50 mm/h. Nine plots were considered for the present study due to the location of soil moisture sensors controlling irrigation on select plots. Irrigation time clocks were used for scheduling all of the treatments except where a time-based controller was not necessary. Plot maintenance was according to local recommendations to maintain good quality during the growing season. Shedd (2008) and McCready et al. (2009) give full details of the site layout and experimental procedures not detailed in this analysis. Soil cores were collected on 11 August 2006 for soil physical properties analysis, which is described in detail in McCready et al. (2009). The soil present at the field site consisted of 1.7% clay, 1.1% silt and 97.3% sand. Field capacity and permanent wilting point values for the field site were measured as 12% and 4% (all soil water contents in this paper are expressed as a percentage on a volumetric basis), respectively. Measurements of field capacity (FC) and permanent wilting point (PWP) for this study were based on with the 18 plots in one particular area of the larger study that had similar soil water holding capacities (McCready et al., 2009).
2.2. Calculated irrigation requirement A daily soil water balance model was performed to estimate the irrigation that was required during the 2006 and 2007 testing. This approach sums inputs and outputs of water to the soil profile based on a specified time step. Irrigation was simulated in the soil water balance when rainfall was insufficient for supplying plant water needs due to losses from ET. The equation for the daily soil water balance according to Allen et al. (1998) is as follows: Dr,i = Dr,i−1 − (P − RO)i − Inet,i − CRi + ETc,i + DPi
(1a)
where D (mm) is the depletion of water from the root zone, i is the current day and i − 1 is the previous day. P (mm) is the daily precipitation, RO (mm) is runoff, Inet (mm) is the net irrigation depth, CR (mm) is the capillary rise from the groundwater table, ETc (mm) is the crop evapotranspiration and DP (mm) is the deep percolation (Allen et al., 1998). A root zone depth of 30 cm was assumed for the soil water balance. According to Doss et al. (1960), 76% of the roots of five warm season forage species were in the upper 30 cm of soil. Peacock and Dudeck (1985) found that, St. Augustinegrass (Stenotaphrum secundatum), had 81% of the root mass in the upper 30 cm. Under the conditions of this study, RO was observed as negligible on the flat experiment site. Capillary rise (CR) was also considered negligible due to the depth of the water table (more than 5 m deep) and the coarse soils. In the coarse soils inherent at the experiment site, FC levels are obtained in less than one day and often in a matter of hours (data not shown). Any water that is applied in excess of FC is considered to contribute to DP. This simplifies Eq. (1a) to: Dr,i = Dr,i−1 − Pe,i − Inet,i + ETc,i + DPi
(1b)
M.S. McCready, M.D. Dukes / Agricultural Water Management 98 (2011) 697–704
The amount of water stored in the root zone that a plant can utilize is the available water (AW) and is calculated as follows: (FC − PWP) × RZ AW = 100
(2)
(3)
A MAD factor of 50% was selected for warm season turfgrass to match procedures outlined in the SWAT testing protocol. Typically a MAD value between 40% and 60% is used for turfgrass irrigation purposes (Allen et al., 1998). In dry soil, the water is more strongly bound to soil particles through capillary and absorptive forces making plant water uptake more difficult. When soil water is no longer readily available to the plant, it will undergo stress due to the increased difficulty of water uptake. Water stress in the plant leads to a decrease in crop transpiration (Allen et al., 1998). To account for the decrease in crop transpiration an adjusted ET value (ETc,adj ) was calculated when the soil water was below MAD using an empirical water stress coefficient (Ks ) given by Allen et al. (1998). ETc,adj = Ks × ETc
(4)
When the soil water level is above the MAD level, the value of Ks is 1. As soil water depletion increases to a depth greater than MAD the value of Ks decreases linearly as described by the Allen et al. (1998): Ks =
AW − Dr AW − RAW
Irrigation Adequacy =
where FC is field capacity (volume %), PWP is permanent wilting point (volume %) and RZ is the root zone depth (mm). The calculated depth of available water at the test site based on available water of 8% and a 30 cm root zone was 24 mm. Irrigation should be applied before the soil water content is equal to the PWP in order to prevent permanent physiological damage. The portion of AW (mm) that is allowed to be depleted before irrigation occurs is known as the readily available water (RAW, mm). The depth of RAW is calculated based on the management allowable depletion (MAD, %). The depth of RAW was calculated as follows (Irrigation Association, 2005): RAW = MAD × AW
follows:
(5)
The irrigation depth required to replace the water lost through ETc , which has not been replaced by rainfall, is the net irrigation requirement (Inet ). The actual application of water by the irrigation system is not perfectly efficient so the net irrigation requirement has to be increased to account for the inefficiency of the irrigation system (Irrigation Association, 2005). The gross irrigation requirement was calculated by adjusting the net irrigation requirement based on the irrigation system efficiency. An efficiency factor of 60% was assumed in the development of the historical irrigation requirement and therefore was used in the soil water balance model to determine gross irrigation (Dukes and Haman, 2002b). 2.3. Irrigation adequacy and scheduling efficiency Actual irrigation applied was compared to calculated depth required in order to determine the irrigation adequacy and scheduling efficiency of water application based on the SWAT testing protocol. The actual water applied was input into a daily soil water balance along with ETc,adj and rainfall to determine the daily soil water level (mm). Irrigation adequacy (%) refers to whether or not the irrigation applied is sufficient for plant needs (Irrigation Association, 2008) and was calculated over each treatment period and as a running 30-day total throughout the testing periods as
699
ETc,adj − Deficit ETc,adj
× 100
(6)
where Deficit (mm) represents the total depth of water that was not available for plant needs during the time period (sum of the depth of water below AW). Scheduling efficiency is the ability of the controller to schedule irrigation without applying excess water that would lead to run off and deep percolation. It was also calculated over each treatment period and as a running 30-day total as follows (Irrigation Association, 2008): Scheduling Efficiency =
Net Irrigation − Scheduled Losses Net Irrigation ×100
(9)
where Scheduled Losses (mm) is the amount of water applied that exceeded the FC of the soil, leading to deep percolation (runoff assumed to be zero due to highly permeable soil). Both the irrigation adequacy and scheduling efficiency were calculated as an average over the whole treatment period and for running totals of 30-day periods. Values were only calculated for the 30-day period if cumulative rainfall totaled at least 10.2 mm and cumulative ETo was at least 63.5 mm, as specified in the SWAT testing protocol (Irrigation Association, 2008). 2.4. Data collection Irrigation was monitored using flow meters on each plot as described by McCready et al. (2009). Soil water content was monitored using time domain reflectometry (TDR) probes installed in each plot. The TDR sensors were buried in the center of every plot with the top of the sensor at a depth of 8 cm and the bottom of the sensor at a depth of 18 cm. Probes were used to verify the calculated soil water balance. Weather data were collected using an automated weather station (Campbell Scientific, Logan, UT) within 900 m of the experimental site. Data collected in 15 min intervals included: rainfall, solar radiation, relative humidity, air temperature, and wind speed. Daily ETo was calculated using the ASCE standardized Penman Montieth equation (Allen et al., 2005). Monthly values of locally determined Kc were specified for a warm season turfgrass (Jia et al., 2009). Turfgrass quality evaluations were made using the National Turfgrass Evaluation Program (NTEP) procedures (Shearman and Morris, 1998). Ratings of turfgrass quality were based on density and color and were on a 1 (dead) to 9 (perfect) scale, with a 5 representing minimally acceptable turf quality. Statistical analysis of the irrigation water applied was performed using means separation by Duncan’s Multiple Range Test in SAS (SAS Institute, Inc., Cary, NC). 2.5. Experimental design Experimental treatments are described in Table 1. A commercially available soil moisture sensor, Acclima Digital TDT RS500 (Acclima Inc., Meridian, ID), was tested at two different volumetric water content (VWC) thresholds, 7% (AC 7) and 10% (AC 10; Table 1). The SMS treatments had a sensor buried at a depth of 8 cm in the experimental plots to control irrigation. Soil water content was monitored independently in all plots using TDR probes. The ET controller tested was the Toro Intelli-Sense (The Toro Company, Bloomington, MN). The Toro Intelli-Sense controller (TORO) calculates irrigation runtime and frequency. Inputs to the ET controller were based on field site properties (McCready et al.,
700
M.S. McCready, M.D. Dukes / Agricultural Water Management 98 (2011) 697–704
Table 1 Summary of experimental treatment codes and descriptions. Treatment description Treatment code Irrigation frequency (days/week) Soil moisture sensor controllers AC 7 2 Acclima set at 7% VWCa AC 10 2 Acclima set at 10% VWC ET controller TORO 2 Toro Intelli-Sense Rain sensors and time-based irrigation RS1-6 mm 1 Rain sensor set at 6 mm rainfall threshold RS2-6 mm 2 Rain sensor set at 6 mm rainfall threshold RS7-3 mm 7 Rain sensor set at 3 mm rainfall threshold DWRS 2 Reduced irrigation schedule (60% of RS2-6 mm) NRS 2 No rain sensor NON 0 Non-irrigated a
VWC, volumetric water content.
2009). The irrigation frequency for both the SMS and ET controller treatments was 2 days/week. Rain sensor (RS) treatments, using the Mini-Clik rain sensor (Hunter Industries Inc., San Marcos, CA), were set at three irrigation frequencies and two rainfall threshold depths: 1 days/week, 6 mm threshold (RS1-6 mm); 2 days/week, 6 mm threshold (RS2-6 mm); 7 days/week, 3 mm threshold (RS7-3 mm). There were two comparison treatments; a time-based treatment without a rain sensor (NRS) and a time based treatment with a rain sensor set at 6 mm and an irrigation depth equal to 60% of the possible depth scheduled for NRS and the other RS treatments (DWRS). The same total application depth per week was divided over the possible number of irrigation days per week. The monthly irrigation schedule was based on local recommendations (Dukes and Haman, 2002b). Every treatment except for the TORO, DWRS and the NON treatments had the same possible total depth of irrigation application. The irrigation schedule used for the RS and SMS treatments was based on a system efficiency of 60%. An efficiency factor of 95% was used as an input to the TORO controller. The depth of water applied for the TORO treatment was adjusted based on an efficiency factor of 60% to account for the difference in irrigation scheduling between the ET and the SMS and RS treatments (TOROadj). This was done by multiplying the actual depth of water applied for the TORO treatment by an adjustment factor (95/60). The research data collection began 22 April 2006 and ended on 30 November 2007. There were four treatment periods as follows: 22 April 2006 to 30 June 2006 (S06), 23 September 2006 to 15 December 2006 (F06), 1 May 2007 to 31 August 2007 (S07) and 1 September 2007 to 30 November 2007 (F07). 3. Results and discussion 3.1. Weather conditions Three of the four treatment periods were relatively dry compared to historical rainfall for the research area (Table 2). During
Fig. 1. Example of soil volumetric water content calculated daily using a soil water balance (SWB) with a 30 cm root depth for (A) RS1-6 mm (RMSE 1.4% VWC) and (B) RS2-6 mm (RMSE 2.0% VWC) treatments during S07 compared to soil water measured using the TDR sensors installed in the treatment plots.
S06, F06 and S07 total rainfall depths were 50% less than the historical average, while in F07 rainfall (347 mm) was 34% greater than the historical average (258 mm). The number of days per month where there was a rainfall event with a depth greater than 2.5 mm was fewer than the monthly historical averages for every time period except F07 (Table 2). Average weekly ETc for the F06 and F07 testing periods (12 mm/week and 15 mm/week respectively) were lower than the S06 and S07 testing periods (30 mm/week and 26 mm/week respectively). Cumulative ETc was greater than rainfall for all of the testing periods except F07. The non-irrigated plots only maintained adequate turf quality ratings (5.0) during F07 (McCready et al., 2009). In S06, the non-irrigated plots received an average turf quality of 1.4 and were dead by the end of the testing period. During F06 and S07 the irrigation was applied to the non-irrigated plots due to poor turf quality after several weeks of testing to prevent turfgrass death. 3.2. Soil water balance (SWB) verification The calculated soil volumetric water content (VWC, %) was compared to the measured soil VWC data collected with TDR probes installed in the experimental plots (Fig. 1). The calculated soil VWC at the end of each day was compared to the TDR data at the end of the corresponding day. The calculated VWC was similar to the measured VWC. The largest differences in measured and calculated soil
Table 2 Summary of weather conditions for the four testing periods including cumulative rainfall and the number of rainfall events with depths >2.5 mm compared to historical averages (NOAA, 2006). Treatment period
Calculated ETc (mm)
S06 F06 S07 F07
302 149 411 152
Cumulative rainfall (mm)
Number of days with rainfall >2.5 mm
Measured
Historical
Measured
Historical
138 92 287 347
298 188 636 258
14 15 20 19
19 21 38 16
M.S. McCready, M.D. Dukes / Agricultural Water Management 98 (2011) 697–704
701
Table 3 Summary of average weekly water applied along with irrigation adequacy and scheduling efficiency for irrigation treatments and a daily soil water balance (SWB) model. Water applied data from McCready et al. (2009). Treatment
AC 7 AC 10 TOROadj RS1-6 mm RS2-6 mm RS7-3 mm DWRS NRS SWB CVc (%)
Average water applied (mm/week)
Average irrigation adequacy (%)
Average scheduling efficiency (%)
S06
F06
S07
F07
S06
F06
S07
F07
S06
F06
S07
F07
18e 27bc –b 24c 24c 23cd 19de 30b 39a 31
–a 16b 14b –a 22a –a 14b 24a 16b 30
20e 32cd 46a 35cd 35c 31d 21e 39b 34cd 24
9f 14e 19d 22bc 23b 20cd 14e 30a 12e 34
47 59 –b 56 64 89 54 66 100
–a 99 84 –a 100 –a 72 100 100
53 80 93 78 88 97 59 90 100
81 97 100 97 100 100 100 100 100
85 89 –b 74 85 97 91 82 100
–a 100 95 –a 85 –a 97 80 100
100 93 75 78 92 98 100 83 100
83 88 79 69 70 81 97 55 100
Note: Different letters in columns indicate differences by Duncan’s multiple range test at the 95% confidence level. The CV is the coefficient of variation determined by the ANOVA model. a Treatments where the irrigation system was malfunctioning. b The TORO treatment did not begin until F06. c Coefficient of variation.
water content occurred when measured VWC was above FC. In the SWB, the soil water is prevented from increasing above FC; therefore no increase in VWC above FC and prior to saturation occurs in the SWB as it does in soil. As soon as the measured soil water content dropped to FC, the calculated rate of water loss from the soil followed the measured trend in the TDR data. The root mean square error (RMSE) represents the mean distance between the simulated soil water content and the measured values of soil water content (Willmot, 1982). The RMSE between the calculated and the measured soil water content values averaged 2.1% VWC across all treatments. 3.3. Soil moisture sensors During all testing periods the AC 7 treatment applied significantly less water than the calculated irrigation requirement (SWB) except in F06 when this treatment was not operational. The AC 10 treatment applied water similarly to the SWB during all testing periods with the exception of S06 where it applied significantly less water (Table 3). During S06, all of the treatments applied significantly less water per week than the calculated SWB depth (39 mm/week), including the NRS treatment (30 mm/week; P = 0.001). The AC 7 treatment bypassed irrigation more often than the AC 10 treatment due to the lower water content threshold. Seasonal average irrigation adequacy ratings for the AC 10 treatment ranged from 59% (S06) to 99% (F06) compared to the AC 7 treatment which had adequacy ratings from 47% (S06) to 81% (F07). Both produced at least minimally acceptable average turf quality ratings during all testing periods, but AC 10 had significantly better turf quality ratings during two dry testing periods (S06 and S07). By the end of S06 the turf quality rating of the AC 7 treatment was below minimally acceptable (4.8) (McCready et al., 2009). The adequacy ratings for the AC 7 (81%) and the AC 10 (97%) treatments increased in F07 compared to the ratings in S07 due to an increase in cumulative rainfall depth and frequency and a decrease in ETc (Table 2). Seasonal scheduling efficiency ratings appeared to be similar for the two treatments ranging from 83% to 100%. These two treatments were scheduled to apply the same depth of water per irrigation event with the same frequency of irrigation; the only difference in the amount of water applied was due to how often irrigation was bypassed by the sensor. 3.4. Rain sensors All of the RS-based treatments, with the exception of DWRS, had similar depths of weekly water applied during the four testing
periods they were operational. In S06, these treatments applied significantly less water per week than the calculated depth required, similar to the SMS and NRS treatments. In F06 and F07 the RS-based treatments applied significantly more water than the calculated depth and in S07 the water applied was similar to the calculated depth. The DWRS treatment applied significantly less water than the calculated depth in all testing periods except F07. In F07, the DWRS treatment applied water similarly to the calculated depth (Table 3). Overall adequacy ratings ranged from 54% to 100% for the four RS-based treatments during the four testing periods. Similar to the findings for the SMS based treatments, the RS based treatments received the lowest adequacy ratings during S06 compared to the other testing periods. The irrigation adequacy ratings ranged from 54% to 89% during S06 (Table 3). The irrigation adequacy rating for DWRS (54%) was the lowest of the RS treatments for S06, however the DWRS treatment received the highest turf quality rating (6.5) compared to the other RS-based treatments (Table 3) (McCready et al., 2009). In this case a low adequacy rating was not a predictor of negative plant response. Some level of deficit irrigation (soil water below a MAD level of 50%) based on the soil water balance was acceptable for maintaining adequate turf quality. The water content of the soil for the RS7-3 mm treatment dropped below the MAD level for an extended period of time in S06 according to the soil water balance (Fig. 2A). The average net irrigation depth applied per event for this treatment during S06 (2.6 mm) was lower than the average depth of ETc,adj (3.1 mm). Irrigation was not deep enough to increase soil water content, but periodic rainfall events kept the soil water content from an increasing deficit (Fig. 2A). Turf quality for the RS7-3 mm treatment in S06 was acceptable (6.2) even though the soil water content was below the MAD level for 38 continuous days (Fig. 2A). The adequacy rating for RS7-3 mm during this testing period was 89%, which was the highest rating of any treatment in S06 (Table 3). Frequent and small irrigation applications optimized the controller performance measures of irrigation scheduling efficiency and adequacy. Substantial water deficits occurred in the soil water profile between all of the irrigation events of the RS16 mm treatment during the S06 testing period based on the soil water balance model (Fig. 2B). This resulted in an irrigation adequacy of 56% and a scheduling efficiency of 74%. In contrast, treatment RS7-3 mm received an irrigation adequacy rating of 89% and a scheduling efficiency score of 97%; however, the weekly depth applied for the RS1-6 mm treatment was similar to the RS7-3 mm treatment (23 mm/week; P = 0.659). Decreased root depth was not observed on the RS7-3 mm treat-
702
M.S. McCready, M.D. Dukes / Agricultural Water Management 98 (2011) 697–704
Fig. 3. Irrigation adequacy and scheduling efficiency ratings in running 30-day totals and cumulative rainfall for SMS treatments: (A) AC 7 and (B) AC 10 in S06.
Fig. 2. Calculated soil water content for (A) RS7-3 mm treatment and (B) RS1-6 mm treatment in S06 plotted along with field capacity (FC), permanent wilting point (PWP) and management allowable depletion (MAD). The bar graph shows the depth of effective irrigation (net irrigation applied that did not lead to drainage), effective rainfall, and deep percolation (DP). Adjusted ETc (ETc,adj ) based on the stress coefficient Ks is shown as an output to the soil water balance.
ment due to the small and frequent irrigation events (Shedd et al., 2008). 3.5. ET controllers In F06, the TOROadj treatment applied water similarly to the calculated depth required (2 mm/week less water than the SWB depth; P = 0.176) while in the remaining testing periods TOROadj applied significantly more water than the calculated depth (no TORO controller was in use during S06; Table 3). Seasonal adequacy ratings for the TOROadj treatment ranged from 84% (F06) to 100% (F07) and scheduling efficiency ranged from 75% (S07) to 95% (F06). The TORO treatment maintained acceptable turf quality rating for all three testing periods (McCready et al., 2009). 3.6. Fluctuations in irrigation adequacy and scheduling efficiency through time Values of irrigation adequacy and scheduling efficiency were calculated throughout the testing periods for rolling 30-day increments (Fig. 3). Currently, the IA SWAT testing protocol uses only one 30-day value that is deemed acceptable by the controller manufacturer and meets criteria established by the testing protocol. These criteria are: a 30-day cumulative rainfall depth of at least 10.2 mm and a cumulative ETo depth of at least 63.5 mm (Irrigation Association, 2008).
During S06, 30-day values of scheduling efficiency decreased as irrigation adequacy increased, as seen in the values of irrigation adequacy and scheduling efficiency for the AC treatments (Fig. 3). Both the S06 and S07 testing periods were relatively dry, compared to typical Florida conditions. Most of the treatments had large deficits during S06, due to little rainfall (138 mm) and high values of ETc , leading to low irrigation adequacy values. The total depth of rainfall for May 2006 was only 16 mm (meets IA SWAT test procedures cumulative rainfall requirement), much lower than the historical average for May of 102 mm (NOAA, 2006). There was an increase in 30-day irrigation adequacy ratings after May 2006 due to an increase in rainfall (Fig. 3). The 30-day irrigation adequacy ratings for treatments during S06 varied by as little as 11 percentile points (the difference between two values measured as percentages, RS7-3 mm) and by as much as 68 percentile points (RS2-6 mm). Rainfall played a very large role in the 30-day adequacy ratings. All treatments had fluctuations in 30-day irrigation adequacy ratings during the four seasonal study periods (Fig. 4). The lowest irrigation adequacy rating for the RS2-6 mm treatment was 26% and the highest irrigation adequacy rating was 100% (Fig. 4), a difference of 74 percentile points. The treatment with the smallest fluctuation in 30-day irrigation adequacy was RS7-3 mm (Fig. 4). The TOROadj treatment in F06 had an irrigation adequacy rating for the entire testing period of 84%, but during the testing period the treatment had 30-day irrigation adequacy values as low as 64% and as high as 100%. The RS7-3 mm treatment had consistently high irrigation adequacy and scheduling efficiency ratings when looking at the treatment period averages (Table 3). This treatment irrigated more frequently with shallower depths than the 1 and 2 days/week treatments, indicating that irrigation rarely caused deep percolation leading to higher scheduling efficiency scores. In S06, the RS7-3 mm treatment had only 4 mm of irrigation water lost to deep percolation compared to 36 mm for RS1-6 mm and 22 mm for RS2-6 mm based on the soil water balance. During all testing periods, the values of 30-day scheduling efficiency ranged from 40% to 100% for the RS1-6 mm treatment, a range of 60 percentile points (Fig. 4). The range in 30-day scheduling efficiency ratings for the 2 and 7
M.S. McCready, M.D. Dukes / Agricultural Water Management 98 (2011) 697–704
Fig. 4. The maximum and minimum 30-day (A) scheduling efficiency and (B) irrigation adequacy values by treatment. Data includes all 30-day periods during S06, F06, S07 and F07.
days/week treatments (44 and 37 percentile points) were smaller than the 1 days/week treatment. The treatment with the largest difference among 30-day scheduling efficiency ratings was AC 7 (Fig. 4). The lowest 30-day AC 7 scheduling efficiency rating for the treatment was 1% during F07 and the highest was 100% during S06, S07 and F07 (this treatment was not operational in F06). During the 30-day period the scheduling efficiency for the AC 7 treatment was 1% (15 September 2007–14 October, 2007), the treatment only irrigated once (net irrigation depth of 9.9 mm). The irrigation event occurred prior to, but on the same day as a 20 mm rainfall event. Inside the daily time step of the soil water balance, rainfall is used to refill the root zone first, followed by irrigation, therefore the 20 mm rainfall event filled the root zone, and 9.7 mm of the irrigation applied was lost to deep percolation. During the same 30-day period irrigation adequacy for the AC 7 treatment was 79%. This meant that rainfall during the 30-day period was sufficient to maintain an adequacy of 79%, since 99% of the irrigation water applied was lost to deep percolation according to the soil water balance. Irrigation adequacy increased during periods with greater rainfall. While some treatments maintained adequate irrigation during 30-day periods with low rainfall, no treatments produced low irrigation adequacy ratings (<75%) during periods with large depths of rainfall (>145 mm). As rainfall increased, such as in F07 compared to the other testing periods, the minimum adequacy ratings (considering all treatments) increased. 3.7. Irrigation adequacy and turfgrass quality Low 30-day irrigation adequacy ratings did not indicate poor turf quality for all treatments. For example, the minimum 30-day irrigation adequacy for the DWRS treatment in S06 was 24%, but the average turf quality for the testing period was 6.5 (McCready et al., 2009). The average irrigation adequacy rating for the DWRS treatment in S06 was 54% which was similar to the RS1-6 mm which was 56%, but turf quality was significantly lower for the RS1-6 mm treatment (5.3) than DWRS (P = 0.008; McCready et al., 2009). The difference in turf quality among the two treatments was most likely due to the difference in irrigation frequency. The DWRS treatment had the lowest average irrigation adequacy rating for S06 (54%),
703
F06 (72%), and S07 (59%). Turf quality for the DWRS treatment during these three testing periods was between 6.5 and 7.1 (McCready et al., 2009). An irrigation adequacy rating of 100% proved unnecessary during all testing periods for maintaining good turf quality. Most treatments had acceptable turf quality in the S06 treatment period, even treatments with adequacy values as low as 54% (DWRS). This may be due to the assumption of a MAD value of 50% for the soil water balance. Based on Shedd et al. (2008), turfgrass at the testing site began to show increased signs of stress (large areas of wilted turfgrass) when the soil water content reached 7% VWC which would be a MAD value of 60%. This result means that under actual testing, more soil water depletion was acceptable for plant quality than under the MAD assumptions. Using a MAD value of 60% for the soil water balance model increased adequacy ratings by 0–7 percentile points for all treatments in S06. For all treatment periods adequacy ratings increased by between 0 and 9 percentile points. Taken further, with a MAD value of 70%, average adequacy ratings increased by between 0 and 17 percentile points for the four testing periods. In S07, the DWRS treatment received an adequacy rating of 59% (MAD value of 50%) and had a turf quality rating of 6.9 (McCready et al., 2009). With a MAD value of 60% the adequacy was 65% and with a MAD value of 70% the adequacy rating increased to 72%.
4. Conclusions In general, irrigation adequacy ratings for all treatments improved in the fall months compared to the spring months due to both a decrease in ETc and in the case of F07 an increase in rainfall. Also, low irrigation adequacy ratings, based on the SWB were not always an indicator of lower turfgrass quality. This was seen in the DWRS treatment which received 54% and 59% irrigation adequacy ratings during S06 and S07, respectively. However, turf quality during these testing periods was always similar to a time-based treatment with no control device, which was scheduled to apply 40% more water per irrigation event than the DWRS treatment. The DWRS, RS7-6 mm and the TOROadj treatments generally applied less water per irrigation event than the SMS or other RS based treatments. The DWRS, RS7-6 mm and TOROadj treatments had good turf quality ratings throughout testing. Based on the soil water balance, irrigating less per irrigation event can increase irrigation scheduling efficiency and provide good turfgrass quality; however, horticultural impact such as reduced health due to stunted root development is unknown. Thirty-day irrigation adequacy and scheduling efficiency ratings varied substantially for most treatments during all testing periods. During S06 for example, there were large fluctuations in 30-day irrigation adequacy ratings due to an increase in rainfall after May 2006. The 30-day irrigation adequacy ratings of the irrigation treatments fluctuated by between 11 and 68 percentile points. These fluctuations show that using an irrigation adequacy rating for just one 30-day time period or averaged over the entire time period may not give an accurate representation of the change in irrigation adequacy over time. The reported irrigation adequacy of the irrigation scheduled by the controller would depend heavily on which 30-day period was used when testing occurred. Currently, in the evaluation of irrigation controllers according to the Smart Water Application Technology effort of the Irrigation Association, a single irrigation adequacy and scheduling efficiency score is reported for a controller over a 30-day period. From our results, it is clear that these values over time and in particular over a period where environmental demands are changing, such as seasonal change, are a better representation of controller performance.
704
M.S. McCready, M.D. Dukes / Agricultural Water Management 98 (2011) 697–704
There were some limitations for the experimental treatments compared to the SWB. The SWB was allowed to irrigate whenever soil water dropped below RAW and it only irrigated to FC. The experimental treatments were scheduled for irrigation on specific days of the week (1, 2 or 7 days/week) and, with the exception of the TOROadj treatment, irrigation depth per event was programmed into the timer. Even with these differences a soil water balance was a useful tool for scheduling irrigation and for comparison. There were treatments that had weekly irrigation depths similar to the SWB. Some treatments were able to apply significantly less water than the SWB suggested and still maintained adequate turf quality. This result suggests that the SWB implemented in this project may not fully capture the physiological response of the turfgrass to soil water deficits. Nevertheless, the comparison of the controller performance to the same soil water balance gives a useful baseline for the ability of a controller to properly schedule irrigation while maintaining plant quality. References Allen, R.G., Pereira, L.S., Raes, D., Smith, M., 1998. Crop evapotranspiration: guidelines for computing crop requirements. Irrigation and Drainage Paper No. 56, FAO, Rome, Italy. Allen, R.G., Walter, I.A., Elliot, R., Howell, T., Itenfisu, D., Jensen, M. (Eds.), 2005. The ASCE Standardized Reference Evapotranspiration Equation. American Society of Civil Engineers Environmental and Water Resource Institute (ASCE-EWRI), 59 pp. Cardenas-Lailhacar, B., Dukes, M.D., Miller, G.L., 2008. Sensor-based automation of irrigation on bermudagrass, during wet weather conditions. Journal of Irrigation and Drainage Engineering 134, 120–128. Connellan, G.J., 1999. Turfgrass irrigation. In: Aldous, D.E. (Ed.), Irrigation International Turf Management Handbook. CRC Press, Boca Raton, London/New York, Washington, D.C., pp. 119–138. Davis, S., Dukes, M.D., Vyapari, S., Miller, G.L., 2007. Evaluation and demonstration of evapotranspiration-based irrigation controllers. In: Proceedings from the ASCE EWRI World Environmental & Water Resources Congress, 15–19 May 2007, Tampa, FL. Doss, B.D., Ashley, D.A., Bennett, O.L., 1960. Effect of soil moisture regime on root distribution of warm season forage species. Agronomy Journal 52, 569–572. Dukes, M.D., Haman D.Z., 2002a. Residential irrigation system rainfall shutoff devices. ABE325, Institute of Food and Agricultural Sciences, University of
Florida, Gainesville, FL. Available from: http://edis.ifas.ufl.edu/AE221 (accessed 17.02.09). Dukes, M.D., Haman, D. Z., 2002b. Operation of residential irrigation controllers. CIR1421, Institute of Food and Agricultural Sciences, University of Florida, Gainesville, FL. Available from: http://edis.ifas.ufl.edu/AE220 (accessed 17.02.09). Dukes, M.D., 2005. Residential Irrigation Water Use and Control. Encyclopedia of Water Science. Marcel Dekker, New York, NY, doi:10.1081/E-EWS-120041736. Haley, M.B., Dukes. M.D., 2007. Evaluation of sensor based residential irrigation water application. ASABE Paper No. 072251, ASABE, St. Joseph, MI. Irrigation Association, 2005. Landscape Irrigation Scheduling and Water Management. Irrigation Association Water Management Committee, Falls Church, VA. Irrigation Association, 2008. Smart Water Application Technology (SWAT) turf and landscape irrigation equipment testing protocol for climatologically based controllers; 8th draft testing protocol. The Irrigation Association, Falls Church, VA. Available from: http://irrigation.org/SWAT/swat.aspx?id=302& terms=swat+protocol. Jia, X., Dukes, M.D., Jacobs, J.M., 2009. Bahiagrass crop coefficients from eddy correlation measurements in Florida. Irrigation Science 28, 5–15. Mayer, P.W., DeOreo, W.B., Opitz, E.M., Kiefer, J.C., Davis, W.Y., Dziegielewski, B., Nelson, J.O., 1999. Residential End Uses of Water. American Water Works Association Research Foundation, Denver, CO. McCready, M.S., Dukes, M.D., Miller, G.L., 2009. Water conservation potential of smart irrigation controllers on St. Augustinegrass. Agricultural Water Management 96, 1623–1632. NOAA (National Oceanic and Atmospheric Administration), 2006. National Climatic Data Center. Surface data. Available from: http://www.ncdc.noaa. gov/oa/ncdc.html (accessed 14.12.06). Peacock, C.H., Dudeck, A.E., 1985. Effect of irrigation interval on St. Augustinegrass rooting. Agronomy Journal 77, 813–815. Shearman. R.C., Morris, K.N., 1998. NTEP Turfgrass Evaluation Workbook. NTEP Turfgrass Evaluation Workshop, October 17, 1998, Beltsville, MD. Shedd, M.L., 2008. Irrigation of St. Augustinegrass with soil moisture and evapotranspiration controllers. MS thesis, Agricultural and Biological Engineering Department, University of Florida, Gainesville, FL. Shedd, M.L., Dukes, M.D., Miller, G.L., 2008. Effect of irrigation control on St. Augustinegrass quality and root growth. In: Proceedings from the Florida State Horticultural Society Annual Meeting, June 1–4, 2008, Ft. Lauderdale, FL. SJRWMD (St. Johns River Water Management District), 2010. Watering restrictions. Palatka, FL. Available from: http://sjr.state.fl.us/wateringrestrictions/index.html (accessed 04.04.10). US EPA (U.S. Environmental Protection Agency), 2010. Draft specification for weather-based irrigation controllers. Available from: http://www.epa.gov/ WaterSense/partners/controltech.html (accessed 21.10.10). Willmot, C.J., 1982. Some comments on the evaluation of model performance. American Meteorological Society Bulletin 63, 1309–1313.