Subscribe and stay up-to-date with the latest news and great offers from DEKALB, Asgrow and Deltapine.
Don't miss out on the latest agronomic news.
Local agronomic alerts.Delivered straight to your inbox.
The objective of irrigation management is to use water profitably, while maintaining yield potential at sustainable production levels. Irrigation needs to be applied only when soil moisture measurements and crop growth stage warrant water inputs, and with a method that limits any waste or excess.
Evapotranspiration (ET) soil water estimates should be periodically validated using soil moisture sensors. Comparing predicted soil moisture with actual soil moisture measurements every couple of weeks is a good strategy to confirm the ET-based scheduling method is accurate.
Research indicates that soil moisture sensors could be used as an irrigation tool for saving water.1 However, procedures for proper installation, calibration, and maintenance of these devices are critical for the success of soil moisture sensor-based irrigation scheduling.
Soil moisture sensors can be placed at differing depths in the crop root zone to provide a direct measure of changes in soil water content. This can help determine when irrigation might be needed.
Examples of some common soil moisture sensing instruments include (Table 1, page 2):
Sensors must be in direct contact with undisturbed soil in order to provide accurate readings. During installation, damage to roots and soil structure should be minimized and air voids, large roots, rocks, and other obstructions should be avoided. All soil moisture sensors should be calibrated in the field for the specific soil type, even if the manufacturer suggests otherwise. Laboratory calibrations of the devices are often made on re-packed soils, where tight soil-access tube contact is ensured and variability in the soil is minimized. These laboratory measurements may not be transferrable to your field, particularly for the capacitance or other FDR sensors. Field calibration can provide more accurate readings because the sensor is placed in the actual soil to be measured.
For representative readings, measurements should be taken from representative soil types, within the active crop root zone, and away from high spots, depressions where water may collect, and slope changes. Soil water content may vary greatly within a field, so multiple measurements throughout the field and at appropriate depths will be required to reduce errors. A greater number of readings will be needed with soil moisture sensors that measure smaller volumes of soil, such as with capacitance probes.2
When choosing locations in the field to measure soil water, it may be more cost effective and efficient to take measurements in areas where soil and plant properties are most representative of the field. The depth to which a particular method can measure soil moisture and the resolution at increasing depths should also be considered. Ideally for irrigation scheduling, soil water should be measured to well below the maximum depth of root water extraction, though this is not always attainable with most soil water measurement methods. Conventional TDR is one method that allows the flexibility of deeper measurements.
Typical installations include one or more sensors for each foot of active rooting depth. The variability among soil samples increases as soils dry, indicating that more measurements will be needed for accuracy as soil water content reaches the Management Allowable Depletion (MAD).3 The latter refers to the amount of water allowed to be depleted from the root zone before irrigation is scheduled. The MAD is usually given as a percentage of maximum water-holding capacity of the soil. At the time of irrigation, the soil water deficit should be less than or equal to MAD.
For further information concerning overall irrigation management and soil moisture sensors consider reviewing the Irrigation Handbook for the Great Plains at https://monsanto.com/app/uploads/2017/06/irrigation-handbook-for-the-great-plains.pdf