Unnecessary irrigation wastes huge amounts of water – some crops are watered twice as much as they need – and contributes to the pollution of aquifers, lakes and oceans. Having this in mind, according to the new Cornell research, a predictive model combining information about plant physiology, real-time soil conditions and weather forecasts can help make more informed decisions about when and how much to irrigate saving 40% of the water consumed by more traditional methods.
The researchers’ method uses historical weather data and machine learning (one of the main digital transformation trends in agriculture) to assess the uncertainty of the real-time weather forecast, as well as the uncertainty of how much water will be lost to the atmosphere from leaves and soil which is combined with a physical model that describes variations in the soil moisture.
Controlling plant moisture precisely could be a catalyser for the quality improvement of sensitive specialty crops such as wine grapes.
According to the research, by integrating these approaches, the watering decisions could be much more precise.
The main challenge of this research is to identify the best tailored method for each crop and determining the costs and benefits of switching to an automated system from a human-operated one.