I would conclude that there is no definite correlation between the three series of data, within the limits of measurement accuracy.
I speculate that the apparently consistent drop in the water pressure sensor readings might be due to some gradual 'settling-in' process, e.g. a permanent slight deformation of some part of the sensor. Repeating the measurements, over a longer time period, might help to prove/disprove that theory.
I did a quick search and came up with these: **broken link removed**
They seem quite impressive. You don't have to be looking at the total depth, but just the depth of the sensor. Resolution 0.000012% FS (23 bits Uni-Polar). That seems phenomenal to me.
A reminder to take into account the quantization error which basically means pick a full scale value close to full scale.
One way to illustrate quantization error is that if the thing your measuring had 100 units in steps of 1 unit. If you were measuring 1 bit, it's an error of +-100%, 2 bits +-50% etc at 99 units it's like 1% error.
It can log every 15 minutes for 1 year on a single battery.
Seems as though total available water depth (water level to dam base or bottom) could be determined by subtracting surface water height from that total depth value. Do all the sensing above (and out of) the water, rather than in it.
If you can determine an arbitrary "normal dam full" (NDF) water height, any surface level variation detected by the sensor can be added/subtracted to/from that NDF value if using an "Ultrasonic, Non-contact water level sensor". Many, many examples on the web. It might take two of them: one for 0 to 10m and 10 to 60m.
Nothing in the water to fail, barometric pressure irrelevant, vastly increased accuracy.