Hi again,
I think i remember the solution now, but you'll need to use Z transforms for this
to make any sense to you. I hope you have had at least an introduction, but
if not it will have to wait for a later time.
Here is the rationale:
In a sampled system, z^-1 can be interpreted as a delay in the signal
such as f(t)*z^-1. Since in the problem we have to subtract f(t) from
it's own delayed version f(t-1) delayed by exactly 1, we can write the
solution to the sampled system like this:
f(t)*1-f(t)*z^-1
or simply:
f(t)*(1-z^-1)
(we also need to integrate BTW).
Now if we find the continuous time expression of this we get a new time expression,
and i bet you already guessed what it is (he he):
Intg f(t)*(1-z^1) dt=Intg[t-1 to t] f(t) dt
In other words, in the sampled time domain when we subtract a function delayed by 1
from the original function, we get f(t)*(1-z^-1), which is equivalent to the
continuous time domain Intg[t-1 to t] f(t) dt (after an integration).
So this should prove that that integral shown in the problem is correct.
It's been a while since i looked at these theoretical topics, so there may be some
limits as to the range or type of functions that this works for, but i would bet that
the linear functions work with this idea.