Hi,
Basic integration in time is done by accumulating the input times the time increment.
So if you take samples every 0.1 second you have a time increment of 0.1 second so you multiply each sample by 0.1 and add that to the previous result:
sum=sum+dt*v[k]
where
dt is the time increment,
v[k] is the sample measurement at that time.
The basic first derivative in time can be calculated by taking the difference of two successive samples and divided by the time increment:
dv/dt[k]=(v[k+1]-v[k])/dt
and a slightly better way is to take samples at k-1 and at k+1:
dv/dt[k]=(v[k+1]-v[k-1])/(dt+dt)
where you might note that we get the derivative for the sample k by using samples k-1 and k+1 rather than sample k itself. It's sometimes referred to as the "Central Means" derivative.
There are a lot of different numerical methods for different things so you might look into this more on the web or ask more questions here