Rohit Chatterjee
New Member
The Fourier transform of a function exists only if it satisfies the Dirichlet conditions,one of which states that the function must be absolutely convergent when integrated over one time period.Now consider the unit step function
u(t)=1, t>=0
0, t<0
u(t) is an aperiodic function.So T tends to infinity,which implies that u(t) is not absolutely convergent when integrated over one time period.But the Fourier transform of u(t) exist nonetheless!Please explain this apparent anomaly.
u(t)=1, t>=0
0, t<0
u(t) is an aperiodic function.So T tends to infinity,which implies that u(t) is not absolutely convergent when integrated over one time period.But the Fourier transform of u(t) exist nonetheless!Please explain this apparent anomaly.