Fourier Series

Status
Not open for further replies.
The Fourier transform of a function exists only if it satisfies the Dirichlet conditions,one of which states that the function must be absolutely convergent when integrated over one time period.Now consider the unit step function
u(t)=1, t>=0
0, t<0
u(t) is an aperiodic function.So T tends to infinity,which implies that u(t) is not absolutely convergent when integrated over one time period.But the Fourier transform of u(t) exist nonetheless!Please explain this apparent anomaly.
 
Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…