Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Do ears alias?

Status
Not open for further replies.
I don't think so because the ear is like a filter that is just unable to pass frequencies that are outside of it's bandwidth. Like 20-20kHz it can hear fine (right?) so no aliasing (I would assume). If I fed your ear 40kHz, sure it woudl alias...if your ear could detect it in the first place, which it can't.

Ears only work with things that "contain a single frequency characteristic" or a single "dimension" or 2D (as in they can be graphed properly in 2D, more accurately than light anyways). I'm not sure how to word it. Like eyes for example have the frequency of the light itself as well as the frequency that the light's magnitude is changing at (flashing for example). It might be flashing at 10Hz, but the frequency of the light might be 600THz. With ears you just have the frequency of the sound waves.
 
Last edited:
The assumption here is that there is a sort of sampling going on. I doubt the ears sample, they act like a filter as stated earlier. If anything samples its likely the brain, but if there's a prefilter on what we're sampling then aliasing is unlikely to occur.

I don't understand the anatomy, so its just best guess. But for aliasing to occur there has to be sampling. I am curious, however, how the eyes alias. What is it they alias? What is their sampling rate, what do you define as aliasing?

Are they aliasing in spatial frequency? Are they aliasing spatial objects? Certainly they don't have very fine resolution otherwise they'd act like a microscope, but coarse resolution doesn't imply aliasing.
 
Last edited:
Eyes don't really sample either. Your eyes don't really see discrete images. It's a strange realm of analog blurred discretely. Biology works in strange ways that aren't truly the words we use to describe them.

I was about to use this analogy to demonstrate that the eye doesn't really sample: shining alternating black and white into the eye and that if I synched it just right then your eye would only see one colour, either black or white (such that the other colour only appeared in between samples that the eye took). Well we know this isn't true (right?), but we also know that you'd always see white and not black if we flashed these colours because light overrides dark for the eye. If I flashed a lot of black and a little white, you'd still see the white. But if I flashed a lot of white and a little black, you wouldn't see the black. You will always see the white and not the black.

This point reminded me of this article I read a while ago which you might find interesting. But I think it still somewhat shows that the eyes doesn't really digitally sample.

https://www.100fps.com/how_many_frames_can_humans_see.htm
 
Last edited:
I didn't think they sampled, which makes aliasing the way we understand it from DSP impossible. But I don't know much about anatomy so I wasn't sure.
 
Plus there's also the problem of human perception which is hard to sift out to differentiate what the eye detects and what we see.
 
The inner ear acts as a low pass filter, if it's bypassed and the sound is fed directly to the auditory nerve, sounds up to 200kHz can be detected - 10 times the bandwidth of a young ear!

The eyes are different, the frequency response, resolution and colour depth all depend on where the image is in the field of vision. There more cone cells in the middle of the retina so the colour depth and resolution is higher in the middle and there are virtually no cones and all rods as the edge of the retina so the colour perception and resolution are poor here. Rods have a faster response time than cones and this is why it's ofen possible to see monitors and fluroscent tubes flicker out of the corner of one's eye and which isn't possible whilest looking directly at them.
 
Photo detectors in the eye can actually detect single photons. However, the brain automatically filters that, so that at least 5 photons or thereabouts are needed to detect light. Otherwise, our eyesight would be a lot more noisy.
 
Status
Not open for further replies.

Latest threads

Back
Top