BEfore the advent of mobile phone's I had the pleasure of working on a state-of-the-art digital network with ISDN full digital services and 1.544MBps full duplex to the home in '79.*
I learnt that such filters were necessary so that global phone calls could sound like the person was just down the street with >75dB SNR audio quality using 8 bit log ADC/DAC. No hiss, hum , pops or distortion.
These days cell phones use DSP firmware for filtering optimise bandwidth efficiency but can sound terrible at times.
But in those* days , these
chips were cheaper and necessary. However if the OP is just sampling DC ,,who cares.
For those who do, I have another example of a digital antialiasing filter in the late 70's on another project (we classify as SCADA today)
I just wanted to sample "DC" from a DC motor current sensor and send it on the digital stream to a remote location in a continuous synchronous stream. But all the ADC channel words were used up, except for a message word, so I allocated 1 bit for DC motor current. Then I used a voltage controlled timer to generate pulses from 0 to 1kHz for DC , 0 to full scale from the current shunt signal. Then at the receiving end for an analog edge meter, it would convert the logic 1 tach signals to an analog meter to show the operator servo-motor current, in case of a jam.
The only problem was it was aliasing on the edge meter with a frame rate of something like 100k "frames" of data per second. So then I thought, but this is only DC? Then I remembered my class in Shannon's Law. So instead of sampling the tach signal to be sent down the stream, I used it to clock or edge trigger a FF instead, thus it would only be reset if the "bit" had been sent, which was a safe bet. Voila! No more aliasing.
With this method your frame rate can be as low as the max tach frequency but no lower
... rather than 2x the pulse rate.
my digital anti-aliasing solution.