Jon, I already provided the figuring, I assumed a best case scenario and asked what the true timing resolution requirements are, that's what we are waiting on. If the resolution required is truly in the nano-second range, meaning actual single nano seconds, the 1ghz statement stands for a clock rate to a stand-alone counter, if the resolution is as course as the 10ns range it's 100mhz. For practical timing on a chip that uses 20mhz as an instruction clock which is common, nano second accuracy is not possible, except for perhaps very course measurements such as less than or more than the 999-1ns range, meaning it could in theory under optimal conditions determine start/stop within 4-8 instruction cycles in that range, greater or less than 500ns but not more than 1usec.
It would have to be interrupt driven on an efficient chip, the interrupt/instruction delays can be accounted for.
Once again we are at a point in a thread where there is NOT enough information to proceed.