Relaxed phylogenetics and dating with
The benchmarks for determining the mutation rate are often fossil or archaeological dates.
The molecular clock was first tested in 1962 on the hemoglobin protein variants of various animals, and is commonly used in molecular evolution to estimate times of speciation or radiation.
There are a number of methods for deriving the maximum clade age using birth-death models, fossil stratigraphic distribution analyses, or taphonomic controls.
Alternatively, instead of a maximum and a minimum, a prior probability of the divergence time can be established and used to calibrate the clock.
The molecular clock is figurative term for a technique that uses the mutation rate of biomolecules to deduce the time in prehistory when two or more life forms diverged.
The biomolecular data used for such calculations are usually nucleotide sequences for DNA or amino acid sequences for proteins.
In order to account for this in node calibration analyses, a maximum clade age must be estimated.
When calibrated with the few well-documented fossil branch points (such as no Primate fossils of modern aspect found before the K-T boundary), this led Sarich and Wilson to argue that the human-chimp divergence probably occurred only ~4-6 million years ago.
Together with the work of Emile Zuckerkandl and Linus Pauling, the genetic equidistance result directly led to the formal postulation of the molecular clock hypothesis in the early 1960s.
Similarly, Vincent Sarich and Allan Wilson in 1967 demonstrated that molecular differences among modern Primates in albumin proteins showed that approximately constant rates of change had occurred in all the lineages they assessed.
If this is correct, the cytochrome c of all mammals should be equally different from the cytochrome c of all birds.
Since fish diverges from the main stem of vertebrate evolution earlier than either birds or mammals, the cytochrome c of both mammals and birds should be equally different from the cytochrome c of fish.