Frequency offset (specified in ppm, or parts-per-million) and jitter (specified in seconds) seem very similar. What exactly is the difference between these?
Frequency offset (ppm) refers to the difference between a clock's nominal (or ideal) frequency and its actual frequency.
Jitter refers to variations between when an ideal clock's edge should occur, and when it actually does occur.
For example, a clock could have high offset and low jitter, as in the case of a 156.3281 MHz clock with perfect edge placement (+500 ppm from nominal frequency of 156.25 MHz, and no jitter). Conversely, a clock could have low offset and high jitter, as in the case of a 156.25 MHz clock whose edges deviate from their ideal positions in time by +/-200 ps.