Basic GPS clock analysis with an HP-53310A modulation domain analyzer

Time-nuts mode activated!

I use my home-made GPS not only for hiking and mapping but also to calibrate my lab master clock, and I've been wondering for a while what are the specs of the square ware returned by the GPS. The GPS receiver is a NEO-F9P from uBlox and it can output pretty much any frequency you want on its TIME_PULSE output. Since lab reference clocks are typically 10MHz it would make send to use that output frequency, but is it the right choice for calibration? I personally don't like it because near 10MHz the change of number of digits between e.g. 9.999999999MHz and 10.00000000MHz means that you can lose one digit of resolution randomly. There is another interesting reason too, but we'll have to explore a bit to make it appear...

First let's connect the GPS to the MDA and set the GPS clock to 10MHz. The GPS is not locked yet, so the frequency is quite a bit off and drifting as the circuit warms up. In the following three screenshots you can see the frequency histogram moving towards the left, away from the 10MHz line in the center. We can also see from the histogram the the frequency is slowly stabilizing since we have taller peaks (and thus more samples) on the left.

Early GPS clock drifting before lock.
Early GPS clock drifting before lock.
The clock keeps drifting to the left, slowly settling down.
The clock keeps drifting to the left, slowly settling down.
Somewhat stable clock before GPS lock.
Somewhat stable clock before GPS lock.

The moment the GPS locks the frequency quickly moves much closer to 10MHz is much more stable, leading to a rapidly growing narrow peak on the histogram:

We have lock! Small peak appearing near the 10MHz line.
We have lock! Small peak appearing near the 10MHz line.
The peaks grows fast...
The peaks grows fast...
... dwarfing the previous unlocked samples.
... dwarfing the previous unlocked samples.

Note that I haven't recalibrated my MDA oscillator since I moved house a few months ago, so the MDA's 10MHz is not tip-top. Which is part of the reason I'm exploring the GPS reference signal. With a span of 2Hz the histogram is quite wide, so let's zoom in the center and take a look at the GPS TIME_PULSE frequency.

Zooming on the peak, 10MHz still in the center
Zooming on the peak, 10MHz still in the center

Now that's better! We can see an offset of 42.16mHz between the GPS clock and the MDA clock. That's what will have to be calibrated later. For now we can already notice that the peak is not that slim anymore, indicating jitter in the GPS clock. That is expected: the GPS outputs a digital signal from a microprocessor, and the only way it can sync its TIME_PULSE output is to make pulses a little longer or a little shorter. And this can only be done in small increments of one clock cycle. Thus the clock jitter will depend on the GPS' CPU clock frequency. I've heard that the NEO-F9P has a 128MHz clock, let's see if we can see that in the data.

But first we can have an even closer look at the histogram. After centering and adjusting the span we get this:

Phatt zoom on the '10MHz' clock. Not so pretty...
Phatt zoom on the '10MHz' clock. Not so pretty...

Yes it's still the same peak, but the 5x horizontal zoom shows how noisy the frequency is. Many little peaks, no smooth section, just messy. This is expected since we can only have discrete frequency changes from the GPS CPU, but how are we supposed to get clear information about the GPS clock from this? One thing to remember is that frequency measurements are done over a long period to get more accuracy. The fact so many 'spikes' are present shows that over that long frequency measurement period the GPS can tune its frequency many times, resulting in many possible outputs over this longer period. And this gives me an idea...

The MDA can not only measure frequency but also periods, a.k.a. time intervals. In that mode the measures are much faster because the instrument measures individual periods, and there's 10 million of those every second. And more importantly we can maybe see a bit better how the GPS adjusts its clock, since after all it's a small period adjustment that is being made there. Switching the instrument to single-channel time interval mode immediately gives us this beautiful graph:

Now that is a much clearer way to show frequency info.
Now that is a much clearer way to show frequency info.

Period may be the dual of frequency, but oh boy how the graphs are different thanks to the different measurement techniques! Now we can clearly see that only two periods exist represented by a 'main' peak and a shorter secondary one. The peaks have the same width and thus their peak height can be compared to evaluate what is their relative occurrence rate. We can find those numbers by zooming a bit more and using markers:

Adding markers and zooming in.
Adding markers and zooming in.

We have 4.2% and 20.8%, thus the small 'adjustment' peak happens almost exactly 20% of the time. We can also note the period difference between the two peaks: 101.040 - 93.167 = 7.873ns. This is the smallest period adjustment the GPS CPU can make, and should be thus corresponding to the GPS CPU clock. Is it? 1/7.873ns = 127MHz, very close to the 128MHz we were expecting. Great!

And what do the relative occurrences of both peaks mean? The TIME_PULSE output is set to 10MHz, and the CPU clock is 128MHz. Thus each period on the GPS output will consist in 12.8 CPU periods. But we can't have fractional periods of course. What will happen is that the CPU will output 13 periods 4 times, then the next period will be a little shorter with 12 CPU periods. Overall that will be 13x4+12=64 periods, which divided by 5 gives us our average 12.8 periods. And that is exactly what we have discovered from the histogram above: one in five periods is shorter, and the four others are longer. Bingo! Now let's try to get more accuracy from the histograms. To do this we can use the markers and ask the MDA to calculate stats only for the section between the markers. This let's us isolate each peak individually:

Left peak stats   Right peak stats
Individual peak statistics

We noe have better periods estimations: 93.1655ns and 101.0082ns, which gives us a GPS CPU frequency of 1/(101.0082-93.1655) = 127.5MHz, a bit better than the previous estimation. We can also try to get more accuracy for the relative occurrence rate of the two peaks. For that we need to isolate one peak as this will let us see the number of samples that are out of range. Here we can see the main peak at max zoom:

Big zoom on main peak, showing the samples in and out of the window.
Big zoom on main peak, showing the samples in and out of the window.

For 280.000.000 samples we see 70.000.629 samples that are out of range and thus belong to the smaller 'adjustment' peak. Almost exactly the four to one ratio we're expecting! There is a tiny difference though, with 629 more samples in the adjustment peak. But this is to be expected as the GPS CPU clock is not perfect, certainly not as good as the atomic clocks in the GPS satellites, and these 629 adjustments correspond to the GPS fine tuning its output frequency to exactly 10MHz. While we're at it we can also see that the histogram is now very much discrete: we have reached the limit of the MDA instrument and we can even 'read' the minimum time difference it can measure:

Time resolution limit reached
The MDA time resolution limit: 75ps and 1ns span.

... which is 75ps. This is quite short: during that time light travels only about 2cm! As we have seen the very large majority of the adjustments are due to our GPS CPU clock of 128MHz not being a multiple of the requested 10MHz output. So what happens if we use another frequency? 128MHz is a multiple of 8MHz, and thus we can expect the smaller 'adjustment' peak to be much smaller if that output frequency is selected:

10MHz   8MHz
10MHz versus 8MHz clocks. Using a divisor or 128MHz reduces secondary peak to near zero.

And what a difference it is indeed! The left 'adjustment' peak has now almost completely vanished and is only visible as a little continuous line at the bottom of the histogram. Zooming on the main peak we can see that we only had 115 adjustments over a 80.000.000 samples run:

8MHz at resolution limit
Big zoom on 8MHz peak, showing the number of 'adjusted' samples (115)

And what if we change the frequency? For the same 80M samples, doubling the TIME_PULSE frequency to 16MHz reduces the adjustments by 50%, and lowering it to 1MHz multiplies it by 8 to 936, almost exactly in both cases:

16MHz at resolution limit   1MHz at resolution limit
16MHz and 1MHz at the resolution limit

Why is that? When the frequency is lower we spend more time measuring each period, leading to more time elapsing, and more micro-adjustments being done by the GPS to keep in sync with the satellites. Even though all these little discoveries are individually simple, there is beauty in all these measurements being explained so perfectly. Science beauty! Now let's have a guess... Would the 'hairy' graph above look better if we use 8MHz instead of 10MHz? There will be way less micro adjustments by the GPS, so it should be noticeably cleaner. On the other hand we will have fewer 'random' adjustments so they may start to stand out more (less overall averaging). It turns out the second interpretation is maybe correct:


Still as noisy as before, more investigation needed...

The graph is a little noisier with a standard deviation of 4.64mHz, up from 2.99mHz before. I'm not completely satisfied with that answer though. Another reason could be reduced GPS accuracy. For example, with the proper number of satellites the timing accuracy should be about 20ns, but during all these tests I had no clear view of the sky and only had 4-5 satellites from poor angles, leading to only 80ns of timing accuracy. Will have to probe that in the future.