Asking partly out of curiosity, I have been toying with a future pet project ideas around portable atomic clocks, just to skip some of the headaches of distributed time sync altogether. Curious how folks who’ve worked on GPS or timing networks think about this.
For test and measurement, it's used for more boring synchronization of processes/whatever. For high security, with minimal length/tight cable runs, you can detect changes in cable length and latency added by MITM equipment, and synch all the security stuffs in your network.
[1] https://en.wikipedia.org/wiki/Precision_Time_Protocol
[2] https://www.arista.com/assets/data/pdf/Whitepapers/Absolute-...
Ptp and careful hardware configuration keeps things synced to within nanoseconds
Including of course information - often defined by the presence or absence of some alterable within a specific time.
We invent new uses for things once we have them.
A fun thought experiment would be what the world would look like if all clocks were perfectly in sync. I think I'll spend the rest of the day coming with imaginary applications.
They couldn't stay synced. There's a measurable frequency shift from a few cm of height difference after all. Making a pair of clocks that are always perfectly in sync with each other is a major step towards Le Guin's ansible!
For other readers' info, clock stability is crucial for long-term precision measurements, with a "goodness" measured by a system's Allan variance: https://en.wikipedia.org/wiki/Allan_variance
In such systems, ntp is inexpensive and sufficient. On networks where ntpd's assumptions hold (symetric and consistent delays), sync within a millisecond is acheivable without much work.
If you need better, PTP can get much better results. A local ntpserver following GPS with a PPS signal can get slightly better results (but without PPS it might well be worse)
This past week I tried synchronizing the time of an embedded Linux board with a GPS PPS signal via GPIO. Turns out the kernel interrupt handler already delays the edge by 20 us compared to busy polling the state of the pin. Stuff then gets hard to measure at sub microsecond scales.
> "Re: ntpd-rs and higher-resolution network time protocols {WhiteRabbit (CERN), SPTP (Meta)} and NTP NTS : https://news.ycombinator.com/item?id=40785484 :
>> "RFC 8915: Network Time Security for the Network Time Protocol" (2020)
Even just triggering an GPT from an GPS PPS input counting cycles of an internal clock you could use a GPT to work out the error in the clock, and you only need to query it once a second.
What do you imagine the clock in your computer is made out of?
For your precise question, it may already be there.
The consistent ordering of events is important when you're working with more than one system. An un-synchronized clock can handle this fine with a single system, it only matters when you're trying to reconcile events with another system.
This is also a scale problem, when you receive one event per-second a granularity of 1 second may very well be sufficient. If you need to deterministically order 10^9 events across systems consistently you'll want better than nanosecond level precision if you're relying on timestamps for that ordering.
ah yes - that would be Planck's second which can be derived from Planck's constant and the speed of light
What is this ultimate precision? I imagine that at some point, even the most modest relative motion at ordinary velocities would introduce measurable time dilation at fine enough clock precision.
[1] https://journals.aps.org/pra/abstract/10.1103/PhysRevA.47.35...
[1] "Quantum-amplified global-phase spectroscopy on an optical clock transition" (2025) https://www.nature.com/articles/s41586-025-09578-8
[2] "Quantum watch and its intrinsic proof of accuracy" (2022) https://journals.aps.org/prresearch/abstract/10.1103/PhysRev...
The very lengthy discussion around the concept was fascinating to me as a 23 year old college student who only knew it from one perspective.
Japan had a whole fancy temporal hour system before Western contact. It was more complicated than our modern framework, as it was based on the time between sunrise and sunset and so the length of the hours had to be adjusted about every two weeks. But they certainly thought quite a bit about it, so I'm not sure how it could be claimed to not be a concept there at the time.
All you need to create a clock is realize that your oil lamp consumes its fuel in a somewhat consistent interval, or a similar observation for the time it takes drips of water to fill a cup. People figured it out.
That really doesn't seem to make sense as written. Even if for "Western" you count all the way to the Middle East (where much of our chronometry originates), there's still a lot found in China and the New World. (From what I can tell, India does not seem to have a strong independent record here? Though they certainly borrowed from the inventors, just like Europe did.)
> we can demonstrate quantum-amplified time-reversal spectroscopy on an optical clock transition that achieves directly measured 2.4(7) dB metrological gain and 4.0(8) dB improvement in laser noise sensitivity beyond the standard quantum limit.