How accurate does a clock need to be? That all depends on what you need it for. A glance at the sun in the sky is enough to tell you when to head home from a hike, whereas you might need a wristwatch to know whether you'll be late to a morning meeting. Likewise, certain applications in science and technology require much more precise timekeeping than that. In those cases, scientists use atomic clocks. Atomic clocks rely on the fact that you can blast an atom with a certain frequency of radiation, such as radio waves, that will cause its electrons to "jump" back and forth, or oscillate, between energy states. All atoms of a given element will reliably oscillate when exposed to a given frequency. For cesium, the most common element used in atomic clocks, that frequency is 9,192,631,770 cycles per second. That is, if a radio wave is making a cesium atom oscillate, we know that its frequency is exactly 9,192,631,770 cycles per second and then know exactly how long a second is based on those cycles. In fact, since 1967, the official definition of the second has been based on the oscillation of a cesium atom. The first atomic clocks were made in the 1950s and have only gotten more accurate. Today, the NIST-F1 cesium atomic clock is so precise that its time error is about 0.03 nanoseconds per day, or roughly one second every 100 million years. We've collected some awesome videos on this topic. Watch them now to learn more.
Share the knowledge!
Key Facts In This Video
Atomic clocks measure the oscillations of atoms. All atoms of a given element vibrate, or tick, the same number of times per second. 01:16
Today, the international standard for a second is based on 9,192,631,770 vibrations of a cesium atom. 01:50
Atomic clocks are accurate to one second in 3 million years. 02:18
Wake up with the smartest email in your inbox.
Our best articles a few times a week.