From Sundials to Atomic Clocks — A Brief History of Measuring Time
Time is the most measured thing in human existence, yet for most of history, no two clocks on Earth agreed. The journey from rough solar observations to timekeeping accurate to a single second per 300 million years is one of humanity’s great engineering achievements — and it’s far from over.
Egypt and the First 24-Hour Day
The division of a day into 24 hours traces back to ancient Egypt around 1500 BCE. Egyptian shadow clocks — essentially calibrated sticks planted in the ground — divided daylight into ten equal parts, with an additional hour added to each end for twilight. At night, star charts called “star clocks” tracked the rising of specific stars to mark the nocturnal hours. The result was 12 daytime and 12 nighttime units, making 24. The actual length of these “hours” varied with the seasons: summer daytime hours were longer than winter daytime hours. Fixed equal hours wouldn’t arrive for another thousand years.
Water Clocks and the Problem of Night
The fundamental weakness of the sundial is obvious: it stops working the moment the sun disappears. Water clocks — clepsydrae — solved this by measuring time through the rate of water draining from a vessel. Babylon, ancient Egypt, China, India, and Greece all developed versions of the device independently. The Greeks refined them considerably, using float mechanisms to drive indicator arms that traced the current hour on a dial. By the first century BCE, elaborate astronomical water clocks existed in Athens and Rome. But water clocks remained imprecise: water viscosity changes with temperature, affecting flow rate. On a cold night, a water clock ran slow.
Medieval Monasteries and the Mechanical Revolution
For over a millennium, monasteries across Europe were the primary timekeepers of Western civilization. Religious rules required monks to gather for prayer at fixed points throughout the day and night — a schedule that demanded reliable timekeeping in darkness. The mechanical escapement clock, invented in Europe during the late 13th century (around 1275–1300), was likely developed specifically to automate the ringing of church bells at the correct prayer hours.
The escapement mechanism — a toothed gear wheel regulated by an oscillating foliot — allowed a falling weight to release energy in controlled, regular pulses rather than all at once. Early mechanical clocks were accurate to perhaps 15 minutes per day. That was a revolution. They had no minute hands; just hour hands. The minute hand didn’t become common until the 17th century, when accuracy improved enough that tracking minutes became worthwhile.
Huygens, the Pendulum, and the Age of Precision
The Dutch scientist Christiaan Huygens changed everything in 1656 by applying the pendulum to clockwork. Galileo had understood decades earlier that a pendulum of a given length swings at a fixed rate regardless of how wide its arc — but he never built a functioning clock around the principle. Huygens did. His pendulum clock reduced daily error from 15 minutes to around 15 seconds. A further refinement — the anchor escapement — brought that to 10 seconds per day.
For the first time, the minute hand became genuinely useful. Scientists could measure short intervals with real precision. Astronomy advanced dramatically, because astronomers could now time celestial events accurately. The art of navigation, which depends on precise time to determine longitude at sea, began its slow improvement toward the marine chronometers that would come a century later.
The 1884 Meridian Conference and Greenwich Mean Time
By the 1880s, industrialization had created a practical crisis: railroads crossing continents needed synchronized schedules, but every town kept its own local solar time. In the United States alone, over 300 local times were in use by various railroads in the 1870s. American railroads standardized themselves into four zones in 1883.
The following year, representatives from 25 nations gathered in Washington, D.C. for the International Meridian Conference. They agreed to establish the Prime Meridian at the Royal Observatory in Greenwich, England, and to divide the world into 24 time zones each spanning 15 degrees of longitude. Greenwich Mean Time — the mean solar time at the Greenwich meridian — became the world’s reference clock.
The adoption was gradual. France, whose scientists had advocated for a Paris-based meridian, didn’t officially adopt GMT until 1911. But the conference set the framework that still governs time zones today.
Quartz Clocks and the Age of Electronic Precision
The 1920s brought a new principle: electricity. In 1927, Warren Marrison at Bell Telephone Laboratories built the first quartz crystal clock. Quartz crystals vibrate at a precise frequency when subjected to an electric current — a phenomenon called the piezoelectric effect. By counting those vibrations electronically, a clock could maintain time with unprecedented stability.
Early quartz clocks were refrigerator-sized instruments confined to laboratories and radio stations. By the 1970s, miniaturization had placed quartz movements into wristwatches. A good quartz watch loses or gains only a few seconds per month. Cheap quartz clocks became so ubiquitous that by the 1980s, telling the time became something anyone could do for a few dollars.
The First Atomic Clock, 1955
Quartz was accurate, but still subject to drift caused by temperature, aging of the crystal, and manufacturing variations. Atomic physics offered something better: the quantum-mechanical vibrations of atoms, which are identical for every atom of a given element regardless of conditions.
The first practical cesium atomic clock was built by Louis Essen and Jack Parry at the National Physical Laboratory in Britain in 1955. Cesium-133 atoms transition between two energy states at a frequency of exactly 9,192,631,770 cycles per second — a number that now defines the second itself. Modern cesium clocks lose or gain approximately one second every 300 million years.
Unix Epoch, 1970
On January 1, 1970 at 00:00:00 UTC — a moment retroactively designated “the Unix epoch” — computer engineers needed a practical starting point for counting time in software. The Unix operating system represented all times as the number of seconds elapsed since that moment. This convention spread throughout computing and remains the underlying timekeeping standard for most software in the world today, including the server delivering this page to your browser. You can see the current Unix timestamp update in real time on HourZone.io.
Leap Seconds, 1972
Atomic time and astronomical time diverge slightly, because the Earth’s rotation is irregular — it’s gradually slowing, and subject to unpredictable wobbles. International standards bodies introduced leap seconds in 1972: occasional extra seconds added to UTC to keep it within 0.9 seconds of astronomical time. As of 2024, 27 leap seconds have been inserted. The practice remains controversial — some technology companies argue that unpredictable insertions cause software bugs — and international standards bodies have voted to eliminate leap seconds by 2035.
Today: Cesium and Beyond
Modern timekeeping doesn’t stop at cesium. Optical lattice clocks using strontium or ytterbium atoms have achieved accuracy surpassing 10^-18 seconds — that’s a second’s error per 15 billion years, longer than the current age of the universe. These clocks are precise enough to measure gravitational time dilation: two identical clocks placed 30 centimeters apart vertically will tick at measurably different rates because of gravity.
For most practical purposes, your phone’s clock is synchronized to atomic time via GPS satellites and network time protocols, accurate to milliseconds. The gap between the sundial and the optical lattice clock spans five millennia of human ingenuity. To experience the result, check the live clock on our stopwatch page or explore the Unix timestamp tool.