Y2K and the New Millenium

A Computer Glitch to End the World and the 20th Century

A keyboard with keys reading "Y2K" and "Help!"

Jon Riley / Getty Images

The year 2000 (Y2K) problem scared the world. Although some were ready to "party like it's 1999," others predicted catastrophe at the end of the year because of a programming assumption from the early days of computers. Y2K entered the cultural conversation over concerns that technology and automated systems would fail when their clocks had to convert the date from Dec. 31, 1999 to Jan. 1, 2000.

Age of Technological Fear

Many assumed that electronics would not be able to compute dates that did not begin with "19" because they ran on outdated, short-sighted programming. Computer systems would be so confused that they would completely shut down, leading to chaos and wide-scale disruption.

Considering how much of our everyday lives were run by computers in '99, New Year's was expected to bring about serious computerized consequences. People were worried about banks, traffic lights, the power grid, airports, microwaves, and televisions which were all run by computers.

Doomsayers even predicted that mechanical processes like flushing toilets would be affected by the Y2K bug. Some thought that Y2K would end civilization as we knew it. As computer programmers madly dashed to update computers systems with new information, many in the public prepared themselves by storing extra cash and food supplies.

Preparations for the Bug

By 1997, a few years ahead of widespread panic over the millennium problem, computer scientists were already working toward the solution. The British Standards Institute (BSI) developed new computer standards to define conformity requirements for the year 2000. Known as DISC PD2000-1, the standard outlined four rules:

  1. No value for current date will cause any interruption in operation.
  2. Date-based functionality must behave consistently for dates prior to, during, and after 2000.
  3. In all interfaces and data storage, the century in any date must be specified either explicitly or by unambiguous inferencing rules and algorithms.
  4. 2000 must be recognized as a leap year. 

Essentially, the standard understood the bug to rely on two key issues:

  1. The existing two-digit representation was problematic in date processing.
  2. A misunderstanding of calculations for leap years in the Gregorian Calendar caused the year 2000 to not be programmed as a leap year.

The first problem was solved by creating new programming for dates to be entered as four-digit numbers (1997, 1998, 1999, and so on), where they were previously represented only by two (97, 98, and 99). The second solution was amending the algorithm for calculating leap years to "any year value divided by 100 is not a leap year," with the addition of "excluding years which are divisible by 400."

What Happened on January 1?

With so much preparation and updated programming done before the change of date, the catastrophe was mostly averted. When the prophesied date came and computer clocks around the world updated to Jan. 1, 2000, very little happened that was abnormal. Only a few relatively minor millennium bug problems occurred, and even fewer were reported.

Format
mla apa chicago
Your Citation
Rosenberg, Jennifer. "Y2K and the New Millenium." ThoughtCo, Aug. 28, 2020, thoughtco.com/the-y2k-bug-1779442. Rosenberg, Jennifer. (2020, August 28). Y2K and the New Millenium. Retrieved from https://www.thoughtco.com/the-y2k-bug-1779442 Rosenberg, Jennifer. "Y2K and the New Millenium." ThoughtCo. https://www.thoughtco.com/the-y2k-bug-1779442 (accessed March 19, 2024).