Y2K was literally a 2 bit (actually byte) problem. From the beginning of software coding to the age of mass storage devices and voluminous RAM capacity, roughly mid 1940’s to the late 1970’s, “software engineering” always had to have an eye on space to run and store their code and data, not to mention the ever present “runtime” concerns and computer cost. Consequently, only large corporations with big pockets and minimum time constraints purchased computers and developed programs. Specifically, Banks, Security Exchanges, Communications, Insurance Companies, Government Agencies and generally relegated computer operations to the financial areas of their respective industries.
It was the banking industry that first noticed the Y2K problem in the early 70’s when their 30 year mortgages were being calculated. No problem, a little fix here and there and everything was back to normal. By 1985 the same problem appeared, but this time the problem was a little more widespread. By 1990 it became front page news.
Now, in 2014, we are faced with the question of eliminating not only historical data, but also eliminating the “old” programs.
Point of View
Two main philosophies are competing against themselves. One is not to eliminate anything, unless it’s absolutely necessary. The other is to eliminate everything, unless it is absolutely necessary.
Periodic “cleansing” of old data and programs from active operational systems is a must. However, you must also have some type of offline site to store data, programs, and technology platforms that can restore the data in case of an emergency.
Converting old data formats to the latest technology platforms would reduce storage and maintenance cost. This would cover both philosophies mentioned above. It is critical, of course, to verify that old data can actually be restored and reloaded in an emergency – can yours? Call Guardian for assistance and auditing, to make sure your old data can be reconstructed with the newest and latest technology platforms.