Common sense has always suggested that information is important, whether it belongs to the military, the government or private companies. The only thing that changes is the potential damage an undesirable disclosure causes and then the zeal with which it needs to be protected. There are also other nuances such as how "sensitive" information is identified and the kind of protection mechanisms employed. Most organizations follow the tried and tested variations of the World War II era classification scheme (sensitive, confidential, top secret…) or something very similar to it.

The funny thing about common sense is that common practices never seem to agree with it! Military classification schemes have been in use from World War II era, seals and ciphers have been in existence forever and the same thing can be said about lockable document carriers or safes. But it took the IT industry a good part of this decade to rediscover these concepts and invent products that bring the same security aspects to electronic information.

One might argue that today's complex environment has created the need for technologies that enable atomic information protection systems that were not required back in the 90s?! But if one goes back to the early 1990s, he will see that Lycos (search engine) and Novell Netware 3.x (network OS with atomic rights management) were both doing well on this front!

So, why did common sense take a long sabbatical? Well unlike our military cousins who were protecting their client's (parent country) and employee's (defence personnel) data (secrets) & plans, we were not thinking along those lines as such, because data was not considered important. Privacy rights activists, identity thieves and hackers forced governments, regulators and eventually companies to find ways to protect information.

Now that everyone has collectively woken up to data protection needs, most of us are wondering why the operating system companies did not simply build an audit capable file system. Such a system would maintain tamper proof records of the creator, editors, edits, copied by, copied from, dates, times, even enforced classification, all conveniently attached (meta data) to the document itself like a library book with its check-in/check-out card.

One may also wonder if I am biased against the OS makers? Why not let the current data protection technology companies solve the problem? If we think about it in terms of a traditional library, isn’t it the librarian who controls the check-in and check-out of the books? And one may also remember the library stamps on page borders that acted as an identifier in case the book was photocopied? Well, all this is far more easily done at the OS level, including history of accesses, changes; the library could - in theory - outsource the more esoteric activities like storage, protection and retrieval of its more sensitive tomes.

Most of us have accepted an RDBMS to maintain a log of every cough and sneeze, but we do not expect the same out of a file system; we are worried:

  • Will the log be too large? Will it eat up space?
  • Will it become too slow?
  • Should we have to archive all that on more tapes?

But these questions of what, how much and where are more like deciding on the cuisine type once you know that there is a choice of restaurants around. Then all you wonder about is what one can eat, if at all.

George Santayana, a Spanish-born American author of the late nineteenth century famously said, “Those who cannot remember the past are condemned to repeat it.” Technology companies haven’t looked at the past. If only they had studied the military or even the local library, they may have built data protection functionality into the operating system or the messaging system. Maybe then we could have protected our critical information without buying new and fancy data protection technology.

Author:

Jayesh Kamath
Practice Head - IRAS, Aujas