Monthly Archives: February 2015

Privacy and Duty

The last few months have been a tough time for internet security.

Recently, we’ve learned that a major computer manufacturer was found last week to be shipping laptops with malware and fake root certificates, compromising the secure communications of their customers.

We learned that hackers stole hundreds of millions of dollars from Russian banks.
And we learned that intelligence agencies may have hacked into a major SIM card manufacturer, putting the privacy of millions of people at risk.
Those of us in the IT world have a duty to respond to these incidents.
And I use the word duty very intentionally.  Most system administrators have, by nature of their work, a moral, ethical, contractual and legal obligation to protect client and company data.

For example, if they work for a law firm, then the Canadian Bar Association Code of Professional Conduct includes this section:

Maintaining Information in Confidence
1. The lawyer has a duty to hold in strict confidence all information concerning the business and affairs of the client acquired in the course of the professional relationship, and shall not divulge any such information except as expressly or impliedly authorized by the client, required by law or otherwise required by this Code.

 

To ‘hold information in strict confidence’, must apply every bit as much to electronic records and communications as any other type of information.

If you work for a company with a presence in Europe, you are bound by EU data legislation, which includes:

“Everyone has the right to the protection of personal data.”
Under EU law, personal data can only be gathered legally under strict conditions, for a legitimate purpose. Furthermore, persons or organisations which collect and manage your personal information must protect it from misuse and must respect certain rights of the data owners which are guaranteed by EU law.

 

In my career, I’ve often found myself working with health care data, and thus come under the jurisdiction of Ontario’s Personal Health Information Protection Act, which among other things states:

12.  (1)  A health information custodian shall take steps that are reasonable in the circumstances to ensure that personal health information in the custodian’s custody or control is protected against theft, loss and unauthorized use or disclosure and to ensure that the records containing the information are protected against unauthorized copying, modification or disposal.

And anyone working in the financial industry is likely to find themselves subject to a Code of Ethics such as this one from TD bank:
A. Protecting Customer Information
Customer information must be kept private and confidential.”
C. Protecting TD Information We must carefully protect the confidential and proprietary information to which we have access, and not disclose it to anyone outside of TD or use it without proper authorization, and then only for the proper performance of our duties. “

Nothing to Hide?

Occasionally, I’ve heard the suggestion that ‘those with nothing to fear have nothing to hide.’
In the light of these duties and obligations, this claim is, of course, absurd.  Not only do we in the IT industry have access and responsibility to large amounts of confidential information, we have a moral, ethical, contractual and legal obligation to keep it secure – to ‘hide’ it.
Because we can’t divine intent when our systems come under attack.  Whether it’s a criminal gang, a careless vendor, or a foreign intelligence agency, the attack vectors are the same, and our response must be the same: robustly and diligently protecting the systems and data that have been placed in our care.

A Rough Week for Security

2014 was a tough year for anyone responsible for systems security.  Heartbleed was uncovered in April, which led to some seriously panicky moments as we realised that some secure webservers had been accidentally leaking private information.  And then again later in the year we discovered the Shellshock vulnerability in many Unix systems, leading to yet more sleepless nights as I and countless other systems administrators rushed to patch our systems.

trevor_neoI did find a couple of silver linings in these events, though. Firstly, both of the vulnerabilities, although severe, were the result of genuine mistakes on the part of well meaning, under-resourced developers, who didn’t anticipate the consequences of some of their design decisions.  And secondly, I was intensely proud of how quickly the open source community rallied to provide diagnostic tools, patches, tests, and guides.  With a speed and efficiency that I’ve never seen in a large company, a bunch of unpaid volunteers provided the tools we needed to dig ourselves out of the mess.

2015, however, is so far going worse.  This week’s security flaws, specifically the ‘Superfish’ scandal (in which Lenovo deliberately sold laptops with a compromised root certificate purely so that third party software could inject ads into supposedly secure websites, and thus exposing millions of users to potential man-in-the-middle attacks), and the now-brewing ‘Privdog’ scandal (trust me, you’ll hear about this soon if you follow security blogs…), are the direct result of vendors choosing to violate the trust of consumers in the interests of chasing tiny increases in their profit margins.

I’m processing a number of emotions as I get up to speed on the implications of these security flaws.  Firstly, frustration – any new security weakness causes more work for me as I test our systems, evaluate our vulnerabilities, apply necessary patches, and communicate with clients and colleagues.

Secondly, anger.  I’m angry that vendors do not feel that they are bound by any particular obligation to provide their clients with the most secure systems possible, and that in both these cases they have deliberately violated protocols that have been developed over many years specifically to protect personal data from hackers, thieves, spies, corporate espionage, and other malicious actors.  I don’t know whether their underlying motivation was greed, malice, or simply stupidity, but whatever the cause, I’m deeply, deeply disappointed.  Not just with the companies, but with the specific individuals who chose to create flawed certificates, who chose to install them, who chose to bypass the very systems that we trust to keep us safe, and who chose to lie to consumers about it; telling them that this was ‘value added’ software, designed to ‘enhance their browsing experience’.

Thirdly, though, I’m grateful.  We wouldn’t have even known about these flaws without the stirling work of security researchers such as Filippo Valsorda.   Watching his twitter stream as the Superfish scandal unfolded was a surreal experience.  As far as I can tell, the man neither eats nor sleeps, he just effortlessly creates software, documentation, vulnerability testing code, and informative tweets, with a speed that leaves me not so much envious as awestruck.

And finally, I’m left with a sense of determination.  The whole world is connected now, and the Internet is every bit as critical to our global infrastructure as roads, shipping lanes, corporations, and governments. And it is a vital shared resource.  If it is to continue to flourish, continue to allow us to communicate, learn, conduct business, share and collaborate, then it must remain a robust, trustable system.  And although we have been sadly let down this week by systems vendors, the Internet is bigger than any one company.  And our collective need and motivation for it to be a trustable system is greater than the shortsighted greed of any number of individuals.

So I’ll go back to work tomorrow, and I’ll do my best to keep my client’s data secure, their systems running, their information flowing, and I’ll do so grateful for all the work of millions of other hard working developers, systems administrators, hardware designers, and other assorted geeks.

 

Here’s to the crazy ones.