Our Data, Our Rules: Rethinking Protections for Institutional Data
By Vince Sheehan, CIO & Associate Dean for Clinical Affairs IT Services, Indiana University
The headlines scream of data breaches involving universities.
Maryland: 300,000 university records with Social Security numbers Hacked
North Dakota: 291,465 student and 784 employee records taken
Virginia: 144,000 job applications illegally accessed
Delaware: 74,000 Social Security numbers stolen
Intrusions by hackers also strike close to home: Thieves stole information on 163,000 students, faculty, staff, alumni, and applicants at an Indianapolis university.
At my own university, a Student Services staffer moved data from a secure site to a public site for easier access. Records of 146,000 students were exposed for 11 months; thankfully, we detected no misuse of their data to date.
As we welcome our students back to Indiana University, our discussions increasingly turn towards security. This is crucial for those of us in the health sciences community. While protecting all data for the university is important, it is paramount for the health sciences schools. We manage the most sensitive data of all: personal medical history.
Responding to the Indianapolis breach, Fred Cate, Distinguished Professor and C. Ben Dutton Professor of Law at the Indiana University Maurer School of Law, said, “You might think your Social Security number is a secret, but it's the worst-kept secret in the world." You willingly give your Social Security number to doctors, financial companies, your employer, and many others. While the release of your Social Security number might result in identity theft and be terribly inconvenient, there are steps you can take to undo the damage.
However, the exposure of our most personal information – your history of sexually transmitted diseases for example – can never be undone.
While we all take steps to protect data centers, servers, databases, and desktops, security lapses are frequently due to human error. In the last year alone:
• A breast cancer treatment center in Indianapolis mailed 63,000 letters containing information on upcoming appointments to the wrong people.
• Thousands of paper records containing personal medical information, doctors' notes, social security numbers, and insurance information were dumped at a public incineration site in York, Pennsylvania.
• A break-in at offices of a billing firm for county health services in Torrance, California yielded eight laptops with medical information on almost 169,000 people.
That last event is all too common. Mobile devices are stolen or lost every day – and, increasingly, these are personal devices. In the ancient past (five years ago), we could make a reasonable attempt to control the security around devices used at work because they were purchased by the university: our devices, our rules.
With the explosion of personal devices, the world has changed. A 2013 Sophos survey revealed that the average worker carries 2.9 mobile devices. This seems accurate as I watch a man leaving this restaurant, slinging a laptop bag across his shoulder while carrying an iPad and wearing a smartphone on his belt. He's no IT geek – he's my lawyer.
So we are shifting our thinking from protecting devices to protecting data. Our approach is to strengthen security by setting the policy rules around the university data, not just the device. Want access to the university’s health records via your personal laptop/tablet/smartphone? We’ll first verify that your device is password protected and encrypted. It’s not perfect, but it’s a start.
This approach deviates from the culture of the academy. Universities are notorious for unit-independence-on-steroids. Deploying solid data security protocols involves more than just deploying the right technologies and creating written policies; it means changing the culture as well. Security protocols for even personal devices are no longer solely in the hands of their owners. Our data – our rules.
Too often, strong security practices can slow down the delivery of patient care. At a minimum, they are an inconvenience because they require additional steps or keystrokes (with no reinforcement that something positive has happened). So we try to change people’s behavior by focusing on avoiding penalties, as the penalties grow increasingly severe.
The combined $4.8 million fine shared by a prominent hospital and research university in New York for a HIPAA violation helps to bring everything into focus. HIPAA/HITECH Act violations can also bring personal penalties. Although going to jail for violations is still rare, the first individual convicted of a HIPAA violation received a four-month prison sentence in 2010. Several others have been prosecuted as individuals under HIPAA regulations. Recent incidents portend an increase in personal penalties. In the IU data exposure case, the Indiana Attorney General (AG) considered civil suits against the individuals involved. That included IT employees who administered the SharePoint system where the data resided. Although all proper security protocols were in place, the user still had control over content and moved data from a protected area to a public area; yet the IT support staffs were still under consideration for legal recourse. The AG eventually decided on official letters of reprimand, but next time we might not be so lucky.
We all know that no security practices are going to be 100 percent effective. Hackers are smart, they’re motivated, and they'll likely stay one step ahead of us. Sometimes users do things without thinking about the consequences. We have to be ever-vigilant and do everything we can to protect university data, including data accessed by personal devices. Until someone devises a method to truly protect the data no matter where it goes, we will have to continue to protect the spaces where it lands – no matter who owns those spaces.