What’s the first thing that comes to mind when you hear the words ‘Insider Threat?’ If you’re like me, negative and malicious images pop into your head. However, that is typically not the case when you think of Higher Education. Instead, we think about a collaborative and sharing environment, where faculty, students, and staff are empowered and encouraged to use collaboration technology — from Gmail and Slack, to project management tools and data repositories. While there is not necessarily an insider threat, this doesn’t rule out insider risk. There is actually significant risk to data being exfiltrated both intentionally and unintentionally within educational institutions.
Exfiltration or misuse of institutional data can be the result of malicious intent. However, mistakes happen simply by people just trying to get their job done quickly and efficiently. A recent Code42 study found that more than a third of employees use unsanctioned apps every day to share data with their colleagues for exactly this reason. Security teams understand that collaboration tools are essential for institutions; however, they can also pose some of the greatest risks for data exfiltration.
Historically, institutions have relied on security tools that require faculty and staff to be connected to the campus network. As off-network work increases, security teams may lack the visibility they need to monitor data movement. Let’s go back to the term “Insider Threat” for a moment. The term conjures an image of a specific person and is reinforced by traditional security solutions that take a user-centric approach. This big brother approach to security is counterintuitive to the strategy of most higher education institutions. Code42 takes a different point of view. Instead of focusing on specific users, Code42 focuses on the data first because faculty, students, and staff are typically trusted members of the institution. However, the responsibility to protect the University against data exposure events, such as loss, leak, theft, sabotage and espionage fall to the security team, so they need to be able to trust, but also have the capability to detect, investigate and respond to suspicious data movement, should something go awry.
Why is this important in a collaborative and sharing culture like higher education? Imagine a researcher who has been working on a groundbreaking project finds that someone else published her findings first. This could not only be detrimental to her publishing career but have implications on the institution as well in terms of current and future funding and/or harm to the University’s reputation. This scenario could be a result of an accidental cloud sharing permission or, more nefariously, from someone intentionally exfiltrating the data.
Imagine now a different scenario where someone accidentally shares FERPA or PII data and it ends up in the wrong hands? In addition to potential legal and financial ramifications, this could cause reputational harm and cause donors to think twice about donating to the institution. Students might also decide to attend a different institution where they feel their PII will be better protected.
Keep in mind, security is not compliance. However, compliance provides some baseline security control coverage. Good security incorporates compliance as well as additional security controls to address the institution’s risk. The two working hand-in-hand can yield some pretty powerful results. It is important that security teams try to get ahead of potential problems and protect University data from exfiltration or misuse to help prevent litigation and reputational damage.
Security’s role is not to determine whether someone was acting morally; their role is to protect institutional data from getting into the wrong hands. Having a plan in place to engage and collaborate is key to success.
To ensure that everyone within the institution understands the reason for endpoint data protection, it is paramount to be transparent with faculty and staff. Each college, academic unit or department should be an integral part of the insider risk planning, as they will know what is considered “normal” from a data movement standpoint. Code42 Incydr will deliver the risk signal needed to distinguish between everyday collaboration and the events that put institutional data at risk.
Code42 Incydr allows an institution to manage and mitigate insider risk by providing the ability to quickly detect, investigate and respond to data exposure across a variety of vectors including cloud file shares, browser uploads, removable media, and email to name a few. Incydr is also used to identify file-sharing activity in communication platforms (i.e. – Slack), cloud collaboration platforms (i.e. – OneDrive, GoogleDrive, Box, Dropbox), and more to ensure that acceptable use policies are being adhered to. It will also help guide Security teams on what additional security awareness and training is needed and target that training to the people or departments where it is needed most. Code42 Incydr is quick and easy to deploy and doesn’t take a massive team to manage, bringing time to value very quickly.
Here are some truths: faculty and staff will eventually move on to other institutions or departments, people make well-intentioned mistakes, institutions are going to restructure. So, it is important to make sure that you have the ability to detect, investigate and respond to harmful data exfiltration.