This is the first in a series of Insights articles that detail challenges associated with proper Host/ Endpoint Data Management.
There are myriad solutions that aim to protect sensitive desktop information, i.e. word processor documents, spreadsheets, presentation materials, PDFs, email content and other data files (even proprietary) that applications read and write as you carry out daily tasks. Some focus on protecting data after it leaves your host, and others aim to detect/ prevent unwanted host application execution.
Fewer endpoint solutions use file encryption then enhance different aspects of its' application to obfuscate content from intruders who steal data files. Many deliver real value, though few combine ease-of-use with in-depth capabilities.
This is the first in a series of articles that discusses the complexity of host-based data management. We begin with file encryption, describe pitfalls, then offer insight into additional challenges. We finish with what you can expect to gain from following along, together with a preview of services we plan to detail.
Today, data management and data protection are tightly coupled: one cannot claim to effectively manage content unless it is done so securely. The security of a system is of course a relative claim, and our measure of effective data protection is one that, a) maintains high availability, b) provides assurances that content retains its' intended form, and c) limits exposure to unauthorized resources.
Thus, in a perfect world, all your documents would be protected all the time, limiting access to you, your teammates, and third parties with whom you're doing business - customers, partners, vendors, and so forth. These documents would also be available to you even if your host computer was corrupted, subjected to Ransomware, or worse yet, if information was manipulated in subtle ways (a growing threat in certain sectors). You want to know that content is as expected, securely stored and made available in its' original, intended form (Integrity Protection), at any time - all the time. Perhaps most importantly, you want assurances that content isn't available to unauthorized users - and if possible, knowledge concerning possible data leaks.
On top of that, we all want to continue working like we do now, without being limited in the applications we choose to utilize, or in how we use them. Nothing should change - we want to continue using sync and sharing solutions like Dropbox, Box, Egnyte, and others - and we also need to be able to use proprietary software developed in-house. We of course need to maintain compatibility with existing and future IT systems, which might include the use of shared server storage, SharePoint and related technologies, virtual desktop environments, and/ or a set of integrated in-house systems that manage internal proceedings. We also want to attach protected items to email, and protect the content of those emails, limiting the need for specialized input, thus minimizing the impact of unintended operations or human error.
After many years of effort, we've achieved something remarkably close to this reality. And though you may find aspects of these desires in some systems, the benefits are not often found in cohesive solutions that can be deployed, maintained, and used with little effort. Let's consider file encryption to start.
Traditional File Encryption
Encryption alone is hardly sufficient to protect from today's threats - but it is a fundamental element in achieving this result. As a starting point, let's look at the, "most manual" instantiation of the basic form of a widely used desktop encryption solution:
- First, you provision keys for your, "identity", often associated with an email address (sometimes using aliases for convenience)
- You maintain these keys for ongoing use, which often requires that you publish a public component then secure a private component
- Before encrypting content, you must first decide who will have access to the encrypted result of the file
- Once you know who you will share a file with, you have to import recipient's public keys (because of how this works under the covers)
- The process of acquiring the proper public key for each recipient involves some form of verification, best carried out in person
- You are now ready to encrypt the file with the public keys of those with whom you wish to share the result
- You can share the resulting ciphertext using whatever means available to you
Though first available, in practical form, in the early 90s, you can still to this day acquire open source software that allows you to manually execute these steps. This of course facilitates specialized automation for many of these tasks, and there are well-known variations that provide enhanced value by doing so in flexible and widely accepted ways. These solutions are, however, for relatively accomplished users with insight into supporting details.
Using Traditionally Encrypted Content
In this baseline encryption procedure, plaintext access is available only to those that hold the private key component matching one of the public keys used during encryption. The procedure for accessing plaintext content is straightforward - with ciphertext, execute a decrypt operation with your private key, consume/ modify original content, then re-encrypt with the public keys of those to whom you intend to share changes.
But wait: How do you know who was slated to receive the original ciphertext? Do you use an email recipient list, and is there any way of knowing whether or not this matches the set of public keys used during encryption? Or, better yet, is there a way to see the public keys used in the encryption operation? What happens if, in response, someone re-encrypts content and accidentally uses the wrong public key for one of the intended recipients?
But worse yet, what if one of the recipients is unaware of the danger in exposing a private key, or blatantly ignores sensitivity and, as a matter of convenience, makes his/ her private key readily available on his/ her host computer to simplify operation? This of course exposes the private key to theft by host intruders. Once taken, the integrity of the system is compromised - and over time, the team continues to work with the notion that content is protected despite the fact that an intruder can access all plaintext destined to our lazy friend.
And though it's true that our intruder can wait for our lazy friend to decrypt content then take the plaintext file - which we'll revisit shortly - the attacker can actually refrain from ongoing operations on the host computer if he/ she can find a way to intercept content in transit, acquire it from, say, unprotected email transmission, and access data over time.
In that case, how does the team learn that content has been compromised? Unfortunately, not until it's far too late - and this does happen quite a bit more frequently than you hear about in the news, specifically because companies aren't required to disclose data breaches except in certain cases.
Fundamental Coverage Gaps in a Connected World
But with the notion of supporting software, this doesn't seem too bad, does it? After all, this solution in the generally stated form has been around for a long time, and many of the user management and access control problems were once effectively controlled with supporting software and systems.
But in the given file encryption process, there are two fundamental flaws that were once not significant concerns:
- Use of enciphered content requires decryption to plaintext before consumption and use, exposing sensitive materials
- Private keys, on a compromised host in any form, will be exposed and acquired to be utilized by attackers
The impact today - not nearly as significant 20 years ago - is largely the result of the more interconnected world we now live in. This includes the emerging prevalence of IoT devices, and the widespread use of BYOD corporate computing, and other dynamics that provide greater opportunity for attackers to infiltrate corporate networks and set up long-term surveillance campaigns. These are designed to avoid detection and quietly offload sensitive information over time, and some have been present for many years before they were detected and removed.
As a result, when we look at today's instantiation of related solutions, we find yesterday's solution that is no longer reasonable in the new dynamic. Such systems suffer from long-term maintainability and flexibility that requires more than simple changes. Dominant solutions were rooted in technologies that required significant commitments to infrastructure not designed to work with today's dynamic cloud-based flexibility. For example, one from only a few years ago came with an almost 30-page prep guide and two weeks of on-site training, "highly recommended" before deploying for...trial consideration.
The result? File encryption as a means for protecting sensitive host application data became too expensive, too difficult, and not nearly effective enough to justify ongoing use. Investment in related technologies all but stopped somewhere around 2010, and vendors refocused teams on the emerging use of cloud technologies for data storage and collaboration.
As you investigate host data protection solutions, you may be surprised to discover how many new offerings fail to address these coverage gaps - though this isn't going to be easy to see offhand. But with time and diligence, you will find that much of this results from an architectural foundation that isn't suited for the task at hand. This very foundation most often prohibits delivery of additional core services necessary to support a complete solution effective against high-end threats encountered with motivated attackers, hackers for hire, and well-financed nation-states.
Much of this is logic: If and when you expose critical and fundamental cryptographic primitives on a host computer - or other details necessary for maintaining insight into access and use of protected/ managed content - you are tempting fate in that an attacker will eventually gain access and thus compromise the system (and/ or alter access records to cover their tracks, complicating security event investigation).
That isn't to say such solutions don't provide value - they are capable of inhibiting low-level, high-frequency attacks that can and do impose considerable damage to corporate systems. Without viable alternatives, it's the best one can achieve while layering defenses with different types of systems at different touch points in the lifecycle of a sensitive data item. But without a solid architectural foundation and/ or platform, the transition from stopping less-advanced threats to managing nation-state attacks becomes nearly insurmountable.
This is the general purpose and the scope of this series of articles - the enumeration, description, and explanation of a service set foundation capable of delivering an effective response to threats against sensitive host application data. In recent headlines, this seems a losing proposition for all but the most well-financed IT security teams in large corporations. We will describe the components that make up a suitable solution available to most any organization, big or small, that doesn't require in-house expertise or demand considerable administrative attention.
Integrating Islands of Capability
So, cynics may say, big deal - software is written for all kinds of things, and this seems another set of challenges for the smart folks out in Silicon Valley. True in some respects, but these technologies require specialized talent, and assembling that talent in one place to tackle a broad value proposition requires startup capital that today isn't as readily available as it has been even in the recent past.
Take for example security technologies on the market today - many deliver technique-based products that focus on one or two aspects of a specific domain - Detection and Prevention, Vulnerability Assessment, Patch Management, Data Encryption, Event Correlation, and so forth. This is smart business - focus on core competency, keep from going astray, maintain constancy of purpose. Though suitable for individual companies, it results in a market of capability islands that do not naturally fit together. This then requires specialized IT service integration that is costly, complex, requires ongoing maintenance, and sometimes provides more opportunity for attackers than it prohibits.
This pushes smaller organizations with less budget flexibility into making decisions that trade-off risk to maintain a competitive pace. Many do little to nothing, or at least the minimal required for regulatory compliance. This takes away from the very real potential to build an effective in-house security program on top of required regulatory structure. The resulting risk proposition sometimes pays out - in catastrophic and unrecoverable failure.
This is part of why we see a parade of massive data breaches, and when you look deeper, a long list of smaller companies that have been run out of business as a result of data compromise. Is any of this accidental? It's quite a tool for nation states to maintain their advantage when small organizations create real innovation with strong potential. Why build it when you can steal it then crush the innovators? It's not a Hollywood movie script anymore.
Data Protection Series
This series of articles will continue forward discussing essential requirements for effectively managing host application data. By the end of this series, we hope to arm you with new tools to help separate the noise from the real solutions, and as a result improve on your ability to identify qualified prospects.
In all forward pursuits, it's important to remember an immutable truth: You are going to be compromised, and information is going to be stolen. You can, however, minimize and manage the situation to reduce the cost to your business. That is the most reasonable and practical goal one can strive to achieve, and when you encounter those with aligned perspectives, we strongly believe you will quickly recognize the real value their technologies offer. It is then for you to decide how well it aligns with your needs. Contact our team with email to email@example.com if you think we can be of assistance.
And with that, we will leave you with a summary set of components and techniques we've brought to bear in the creation of our solution - SSProtect - which will provide the foundation for further review.
Technologies and Purpose
- Adaptive Filtering - Intercepts access to managed content, isolates content and begins the process that secures further activities
- :Access 2FA - Two-factor Authentication that manages identification and authentication as input to authorization
- Secure Communications - Optimized, secure networking designed for efficient cloud-offloading of sensitive cryptographic operations
- KODiAC Cloud Services - The patented methods that coordinate host execution, cloud operations, and secure transfer/ use of results
- :Assess Auditing/ Reporting - Cloud-managed audit information that records fine-grained details of a transaction, for reporting and Analysis
- :Recover Storage - An optional service that stores version-based content in the cloud, with Zero Trust, for on-demand Restore
- :Collaborate Sharing - Policy-based collaboration and sharing permissions, managed by authorized personnel, to simplify end-use
- :Confidential In-Place Encryption - Host services that isolate decrypted content for application-independent - and exclusive - protected use
- :Respond Remediation/ Analysis - Specialized, optional services designed to inhibit sabotage and provide objective disclosure risk insight
This article was published January 17th, 2019