Securing Enterprise Data in the Remote Work Era With DSaaS

essidsolutions

Enterprises should consider implementing query-level data consumption controls using data security-as-a-service (DSaaS) solutions as cloud data warehouses lack data consumption controls and have inadequate data-loss prevention (DLP) or endpoint protection as they are entirely virtualized.

Insecure personal devices, accordingOpens a new window to Deloitte, connect daily to networks in more than 30 percent of U.S., U.K., and German companies. The potential increased tempo of insider threats, given that most breach incidents originate from employees themselves, has brought IT interventions in sharper focus. As IT departments scramble to enable remote work productivity while looking to avoid excessive risk, this has created a slate of data security issues. 

Companies have limited control, indeed in some cases no control, over the remote computing environment of employees, partners and customers who have access to their networks. Consider the reportedOpens a new window spike in brute-force attacks against remote desktop protocol (RDP) amid the Covid-19 crisis, and “Zoom-bombing” as workers began relying to a much greater extent on video conferencing and collaboration tools. Cloud-based unified communications and collaboration platforms such as Microsoft Teams and Cisco are targets for large-scale attempts to gain access to credentials using stolen or compromised identity details.

As enterprises looks to modulate and adapt their IT environments in the post-Covid era, where enablement of secure remote access will be a standard and integral aspect of a “new normal,” security teams will need to institutionalize many of the policies and processes which they hastily rolled out in the early weeks of the pandemic.

This increased attack surface means legacy data protection systems are particularly problematic because they were not designed for the cloud in the first place. Stay-at-home orders have accelerated the migration to the cloud or a hybrid cloud environment as companies fast forward digital transformation. A secure network that supports a remote workforce, from streaming platforms to collaboration tools to data access, is essential. This often includes building a data warehouse in the cloud to unite data sources and maintain the semantic richness of mission-critical information.

Companies like Amazon, Google, and Snowflake offer frictionless, inexpensive storage of vast quantities of data, but the anatomy of a cloud data warehouse (CDW) is no less vulnerable to exploitation than on-premise systems. Firewalls and “border security” to protect the perimeter and make it easy to authenticate remote users before they are allowed inside the gate of a CDW, however lack of data access controls and tracking is a proverbial “chink in the armor” that exposes key business information to malicious actors.

Learn More: Cyber Threat Intelligence: A Useful Tactic To Reduce Cyber Risks

Lack of Controls Around Data Consumption Widening the Threat Window 

Enterprise companies have already invested heavily in security. In fact, worldwide security spending has surpassed $100 billion annually and is expected to growOpens a new window to $170 billion by 2022. Security teams are adapting their data protection posture toward a zero-trust stance with stronger consumption governance and observability models that can readily detect and respond to anomalous behavior.

However, these organizations still lack basic visibility into and control over the sensitive data they collect and consume. This prevents them from understanding how their organization uses data, and hampers their ability to leverage consumption patterns to reshuffle priorities and rebalance resources toward a more resilient corporate posture.

Meanwhile, a lack of control around data consumption means while companies may have implemented controls around who is able to access data, and what data they are allowed to access, they have not closed a critical gap around how much data a credentialed request is allowed to consume.

These two factors—an inability to understand enterprise data consumption and a lack of control around how much data is allowed to be consumed create a perfect storm for today’s enterprise. The fact is, credentialed requests for data are often able to consume without limits, opening up a level of risk that puts entire companies and computing infrastructures at stake. With the rapid changes demanded by remote access and workflows, the urgency to close this gap has only grown in importance.

Limits on the consumption of data are critical and organizations that fail to manage this aspect of security are at risk. Evidence suggests, in fact, that credentialed access is the most common way data is exploited. A recent Verizon report indicatesOpens a new window that insiders play a role in nearly one out of every three breaches of cloud-based data, and more than 80 percent of attacks in general.

While the financial and reputational impacts are difficult to precisely quantify, they are significant and wide-ranging. From an IT perspective alone, costly staff and other key resources must be diverted away from other strategic imperatives. Projects can be delayed, morale can wane, and recovery can consume considerable resources, time and effort.

Learn More: 5-Point Checklist to Fix Cybersecurity Threats That Loom Large

Mitigating Risk With Data Consumption Governance

Data access requests usually let credentialed users consume data without limits. But more and more, that type of open access based on identity alone is problematic because it leaves an organization vulnerable to virtually any type of failure including insider threats and SQL injection attacks.

To mitigate these risks, observability and control over data consumption is a prudent measure because it employs the security principles of “need to know” and “least privilege”. Ideally, visibility and control would treat data the same way credit card companies treat access to money. Companies are increasingly turning to technology that recognizes abnormal data consumption and thereby limits data loss when unauthorized access inevitably occurs. Research firm Omdia, formerly IHS Markit, describesOpens a new window the steady trend toward SaaS-based data-access control systems as a consistent technological development that avoids the need to invest in costly IT infrastructure.

Data solutions deployed via SaaS are well aligned with cloud and CDW initiatives, because they facilitate interoperability and “new normal” collaboration. They also have the potential to address the current gap between the consumer and commercial sectors in terms of IoT sensors and other data sources, as well as analytics and business intelligence tools, which require responsive governance and access controls to drive change.  

Data security-as-a-service (DSaaS) is catalyzing this shift. Similar to how credit card providers monitor activity and protect cardholders from fraud, this approach to data protection observes data flows and recognizes aberrant patterns when data consumption exceeds preset thresholds or access to data is outside normal parameters for a user and restricts or curtails abnormal consumption of data in real-time.

Cloud data warehouses, such as Snowflake and Amazon Redshift, provide for user permissions and protect at-rest data, and access control can be managed with single sign-on (SSO) solutions like Okta. However, data access governance is problematic at best. CDWs lack data consumption controls, and because they are entirely virtualized, data-loss prevention (DLP) or endpoint protection around the data stack are inadequate to protect data from credentialed users. Older technologies such as proxies, while tempting at first glance, hinder performance and still leave data exposed.

SaaS and cloud-based solutions that offer observability at the query layer between applications and provide some level of control over data consumption represent a crucial step forward. Working in parallel with a CDW itself, this approach is in fact abstracted from and not connected to infrastructure, meaning it actually places data access governance and observability into the context of data consumption without inhibiting the performance and scalability of a CDW.

Observability is a key attribute of a model that places contextual filters around consumption patterns across all sensitive data, whether PII, PHI, or PCI data. Data access requests usually let credentialed users consume data without limits. But more and more, that type of open access based on identity alone is problematic because it leaves an organization vulnerable to virtually any type of failure including insider threats and SQL injection attacks.

By implementing data consumption governance, based on the observability of data usage patterns at the query level that underpins every cloud store, enterprises can understand how their organization uses sensitive data as a predicate toward protecting data and solidifying their competitive position while securely and confidently advancing a futureproof digital stance no matter what “new normal” permutations lie ahead.

Let us know if you liked this article or tell us on LinkedInOpens a new window , TwitterOpens a new window , or FacebookOpens a new window . We would love to hear from you!