“A consistent global file lock, and concurrent burst-syncing to both the cloud and to each local filer offers real-time collaboration while addressing issues around file bloat, versioning or overwrites,” says Edward M.L. Peters, Chief Innovation Officer, Panzura, in an exclusive interview with ITSecurityWire.
ITSW Bureau: What were the vulnerabilities in the cloud file systems that the pandemic has presented to enterprises across the globe?
Edward M.L. Peters: The vulnerabilities have not increased in global file systems with the pandemic, the overall attack surface has changed, increasing the need for cloud services that are engineered to prevent failure and insure for replication.
Enterprise companies in particular will need to rethink how they consume data itself. In fact, this thinking is already underway. Data is processed closer to the edge source, near the physical location of people and devices become ever more geographically dispersed.
The pandemic has taught enterprises that workforce models are changing. Collaboration is driving the enterprise to adopt new cloud services, and bad actors naturally are attracted to the opportunities this provides.
Data protection, retention, and security are more important than ever, and IT departments are now expected to securely support remote work across any cloud configuration.
Global file systems are a good example. Multi-cloud services platforms need to allow people to collaborate in real-time, unencumbered by notions of place, time or location.
While the weakest security links remain the entry points into a system, the frequency and success of attacks suggest a completely impenetrable first line defense may not be possible.
Enterprises need look no further than ransomware and other malicious code attacks. Cloud-native file sharing and other hybrid-cloud services have to be engineered to avoid failure in the first place.
Also Read:The Criticality of Data Protection – Are Data Privacy Programs in Enterprises Well Established?
ITSW Bureau: Are there any initiatives that enterprises can take to maximize their data protection? If so, would you please care to elaborate on them?
Edward M.L. Peters: With legacy NAS storage, enterprise data is protected through often complex and costly processes that combine replication and offsite data backup solutions for file restoration and disaster recovery. Enterprise hybrid cloud NAS can replace that system with one that is simple and cost-effective.
Rather than managing multiple copies of the data replicated in different locations, a single, authoritative version of the data is protected by the high availability of the cloud, which itself maintains replica copies both in region and across regions. That data is then incrementally protected through snapshots, which provide point in time recovery for anything from an individual file up to a complete file system.
Depending on snapshot frequency, this can provide near-instant recovery from file loss, human error, or a ransomware attack.
Since the cloud effectively has unlimited scalability, there is no practical limit to the frequency or number of snapshots that can be taken. Additionally, as new data or changes to existing data are written as net new blocks in the cloud, protected by snapshots, existing data is not subject to corruption or infection by viruses or ransomware and can be easily reverted back to a point prior to the attack.
Enterprises should be wary of systems that rely on cloud encryption and security measures alone when connected to the cloud. Data bloat is a common side effect or improper application of encryption, with the end result being significant increases in storage costs. Deduplication and compression of data before sending it to the cloud is an effective strategy for eliminating cumbersome back-up and DR processes.
ITSW Bureau: How can enterprises leverage cloud-based collaboration and file transfer as a foundation for a competitive advantage?
Edward M.L. Peters: Cloud-based collaboration requires a solution that enforces how and when files are available across the network independent of geography, regardless of the size of the files or their origin.
Failure to do this results in team members overriding each other or creating file versions that must be merged and this is one of the faults in systems that have not created the infrastructure to ensure thoughtful access to shared files.
While most providers offer global file locking, it is important that each filer is more than 60 milliseconds from a cloud locking platform. This is because advanced locking collaboration features, when restricted to within 60 milliseconds, are not consistently enforced resulting in file versioning problems.
Organizations must satisfy the need of every team member to open and work with files and simultaneously see changes made by others regardless of their physical location. And, the cloud file solution must be fast and effective as more sites are added to the network. Deduplication and compression of all data at the edge, before transferring it to the cloud is essential. This improves the pace of synchronization and is less taxing on bandwidth.
Lastly, Determining value should be a clear cut decision based on product capabilities and features that directly impact productivity gains throughout the organization.
The time to open files, especially as your organization becomes more spread out or moves to permanent remote work is paramount. 10 minutes, 20 minutes is an impossible amount of time when you calculate the compounded sum of an engineer’s time.
Changes to cloud files must also be synchronized with other locations for efficient workflows and collaboration.
The real return on investment for customers is the operational efficiencies gained utilizing a cloud-based global file system. Even the largest files should take seconds to open, regardless of user location. A consistent global file lock, and concurrent burst-syncing to both the cloud and to each local filer offers real-time collaboration while addressing issues around file bloat, versioning or overwrites.
Also Read: IoT-Centric Cyber-attacks are Rapidly Increasing
ITSW Bureau: What trends do you see that will shape the cloud security landscape?
Edward M.L. Peters: It’s hard to quantify the impact of the last year on IT infrastructure and the light it has shone on cloud-based technologies and overall resilience as a business imperative. A Gartner analysis of 2021 cloud priorities named “distributed cloud” as a primary focus and the security implications of this move cannot be downplayed.
Otherwise known as the hybrid cloud, the distributed cloud allows an organization to migrate processes to a combination of public and private clouds. This model requires a security posture that implores companies to create a shared responsibility security model. This is not new in 2021, but bears repetition as organizations become more location independent for the long term.
Additionally, Data management will become more demand driven in the year ahead. This will require a change in mindset, away from thinking about data as a monolith, and more about what type of data teams need. What are they using the most and how can IT leaders provision the transfer of that data?
Understanding the demand for data, not just how many users are supported or what capacity is needed, will be the hallmark of a new demand-driven model that will begin to drive both services, as well as pricing. More data means tools are needed that provide a more granular view of data usage down to individual files.
Edward M.L. Peters, Ph.D. is Chief Innovation Officer for Panzura, where he leads teams that conceptualize, plan, build and deliver the most innovative data management products on the planet. Formerly CEO of OpenConnect Systems and DataDirect Technologies, Inc., Dr. Peters is an expert in leveraging the combination of data and predictive analytics to digitally transform businesses and has been published in both scientific peer-reviewed journals, as well as the popular press (Forbes, The Financial Times, The Hill).