Data Resiliency: The Strategic Foundation for AI After Commvault SHIFT

The Commvault SHIFT London event highlighted that data resiliency is essential for effective AI and BI strategies. We need to consider what we do with the data that AI produces, not just the data sources. Without a strong data foundation, businesses risk poor decisions and operational failures. The event advocated for integrating security and data management, deploying tools like Commvault Cloud Unity, and continuously testing recovery plans to ensure organisational resilience.

The Commvault SHIFT London event on 25 February 2026 presented a clear message to the technology industry and business leaders. Data resiliency is the core requirement for any functional Artificial Intelligence (AI) or Business Intelligence (BI) strategy. Without a resilient data foundation, these technologies are unreliable.

The current focus on AI deployment often ignores the underlying infrastructure because people don't often want to think about the data. Organisations invest in AI, whether it is Copilots, large language models (LLMs) and predictive analytics while their data management remains fragmented. This is a mistake. If you take one thing away from this post, let it be this:

Data is the prerequisite for AI, Business Intelligence, and good decisions.

It provides the context and the factual basis for every output. When data is compromised, the AI becomes a symptom leading to bad decisions. For the business teams, it is not clear where to identify and cauterise the liability before it spreads.

Businesses often see AI as the goal or endpoint of an AI project or program. However, this view is not strategic enough. What do you do with the data that your AI models produce? The process is not end-to-end, but rather, it is start-to-end-and-start again. The process is cyclical and iterative rather than linear, and data resiliency will also encompass the cyclical, iterative nature of data as a product as well as a source.

The Navigation Analogy: Flying Without GPS

Relying on AI without data resiliency is like flying a plane without a GPS. The plane has powerful engines and sophisticated aerodynamics. These represent the AI models. However, without a GPS, the pilot does not know the current location or the correct heading.

In a business context, data resiliency is the GPS. It ensures that the information feeding the AI is accurate, available, and untainted. If a cyberattack occurs or a system fails, the GPS stops working. The "plane" continues to fly, but the decisions made by the pilot are guesses. Strategic and operational decisions are impossible without a reliable data stream. A business that loses its data foundation loses its ability to navigate the market strategically, or even serve its customers well from an operation standpoint.

Circular data resilience loop showing data sources feeding AI processing and analysis, producing outcomes that generate new data for the next cycle.

Data Resiliency is a Strategic Asset

Strategic decisions rely on long-term trends and historical accuracy. Operational decisions rely on real-time availability. Both require data resiliency. At the Commvault SHIFT event, the discussion centered on how data protection is now a strategic function. It is no longer just a task for the IT department in the basement.

The integrity of data impacts the entire organisation. For example, if a financial institution suffers a ransomware attack that encrypts its historical records, its AI-driven risk assessment tools become useless. The model cannot learn from data it cannot access. This forces the organization to revert to manual processes, which are slower and more prone to error.

Data resiliency prevents this regression, which usually reverts to Excel Hell. It ensures that the business remains in an AI-ready state even during a crisis.

Identity Resilience and Active Directory

A major technical highlight at Commvault SHIFT was the focus on identity resilience. Most cyberattacks target identity systems to gain elevated privileges. Active Directory (AD) is a primary target. If a company loses its Active Directory, it loses the ability to manage users and permissions. It is an obstacle to data access for humans and AI.

Commvault emphasises that protecting the data is useless if the identity layer is gone. Recovery must include the identity infrastructure. The event introduced specific enhancements for Active Directory protection. This involves more than just backing up the database. It requires the ability to recover the entire AD forest in a clean environment. This is a requirement for modern business continuity. Without identity, there is no access. Without access, there is no AI.

ResOps: A New Operational Paradigm

The event introduced the concept of Resilience Operations, or "ResOps." This is a shift in how companies organize their technical teams. Historically, security teams and data management teams worked in silos. Security teams focused on blocking threats. Data teams focused on storage and availability.

These silos cause delays during a recovery. ResOps brings these teams together under a single operational umbrella. This collaboration is necessary because a cyber incident is both a security breach and a data loss event. By integrating these functions, companies improve their response times.

A critical metric in this area is MTCR, or the Metric to Clean Recovery. While MTTR (Mean Time to Repair) is a common IT metric, MTCR focuses specifically on how long it takes to recover clean data after a cyberattack. ResOps aims to lower this metric by automating the handoffs between security alerts and data recovery workflows.

Security and data teams collaborating on a unified ResOps strategy using a shared interactive data interface.

Commvault Cloud Unity and Data Rooms

The Commvault Cloud Unity platform is the central hub for this new approach. It provides a single view of data across hybrid environments. It works with HyperScale X for on-premises data and HyperScale Flex for public cloud. The consistency of this platform is its main advantage. It treats training data, models, and embeddings as critical assets.

One specific feature discussed was the use of "Data Rooms." These are isolated environments used for safe data activation. In a typical recovery scenario, there is a risk of re-infecting systems with dormant malware. Data Rooms allow organizations to restore data into a secure space first. Here, they can scan, test, and verify the data before it enters the production environment. This is a practical application of the "cleanroom" concept. It allows a business to resume operations with confidence that the data is not compromised.

Planning and Testing are Mandatory

A backup is just a copy of data. A recovery is a successful business outcome.

Resiliency is a state you maintain through planning and testing. The SHIFT event highlighted that many companies have backup plans but few have verified recovery plans. 

Regular testing is the only way to ensure a recovery plan works. This includes testing the "Recovery Runbooks" which act as blueprints for the application architecture. If an AI application depends on three different databases and an identity provider, the runbook must account for all of them. Testing these dependencies reveals gaps that are often invisible during normal operations.

I often see organisations confuse recovery measures with having a backup. There is a technology bias which assumes the technology will work when needed. However, data resiliency requires constant validation. For more on moving beyond basic setups, see my earlier article on escaping the AI pilot trap.

Isometric blueprint of a secure data cleanroom for verifying and testing resilient cloud infrastructure.

The Impact on People and Culture

The move toward data resiliency requires changes in company culture from a reactive mindset to a proactive one. This is often the hardest part of the transition. People must change how they view data risk while understanding that a backup isn't enough.

When security and data teams work together, they share responsibility for the outcome. This breaks down the "blame culture" that often follows a data breach. Instead of the security team blaming the data team for poor backups, or the data team blaming security for a breach, they collaborate on a unified ResOps strategy.

This cultural shift impacts leadership because executives need to understand that data resiliency is a prerequisite for their AI ambitions. If they want the benefits of Business Intelligence, they must support the underlying resiliency through executive sponsorship and viewing it is a business insurance policy for the AI age. You can read more about the risks of unmanaged data in my previous post about Excel Hell and data automation.

Concrete Facts from Commvault SHIFT

The event provided several specific data points and product updates:

  1. Commvault Cloud Unity now integrates identity activity and threat signals into a single dashboard.
  2. Cleanroom Recovery now supports automated recovery of Active Directory forests.
  3. HyperScale Edge extends data protection to remote locations, ensuring AI data at the edge is as resilient as data in the data center.
  4. Synthetic AI Recovery uses telemetry to identify the cleanest backup snapshots, reducing the time spent searching for uncorrupted data.

These updates show a move toward automation. Manual recovery is too slow for the modern threat environment. AI itself is being used to protect the data that AI requires.

Conclusion

Data matters, and data resiliency is the foundation of the modern enterprise. The Commvault SHIFT event in London was music to my ears that the industry is moving toward a unified approach of seeing data from start to finish and then circling around to the next start. This approach goes beyond data protection and identity resilience towards integrated operations. It cares about data from start to the next start, not just start to finish. 

AI and BI are powerful tools, but they are only as good as the data that supports them. Without a plan for resiliency, these tools are fragile. Organisations must prioritise their data foundation. They must break down silos between teams and implement platforms like Commvault Cloud Unity. They must focus on metrics like MTCR and commit to regular testing.

The analogy remains true. You can have the most advanced plane in the world, but if you cannot navigate, you will never reach your destination. Data resiliency is the navigation system for the AI-driven future.

For further insights into how data strategy impacts your visibility and decision-making, see our discussion on Power BI as a spotlight or a filter.

Share the Post:

Related Posts

Is Your Power BI a Spotlight or an Instagram Filter?

Power BI’s greatest value might not be the insights it delivers. Rather than masking the issues like an Instagram filter, Power BI may be your critical friend because of the problems it exposes. The wise thing to do is to use that signal to navigate your next step, rather than try to mask it as if it is an Insta filter. 

Read More

When Trusted Domains Betray Trust: Power BI Scam-Spam and the Case for Proactive BI Governance

Microsoft Power BI is facing increased phishing attacks that exploit its legitimate report subscription feature, allowing scammers to send malicious emails from trusted domains. Organisations need to recognise that Business Intelligence tools have functionality that could be used for good or bad. Governance means understanding the features, and using controls that recognise the fine line between business intelligence and cybersecurity.

Read More

Discover more from Jennifer Stirrup: AI Strategy, Data Consulting & BI Expert | Keynote Speaker

Subscribe now to keep reading and get access to the full archive.

Continue reading