Written by Danny Allan, VP of Product Strategy at Veeam
When it comes to artificial intelligence, most people are well aware of the tropes from popular entertainment: the malevolent computer, the android gone rogue. In reality, however, AI is already transforming numerous industries for the better.
Fields as varied as health care, education and public safety are leveraging artificial intelligence to automate some processes and optimize others, allowing humans to spend more time on their most meaningful work and less time managing rote tasks. And with global spending on AI technology expected to reach $46 billion by 2020, it’s likely to become not only prevalent but also crucial to modern business.
AI is especially poised to transform data management. In this space, true innovation isn’t just about capturing more data—it’s about having it always available, responding to it and separating signal from noise in order to improve processes and drive better business decisions. In the data center alone, AI and automation are giving companies new ways to meet the challenges of our age of information hyper-sprawl, the result of increasing adoption of the cloud and policies like BYOD.
AI and automation bring consistency to standards and compliance issues. They improve server efficiency and cut costs across the IT infrastructure. And crucially, they put businesses in a better position to understand and act on their data, however disparate or voluminous.
The Path to an AI Solution
Becoming a hyper-available, AI-assisted enterprise doesn’t happen overnight. Arriving at this point is a gradual process, although each step comes with substantial benefits. Artificial intelligence never starts with artificial intelligence. There are always algorithms upfront, then machine learnings and eventually you get to AI. In some corners, it’s already extremely well-entrenched.
We use network edge detection systems to determine whether malware is attacking. We can take a snapshot of the data to make a definitive determination without a human becoming involved. Contrary to how it’s often presented, investing in AI doesn’t mean removing humans from the equation. Though the technology is rapidly becoming more advanced, it would be a mistake to fully trust machines to handle complex processes, particularly those impacting availability.
We think of AI in the province of data replication the same way you might think about a self-driving car. You wouldn’t want a network disruption or a network data management protocol packet to disappear when all of a sudden the entire workload moves to the cloud. There still needs to be a person saying, ‘Yes, allow that to take place,’ with automation working up until that point.
As automation evolves into artificial intelligence, it’s up to businesses to build data availability into their tools and processes from the beginning, rather than bolting it on after the fact. It’s a conversation that runs parallel to the world of cybersecurity, where defending a company’s perimeter from cyber attackers all too often becomes an afterthought.
But now, with philosophies like DevSecOps encouraging businesses to make security a baked-in part of software development, we’re beginning to see a shift in thinking. In data management, the same principle applies—particularly for businesses leveraging AI and machine learning as part of their availability strategy. As an industry, we’re at an inflection point where we’re beginning to build resiliency and dependability into our systems.
The first step is to recognize that it’s a journey. There are steps to go through and practices to learn along the way in order to achieve that resiliency. And it’s not just about recovering faster — it’s actually lowering costs for the organization. It’s helping them deliver better ROIs.