Seraphim Vector - Fotolia
How can your IT resilience plan incorporate big data analytics?
Big data analytics can help businesses spot past mistakes, forecast conditions and improve future strategies. So, why should resilience planning be any different?
Most organizations are not yet using the power of big data analytics as a part of their risk management and disaster recovery planning. However, as part of an IT resilience plan, the technology holds enormous potential for improving risk management and ensuring that companies are better prepared when disaster strikes.
Although big data analytics has undoubtedly proven their usefulness in the business world, it is easy to make the mistake of assuming that these types of analytics are only useful for spotting hidden business trends. Like financial applications, though, IT systems also generate massive amounts of data, usually in the form of event logs. An organization can analyze its logging data in a manner similar to how it might analyze its business data.
This raises the question of how an organization might improve its ability to avoid potential risks or recover from a disaster based on the analysis of IT data. The key to seamlessly incorporating big data into an IT resilience plan is to take a cue from the business world.
Big data analytics have many different uses in business, but the majority of those fall into two basic categories: First, big data analytics is used to spot missed opportunities. An analysis of an organization's sales data might reveal, for example, that the organization can expect to sell more widgets if it cross promotes those widgets with some other item.
Big data analytics is also heavily used for forecasting. An analytical engine might, for example, be able to determine that the current market conditions mimic those of the third quarter of 2016 and make a sales forecast based on those similarities.
Applying this concept to an IT resilience plan, there is clear value in using historical data to assess the current situation and to make predictions for the future. An analysis of historical logging data might reveal that a particular workload is more prone to failure than some of the organization's other workloads. If IT recognizes which workloads are more prone to failure, it can prioritize its resources based on those findings.
Risk intelligence provider NC4 currently uses this type of data for disaster recovery planning purposes. Recently acquired by Everbridge, NC4 aggregates big data to provide customers with actionable information to improve an IT resilience plan.
Logging data may be able to help predict an upcoming failure. If certain errors occurred in the weeks leading up to a past failure, then a recurrence of those errors -- which might otherwise go unnoticed -- could indicate that the failure is about to happen again. As such, an organization can take corrective action and make sure that its backups are ready if a failure does occur.