Australian business embrace of AI drives home the need for resilience

Australian business embrace of AI drives home the need for resilience

By George Dragatsis (pictured), ANZ Chief Technology Officer at Hitachi Vantara

 

We know data is a valuable asset: it’s foundational to business intelligence and insight, and its value to organisations is increasingly judged in terms of its utility to act as a training set for Artificial Intelligence (AI) models. The value of data is also reinforced by the variety and volume of attempts by bad actors trying to gain access to it.

This has led some observers to try to put a price tag on data. The form of that “pricing” varies: it could be expressed in terms of the cost of trying to work with less structured or “unclean” data; of relative ease of access to different sources or repositories; or, in the case of security, the cost of losing that data in some sort of breach.

All of which is to say that value is intrinsic in any conversation about data today – and that preservation of that value is paramount.

Behind the scenes, there are real and present concerns about how AI’s computational intensiveness might impact data infrastructure. It’s an issue that’s top-of-mind at some of the largest data centre operators in Australia and the region, as well as with enterprise and government users of AI and similar advanced data analysis techniques.

That’s driving a conversation about an ‘unbreakable data infrastructure’ – unbreakable in the sense that it’s resilient to the load that AI puts on it, resilient to bad faith attempts at data access, and that it remains resilient over time.

Approaching unbreakable data infrastructure design

Building an unbreakable data infrastructure is a complex challenge. There are many internal and external variables involved. While impossible to control all of them, it is possible to get to a place that is “secure” within the context of a landscape that’s continually and rapidly changing.

When designing data protection policies and an unbreakable data infrastructure, the first thing to determine is ‘what is good enough?’ There is no shortage of techniques and technologies that can be deployed to store and protect data, but the costs of doing so may not be appropriate for all data assets. So, a logical first step in designing an unbreakable data infrastructure is to actually identify data assets, the risk profile of those assets and who has access to them.

From there, organisations should look to build a multi-layered defence that addresses both technical and strategic considerations.

The first line of defence is strong technical safeguards. These may include maintaining synchronised copies of data in geographically dispersed locations; having a reliable safety net in place in the form of regular backups, including a tamper-proof copy; encrypting data at rest and in transit; enforcing granular controls for data access; and using network security to protect against unauthorised access attempts and malicious activities.

Technical safeguards are a critical foundation for data security, but they are not enough on their own. To build a truly secure data environment, organisations need to establish a strong security culture with well-defined policies and procedures.

Executive leadership must champion data security, by fostering a culture of awareness and accountability within the organisation. In doing so, they can drive cross-functional collaboration between IT, security teams, and business units to ensure alignment and effective implementation of data protection strategies. In addition, a culture of continuous learning, improvement, and knowledge sharing should be encouraged, such that the organisation can stay abreast of evolving threats and best practices.

AI might help, a hybrid cloud infrastructure definitely will

While AI is greatly contributing to data growth and to the requirement for more resilient data infrastructure, it also has the power to play a crucial role in data protection. AI-powered systems can analyse vast amounts of data to identify anomalies and suspicious activities in real-time, enabling faster and more effective threat detection and response. In addition, AI can be used to anonymise sensitive data while preserving its utility for analytics, ensuring compliance with data privacy regulations.

However, a bigger force multiplier for an ‘unbreakable data infrastructure’ is the use of hybrid cloud.

A hybrid cloud infrastructure offers several advantages for building a secure and proactive approach to data by enabling flexibility and scalability. Organisations can leverage the scalability and cost-effectiveness of public cloud for non-critical data, while keeping sensitive information in secure private cloud environments. They can also enhance data protection by implementing robust security controls across both public and private cloud environments, providing a layered defence against cyber threats.  In addition, hybrid clouds can equip leaders to ramp up data discovery during the most critical times. By replicating data across different cloud environments, organisations can ensure faster recovery times in case of disruptions or outages.

Building an unbreakable data infrastructure is not a one-time endeavour, but an ongoing journey of continuous improvement. True resilience lies not in achieving a static state of “unbreakability,” but in the constant pursuit of vigilance, adaptation, and proactive defence for an ever-evolving technology landscape.

By adopting a holistic and proactive approach that integrates technology, strategy, human elements, and the power of AI and hybrid cloud infrastructure, organisations can navigate the ever-changing threat landscape and safeguard their valuable data assets.