SoftwareOne logo

7 min to readApplication ServicesCloud ServicesThought LeadershipData and AI

How to defy gravity and survive disruption

A man holding a dog.
Alex GalbraithCTO, Cloud Services
Cinematic Vibrant Blackhole

Will your business still be viable in 10 years? Many won’t be.

The pace of change is such that even Moore’s Law is being left behind. Hyperbole or not, Nvidia’s AI hardware supremo Jensen Huang recently stated that AI capability has increased a million-fold in last decade, and indicated it may continue at this rate.

To say established businesses face disruption would be an understatement. But is it hopeless? Absolutely not. While we don’t know exactly what the future has in store, we do know the direction things are heading. It behooves technology leaders to look ahead, think big and start iterating toward that vision, today.

Today’s data decisions, tomorrow’s market relevance

One thing we know is that many businesses will live and die based on their effective use of data. Customer insights, service quality, time to market and many companies’ core proposition, will increasingly depend on analytics, automation and generative AI that are fueled by data.

If your business isn’t moving in this direction, the competition most definitely will be. SoftwareOne’s own research shows that in the next two years, 79% of businesses plan to make significant progress transforming processes and customer interactions with AI.

With this in mind, it’s vital to think about the future infrastructure, architecture, culture and governance your organisation will need to fully leverage your data assets.

No concept illustrates the value of forward thinking more than ‘data gravity’ – a term coined by David McCrory over a decade ago, but which is increasingly relevant. The upshot? Today’s decisions about data create path dependency that can hinder the adaptation businesses need to remain competitive.

To illustrate the concept, let’s look at a hypothetical example from the last wave of digital disruption. 

The cautionary tale of Heritage Retail

Imagine a company that for many years has been a household name in retail. Despite its longstanding success, more recently ‘Heritage Retail’ had a problem. It was rapidly losing market share to cloud-native online retailers with lower costs, greater agility and better customer insights. With revenue shrinking, their executives knew they had to act.

The vision was to transition from a monolithic, on-premises infrastructure to a nimble, cloud-native ecosystem to enhance customer experience and leverage advanced analytics. The initial stages of the project seemed promising, however, as the project moved towards their core operations, the crippling effects of data gravity and data inertia became painfully apparent.

Their decades-old, on-premises CRM and sales data warehouse contained petabytes of data. Because this database was an operational fulcrum, business critical apps like the company's e-commerce platform and supply chain management system were built around it. Years of ad-hoc integrations and data feeds had created a complex and poorly documented web of dependencies.

In data gravity terms, the mass of the data meant that moving one piece of the puzzle was impossible without moving all the interconnected pieces simultaneously. The sheer volume of data meant that the egress fees for migration would be eye-watering.

The team was faced with a stark choice: either embark on a ‘big bang’ migration of the entire, tangled ecosystem – an expensive and risky undertaking – or keep the applications on-premises, effectively halting the modernisation of this critical business function.

For Heritage Retail, failure to iteratively modernise towards a long-term vision meant they had painted themselves into a corner. Data gravity, rather than strategic planning, had been the driving force behind their IT evolution. Compounding over time, its effects had created a mass of interdependent systems and data too cumbersome to feasibly modernise. By the time they needed to respond, their most valuable data assets were locked in a state of inertia, thwarting their goal of adapting and competing in the new market paradigm.

Overcoming data inertia

So, what steps can today’s established businesses take to avoid a fate like Heritage Retail?

The first step is knowing more about your own mass of critical data that your business depends on today. Second, is having a modernisation strategy that considers future requirements and enables agility. And third, make iterative progress towards the long-term vision right away.

  1. Know what you have

    The data gravity effect is likely to be present in most organisations. The first step to an effective modernisation plan that sidesteps inertia is to know your key technology and data liabilities and assets. What is currently mission critical, what is it costing, and what are the dependencies? The sooner you have a clear picture, the sooner you can move forward with a plan. Architecture and licensing reviews with a specialist partner can accelerate the discovery process and provide assurance that your considerations are valid.

  2. Know where you’re going

    Next you need a vision along with a modern data strategy that informs how your business use, store, connect, classify and govern your data assets. This is a cross functional effort and technology strategy is one part of a trifecta that also includes people and process. Line of business experts who work directly with the data will have the clearest understanding of its value, while technology experts will have the clearest picture of what’s technically possible and when. Work together on a vision and roadmap that’s geared towards both short and long-term business outcomes.

  3. Start now and don’t try to boil the ocean

    Many organisations have a vast and complex data landscape. Attempting to identify, catalogue, cleanse, and govern every piece of data before any cloud initiative can lead to analysis paralysis and significantly delay the benefits of modernisation.

    Instead, prioritise effort based on benefits, risks, and dependencies, addressing data issues in phases. Focus on security, compliance, and quality for data that is sensitive or critical to business operations. For less critical data, issues can be addressed iteratively as data is migrated or as new cloud-native applications are developed.

    The effort required to improve the quality of data varies depending on its application, whether it's for operational tasks, analytics, or AI training. Rather than a broad focus on improving data quality, focus on making it fit for purpose, so it supports its intended use. Standardising metadata is a practical step that will help users and systems use the data appropriately.

Move ahead with confidence

Data and technology strategy are crucial facets of being ready for the rapidly emerging AI future. Making the right strategic decisions today can have a huge bearing on your organisation’s future.

SoftwareOne helps organisations of all kinds overcome challenges, embrace opportunities and achieve lasting value through technology initiatives – including those centered on data and AI.

To learn more about how to set up for success in today’s fast changing landscape, check out our new AI Ready Blueprint guide which covers critical themes on this topic, or reach out to our expert team for an initial consultation.

An image of a dark room with neon lights.

Contact us today

With expert practices in cloud, data, AI and application modernisation, SoftwareOne has helped many clients discover and follow their optimal strategic technology path. Talk to us to find out how we can help.

Contact us today

With expert practices in cloud, data, AI and application modernisation, SoftwareOne has helped many clients discover and follow their optimal strategic technology path. Talk to us to find out how we can help.

Author

A man holding a dog.

Alex Galbraith
CTO, Cloud Services