Wednesday, December 25, 2024
Connecting Innovation


Cloudy, with a chance for innovation!

A storm can begin with just one cloud. As it gains watery momentum, the cloud expands or combines with peers…

By Anthony Bell , in Disruption and Innovation , at March 7, 2019 Tags: , , , , , , , , , , , ,

A storm can begin with just one cloud. As it gains watery momentum, the cloud expands or combines with peers into a horizon-spanning thunderhead. Inside its brilliant exterior, trillions of individual droplets meld, ready to fall and soak into the soil. Like so much of the natural world, it’s a vastly intricate process with a simple, and necessary, point.

If only data clouds were so gentle to us as we try to make sense of them. Hybrid clouds are tempests of any number of different file-types, owned by any number of individuals, and needing to be accessed by a varied population of users. Their interiors are segmented by invisible barriers which regulate data even before it flows down through the reeds of fibre-optics and into one’s hard drive.

Understanding their nature, and effectively mastering its management, is the key to success in modern IT. In the recent entry in our Cloud Transformation webinar series, Managing Data In A Hybrid Environment, we sat down with three experienced data-cloud watchers, and gleaned from them the trepidations faced by those needing to navigate through these digital phenomena.

Dan Gooden is a Data Lead with Airtasker, an organisation that connects companies looking to outsource certain kinds of work with people looking to get involved. With a background steeped in physically managing data centres, he has seen first-hand how the storm of change has blown across the land of data management.

“I think data warehousing was only accessible to organisations that had a budget that allowed for what is required to develop and run the software, as well as the physical machines and the licensing. But over the past five or so years, what I’ve seen is data management software as a service. Companies coming in, with an Amazon-like model, where it’s low cost and scales when and where you grow.”

This certainly gels with the notion that stellar data management has evolved beyond simple archival services on server wracks. It requires software that gives its users freedom of movement within the pools of data out there. Dan’s experience pointed towards this evolution sparking from customer needs.

“As I moved through my career, I noticed that there was a real focus on delivering value. And what I realised customers wanted wasn’t so much accuracy, but a best-effort attempt that provided them with what they needed as fast as possible.”

Nick Smith, Founder and Strategist for Informed, agreed, and asserted that the biggest challenge facing data management originates from the ground-level changes we’ve all witnessed.

“The maturing of the public cloud as a viable option for enterprises has driven changes across the board – from key infrastructure to the physical data being managed.”

This maturing, he explains, has lead to a diversification in requests from clients. And “with those choices comes the new opportunities companies have capitalised on.”

Dan and Nick commiserated on the great demands this has placed on organisations, let alone individual engineers. The need for more predictive and agile software is the core engine that powers innovation in the data centre industry. But Nick adds an important observation to this.

“Another difficulty is handling bi-modal operations in IT. It’s the reality of the industry that you have to balance innovation whilst continuing to provide your services. Today we have more of the tools to enable that – software development tools and statistical analysis software, for example.”

Though everyone at the webinar accented to the need for freedom-of-movement in order to survive managing a Cloud, a spectre of a topic refused to back down.

Accuracy. Pin-point precision in data. Truth.

“It’s hard for any company to have a complete overview of their data,” Dan said. “The single source of “truth” is something we will always chase. Knowing it, whatever it may be, is a powerfully strategic position to operate from… But it takes a huge amount of work to get there.”

Dan is quick to confirm potential bias towards using a Data Lakes – a traditional systems that has a focus on reading data, as well as a preference for unstructured forms of raw data that are of more interest to data scientists than professional businesses. But he notes that he is warming to hybrid systems like modern Cloud storage for one big reason.

“The simplicity of off-the-counter Cloud solutions, that enable businesses to access very complicated information, without having to expend too much effort, can’t be overlooked.”

Working alongside CIOs, Nick conformed to this idea that Cloud has become so popular because it has evolved to enable agility through vast sheets of data, which can then be analysed via other departments or outsourced services for accuracy.

“Data derives information, so it derives value. That’s why it’s so important to have flexibility first and foremost in your data management, and then accuracy.”

All this is just the edge of the storm. Risk management, cybersecurity, so many more factors combine to motivate businesses to adopt or abandoned digital transformation through Cloud-based data management. One could find themselves blown off their feet by the choices that are presented.

Enter Chris Wahl, Chief Technologist for Rubrik. Acknowledging the explosion and fragmentation of operational standards and requirements, he offered several newsflashes on where to begin in data transformation strategy.

“We’ve had to change the technology we use and how we wield it, but we need to change our thinking too. We need to change how we organise our thinking, how we structure our teams in our companies, because doing that immediately changes how you approach a challenge.”

For Chris, some of the important changes in thinking revolved around how assets – from apps to data to raw infrastructure – are ordered. They needed to be separate where appropriate, and combined where necessary, so that they could interface with a Cloud where needed. Using Rubrik’s methodologies as a model for this, he explained how this thinking translated into clearer strategies for compliance regulation and security licensing, which in turn evolves into an efficient cloud that can be accessed easily by on-premise systems.

Intricate, but it has a simple effect.

“Something I harp on a lot about is that IT, historically, has been about carrying heavy buckets up a steep hill in order to provide service. But why do that when you can be a faucet? If all your data is protected under license and service agreements and it’s declared and code is defined as company infrastructure, it doesn’t matter where the data is housed, necessarily. We can be a faucet and turn on a service for customers when they need it.”

The question of strategising cloud transformation tied into this.

“From my experience, there are five things to consider.

Firstly, there’s software convergence. You need a single software fabric for all data management functions. I don’t think I’m going to get a lot of disagreeance on that.

The second thing, which is something that often blows minds, is that as you work towards building a hybrid environment, it must be infinitely scalable. I’m talking systems that can handle petabytes and exabytes of data, without requiring a retrofit to the architecture. That’s not easy, but that’s why I want them, because then I won’t have to spend time and energy and money in future having to retool and rearchitect them.

Step three involves APIs. Everything in your data centre has to have an API that works with it, otherwise how are you going to plug in your components to all the corresponding workloads?

The other thing is that everything needs to be available immediately. This isn’t a world that tolerates downtime. “If my phone can do it, why can’t my multi-million dollar data-centre do it?”

And finally, there’s the actioning of the “Cloud First” model, where your service is aimed at providing easy and quick access to customers, and helps them dodge all the muck that comes from the behind-the-scenes needs of data management.”

He finishes by explaining how Rubrik stitches together its systems, using the above model, to deliver flexible services to its users.

“What we do is try and fit all the data into “use-cases” that you will inevitably find yourself defaulting to. We then stitch these together into something that can be searched and queried, which helps someone understand things like: where your mission-critical applications lie in your organisation, what it does, and how it impacts your data and business “eco-system”.

Rubrik is built to see all that information, and have on-demand access to it.”

For an audio run through of this webinar, check it out here if you haven’t already!

And, if you fancy yourself a tech insider, or you know you’re a proven powerhouse in IT, and want to pitch projects and ideas for us to discuss, reach out at: [email protected]

Comments