Cloud Cover: Delivering Data-Driven Success

Publication sponsored by

Navigating a complex cloudscape

Mastering an increasingly complex cloudscape to achieve a solid data management strategy will enable critical insights and drive business forward

On-premise databases are no longer big or agile enough to cope with the amount of data businesses must process. It’s a situation exacerbated by the increase in connected devices, including the machines, sensors and cameras that make up the internet of things.

As a result, cloud platforms, often positioned as the “solution” to this problem, have been growing in popularity for a decade.

One of the primary benefits of the cloud is its almost limitless ability to scale services and storage on demand to meet fast-changing customer needs and support an exponential growth in data volumes and processing requirements, says Antony Heljula, technical director at Peak Indicators, a consultancy supporting customers with data science and machine-learning.

“Chief information officers (CIOs) typically struggle to get the same scalability and flexibility with their on-premise infrastructure,” he says.

However, the modern cloud environment is more complicated than ever. Multiple datasets in multiple clouds, public cloud and the security implications that come with it, the growing adoption of flexible, high-performance NoSQL databases, and the impact of open source technology; the possibilities are endless and overwhelming.

According to IDC, by 2022 more than 90 per cent of enterprises worldwide will be relying on a mix of on-premise or dedicated private clouds, multiple public clouds and legacy platforms to meet their infrastructure needs.

But at the same time, McKinsey & Company notes that 28 per cent of CIOs cite the complexity of their current environment as a major challenge to modernising infrastructure.

Fragmented data

There are other factors that further complicate the current cloudscape. For example, when data is fragmented across the organisation, siloed in legacy systems or stored in cloud applications purchased and used by individual departments, often without the knowledge of the CIO.

Understanding and translating data for key business stakeholders is now a core CIO responsibility

The ensuing data sprawl is a major roadblock for any organisation attempting digital transformation. In fact, 43 per cent of IT professionals rate application sprawl as a top-three barrier to realising the benefits of the cloud, according to a survey by Accenture.

“Cloud providers will typically target the business side of an organisation ahead of IT and each part of the business could choose a different provider, especially with SaaS [software-as-a-service] applications. While this is undoubtedly beneficial in delivering fast value to the business community, companies can then suffer when it comes to delivering a common data integration and analytics strategy,” says Heljula.

“SaaS applications limit the way organisations can access and share data, and every cloud provider comes with a different set of APIs [application programming interfaces] to integrate with. These APIs can be complex to work with and it is not always possible to access all your data. The CIO is often left with the challenge of how to integrate data from various cloud providers and make it centrally available for enterprise analytics.”

Enabling data

Crucially too, CIOs and their teams are no longer required to just keep data in a database, but to understand and translate it to key business stakeholders. This is where the CIO role evolves from being a guardian of data to an enabler of that data, joining up data from different sources and presenting it in a way which delivers real value to the business.

“While data has always been critical to guide strategy and decision-making, the events of the last few months have brought into sharp focus the benefit of real-time visibility over logistics, inventory and supply chain, alongside flexible and scalable digital infrastructure,” says Robin Gardner, strategic services director at cloud consulting business Xtravirt.

“Those organisations that have invested in digital transformation have been able to pivot and adapt to new working and trading conditions more quickly and with greater success.”

So how can CIOs and their teams spend less time managing the cloud environment and more time building a data-driven enterprise?

“Understanding and translating data for key business stakeholders is now a core CIO responsibility,” says Heljula. “With the fast adoption of cloud applications, the CIO must have a clear understanding and definition of all the data within the business and the flow of information between applications. A solid master data management strategy is also essential to maintain a single version of the truth.

“Furthermore, with the recent General Data Protection Regulation, the CIO should ideally also be able to explain every flow of data. This involves detailing its purpose, the data content, it’s sensitivity, the owner or controller and describing the processing involved.”

CIOs must think strategically if they are to navigate the complex cloudscape. But if they can do this, and use cloud in the right way for their business, CIOs can help key stakeholders derive critical insights that will drive their enterprise forward.

Silver lining: unexpected positive of remote working

The coronavirus pandemic has seen organisations enable and accelerate remote working on a grand scale. One of the unexpected outcomes of this shift has been how quickly IT teams have been able to innovate, playing a crucial role in ensuring the new distributed workforce is supported and the business continues to function.

For example, research shows the mandate for home working has meant more than half of companies are accelerating their digitalisation, including expanding cloud adoption for 62 per cent. “This abrupt shift to remote work brought on by the pandemic has provided a compulsory testing ground for a more agile and connected workforce, with digital transformation at its heart,” the report notes.

There are many other examples of this agility in action. Cloud and DevOps consultancy Steamhaus is using remote working as an opportunity to open up its recruitment to an international talent pool.

“Being cloud architects, our area of technology is still fairly niche and finding people with the skills required isn’t always easy. But by establishing a genuine remote-working model, which isn’t restricted by location or time zones, we’re opening up an international talent pool that will help us to grow into a global company,” says Rob Greenwood, technical director at Steamhaus.

COVID-19 is driving technological innovation and agility in alternative ways, too. For example, critical shortages of personal protective equipment (PPE) brought on by the pandemic spurred change at NHS Wales, which has moved from tracking PPE stocks with outdated and often incorrect Excel spreadsheets to a near real-time data dashboard powered by artificial intelligence.

“Usage of PPE increased significantly overnight and it’s been essential that we do what we can to protect our frontline staff, so they can continue safely working to save lives,” says programme director at NHS Wales Mark Roscrow. “We’re now able to see exactly how much stock we have available within the pilot site and identify usage to ensure the right people get the right products at the right time.”

Similarly, James Maunder, chief information officer at The London Clinic, says its IT team have adopted “the principles of lean innovation: test, learn, pivot”.

He explains: “In many ways, the need to adapt suddenly has meant we have confronted much of our legacy, which can sometimes prevent innovation. The learnings we have made mean, when it comes to tech and innovation, we are set to be even stronger in the long run.”

Controlling the cost of cloud

Cloud offers potential efficiencies and cost-savings, but visibility is crucial to control spending

Worldwide spending on information and communications technology will hit $4.3 trillion in 2020, according to IDC. An increase of 3.6 per cent over 2019, IT represents a major and growing outgoing, and it is imperative that companies make the best use of their technology investment - both by maximising impact and minimising wastage.

Migration to the cloud is an important part of cost control strategies. Businesses are looking to transform what used to be a fixed-cost, multi-year outlay on hardware, network kit and software licences, or capital expenditure (capex), into much more fluid, changeable, on-the-move IT services spending, or operational expenditure (opex).

Flexera’s 2020 State of the Cloud report confirms that commitment to this style of IT procurement is rising, as organisations adopt multi-cloud strategies and move more workload and data into both public and private clouds.

The cloud is particularly attractive for database administrators, who like the idea of cheaper storage and increased administrative efficiency when they migrate large legacy database applications. In addition, 73 per cent of organisations polled by Flexera plan to optimise their use of cloud to make cost savings.

But here’s the challenge. Cloud should be a failsafe way to lower IT costs, whilst scaling up on storage and finding easy-win efficiencies - but it isn’t. Despite its success converting capex to opex, the latter still needs to be managed. The Flexera study indicates that enterprises are struggling to accurately estimate the cost implications of cloud migration. 

Visibility controls cost

Managing cloud costs requires holistic visibility of cloud spend. The bottom line is, if businesses adopt a new cloud-based way of working without proper planning, it's very easy to end up with larger monthly subscription bills than expected.

An overlooked, but critical, aspect is the importance of "tiering" in cloud provision: choosing the appropriate cost structure, without paying for more capacity than is needed.

The reality is most cloud users end up paying between 30 and 35 per cent more than they need to because they're not using the full cloud capacity in their tier.

David Schmidt, director of marketing at BMC Software, agrees. "Managing cloud cost requires having visibility into the company’s cloud spend, across all cloud platforms, and in a meaningful way", he says.

Optimising usage is a central cost-control measure, along with a strong grasp on the cloud resources needed for database workloads.

Initial scoping is crucial to ensure organisations not only have the right cloud and cloud partner, but also the correct service level for their needs.

Migrating to the cloud without specific cost-control strategies in place is likely to sting the unprepared. Do this work at the outset, and the chances of cloud assisting in long-term cost control dramatically improves.

Maturing into multi-cloud: cloud computing for the 2020s

Providers, applications and support tools must be carefully selected if businesses are to develop mature, market-leading cloud strategies for long-term success

Even before the onslaught of COVID-19, the public cloud market looked set to continue its growth into the 2020s

Public cloud service revenue ($bn):

The pandemic looks set to accerelate this trend

Nearly all large enterprises have developed a multi-cloud strategy - and the majority of these are hybrids, combining public and private

Despite the hype, integrating information across multiple clouds remains a challenge for implementing a hybrid strategy...

...and businesses must take advantage of under-utilised tools to tackle this challenge

Commercial feature

Enterprise database: bottleneck to achieving business agility?

As we emerge from lockdown, digital transformation ideas that had existed mainly on chief information officers’ presentations have been swiftly moved into full business-case mode

Around the world, C-suites know they will have to get up and running very quickly in a “new normal" that will be very different than the way business worked even at the start of 2020. Key to making recovery work will be increased enterprise use of automation. As IT trend watcher Forrester notes: “Automation has been a major force reshaping work since long before the pandemic. Now, it’s taking on a new urgency [as] COVID-19 just made automation a boardroom imperative.”

The good news is that much IT is already in great shape when it comes to automation. Cloud-delivered infrastructure and software as a service (SaaS), a culture of collaboration and an increased use of agile as a quick-fire development methodology, combine to make technology far more productive and higher quality. The combination, often referred to as DevOps for short, requires a very efficient continuous integration/continuous delivery (CI/CD) pipeline of application creation, versioning, build, testing, quality assurance (QA) and deployment.

Predictable and productive

Moving to DevOps has required a mind shift, somewhat surprisingly, led by local and regional government departments that adopted the idea early. One notable challenge is working with the business database. Why? In almost every other part of modern enterprise tech, IT has implemented robust CI/CD processes, making service delivery much more predictable and enabling straightforward upgrades.

We've had companies that have reduced their delivery cycles from eight weeks down to two just by following this automated process

Businesses love this as it enables them to increase their ability to respond to changing market dynamics and customer sentiment and to innovate faster to be more competitive. The sticking point, regrettably, remains the vital work on the database side. CI/CD is not just about building databases and applications faster and more reliably, it also entails managing end-point risk.

When an enterprise has spent time and resources building a new, competitive application, any unplanned bugs or usability issues mean those IT teams returning to the beginning of the process to work out where the faults lay. This backtracking is not only tedious and expensive, it could actually prevent the enterprise responding to, or benefiting from, market change. After all, the most expensive place to fix a problem is once it is in full production.

Instead, creating the new application in chunks that could be tested, fixed and sent quickly would be a much safer and more efficient way of working. Being able to work in an automated CI/CD mode with SQL (structured query language) and database management systems as well would boost business IT significantly.

Manual for too long

It is undeniably easier to implement CI/CD on the application side. Applications don’t have data, instead they’re essentially codified business rules. Databases are different and more complicated, because data is continually changing When a new system is being implemented, the data change that took place during the implementation process must be considered and means there is a higher chance of the new system being flawed. There is a business emphasis on big data, an imperative to process masses of information on customer behaviour, preferably captured in real time so the insights derived are as fresh and deep and accurate as possible.

The DevOps software market is expected to triple in value from 2018 to 2023

Until businesses align their database work with their other software delivery processes, they are less efficient than they could and need to be. The good news is there are techniques starting to emerge that are proven ways of making database work slowly but surely start to look more like CI/CD.

Our customers have found that emulation is the first place to start, automating a lot of traditionally manual database build and deployment processes. This means IT professionals can start to converge what happens on the application side and what happens on the database side into a single software delivery pipeline. We've had companies that have reduced their delivery cycles from eight weeks down to two just by following this automated process.

In addition, this approach boosts the overall quality, integrity and performance of database code, thus reducing production defects that would otherwise introduce unplanned business costs and disruption.

If senior IT leaders start by assessing all the database code testing and QA work they need, and implement standardised testing routines and standards that can be automated, like application development teams have been doing for some time, the benefits are immediate. Speed of delivery will go up, risk of failure will reduce with an increasing ability to pivot plans at short notice.

And as chief information officers and their teams draw up plans for the new normal, slashing production defect rates, increasing transparency and making data as agile as other parts of the business will really ease their load.

The author is senior market strategist at Quest Software, which offers tools to help organisations spend less time on IT admin and more time on business innovation.

Governing data in the cloud

Data governance, risk and compliance should be the responsibility of the IT leader nearest the database

The world of COVID-19 seems surreal, even dystopian, in 2020, but as May 2018 approached, a different kind of craziness gripped organisations: the frenzy of last-minute preparations for the European Union’s General Data Protection Regulation (GDPR).

Two years on and things have settled down, at least with GDPR. Most organisations now have appropriately qualified data protection officers and are following ICO guidelines from the UK’s Information Commissioner’s Office. At the same time, many have successfully transitioned to working in the cloud.

Although the UK is transitioning out of the EU, businesses wanting to work with their European counterparts or third parties must prove they are meeting Brussels' strict rules around data hygiene.

And enterprises may face challenges with being fully GDPR-compliant in the cloud, wrestling with retention and deletion rules, having to try and run clouds across different countries with different data jurisdictions, keep on top of their suppliers’ IT controls and so on.

That’s bad enough, but as business moved almost entirely into cyberspace during lockdown, even more data is being generated and organisations must continue to collect, organise and store it as the rules demand.

Organisations are increasingly aware that this job does not fall to the cloud provider. Providers do accept some responsibility for sensitive business and client information, but ultimately the company or organisation has responsibility for its own data protection.

Who is the stakeholder?

What’s making things even more opaque is there are multiple stakeholders worrying about data and compliance, raising the risk of confusion or inadvertent non-compliance.

Many companies may have a centralised security function, a separate chief data officer, and a range of other individuals who have been trained in GDPR compliance. But - reasonably obviously - the database administrator often has the most detailed understanding of the company's data portfolio, and therefore is best-placed to spot vulnerabilities or potential compliance issues as they arise.

Companies should look to nominate the person with the best skill and position to oversee data governance, but ensure this is supported and pushed down onto those in the business closer to the problem

As Alex Hollis, vice president of governance, risk and compliance services at SureCloud, says: “Organisations find themselves in the position of asking who is responsible for data governance and what they need to do. There are arguments for whether this task belongs to compliance, legal, IT, even finance; however the oversight isn’t as important as the implementation.

“The identification of data owners, or custodians, who understand the nature of the data and the processes that surround it, is key. Companies should look to nominate the person with the best skill and position to oversee data governance, but ensure this is supported and pushed down onto those in the business closer to the problem.”

Discovery and location of data remain a major challenge to effective data governance. Many solutions rely on metadata and, with a wealth of third-party applications at even the biggest enterprise software suppliers, this metadata can be difficult to locate, organise or understand, let alone protect.

Database teams urgently need to get a grip on the metadata problem. It must be collected, scanned, understood and organised before it can be encrypted. The next step, which should be built into processes immediately afterwards, is to align the company's GDPR strategy with its cloud work.

Ultimately, siloed processes owned by a variety of stakeholders is not the route to effective data governance. In a world where data is a valuable currency and its protection is mandated and highly scrutinised, the senior IT decision-maker must take ownership of data identification, protection and compliance across the whole enterprise.

Bridging the gap between process and innovation

Technology can liberate time-poor IT leaders from laborious repetitive tasks to concentrate on driving innovation and adding business value

The role of the chief information officer (CIO) often straddles the line between driving innovation and maintaining everyday business operations.

On the one hand, more CIOs are now being measured on their contribution to revenue growth. Indeed, IDC forecasts that by 2023, 65 per cent of CIOs will be “entrepreneurial leaders who evolve their organisations into centres of excellence that engineer enterprise-wide collaboration and innovation”.

Yet despite spending less time on the day-to-day management of IT than they did in previous years, CIOs are still tasked with “keeping the lights on” within their organisation.

So, how can CIOs keep things running, while building in time for innovation and creativity? One way is to lean into emerging technology to ease the pressure on their time.

“Digital transformation, creative strategy and disruptive initiatives all need head space, time away from the day-to-day operations of the IT function. With teams so time poor, this space is a rare commodity, so I will lean into any tool that is going to help free up time for my team to drive our digital transformation strategy,” says Steve O’Connor, director of IT at luxury sports car manufacturer Aston Martin Lagonda.

“For me, this means automation and machine-learning. Repetitive, low-value, day-to-day tasks are a real opportunity to unlock this time with automation.”

Mundane tasks stifle creativity

Similarly, marketing agency Kaizen recently undertook a review of its processes that identified how mundane tasks take up time and stifle creativity. It then set about addressing the problem.

Kaizen managing director Jeremy McDonald says: “Normally, it involves automating or streamlining a piece of work the team regularly delivers and improving the process with technology, which means we can free up more time to focus on more important things, such as client communications or planning for future campaigns.”

The CIO may also look to others within the organisation to share the load, particularly given the evolution of more specialist IT roles within the enterprise.

“I always think of a CIO as someone who’s juggling a lot of spinning plates. They know there are certain plates they cannot let drop, the ones that make sure the lights are kept on, but equally there are a few they don’t need to worry about as much. That’s where there’s space for creativity and innovation,” says Caroline Carruthers, chief executive of data literacy consultancy Carruthers and Jackson.

Repetitive, low-value, day-to-day tasks are a real opportunity to unlock this time with automation

“Newer roles, such as the chief data officer, can further help CIOs by taking some of those ‘plates’ away from them and letting them focus on innovating in the right areas, while safeguarding critical legacy systems. Delegating areas of responsibility and fostering a sort of symbiotic relationship between CIOs and the newer IT roles are actually the key to ensuring innovation can continue to drive an organisation forward.”

Now more than ever, innovation is necessary to secure an advantage in a disruptive business landscape. Technology will enable CIOs and their time-poor teams to spend less time on the day-to-day processes that keep their organisations running and more time developing pioneering initiatives and tech-driven business strategies for the future.