Connect with us

Features

THE NEXT PHASE OF VIRTUALISATION

Published

on

Updated : August 24, 2014 04:45  pm,
By Editor

img28The next leg of the journey is to port mission critical workloads to virtualized infrastructure.

Consolidation and automation of their IT infrastructures are what customers are seeking to make their Business processes and services more effective and virtualization has been a means to that. To take the process further, Virtualisation must extend to the domain of the mission crucial Business applications.

Historically, the challenges in moving Mission critical applications to virtual infrastructures have been about high availability, predictable performance and security primarily. There are however solutions available that help customers tide over these challenges by testing and validating infrastructure performance as well as troubleshoot issues that arise in the test phase.

While Virtualization deployments in the region are no longer confined to only the commodity workloads, the instances of virtualizing mission critical workloads are still in the early stages but are showing an increase. Most of the virtualisation deployments continues to be limited to servers and systems.

Feras Abu Aladous, Manager, Systems Engineering, Services & Support at Brocade says, “Many Enterprises in the region are targeting virtualization of up to 90% of their servers and systems, to increase Server’s utilization efficiency, increase mobility, orchestration agility. A reduced number of physical servers require reduced space and cooling inside the datacenter which leads to cost savings as well.”

Companies in the region are definitely starting to virtualise business critical applications at a rapid rate opines Gregg Petersen, Regional Director Middle East & SAARC at Veeam Software. He adds, “In some instances we are seeing banks deploying on a virtual workload, however the Middle East region is in the infancy stages of this compared to USA and Europe. But, there is definitely a huge move towards virtualisation of these core applications.”

Operational agility is a key reason driving Businesses to head into the next phase of virtualization. The next leg of the journey is to port the more difficult workloads to virtualized infrastructure.

“We believe most of the technical barriers to virtualization of these more difficult workloads have already been resolved, and it is already becoming mainstream to virtualize across the board.  We are seeing different factors driving the virtualization of these database and “big iron” workloads though.  In the first phase of virtualization customers were focused on consolidation of “easy workloads.” For this next phase, they are much more focused on operational agility and speed of time-to-market.  So IT departments that virtualize all of their workloads tend to do so because they want to offer a cloud utility experience to their lines of business – this is no longer just about asset cost avoidance,” says Aaron White General Manager, Middle East, North Africa and Turkey, Hitachi Data Systems.

Over the past few years, other hypervisors have gained ground while VMware’s Hyper V still dominates by far in terms of x86 server virtualization infrastructure with Microsoft a distance second. Hyper-V has constantly improved and added more features to compete with VMware and gain market share while there are other competitors including Citrix’s XenServer that carries no licensing charge, the Red Hat Enterprise Virtualization Hypervisor (REVH) etc that have smaller market share.

The focus has however shifted away from hypervisors to innovations around the Management layers above the hypervisor.

Aaron says, “We have started to see them, but this tends to be within System Integrators and Service Providers’ landscape rather than in the traditional internal enterprise IT.  The reason for this is a “leveling-up” of the capabilities of the hypervisor. So whereas in the past there was clear white space between ESX, the leading hypervisor, and Hyper-V / OpenStack, we have now seen that the hypervisor layer is becoming commoditized over time. And customers are starting to focus more on the management layers above it. We tend to see customers introducing at a “second hypervisor,” because they are looking at leveraging the merits of a specific management ecosystem, rather than because the specific hypervisor brings differentiation.

Enterprises mostly stick with one virtualization vendor to reduce cost of operational expenses but that is beginning to change. Customers are looking for best functionalities as some workloads do perform better on some hypervisors vis-à-vis others.

Greg says, “We are witnessing many cases where customers order Veeam Software – both hyper-V and VMware in large quantities. This tells us that customers are utilising the best of both worlds. Generally, we see a lot of production environments on VMware and hyper-V and this is where Veeam plays a crucial role because we have the ability to provide the best of both worlds. In addition we have also seen a large amount of deployment in business critical applications with hyper-V and VMware, once again proving that multi-hypervisor environments are becoming more common.”

Multi-hypervisor environments bring along its own set of challenges and therefore there could be customers sticking with one hypervisor. However, cross platform management will be a reality that needs to be taken in stride for those who are willing to look at the best of the options.

Aaron argues, “The downside of that is that managing two hypervisors increases your operational overhead – you need to have two sets of operational and integration processes.  This is why it mainly occurs within the System Integrator space – they have the organizational economies of scale that can make it cost effective.  We have also seen this drive an interest in converged platforms such as Hitachi’s Unified Compute Platform.  This platform takes care of all the workflow required to integrate and manage everything from the hardware elements up to the hypervisor.  As it supports both VMware and Hyper-V, it frees customers up to focus on business workflow “above the hypervisor” and is a key enabler for this new trend in Infrastructure as a Service.”

The datacenter

In the datacenter Server virtualization has seen widespread adoption and continues to be the primary focus of virtualization efforts. With the increasing adoption of virtualized computing infrastructure, enterprises are able to run multiple servers on the same equipment, reducing the demand for additional servers. As a result, data centers are becoming smarter and supporting more users than ever before while requiring less hardware to do all of this.

Feras says, “Virtualization is the major trend inside the data centers; and it’s expected that by the end of 2014 around 70% of Server workload will be virtualized, but Storage and Network virtualization is still very low.”

Virtualization in other domains is still growing at a modest pace and would be the focus in the second phase. Virtualized infrastructure offers the potential for higher productivity, scalability and manageability compared to traditional computing which is what customers are in pursuit of as they allow virtualization to get deeper into their networks.

Aaron says, “We would consider server virtualization to be pretty well advanced – we’re about 90% there, and are just about tying up loose ends with it. People are now looking to technical innovations in network and storage virtualization, as well as the management ecosystem to provide the next level of agility and efficiency.”

Automation would be a key objective going forward in the datacenter. An effective virtualization strategy lays down the framework for more automation of regular tasks and this enables greater efficiency with available computing resources.

Greg says, “There is a lot of buzz around automation in the datacentre at the moment. Over the last three years virtualisation has moved from 30 – 40% to 60 – 70%. Most of our customers are about 60% virtualised. As organisations become mature in virtualisation, automation will continue to become an important aspect for companies.”

There are bottlenecks along the way for a wider adoption of virtualization. For instance, there are some very mature technology offerings in the Storage Virtualization space but significant differences between individual vendor strategies are slowing down adoption, believes Aaron.

He comments, “Most vendors still sell separate storage virtualization appliances, whereas Hitachi has integrated storage virtualization capabilities into every enterprise storage array that we have sold for the last ten years.  And our customers have now leveraged this during several transition events to virtualize legacy assets and accelerate transformation.  So we believe pretty strongly that Storage Virtualization just needs to be an integrated feature of every storage controller in order to gain broader adoption.”

He also believe that applications are still too tightly coupled to physical storage assets and physical locations.

Aaron adds, “This is the next wave of value-add from Storage Virtualization, and it is why we have invested in Storage Virtualization OS, which will enable us to run a common storage virtualization platform across all of our storage hardware containers.  We also believe that customers will move to active/active architectures and have introduced distributed Virtual Storage Machines which can live forever, delivering zero downtime during site disaster recovery and non-disruptive technology refresh.”

On the other hand SDN (software defined networking) has still had very limited adoption and there are a lot of competing standards at the transport virtualization and management control layers.  There are also strong requirements for a more integrated approach between the management frameworks for SDN and Network Function Virtualization.

“We believe that the benefits in terms of total cost to provision new services are so substantial that we will see very active investment for these use cases in the short term within the SI and Service Provider space.  Once this technology is proven in this arena it will become more common in tradition enterprise IT.

SMB adoption

Virtualistion technologies suit growing SMB Businesses as well as the already large sized customers. In fact Virtualisation helps dynamically scale up capabilities of a Business’s IT infrastructure and this is a trend seen in the case of companies who are looking to add more virtual servers.

Greg says, “SMB clients are definitely adopting virtualisation and we can see this with our customers. We are seeing companies who have up to 10 servers and instead of putting it on physical servers they are recognising the need to virtualise. It’s a trend that is rapidly increasing and more and more customers are realising the value in adopting virtualisation.”

There is better understanding among SMB Businesses that Virtualisation will help them not only manage costs of upgrading IT infrastructure but also improve critical tasks like data backups and high availability.

Feras says, “SMB market is considering virtualization more than ever. SMBs are trying to get more performance and flexibility out of their existing server resources, considering the rising cost of managing physical data centers. Server virtualization improves disaster recovery and high availability allowing administrators to take VM snapshots, and recover from existing snapshots, and reduces the number of required servers to implement solutions.

However, SMBs have limited exposure to Virtualistion benefits within their networks and largely confined to only server consolidation. Enterprises on the other hand are more focused on optimizing the benefits across and therefore adopt innovations faster.

Aaron says, “We’ve seen large enterprises that are much more virtualized than SMBs. In fact, the most virtual organizations are the ones who are process-driven, focused, and committed to achieving efficiencies as a primary goal. Those who lag behind in the virtualization landscape the most are those who have taken some individual areas of the business, and run the IT departments for those sectors almost as pet projects.  In addition, the SMB space tends to focus on Server Virtualization whereas SI’s and large enterprises have clear additional use cases that they can enable by deploying network and storage virtualization layers as well.”

Server virtualization will be the key driver of virtualisation market but for customers who have already gone through a first phase of virtualization, they will have the opportunity to look at tapping into other innovations from the virtualisation industry.

Aaron says, “Servers and to a certain extent storage virtualization too have seen growth. For networking it’s a bit of a different story – they’re not necessarily about consolidation they’re about programmability. Networking efficiency comes from bringing complex systems into a virtual, manageable space, and alleviating IT departments dependency on hardware segmentation and re-config.”

With virtualization, the utilization rates of the installed hardware has a huge jump. The ROI benefits are seen in a short time.

Greg says, “With server virtualisation the benefits are seen instantly. Storage and network virtualisation are yet to gain traction in this region compared to other regions. But, it’s just a matter of time before network and storage virtualisation takes off, because it’s cost effective and far more efficient.”

While most virtualized workloads are either server or desktop workloads, there is an increased interest in virtualizing network and application delivery etc.

Feras says, “Sever virtualization is the major trend in Data center virtualization and very mature, on the other hand network and storage virtualization is still under evaluation by enterprises, and not widely implemented.”

In summary, there are applications that are still ring-fenced from virtualization and while that percentage may begin to come down each year, even outside of that the scope for virtualization remains high. So while there is a huge variance among customers, the number of those that have virtualized the entire stack – server, storage and network is increasing rapidly. Further, while virtualization may not be seen as essential before adopting cloud computing, cloud deployments in virtual environments deliver optimized results and hence a key consideration factor for Businesses looking to adopt cloud services.

Continue Reading

Features

Establishing data sovereignty in a ‘datafied’ world

Published

on

data

By: Omar Akar, Regional Vice President for Middle East & Emerging Africa, Pure Storage

Data is the currency of the digital domain, and with every passing day, the world is getting increasingly ‘datafied’. Billions of gigabytes of digital data pertaining to citizens, businesses, governments, and institutions are generated, collected, and processed every day. Understandably, there are concerns about how we can protect personal data, business data, as well as sensitive data that has implications for national security.

Challenges associated with data sovereignty

It is possible that a company based in a certain country uses cloud infrastructure from a provider abroad, and that cloud provider also has customers in other countries and regions. If data collection, data storage, and data processing happen in different countries, it will be subject to the data sovereignty rules of all those countries. Many of the concerns surrounding data sovereignty pertain to ensuring data privacy and preventing data that’s stored abroad from violating the laws of that country. Many countries have therefore introduced new laws, or modified the existing ones, so that data is kept within the boundaries of the country where the individual or entity is based. However, verifying that data indeed exists only at permitted locations can be very difficult.

On the other hand, storing huge amounts of data at only a few locations can increase the risk of data loss and data theft through cyberattacks, which can have huge ramifications on the financial health and reputation of businesses.

Moreover, data sovereignty makes it complex to share data across international borders. This can increase cost and inefficiencies for any business that operates across multiple countries and requires flow of data between its offices. Such businesses must now establish infrastructure in local data centers to comply with data protection regulations in each country. Companies also need to keep in view the data sovereignty requirements of each country and international data sharing agreements while wanting to share data which can impact business operations.

Ways to ensure data sovereignty and elevate data performance

Although establishing data sovereignty is undoubtedly challenging, there are some best practices and approaches that can help in achieving it and elevating data performance. Organizations should conduct a comprehensive audit of their data, including where it is stored, processed, and shared. This is the first step in identifying potential data sovereignty risks and ensuring compliance with the relevant laws and regulations of the concerned countries. It is also necessary to adopt data protection measures — such as encryption, access controls, and monitoring — to prevent unauthorized access and use of data, whether it is in transit or at rest.

The company’s data protection policy should define protocols for handling and storing data as well as measures for protecting it. This policy should be regularly reviewed and updated to keep up with any changes in data protection laws and regulations. If an organization has a footprint spanning multiple regions, it is a good idea to take the strongest data sovereignty laws among them and implement it across all regions. Cloud providers can be of assistance in this regard.

Benefits of working with cloud service providers

Most cloud providers have data centers in multiple countries. Organizations should go for a provider whose data residency provisions are aligned with their own data sovereignty requirements. Today, leading cloud providers also offer other features, including data encryption, that can help in achieving data sovereignty. To take it one step further, companies must introduce strict data governance processes in the cloud. This will ensure regulatory compliance, risk assessment, and risk mitigation at all times.

Data sovereignty laws apply not only to data but also to data backups. It is therefore important to understand how your organization backs up information — whether it is done on-premises or using dedicated cloud services or public cloud services. Adopting cloud-ready solutions and leveraging the benefits of all-flash storage is one of the ways to future-proof your organization’s data storage infrastructure. Uncomplicating storage will help in reimagining data experiences and powering the digital future of the business.

Finally, it is important to view data sovereignty holistically, and not as the exclusive responsibility of any one individual or team. The need to comply with data regulations extends across the board, from businesses to suppliers to the end-users. From a business perspective, ensuring data sovereignty calls for robust governance, holistic risk management, and concerted efforts on the part of the IT security, legal department, procurement, risk managers, and auditors — under the guidance and supervision of the company’s Chief Information Officer. It is a good way to build digital trust in today’s business environment.

Continue Reading

Features

HOW FSI INCUMBENTS CAN STAY RELEVANT THROUGH THE GCC’S PAYMENTS EVOLUTION

Published

on

payment

By Luka Celic, Head of Payments Architecture – MENA, Endava

Banks and payment services providers (PSPs) have been the region’s engines of economic growth for as long as anyone can remember. It is therefore jarring to imagine that this dominance is now under threat. After all, venerable banks and credit card companies have elegantly embraced the Internet, mobile banking, and the cloud to deliver self service banking to millions of customers. But consumers, especially digital natives, have never been known for congratulating an industry for a job well done. Instead, with each convenience, their expectations only grow. The siege reality of the pandemic accelerated a shift in consumer behaviour, and Middle East banks and PSPs now face challenges on three fronts.

The first is FinTechs. from Saudi Arabia’s BNPL (buy now, pay later) pioneer Tamara and Qatar’s unbanked oriented platform cwallet, to online financial services, Klarna, tech startups have been able to tap into rapidly changing consumer markets. New companies find it easier to pivot. And like speed boats racing against aircraft carriers, they weaved effortlessly to fulfil a range of desires amid high smartphone connectivity rates and a range of other favourable market conditions. By one estimate from 2022, BNPL alone accounted for US$1.5 billion (or 4%) of the Middle East and Africa’s online retail market.

The second threat is open banking, which comes in many forms, but one example is the instant-payments platforms being introduced by central banks such as those in Saudi Arabia and the United Arab Emirates. To get a sense of how this could play out, we need only look to Europe, where players who once relied on payments through card schemes are now pivoting towards open banking enabled payments. Closer to home, Al Ansari Exchange recently announced its customers can now transfer money and settle bills via the recipient’s mobile number, enabled by the UAE’s Aani IPP.

And finally, comes big tech. To augment its e-wallet service, Apple has signed up to an open banking service in the UK. The open banking framework which banks enabled through their investments is being exploited by a Big Tech firm that has access to 34% of UK smartphone users. Unsurprisingly, this sparked a fierce antitrust complaint by UK’s banks. Other big names will surely follow as they continue to craft ways of offering the digital experiences that garnered them user loyalty in the first place.

THE BALANCE

Apple Wallet is aimed at blending payment methods, loyalty cards, and other services into a single experience. But such moves have raised regulators’ eyebrows regarding a lack of interoperability and the preservation of competitive markets. Hence, Apple’s open banking foray — a gesture to calm the nerves of a finance market that fears having to compete with a company armed with countless millions of user transactions from which to draw insights. The massive user bases of tech giants will give any FSI CEO goosebumps. How does a traditional bank lure an Apple user? Open banking initiatives open the door to greater competition and innovation, both of which are good for consumers. But the only way to ensure both is by building an ecosystem that balances innovation with regulatory oversight.

FROM INCUMBENT TO INNOVATOR

Yes, smaller businesses have freedom of movement that larger incumbents do not. But that does not mean that there are no paths for banks and PSPs. There are, in fact, several strategies that larger FSI companies can employ to capitalise on the open banking revolution.

The first of these is collaborating to create ecosystems that provide users with frictionless experiences. Established FSIs already have access to a wealth of information about their customers and must now consider how to integrate data sources to create highly streamlined and frictionless workflows. A customer applying for a loan could then see their details auto populated, and credit history already accounted — all without the hassle of lengthy phone calls, application forms, or submission requests. In an age when instant is everything, it’s easy to see why the former approach could foster loyalty, while the latter would only serve to drive customers towards more capable competitors.

Card companies and issuer banks could also work with acquirers to smooth out the rough landscape that has arisen from the advent of digital payments. Acquirers traditionally acted on behalf of the merchants that accepted payment methods to recoup funds from the PSP through the issuing bank. This system has served the industry well, but with more payment methods emerging, acquirers have branched out into mobile wallets, QR codes, and gateway services. Gradually the relevance of established players has dwindled as their lack of representation at the critical checkpoint has diminished their significance. Incumbents must work to turn back the tide by recognising that acceptance and acceptance ownership are becoming increasingly important for maintaining market relevance.

Another strategy is diversification. Veteran FSIs may feel like they’ve lost ground to nimble start-ups and Neo Banks, but history shows value in patience — established FSI players now benefit from the investments of early innovators, and double down on payments innovations which have already shown the most promise. Moreover, if they diversify their portfolios through acquisitions, innovations, and partnerships, they can secure their future. Mastercard presents an excellent example with their US$200m investment into MTM payments. This single move has given the company access to MTM’s 290 million strong subscriber base, allowing these customers to become familiar with Mastercard products before getting entrenched with mobile wallet alternatives.

WHO’S ON TOP?

If we look at the rise of BNPL services, we see an origin story with — at least — major supporting roles for large card providers. But open banking has sidelined them in just a few years. BlackBerry was a stock market darling just five years before it sought a buyer. Traditional FSI players must innovate; they must collaborate with emerging disruptors; they must diversify. They can survive and thrive if they do these things — after all, they already have much of the infrastructure, and experience required for success. Middle East banks and PSPs have the existing user bases, so they have the scale to get out in front in the era of open banking. All they lack is the kind of compelling use cases that will entice the banking public. PSPs and their issuers could offer embedded payments, for example. The right services at the right time will be warmly received by consumers, no matter the scale of the offering institution, so there is every reason to believe that incumbents will come out on top against FinTech and Big Tech.

Continue Reading

Features

SEC paves way to approve spot ethereum ETFs

Published

on

ETF

By Simon Peters, Crypto Analyst at eToro

Ethereum spot ETFs took a significant step forward to being available to US investors last week with approval of the 19b-4 applications, allowing US exchanges (namely Cboe BZX, NYSE Arca and Nasdaq) to list and trade ethereum spot ETFs.

On the back of this, ethereum has been one of the best performing cryptoassets this week, gaining 19%.

According to a recent survey by eToro with retail investors in the UAE, over 74% respondents agreed that the prospect of an ethereum ETF will significantly influence their decision to increase, decrease or maintain their current ethereum allocation.
Focus now turns to the S-1 registration statements from the ETF issuers, as these still need to be approved by the SEC before the ethereum spot ETFs can actually launch and investors can buy them.

As to when the S-1s will be approved we have to wait and see. It could be weeks or months unfortunately.

Nevertheless, with the 19b-4s out of the way, it could be an opportunity now for savvy crypto investors to buy ethereum in anticipation of the S-1s being approved, frontrunning the ETFs going live and the billions of dollars potentially flowing into these.

We’ve seen what happened when the bitcoin spot ETFs went live, with the bitcoin price going to a new all-time high in the months after. Could the same happen with ethereum? The all-time high for ethereum is $4870, set back in 2021. We’re currently at $3650, about 35% away.

We’re also going into a macroeconomic climate with potentially looser financial conditions, i.e. interest rate cuts and a slowdown of quantitative tightening, conditions where risk assets such as crypto tend to perform well price-wise.

Continue Reading

Trending

Please enable JavaScript in your browser to complete this form.

Copyright © 2023 | The Integrator