Connect with us

Tech Interviews

 

Published

on

Vertiv Outlines Data Center Evolution and AI Infrastructure Strategy

Exclusive Interview with Peter Lambrecht, Vice President Sales, EMEA, Vertiv

What specific challenges do clients face in powering and cooling AI infrastructure, and how did you address these at GITEX this year?

At GITEX this year, the focus on artificial intelligence (AI) was unmistakable. Every booth showcased AI-driven solutions running on GPU-powered servers from leading companies like NVIDIA. However, for AI applications to function effectively, the right infrastructure is essential to power and cool these GPUs, ensuring smooth and efficient performance. For us, this highlights our expertise in AI infrastructure, designed to support these platforms optimally.

One of our key focus areas is liquid cooling, as traditional air cooling in data centers is no longer sufficient. With rack densities now reaching 50, 60, and even up to 132 kilowatts per rack, air cooling alone cannot handle the thermal load. Liquid cooling has become critical, efficiently drawing heat away and directing it elsewhere. This core technology, developed with our partner NVIDIA, supports the deployment of GPUs worldwide, and together we provide, market, and deliver these advanced solutions to our clients.

Cooling, however, is only one part of the equation. The shift to AI also requires a comprehensive approach to power management, as AI workloads significantly alter electricity load patterns. Our solutions are designed to meet these power demands, and we have showcased these capabilities at our booth. We’ve been actively engaging with partners and clients to address these challenges as they implement their AI solutions, ensuring that both cooling and power needs are effectively met.

Can you elaborate on your partnership with NVIDIA and how it has evolved over the years?

Our partnership with NVIDIA has grown significantly over the years, and over the past year, it has reached an unprecedented level of collaboration. Both of our CEOs, Jensen Huang of NVIDIA and Giordano Albertazzi of Vertiv communicate regularly, aligning closely on joint development initiatives. Together, we create environments optimized for NVIDIA’s cutting-edge chips, ensuring they have the necessary infrastructure to operate at peak performance.

In a recent joint announcement, both CEOs unveiled new solutions that integrate NVIDIA GPUs with Vertiv’s advanced infrastructure, designed to maximize efficiency and reliability. This collaboration represents the core strength of our partnership, where we have refined reference designs that allow NVIDIA to deploy their GPUs seamlessly and effectively on a global scale.

What current trends do you see in the data center and critical infrastructure market? With many hyperscalers entering the market, what is your perspective on this development?

The data center market is dominated by Hyperscalers, whether through their direct deployments or co-location facilities—two primary models for quickly scaling data center capacity. These Hyperscalers are making substantial investments in infrastructure, fueling competition in this fast-evolving landscape.

The AI race is fully underway, with industry giants all striving toward the same goal. As their infrastructure partner, we are advancing with them, providing the essential support they need to drive this innovation. At the same time, the growth of the cloud sector remains foundational and continues to expand robustly. What we are seeing now is a dual growth trajectory: traditional cloud business growth compounded by the accelerated demand for AI infrastructure.

Trends in data centers reveal a marked increase in power consumption. A few years ago, a five-megawatt data center was considered significant; soon, however, a five-megawatt capacity will fit within a 10×10-meter room as rack density skyrockets, reducing the need for extensive white space but requiring expanded infrastructure areas. Data centers are scaling up to unprecedented sizes, with discussions now involving capacities of 300-400 megawatts, or even gigawatts. Visualizing a gigawatt-sized facility is challenging, yet that is the direction the industry is moving—toward ultra-dense, compacted facilities where every element is intensified, driving an enormous need for power.

As data centers continue to grow, how do you view the sustainability aspect associated with these large facilities?

Today, nearly everyone has a mobile phone in hand, yet data centers—the backbone of our digital lives—often face criticism despite being indispensable to modern society. Data centers are not disappearing; on the contrary, they are set to expand as the pace of digitalization accelerates globally. Power generation remains, and will continue to be, a critical challenge, particularly in regions where resources are limited.

Currently, we see significant advancements in AI infrastructure across Northern Europe. Countries like Sweden, Finland, and Norway benefit from ample hydropower and renewable energy sources, making them well-suited for sustainable AI development. Meanwhile, the Middle East is experiencing a technology boom, backed by its rich energy resources and favorable conditions for large-scale investment in data centers.

There’s also a rising trend toward on-site power generation, with organizations increasingly considering dedicated power stations and micro-grids that tap into renewable or alternative energy sources. In the U.S., for instance, discussions are underway about small, mobile nuclear reactors to support local power needs. Finding sustainable power solutions has become imperative. A sudden surge in electric vehicle usage, for instance, could stress current power supplies dramatically, underscoring the need for substantial changes in our energy landscape.

What support and services does Vertiv offer to its customers? What types of infrastructure investments is Vertiv making in various parts of the world?

At Vertiv, service is arguably the most critical aspect of what we do. Selling a product is only one phase of its lifecycle; the real value lies in our ability to maintain and support that product for the next 10 to 15 years. The availability of highly skilled labor and expertise is essential, as we know that issues will inevitably arise. This is why resilience is a cornerstone of data centers. A strong service infrastructure is vital for addressing challenges promptly when they occur. Just as a car will eventually break down, data center systems too will face difficulties over time. At Vertiv, we have developed an exceptional service framework to ensure that we are always prepared to support our clients.

My philosophy is simple: we don’t sell a product unless we can guarantee the service to support it. When you sell a solution, you’re essentially selling a potential future problem, so ensuring your service capabilities are in place is vital for sustainable growth. This commitment to service is one of our key differentiators in the market.

We are continuously enhancing our core production facilities and making ongoing investments in engineering, research, and development. We are also evaluating our global footprint to optimize production capabilities and meet the growing demand. We are not just focused on the immediate needs of today; we are preparing for the demands that will arise in the next six months, a year, or even two to three years. In this race for capacity, the winner will be the one best positioned with the most scalable and resilient capacity.

What is the future of cooling systems in relation to AI chips, and where do you see this race heading?

While we are not completely moving away from air cooling as it will always play a role in the equation, fully eliminating it would be prohibitively costly. The transition to liquid cooling is a critical step forward. We are already seeing advancements in the liquids used to cool servers, enabling them to absorb higher levels of heat. However, the primary challenge will be addressing the overall densification of data center systems, as more powerful and compact solutions require innovative approaches to heat management.

Our partnerships with industry leaders like NVIDIA and Intel are essential, as they provide us with invaluable, first-hand insights into the development of cutting-edge chips and GPUs. While cooling and power systems might seem straightforward, AI introduces a new layer of complexity that demands forward-thinking solutions. To meet these challenges, we are making significant investments in research and development to support the AI-driven data centers of today and tomorrow. Our commitment to continuous innovation ensures that we remain at the forefront of these critical advancements.

Tech Interviews

Why TWO99 is Rethinking Cloud Marketing with Compliance, Data, and Agility

Published

on

TWO99

How does TWO 99 ensure its Cloud Security Solution remain compliant with evolving international data privacy laws like GDPR and HIPAA?

Two99’s cloud-native security solutions—including CNAPP, CWPP, and CSPM—are engineered to align with evolving international data privacy regulations such as GDPR and HIPAA through a proactive, multi-layered compliance framework. Our platforms integrate automated policy enforcement and real-time security posture monitoring to enable rapid detection and remediation of compliance deviations.

As an ISO 27001:2022 certified organization, we maintain a robust Information Security Management System (ISMS) that embeds data protection, risk assessments, and regular audits into our core operations. In parallel, our ISO 9001:2015 certification underscores our commitment to rigorous quality management processes, allowing agile adaptation to global regulatory shifts.

Our compliance team continuously monitors international privacy standards, translating insights into operational controls and product enhancements. We enforce stringent data storage protocols, including encryption of data at rest and in transit using advanced cryptographic methods, along with secure key management and policy-driven data retention and deletion.

Regular internal evaluations and third-party audits further validate our security posture, ensuring that Two99’s offerings not only meet but consistently exceed global data protection standards.

Given your background with WPP and GroupM, how has your approach to digital transformation changed since founding TWO 99?

My experience with WPP and GroupM provided invaluable insights into how large-scale organizations operate—especially in terms of process, structure, and scalability. However, founding TWO99 marked a deliberate shift toward a more agile, innovation-driven approach to digital transformation. At TWO99, we focus on vertical-agnostic scalability, bringing together technology, creativity, and performance under a unified, adaptable framework.

Unlike traditional holding companies that often operate within rigid silos, our model emphasizes speed, flexibility, and integration. We’ve built an ecosystem that allows us to pivot quickly, test rapidly, and deploy solutions that are customized to the dynamic needs of each client. This is especially critical in emerging markets like India, where consumer behaviors and platform trends evolve at breakneck speed.

Our approach moves away from isolated service offerings and instead delivers end-to-end growth strategies—from brand storytelling to performance marketing—under one roof. This integrated engine not only accelerates ROI but also empowers clients to scale more efficiently across diverse industries and geographies.

Ultimately, digital transformation at TWO99 is not about adopting new tools; it’s about building a mindset of experimentation, collaboration, and continual evolution—something that’s only possible when tech, creative, and media are not just coexisting, but co-creating.

You speak a lot about growth marketing—what’s one underused strategy or tool you believe more startups should adopt?

One of the most underutilized yet high-impact strategies in growth marketing today is predictive audience modeling—specifically using first-party data to anticipate user behavior before a customer even shows active intent. In the rush to acquire users, many startups focus heavily on performance spend and surface-level targeting, often missing the opportunity to build smarter, more efficient pipelines through data-driven foresight.

By leveraging tools like AI-powered lookalike modeling or Google’s AutoML, companies can identify emerging patterns and preemptively segment high-intent audiences. These platforms analyze behavioral signals—ranging from product interactions and website heatmaps to backend signals like GitHub commits or CRM workflows—to spot trends that traditional analytics would miss.

At TWO99, we’ve seen transformative results with this approach. For instance, by layering multiple intent signals (e.g., developer activity, trial-to-paid movement, sales pipeline stages) and combining them with Dynamic Creative Optimization (DCO), we helped a SaaS client reduce their Customer Acquisition Cost (CAC) by over 30%. This wasn’t just about targeting more people—it was about targeting the right people, at the right time, with the right message.

Startups that embrace predictive modeling early in their growth journey can shift from reactive marketing to proactive growth engineering, ultimately driving better ROI, faster time to conversion, and more sustainable customer relationships.

At TWO 99, how do you balance creative innovation with data-driven performance when leading campaigns for tech and cloud-based clients?

At TWO99, we treat data as the creative brief—a philosophy that helps us seamlessly bridge creative storytelling with performance marketing, especially for tech and cloud-based clients. Rather than starting with assumptions or generic messaging, we begin with behavioral analytics and first-party data to uncover real pain points, usage patterns, and moments of friction within the user journey.

This insight-driven approach allows us to craft narratives that aren’t just imaginative, but deeply relevant and conversion-focused. For example, if product analytics show a drop-off at the integration stage, our creative strategy might revolve around simplifying technical complexity or highlighting seamless onboarding. In this way, the campaign’s message is directly informed by what users are experiencing, not just what the brand wants to say.

We also continuously A/B test creative iterations—from copy to visual formats—to fine-tune performance in real time. For tech and cloud clients, where the buyer journey is often complex and multi-touch, this balance of data and creativity ensures that each piece of content not only captures attention but drives measurable outcomes like engagement, sign-ups, or qualified leads.

In short, we don’t see data and creativity as separate tracks. At TWO99, one fuels the other—creating high-performance campaigns that are not only intelligent but emotionally resonant.

Continue Reading

Cover Story

From Insight to Impact Qlik’s Vision for the Future of Data and AI

Published

on

QLIK

As data becomes the lifeblood of modern organizations, Qlik’s agnostic platform spanning data integration, quality, analytics, and AI/ML enables enterprises to make smarter, faster decisions, no matter where their data resides. We spoke with James Fisher, Chief Strategy Officer at Qlik, about how the company is helping businesses unlock AI’s full potential globally, while deepening its commitment to the UAE with the launch of Qlik Cloud on AWS bringing speed, sovereignty, and strategic growth to the region.

You’ve been in the industry for over 30 years. What is it about this moment in data and AI that really excites you, especially here in the UAE?

I’ve been in the industry for over 30 years, started out at PwC, spent nearly a decade at SAP, and then joined Qlik. Honestly, I’ve never seen a more exciting time for data, analytics, and AI. The pace of innovation is incredible, and what really stands out is how committed governments and organizations— especially in the UAE—are to tapping into AI’s full potential. From big public initiatives to a thriving startup scene, there’s real momentum, and Qlik is right at the center of it all.

You wear many hats at Qlik—what does a typical day look like for you?

My role is about making sure our customers get real value from their data—using analytics and AI to drive meaningful outcomes. As Chief Strategy Officer, I help shape Qlik’s vision, build the right partnerships, and steer our growth strategy across areas like products, M&A, and even broader priorities like sustainability and diversity. I also spend a lot of time keeping up with market trends. It’s a busy role, but what I enjoy most is how cross-functional it is—it lets me connect innovation with real business needs.

Qlik’s mission is to help customers ‘do data differently.’ Can you unpack what that means in practice—and how your strategy role helps turn that vision into impact?

Our mission is to help customers get the most out of their data to tackle big, real-world challenges. When we say, ‘do data differently,’ we mean rethinking how data—structured or unstructured—is accessed and used, wherever it lives. On the strategy side, I work closely with our product teams to stay ahead of market trends, and with our services and customer success teams to ensure we’re truly partnering with clients. And that collaboration mindset is something I truly value—it makes Qlik feel like a trusted advisor, not just a vendor.

How do you ensure that your global capabilities align with the specific needs of markets like the UAE?

For Qlik, adaptability is key. We really focus on listening— whether it’s to our customers, partners, or the broader market. In the UAE, we’ve been active for years and have a solid customer base, which gives us valuable insights into regional needs. We also set up a global AI Council, including a member based here, who works closely with local companies and government entities. This helps us adapt our global strengths to deliver real, local value. It’s not just about the product—it’s about how we approach the market, offer training, and engage with the community to reflect local nuances.

With major shifts happening in analytics and data integration, particularly the surge in AI investments in regions like the UAE, how has Qlik evolved its technology to stay ahead of the curve?

Qlik has been in this space for over 30 years. From the beginning, it was about helping customers turn questions into insights by pairing human curiosity with machine intelligence. That is essentially an AI problem—and it has been core to our platform since day one. Over time, we’ve evolved our technology to be cloud-native on AWS, expanded our data integration capabilities, and made it easier for everyone in an organization, not just analysts, to access insights and take action. In markets like the UAE, where AI investment is booming, we’re in a great position to support that momentum. A key part of our strategy has been democratizing data access so that teams across all departments—sales, marketing, HR, finance, and more—can leverage it, not just analysts.

How does Qlik ensure it supports each organization’s unique data and AI journey without disrupting what’s already in place?

Our approach is simple—we’re here to enable, not disrupt. Every organization is on a different path with different needs, whether that’s industry-specific, company size, or their own pace of digital adoption. That’s why flexibility and openness are a big part of how we operate. We don’t ask customers to start over—we work with what they already have and help them get more out of it. A great example is in highly regulated sectors, where security and hybrid environments are a must. We meet them where they are, and build from there. That’s how we earn trust and become long-term partners.

Organizations are juggling data across cloud, on-prem, and even the edge—how does Qlik stay flexible enough to handle all that?

Data today is vast and varied. For years, most analytics solutions focused only on structured data; but about 80% of an organization’s data is unstructured. At Qlik, we do both. Our platform allows us to reach data wherever it is—in cloud apps, in on-prem systems, or even at the edge. That openness allows us to solve real-world problems, not just run analytics in silos. And we are not limited by any single cloud provider. Customers can choose what works best for them—AWS, Azure, Google Cloud—and we will be right there with them.

As someone steering strategy at a data and AI company, how do you personally approach using data to guide big decisions—and what advice would you give to leaders trying to cut through the AI hype?

As Chief Strategy Officer, I see data as the lifeblood of any organization, it’s behind every smart decision we make. At Qlik, we use our own analytics and AI technology to shape our strategies. We build a team with the capability to understand what problems we want to address the right use cases we want to drive, and what data we need to get there—and then making sure we can actually access that data. My biggest piece of advice? Don’t get swept up in the AI hype. Start with a real business problem. Then ask: what data will help me solve this, and how do I get it? Strategy isn’t something you set and forget. The best strategy leaders are the ones who stay humble, curious, and ready to pivot.

What recent launch or initiative is Qlik most excited about in the UAE?

We are announcing the availability of Qlik Cloud on AWS in the UAE. This is a big milestone. It enables innovation and AI-driven value while ensuring robust data sovereignty— customer data stays within the country’s borders. It also boosts performance and latency for applications running locally. This is a major step in helping our clients generate real-time insights and act on them with confidence.

What long-term plans does Qlik have for the region?

We are here for the long run. We are often asked, “Where will Qlik be in five years?” And the answer is simple, we will be right here, growing with our customers. We are expanding our partner network, building up our regional team, and investing in training and enablement. We have only just scratched the surface of what AI can offer, and we are committed to helping customers realize that potential through long-term partnerships. As the region continues its digital transformation, we want to be a steady, trusted enabler—helping organizations turn complexity into clarity.

Continue Reading

Tech Interviews

RedHat Summit Connect 2025: A discussion with Ed Hoppitt and Adrian Pickering

Published

on

Red Hat Summit

Exclusive Interview with Ed Hoppitt, EMEA Director – Value Strategy, App and Cloud Platforms at Red Hat & Adrian Pickering, Regional General Manager, MENA & Enterprise Segment Lead for CEMEA at Red Hat  

Having worked in global telecom and advised some of the world’s largest enterprises, how have these experiences shaped your approach to developing IT solutions?
Ed: I believe that designing and running operational IT over many years gives one a deep understanding of what truly matters to a customer, especially those partnering with Red Hat. This background enables me to connect with our customers on a level where they feel understood regarding their pain points. Today’s biggest challenge for enterprise IT is building systems that are predictable, replicable, and standardized—yet able to scale effectively.

When you look at what Red Hat offers and how we help enterprises build these solutions, our focus is rooted in leveraging the open source community. We invest in projects that we know will create tremendous value for our enterprise customers, taking those projects upstream, incorporating them into the Red Hat portfolio, and industrializing them into platforms such as OpenShift. Customers choose platforms like OpenShift because they represent best-of-breed choices, delivering stability, reliability, predictability, and scalability. With my operational IT background, I appreciate just how crucial these outcomes are for every customer I speak with.

This year’s Red Hat Summit focuses on curiosity and turning acquired knowledge into practical application. It’s about transforming acquired knowledge into practical applications. Through this, what key message are you hoping to leave with the audience this year?
Adrian: I view curiosity as the foundation for working with our customers to truly understand their vision—where they want to be 18, 24, 30, or even 36 months down the line. It’s about gaining a clear grasp of the business challenges they face or the new markets they wish to serve in the future. We then align our best capabilities to support them along that journey, keeping cost efficiency in mind. This might involve modernizing infrastructure, existing applications, or even building new applications that open doors to entirely new customer segments or solutions.

A great example of this is our work with the Dubai Health Authority, who were on stage at Summit Connect Dubai. When we engaged with them, we took the time to deeply understand the challenges they were trying to address for the citizens and then brought not only our technical products but also our expertise in project management, implementation, training, and knowledge transfer. I’m very proud of our achievements over the years, and I believe that in doing so, we add significant value for our customers.

Ed: To add another perspective, the most compelling conversations I have with customers often begin with discussions that don’t initially center on technology. They start with, “I want to imagine a world where things are different—where you can help me achieve something extraordinary.” For instance, with Red Hat OpenShift AI, we collaborated with the US Department of Veterans Affairs to build a platform that effectively reduced self-harm and suicide rates. By harnessing a platform that could analyze how people called in for assistance—assessing tone and how they described their situations—we helped the teams prioritize who needed immediate care versus who could wait a while for some support.

It’s when someone presents you with such a profound challenge that you really see the immense opportunity we have as an organization. These technology platforms do more than enable business; they help vulnerable people receive the care they need and, ultimately, save lives.
 
You mentioned that for Red Hat it’s relatively easy to work on new technologies because of the robust support provided by partners and customers alike. Can you elaborate on just how important those relationships are for your team?

Adrian: The point is that, while we are proud of the solutions we deliver through Red Hat, many integrated solutions require components from multiple software vendors. Our partners and integrators are essential because they bring together the various components needed to deliver, implement, and support these complex solutions. In many regions, especially where we serve multiple countries, these partners offer additional scale and reach, often accessing markets where Red Hat might not have a direct footprint. This collaboration is a critical part of why we work so closely with our partners.

Ed: Another significant benefit of having partners is that it allows Red Hat to concentrate on what we do best. We aren’t trying to solve every aspect of the IT enterprise supply chain. Instead, we work with best-of-breed partners who focus on their own areas of expertise. This means that Adrian’s teams and others in our region can focus on delivering core value to our customers. As we saw on stage, one of the Middle East’s largest banking group  was very ahead of the curve in its approach to virtualisation and modernisation. These partners enable us to help customers execute at scale and with credibility. My background in operational IT tells me that although the journey is rarely smooth, having a trusted team and partners makes all the difference.

In today’s enterprise technology landscape, where hybrid and multi-cloud environments are the norm, how is Red Hat helping customers unlock the potential of open source technologies?
Ed: For me, the hybrid and multi-cloud narrative is essentially about providing customers with standardization. Some customers might say that they’re on a path toward data center consolidation, or are committed to a single hypervisor, or even a multi-cloud strategy. But once they embrace a hybrid approach, the underlying message is that they require a globally consistent management and operational platform—one that spans multiple cloud providers, private data centers, or even edge environments.

How do we achieve this consistency in an open source manner? When you’re a proprietary company, control is tight. With our strategy, we offer customers open choice—where to run their platform and which workloads to deploy on top of it. In essence, our approach empowers customers by eliminating the risks of siloed, locked-in solutions. This freedom enables businesses to continuously ask, “What should I run, and where and how should I run it?” They consider the portfolio of applications, evaluate whether low-latency edge deployment is needed—as is common for a supermarket loyalty system—or whether a core data center or public cloud deployment makes sense. The operational “how” is addressed by determining whether to run on a container platform, a virtual machine platform, or an alternative setup. Finally, the “why” ties back to ensuring the overall solution aligns with the customer’s cost and business objectives.

Ultimately, our focus is on answering one simple question for the customer: “What should I run, and where, how, and why should I run it?” This encapsulates our commitment to providing both choice and clarity in today’s complex IT environment.


Adrian: I find it quite interesting how that perspective plays out regionally. While we enable customers to run applications on our platforms, major players like Google are also part of the ecosystem. Particularly in Europe, where there is current uncertainty, many governments are questioning whether their sovereign data should reside on a cloud service originating from the U.S. Without diving too deeply into politics, this debate is prompting customers to consider alternative cloud options. For example, when running OpenShift on-premises or on a cloud provided by a specific country, it becomes easier to migrate to a new provider if necessary. This is an evolving discussion, especially in Europe, and it’s something that might expand beyond political cycles in the future.

Ed: Exactly. In Europe, the focus remains on providing choice. With open source technology, we sidestep many political concerns because of the transparency it offers. Customers can inspect the code to see that there are no hidden backdoors or data issues. Consequently, building a sovereign solution using Red Hat technology has gained significant traction. Both governments and organizations are increasingly interested in retaining full control over their data.
 
It seems that customers also desire a degree of freedom with their platforms; they want to ensure that no external party completely controls their systems. How does Red Hat provide this assurance of complete control?

Ed: Customers can deploy our platform in either of two ways. If they run it in their own data center, on-premises. In this case, they obtain full access—they have the code, the platform, all the necessary certificates, and they manage it themselves. In contrast, if they decide to run the platform on one of the hyperscalers, while the underlying compute infrastructure is provided by the hyperscaler, the platform—the layer where the data sits and the applications operate—remains in the open source domain. Therefore, even in these cases, customers retain the ability to influence, control, and understand what happens with their data and applications. And when it comes down to it, every country and organization will make its own decisions, but our consistent message remains: our focus is on choice. Whether a company decides to run its workloads privately, on the public cloud, or at the edge, we ensure that they have the consistent tools and platforms to do so efficiently.

Adrian: That’s exactly right. We have long maintained a commitment to enabling customers to choose the open hybrid cloud. Whether a customer opts for a sovereign cloud, a hyperscaler, or their own private cloud, our core mission is to grant them the freedom to choose and to operate in a simple, consistent, and controlled manner.


Where do you see the enterprise technology landscape heading in the next three to five years?
Ed: I believe that over the next three to five years, we will witness an increasingly consolidated effort to eliminate complexity within IT organizations. Over the last decade, IT has excelled in building silos—if anything, it’s been very effective at doing so. However, with the advent of AI, these separate silos of infrastructure and data are becoming even more problematic. When your data resides in multiple unconnected silos, it becomes extremely challenging to aggregate and leverage it for AI-based insights.

In a recent discussion with a financial services industry leader, the focus was increasingly on ensuring access to all their data, democratizing it internally, and enabling AI-driven querying. This represents a paradigm shift, as data today typically lives within isolated applications. In an AI-integrated world, breaking down these silos is critical. I foresee that one of the most significant developments in the near future—driven by AI—will be the democratization of data access across organizations.

Adrian: I concur. From a regional perspective, we might be a couple of years behind more developed markets like Europe or the U.S. For instance, we are still in the earlier stages of transitioning to the cloud. In the UAE and other regions, sovereign cloud providers are just beginning to expand their offerings. Financial institutions, aviation companies, and others are now starting to embrace the cloud more aggressively than they have in the past four or five years.

Ed: Another nuance here involves what we’re exploring with Granite and small language models. Often, to help Adrian’s customers manage support tickets, you don’t need a language model that knows Shakespeare by heart. Large language models typically contain vast amounts of data, much of which isn’t directly relevant to a given enterprise. Our focus has thus shifted to a choice: do we help organizations harness AI by asking questions of data they couldn’t access before, or do we tailor solutions with smaller language models designed to address specific enterprise challenges?
One notable example was how we applied a tailored small language model within Red Hat to support our own teams in resolving support tickets. This initiative not only saved millions of dollars but also significantly enhanced customer experience and sped up response times. Over time, while large language models have captured much of the buzz, I suspect we will see rapid adoption of small, specialized language models tailored for specific functions.

Continue Reading

Trending

Copyright © 2023 | The Integrator