Connect with us

Tech Interviews

How AI Can Turn Facilities Management from Reactive to Proactivess

Published

on

Exclusive Interview with Sangeetha B, CEO of Amantra Facilities Management

  1. AI and predictive maintenance are gaining traction in facilities management. Why should facilities managers consider transitioning from traditional maintenance models to AI-driven systems?

AI’s impact can be seen most clearly in its ability to preemptively address issues before they escalate. Instead of waiting for things to break, AI systems predict when equipment is likely to fail by analyzing real-time data from across the facility. This typically leads to significant improvements in operations, reductions in emergency repairs, and minimal downtime. It shifts the focus from managing isolated assets to improving the performance of the entire facility. It helps managers transition from an asset-centric approach to managing an interconnected ecosystem.

  1. Where is the return on investment (ROI) most visible when adopting AI-driven predictive maintenance? Is the impact immediate or more long-term?

Energy savings and reduced maintenance costs become evident almost immediately, particularly when AI uncovers inefficiencies that were previously unnoticed. However, the true return on investment is more apparent over time. The long-term impact on the bottom line will be significant—lower repair costs, optimized energy use, and extended equipment lifespan. It’s a comprehensive savings solution at the facility level.

  1. Does data quality come into the picture with AI-driven maintenance?

AI thrives on real-time data, and it’s not just any data—it needs to be high quality. This includes sensor data monitoring temperature, humidity, vibrations, air quality, and energy usage. However, in older buildings, outdated or inconsistent sensors can compromise data quality. To ensure accurate predictions, facilities must standardize their sensor systems and implement robust data governance practices, including regular calibration and sensor maintenance. Solid data management practices are essential for accurate monitoring and analysis. Reliable data is critical for AI to deliver optimal results.

  1. Despite the advantages, what barriers are preventing more widespread adoption of AI in facilities management?

Many decision-makers don’t fully grasp how it can seamlessly integrate into their existing operations or, more importantly, how it can be customized to suit their needs. This is the primary roadblock at the moment. Switching to AI can feel like stepping into uncharted territory. It’s a big mental shift. Then there’s the cost factor – installing AI systems isn’t cheap, which can make it feel like a hefty commitment up front. Those who act early to integrate AI will have a significant advantage – we are barely past the early-adoption phase, and facilities that can be proactive about integrating AI will have a big advantage in the long run.  AI isn’t just another tool, it’s a long-term strategy that will holistically transform how facilities are run in the near future.

  1. How does AI contribute to sustainability efforts in facilities management?

I think it will be a game-changer. AI plays a critical role in enhancing sustainability in facilities management. The immediate effect is reduced waste and lower carbon footprints. Facilities are huge energy consumers. By continuously monitoring and adjusting systems like HVAC, lighting, and water usage, AI ensures that energy isn’t being wasted. This leads to more efficient resource use and will be a boost to all sustainability goals. As sustainability regulations tighten, AI will play an even bigger role in helping facilities meet their environmental goals while staying compliant with new standards.

  1. Is AI-driven predictive maintenance a viable solution for smaller facilities with limited budgets?

AI is becoming increasingly accessible. Facilities don’t need to implement it across the board from the start. Instead, they can focus on key areas like HVAC or lighting systems, where immediate benefits can be realized. Smaller facilities can tap into AI’s potential without straining their budgets, and the long-term cost savings generated will help fund further investments as they scale AI adoption.

  1. How do you see AI evolving in facilities management over the next 5 to 10 years?

AI’s role in facilities management will expand significantly in the coming years. Currently, its primary applications are in predictive maintenance and operational optimization. In the future, AI will touch every aspect of facilities management – from security to energy grids to tenant comfort. We are moving towards autonomous facilities where AI not only predicts failures but actively manages entire systems to optimize performance in real-time. The full potential of AI in facilities management is still being realized, and it will continue to drive significant advancements in operational efficiency and sustainability.

Tech Interviews

Why TWO99 is Rethinking Cloud Marketing with Compliance, Data, and Agility

Published

on

TWO99

How does TWO 99 ensure its Cloud Security Solution remain compliant with evolving international data privacy laws like GDPR and HIPAA?

Two99’s cloud-native security solutions—including CNAPP, CWPP, and CSPM—are engineered to align with evolving international data privacy regulations such as GDPR and HIPAA through a proactive, multi-layered compliance framework. Our platforms integrate automated policy enforcement and real-time security posture monitoring to enable rapid detection and remediation of compliance deviations.

As an ISO 27001:2022 certified organization, we maintain a robust Information Security Management System (ISMS) that embeds data protection, risk assessments, and regular audits into our core operations. In parallel, our ISO 9001:2015 certification underscores our commitment to rigorous quality management processes, allowing agile adaptation to global regulatory shifts.

Our compliance team continuously monitors international privacy standards, translating insights into operational controls and product enhancements. We enforce stringent data storage protocols, including encryption of data at rest and in transit using advanced cryptographic methods, along with secure key management and policy-driven data retention and deletion.

Regular internal evaluations and third-party audits further validate our security posture, ensuring that Two99’s offerings not only meet but consistently exceed global data protection standards.

Given your background with WPP and GroupM, how has your approach to digital transformation changed since founding TWO 99?

My experience with WPP and GroupM provided invaluable insights into how large-scale organizations operate—especially in terms of process, structure, and scalability. However, founding TWO99 marked a deliberate shift toward a more agile, innovation-driven approach to digital transformation. At TWO99, we focus on vertical-agnostic scalability, bringing together technology, creativity, and performance under a unified, adaptable framework.

Unlike traditional holding companies that often operate within rigid silos, our model emphasizes speed, flexibility, and integration. We’ve built an ecosystem that allows us to pivot quickly, test rapidly, and deploy solutions that are customized to the dynamic needs of each client. This is especially critical in emerging markets like India, where consumer behaviors and platform trends evolve at breakneck speed.

Our approach moves away from isolated service offerings and instead delivers end-to-end growth strategies—from brand storytelling to performance marketing—under one roof. This integrated engine not only accelerates ROI but also empowers clients to scale more efficiently across diverse industries and geographies.

Ultimately, digital transformation at TWO99 is not about adopting new tools; it’s about building a mindset of experimentation, collaboration, and continual evolution—something that’s only possible when tech, creative, and media are not just coexisting, but co-creating.

You speak a lot about growth marketing—what’s one underused strategy or tool you believe more startups should adopt?

One of the most underutilized yet high-impact strategies in growth marketing today is predictive audience modeling—specifically using first-party data to anticipate user behavior before a customer even shows active intent. In the rush to acquire users, many startups focus heavily on performance spend and surface-level targeting, often missing the opportunity to build smarter, more efficient pipelines through data-driven foresight.

By leveraging tools like AI-powered lookalike modeling or Google’s AutoML, companies can identify emerging patterns and preemptively segment high-intent audiences. These platforms analyze behavioral signals—ranging from product interactions and website heatmaps to backend signals like GitHub commits or CRM workflows—to spot trends that traditional analytics would miss.

At TWO99, we’ve seen transformative results with this approach. For instance, by layering multiple intent signals (e.g., developer activity, trial-to-paid movement, sales pipeline stages) and combining them with Dynamic Creative Optimization (DCO), we helped a SaaS client reduce their Customer Acquisition Cost (CAC) by over 30%. This wasn’t just about targeting more people—it was about targeting the right people, at the right time, with the right message.

Startups that embrace predictive modeling early in their growth journey can shift from reactive marketing to proactive growth engineering, ultimately driving better ROI, faster time to conversion, and more sustainable customer relationships.

At TWO 99, how do you balance creative innovation with data-driven performance when leading campaigns for tech and cloud-based clients?

At TWO99, we treat data as the creative brief—a philosophy that helps us seamlessly bridge creative storytelling with performance marketing, especially for tech and cloud-based clients. Rather than starting with assumptions or generic messaging, we begin with behavioral analytics and first-party data to uncover real pain points, usage patterns, and moments of friction within the user journey.

This insight-driven approach allows us to craft narratives that aren’t just imaginative, but deeply relevant and conversion-focused. For example, if product analytics show a drop-off at the integration stage, our creative strategy might revolve around simplifying technical complexity or highlighting seamless onboarding. In this way, the campaign’s message is directly informed by what users are experiencing, not just what the brand wants to say.

We also continuously A/B test creative iterations—from copy to visual formats—to fine-tune performance in real time. For tech and cloud clients, where the buyer journey is often complex and multi-touch, this balance of data and creativity ensures that each piece of content not only captures attention but drives measurable outcomes like engagement, sign-ups, or qualified leads.

In short, we don’t see data and creativity as separate tracks. At TWO99, one fuels the other—creating high-performance campaigns that are not only intelligent but emotionally resonant.

Continue Reading

Cover Story

From Insight to Impact Qlik’s Vision for the Future of Data and AI

Published

on

QLIK

As data becomes the lifeblood of modern organizations, Qlik’s agnostic platform spanning data integration, quality, analytics, and AI/ML enables enterprises to make smarter, faster decisions, no matter where their data resides. We spoke with James Fisher, Chief Strategy Officer at Qlik, about how the company is helping businesses unlock AI’s full potential globally, while deepening its commitment to the UAE with the launch of Qlik Cloud on AWS bringing speed, sovereignty, and strategic growth to the region.

You’ve been in the industry for over 30 years. What is it about this moment in data and AI that really excites you, especially here in the UAE?

I’ve been in the industry for over 30 years, started out at PwC, spent nearly a decade at SAP, and then joined Qlik. Honestly, I’ve never seen a more exciting time for data, analytics, and AI. The pace of innovation is incredible, and what really stands out is how committed governments and organizations— especially in the UAE—are to tapping into AI’s full potential. From big public initiatives to a thriving startup scene, there’s real momentum, and Qlik is right at the center of it all.

You wear many hats at Qlik—what does a typical day look like for you?

My role is about making sure our customers get real value from their data—using analytics and AI to drive meaningful outcomes. As Chief Strategy Officer, I help shape Qlik’s vision, build the right partnerships, and steer our growth strategy across areas like products, M&A, and even broader priorities like sustainability and diversity. I also spend a lot of time keeping up with market trends. It’s a busy role, but what I enjoy most is how cross-functional it is—it lets me connect innovation with real business needs.

Qlik’s mission is to help customers ‘do data differently.’ Can you unpack what that means in practice—and how your strategy role helps turn that vision into impact?

Our mission is to help customers get the most out of their data to tackle big, real-world challenges. When we say, ‘do data differently,’ we mean rethinking how data—structured or unstructured—is accessed and used, wherever it lives. On the strategy side, I work closely with our product teams to stay ahead of market trends, and with our services and customer success teams to ensure we’re truly partnering with clients. And that collaboration mindset is something I truly value—it makes Qlik feel like a trusted advisor, not just a vendor.

How do you ensure that your global capabilities align with the specific needs of markets like the UAE?

For Qlik, adaptability is key. We really focus on listening— whether it’s to our customers, partners, or the broader market. In the UAE, we’ve been active for years and have a solid customer base, which gives us valuable insights into regional needs. We also set up a global AI Council, including a member based here, who works closely with local companies and government entities. This helps us adapt our global strengths to deliver real, local value. It’s not just about the product—it’s about how we approach the market, offer training, and engage with the community to reflect local nuances.

With major shifts happening in analytics and data integration, particularly the surge in AI investments in regions like the UAE, how has Qlik evolved its technology to stay ahead of the curve?

Qlik has been in this space for over 30 years. From the beginning, it was about helping customers turn questions into insights by pairing human curiosity with machine intelligence. That is essentially an AI problem—and it has been core to our platform since day one. Over time, we’ve evolved our technology to be cloud-native on AWS, expanded our data integration capabilities, and made it easier for everyone in an organization, not just analysts, to access insights and take action. In markets like the UAE, where AI investment is booming, we’re in a great position to support that momentum. A key part of our strategy has been democratizing data access so that teams across all departments—sales, marketing, HR, finance, and more—can leverage it, not just analysts.

How does Qlik ensure it supports each organization’s unique data and AI journey without disrupting what’s already in place?

Our approach is simple—we’re here to enable, not disrupt. Every organization is on a different path with different needs, whether that’s industry-specific, company size, or their own pace of digital adoption. That’s why flexibility and openness are a big part of how we operate. We don’t ask customers to start over—we work with what they already have and help them get more out of it. A great example is in highly regulated sectors, where security and hybrid environments are a must. We meet them where they are, and build from there. That’s how we earn trust and become long-term partners.

Organizations are juggling data across cloud, on-prem, and even the edge—how does Qlik stay flexible enough to handle all that?

Data today is vast and varied. For years, most analytics solutions focused only on structured data; but about 80% of an organization’s data is unstructured. At Qlik, we do both. Our platform allows us to reach data wherever it is—in cloud apps, in on-prem systems, or even at the edge. That openness allows us to solve real-world problems, not just run analytics in silos. And we are not limited by any single cloud provider. Customers can choose what works best for them—AWS, Azure, Google Cloud—and we will be right there with them.

As someone steering strategy at a data and AI company, how do you personally approach using data to guide big decisions—and what advice would you give to leaders trying to cut through the AI hype?

As Chief Strategy Officer, I see data as the lifeblood of any organization, it’s behind every smart decision we make. At Qlik, we use our own analytics and AI technology to shape our strategies. We build a team with the capability to understand what problems we want to address the right use cases we want to drive, and what data we need to get there—and then making sure we can actually access that data. My biggest piece of advice? Don’t get swept up in the AI hype. Start with a real business problem. Then ask: what data will help me solve this, and how do I get it? Strategy isn’t something you set and forget. The best strategy leaders are the ones who stay humble, curious, and ready to pivot.

What recent launch or initiative is Qlik most excited about in the UAE?

We are announcing the availability of Qlik Cloud on AWS in the UAE. This is a big milestone. It enables innovation and AI-driven value while ensuring robust data sovereignty— customer data stays within the country’s borders. It also boosts performance and latency for applications running locally. This is a major step in helping our clients generate real-time insights and act on them with confidence.

What long-term plans does Qlik have for the region?

We are here for the long run. We are often asked, “Where will Qlik be in five years?” And the answer is simple, we will be right here, growing with our customers. We are expanding our partner network, building up our regional team, and investing in training and enablement. We have only just scratched the surface of what AI can offer, and we are committed to helping customers realize that potential through long-term partnerships. As the region continues its digital transformation, we want to be a steady, trusted enabler—helping organizations turn complexity into clarity.

Continue Reading

Tech Interviews

RedHat Summit Connect 2025: A discussion with Ed Hoppitt and Adrian Pickering

Published

on

Red Hat Summit

Exclusive Interview with Ed Hoppitt, EMEA Director – Value Strategy, App and Cloud Platforms at Red Hat & Adrian Pickering, Regional General Manager, MENA & Enterprise Segment Lead for CEMEA at Red Hat  

Having worked in global telecom and advised some of the world’s largest enterprises, how have these experiences shaped your approach to developing IT solutions?
Ed: I believe that designing and running operational IT over many years gives one a deep understanding of what truly matters to a customer, especially those partnering with Red Hat. This background enables me to connect with our customers on a level where they feel understood regarding their pain points. Today’s biggest challenge for enterprise IT is building systems that are predictable, replicable, and standardized—yet able to scale effectively.

When you look at what Red Hat offers and how we help enterprises build these solutions, our focus is rooted in leveraging the open source community. We invest in projects that we know will create tremendous value for our enterprise customers, taking those projects upstream, incorporating them into the Red Hat portfolio, and industrializing them into platforms such as OpenShift. Customers choose platforms like OpenShift because they represent best-of-breed choices, delivering stability, reliability, predictability, and scalability. With my operational IT background, I appreciate just how crucial these outcomes are for every customer I speak with.

This year’s Red Hat Summit focuses on curiosity and turning acquired knowledge into practical application. It’s about transforming acquired knowledge into practical applications. Through this, what key message are you hoping to leave with the audience this year?
Adrian: I view curiosity as the foundation for working with our customers to truly understand their vision—where they want to be 18, 24, 30, or even 36 months down the line. It’s about gaining a clear grasp of the business challenges they face or the new markets they wish to serve in the future. We then align our best capabilities to support them along that journey, keeping cost efficiency in mind. This might involve modernizing infrastructure, existing applications, or even building new applications that open doors to entirely new customer segments or solutions.

A great example of this is our work with the Dubai Health Authority, who were on stage at Summit Connect Dubai. When we engaged with them, we took the time to deeply understand the challenges they were trying to address for the citizens and then brought not only our technical products but also our expertise in project management, implementation, training, and knowledge transfer. I’m very proud of our achievements over the years, and I believe that in doing so, we add significant value for our customers.

Ed: To add another perspective, the most compelling conversations I have with customers often begin with discussions that don’t initially center on technology. They start with, “I want to imagine a world where things are different—where you can help me achieve something extraordinary.” For instance, with Red Hat OpenShift AI, we collaborated with the US Department of Veterans Affairs to build a platform that effectively reduced self-harm and suicide rates. By harnessing a platform that could analyze how people called in for assistance—assessing tone and how they described their situations—we helped the teams prioritize who needed immediate care versus who could wait a while for some support.

It’s when someone presents you with such a profound challenge that you really see the immense opportunity we have as an organization. These technology platforms do more than enable business; they help vulnerable people receive the care they need and, ultimately, save lives.
 
You mentioned that for Red Hat it’s relatively easy to work on new technologies because of the robust support provided by partners and customers alike. Can you elaborate on just how important those relationships are for your team?

Adrian: The point is that, while we are proud of the solutions we deliver through Red Hat, many integrated solutions require components from multiple software vendors. Our partners and integrators are essential because they bring together the various components needed to deliver, implement, and support these complex solutions. In many regions, especially where we serve multiple countries, these partners offer additional scale and reach, often accessing markets where Red Hat might not have a direct footprint. This collaboration is a critical part of why we work so closely with our partners.

Ed: Another significant benefit of having partners is that it allows Red Hat to concentrate on what we do best. We aren’t trying to solve every aspect of the IT enterprise supply chain. Instead, we work with best-of-breed partners who focus on their own areas of expertise. This means that Adrian’s teams and others in our region can focus on delivering core value to our customers. As we saw on stage, one of the Middle East’s largest banking group  was very ahead of the curve in its approach to virtualisation and modernisation. These partners enable us to help customers execute at scale and with credibility. My background in operational IT tells me that although the journey is rarely smooth, having a trusted team and partners makes all the difference.

In today’s enterprise technology landscape, where hybrid and multi-cloud environments are the norm, how is Red Hat helping customers unlock the potential of open source technologies?
Ed: For me, the hybrid and multi-cloud narrative is essentially about providing customers with standardization. Some customers might say that they’re on a path toward data center consolidation, or are committed to a single hypervisor, or even a multi-cloud strategy. But once they embrace a hybrid approach, the underlying message is that they require a globally consistent management and operational platform—one that spans multiple cloud providers, private data centers, or even edge environments.

How do we achieve this consistency in an open source manner? When you’re a proprietary company, control is tight. With our strategy, we offer customers open choice—where to run their platform and which workloads to deploy on top of it. In essence, our approach empowers customers by eliminating the risks of siloed, locked-in solutions. This freedom enables businesses to continuously ask, “What should I run, and where and how should I run it?” They consider the portfolio of applications, evaluate whether low-latency edge deployment is needed—as is common for a supermarket loyalty system—or whether a core data center or public cloud deployment makes sense. The operational “how” is addressed by determining whether to run on a container platform, a virtual machine platform, or an alternative setup. Finally, the “why” ties back to ensuring the overall solution aligns with the customer’s cost and business objectives.

Ultimately, our focus is on answering one simple question for the customer: “What should I run, and where, how, and why should I run it?” This encapsulates our commitment to providing both choice and clarity in today’s complex IT environment.


Adrian: I find it quite interesting how that perspective plays out regionally. While we enable customers to run applications on our platforms, major players like Google are also part of the ecosystem. Particularly in Europe, where there is current uncertainty, many governments are questioning whether their sovereign data should reside on a cloud service originating from the U.S. Without diving too deeply into politics, this debate is prompting customers to consider alternative cloud options. For example, when running OpenShift on-premises or on a cloud provided by a specific country, it becomes easier to migrate to a new provider if necessary. This is an evolving discussion, especially in Europe, and it’s something that might expand beyond political cycles in the future.

Ed: Exactly. In Europe, the focus remains on providing choice. With open source technology, we sidestep many political concerns because of the transparency it offers. Customers can inspect the code to see that there are no hidden backdoors or data issues. Consequently, building a sovereign solution using Red Hat technology has gained significant traction. Both governments and organizations are increasingly interested in retaining full control over their data.
 
It seems that customers also desire a degree of freedom with their platforms; they want to ensure that no external party completely controls their systems. How does Red Hat provide this assurance of complete control?

Ed: Customers can deploy our platform in either of two ways. If they run it in their own data center, on-premises. In this case, they obtain full access—they have the code, the platform, all the necessary certificates, and they manage it themselves. In contrast, if they decide to run the platform on one of the hyperscalers, while the underlying compute infrastructure is provided by the hyperscaler, the platform—the layer where the data sits and the applications operate—remains in the open source domain. Therefore, even in these cases, customers retain the ability to influence, control, and understand what happens with their data and applications. And when it comes down to it, every country and organization will make its own decisions, but our consistent message remains: our focus is on choice. Whether a company decides to run its workloads privately, on the public cloud, or at the edge, we ensure that they have the consistent tools and platforms to do so efficiently.

Adrian: That’s exactly right. We have long maintained a commitment to enabling customers to choose the open hybrid cloud. Whether a customer opts for a sovereign cloud, a hyperscaler, or their own private cloud, our core mission is to grant them the freedom to choose and to operate in a simple, consistent, and controlled manner.


Where do you see the enterprise technology landscape heading in the next three to five years?
Ed: I believe that over the next three to five years, we will witness an increasingly consolidated effort to eliminate complexity within IT organizations. Over the last decade, IT has excelled in building silos—if anything, it’s been very effective at doing so. However, with the advent of AI, these separate silos of infrastructure and data are becoming even more problematic. When your data resides in multiple unconnected silos, it becomes extremely challenging to aggregate and leverage it for AI-based insights.

In a recent discussion with a financial services industry leader, the focus was increasingly on ensuring access to all their data, democratizing it internally, and enabling AI-driven querying. This represents a paradigm shift, as data today typically lives within isolated applications. In an AI-integrated world, breaking down these silos is critical. I foresee that one of the most significant developments in the near future—driven by AI—will be the democratization of data access across organizations.

Adrian: I concur. From a regional perspective, we might be a couple of years behind more developed markets like Europe or the U.S. For instance, we are still in the earlier stages of transitioning to the cloud. In the UAE and other regions, sovereign cloud providers are just beginning to expand their offerings. Financial institutions, aviation companies, and others are now starting to embrace the cloud more aggressively than they have in the past four or five years.

Ed: Another nuance here involves what we’re exploring with Granite and small language models. Often, to help Adrian’s customers manage support tickets, you don’t need a language model that knows Shakespeare by heart. Large language models typically contain vast amounts of data, much of which isn’t directly relevant to a given enterprise. Our focus has thus shifted to a choice: do we help organizations harness AI by asking questions of data they couldn’t access before, or do we tailor solutions with smaller language models designed to address specific enterprise challenges?
One notable example was how we applied a tailored small language model within Red Hat to support our own teams in resolving support tickets. This initiative not only saved millions of dollars but also significantly enhanced customer experience and sped up response times. Over time, while large language models have captured much of the buzz, I suspect we will see rapid adoption of small, specialized language models tailored for specific functions.

Continue Reading

Trending

Please enable JavaScript in your browser to complete this form.

Copyright © 2023 | The Integrator