Tech Interviews
MBUZZ Unveils Somod Brand and AI Breakthroughs at GITEX 2024
Exclusive Interview with MBUZZ’s Mr. Sabir Saleem, CEO
Tell us about the brand and how do you see the ongoing plans for the brand?
We deal with many components, like G-force, high-performance SSDs, DRAMs, and various workstations from Super Micro and Asus. We noticed a gap in getting customized systems for AI, content creation, gaming, etc. So, we decided to launch a brand focusing on these domains. After a year of internal discussions, we did a soft launch of the brand Somod, which means resilience in Arabic. We’ve seen significant traction for this brand at GITEX.
What distinguishes a MBUZZ in the current AI compute ecosystem. What makes you different from other players in the market?
If you watch all over GITEX or any global event, everything is about AI. Everyone is talking about AI. MBUZZ’s positioning on AI is in the compute ecosystem. Our expertise is in building a full AI compute ecosystem, including computer storage, networking, and other orchestration layers. We are an NVIDIA elite partner in the region, with competencies in all domains like DGX, cloud, visualization, networking, and other verified NVIDIA partners. We work with enterprise customers, small consumers, and large cloud providers, from workstations to thousands of GPU clusters. Our team is trained in all these domains, including solution architects, pre-sales, post-sales, and support.
Can you tell me about your history of partnership with PNY and NVIDIA, and how do you see the future with them?
It’s been an interesting journey. We started with PNY in early 2019, followed by a partnership with NVIDIA. We work across all their domains. PNY offers consumer-grade products like SSDs, DRAM, and GeForce, and they distribute NVIDIA’s DGX solutions and InfiniBand networking. We are likely the only customer covering their entire product line, from consumer to enterprise. Today, we announced our Somod brand’s launch of a consumer AI platform-ready PC powered by PNY components and the GeForce RTX 4060. This unique offering in the region will help students and content creators with small AI use cases.
What segments do you cater to? Is it only the consumer segment or are you catering to other segments also?
No, we cater to all segments. The AI-ready PC launched by PNY Power, the Somod system, is for consumers. We also have systems designed for enterprise customers. These start with a 5880 single GPU and can scale up to seven GPUs in the same workstation, featuring a full liquid-cool design and up to 128 terabytes of storage. This workstation, not a server, is quiet enough to keep on your desk. Small researchers and university students can use it without needing a data centre environment.
Tech Interviews
Bitdefender Shapes Modern Cloud Security Solutions
Exclusive interview with Mr. Raphaël Peyret, Director of Product Management — Cloud Security, BitDefender
What is Bitdefender’s main objective at GITEX this year?
GITEX is a great opportunity to meet not only our end customers but also partners. We are a channel company and we sell exclusively through partners. Hence, it’s great to have partners from all over the region all over the world come to us so that we can connect again in person. We are also presenting some of our newer products and offerings. For example, this year we launched a new cloud security offering. Talking about some of our innovations in risk management, we announced something called PHASR recently. So those are going to be some big, big things that we hope to spread the word about.
As a cybersecurity company, how Bitdefender keep itself ahead of the competition?
Every company in the space has its own angle on this. And for us, our angle is its technology. We’re a technology company at heart, our founder, founded Bitdefender 23 years ago. He’s a university professor, and we have 50% of our teams working on R&D. We are not a marketing company. We are a technology company and that comes across in our products, where technically speaking we are really, really good. We have got people doing reverse engineering of malware, working with the FBI, with INTERPOL, as well as building the B2B products that you will see here. Under the hood, there’s a ton of work on core technology.
Right now, a lot of companies are moving toward having cloud security. What is Bitdefender doing in this sector/market?
At Bitdefender, we recognize that while the cloud requires a tailored approach, it cannot be treated as entirely separate from other environments. Attackers won’t stop at the cloud’s edge, so isolated approaches only create exploitable silos. That’s why our cloud security platform and cloud-native application protection are integrated into our broader security framework. Our capabilities correlate data and responses across all environments—cloud, endpoints, email, mobile, or IoT. This unified approach ensures comprehensive threat prevention, detection, and response, following attackers wherever they move.
How are you utilizing AI and machine learning in enhancing your threat detection capabilities?
We often get asked about AI and machine learning, especially now, but for our technical teams, it’s nothing new—it’s like discussing computing itself. Bitdefender has been leveraging machine learning on endpoints for over a decade. Instead of just claiming “we have AI,” we deploy a vast range of models fine-tuned for different security needs, including machine learning, heuristics-based, and generative models. These aren’t just buzzwords; each model serves a specific security purpose, optimized through rigorous R&D. In 2017, a new ransomware was blocked by a 2014 model—demonstrating the effectiveness of our approach. Moreover, every device with our protection uses its own customized AI model, meaning we effectively deploy millions of unique models tailored to individual devices.
Considering that hackers are aware of cybersecurity measures from companies like Bitdefender, are they constantly innovating new tactics to bypass these defenses?
Hackers driven by financial gain often target more vulnerable systems because they offer easier access with lower investment. If defenses are strong enough to make hacking unprofitable, attackers tend to move on. Despite this, hackers continue to innovate, particularly in malware and ransomware. For instance, they increasingly use harder-to-reverse-engineer languages like Rust to slow down security teams’ efforts to create decryptors. They also explore advanced cryptography, anticipating quantum computing’s potential to break traditional security methods. This constant evolution creates a cat-and-mouse dynamic; the goal remains to make attacks so costly that cybercriminals are deterred.
Do you also consider the perspective and strategies of hackers when developing your security measures?
Of course and in fact the services that we offer include penetration testing and red teaming. Red teaming is the one that’s really the closest to the hacker mindset. Because organizations will come and say. “Here are our crown jewels. Here’s the safe. Here are the rules of the game. You’re not allowed to, beat my CEO up. Everything else is on the table. Break in and tell me how you did it.” And so for that, we basically have the white-hat hackers, the good hackers. They hack you so that you know how another hacker would do it and that you can block. We do that. And of course, it’s important not to separate from the hacker mindset for doing cybersecurity properly. Because otherwise, you’re blind to where the risks actually are.
Vertiv Outlines Data Center Evolution and AI Infrastructure Strategy
Exclusive Interview with Peter Lambrecht, Vice President Sales, EMEA, Vertiv
What specific challenges do clients face in powering and cooling AI infrastructure, and how did you address these at GITEX this year?
At GITEX this year, the focus on artificial intelligence (AI) was unmistakable. Every booth showcased AI-driven solutions running on GPU-powered servers from leading companies like NVIDIA. However, for AI applications to function effectively, the right infrastructure is essential to power and cool these GPUs, ensuring smooth and efficient performance. For us, this highlights our expertise in AI infrastructure, designed to support these platforms optimally.
One of our key focus areas is liquid cooling, as traditional air cooling in data centers is no longer sufficient. With rack densities now reaching 50, 60, and even up to 132 kilowatts per rack, air cooling alone cannot handle the thermal load. Liquid cooling has become critical, efficiently drawing heat away and directing it elsewhere. This core technology, developed with our partner NVIDIA, supports the deployment of GPUs worldwide, and together we provide, market, and deliver these advanced solutions to our clients.
Cooling, however, is only one part of the equation. The shift to AI also requires a comprehensive approach to power management, as AI workloads significantly alter electricity load patterns. Our solutions are designed to meet these power demands, and we have showcased these capabilities at our booth. We’ve been actively engaging with partners and clients to address these challenges as they implement their AI solutions, ensuring that both cooling and power needs are effectively met.
Can you elaborate on your partnership with NVIDIA and how it has evolved over the years?
Our partnership with NVIDIA has grown significantly over the years, and over the past year, it has reached an unprecedented level of collaboration. Both of our CEOs, Jensen Huang of NVIDIA and Giordano Albertazzi of Vertiv communicate regularly, aligning closely on joint development initiatives. Together, we create environments optimized for NVIDIA’s cutting-edge chips, ensuring they have the necessary infrastructure to operate at peak performance.
In a recent joint announcement, both CEOs unveiled new solutions that integrate NVIDIA GPUs with Vertiv’s advanced infrastructure, designed to maximize efficiency and reliability. This collaboration represents the core strength of our partnership, where we have refined reference designs that allow NVIDIA to deploy their GPUs seamlessly and effectively on a global scale.
What current trends do you see in the data center and critical infrastructure market? With many hyperscalers entering the market, what is your perspective on this development?
The data center market is dominated by Hyperscalers, whether through their direct deployments or co-location facilities—two primary models for quickly scaling data center capacity. These Hyperscalers are making substantial investments in infrastructure, fueling competition in this fast-evolving landscape.
The AI race is fully underway, with industry giants all striving toward the same goal. As their infrastructure partner, we are advancing with them, providing the essential support they need to drive this innovation. At the same time, the growth of the cloud sector remains foundational and continues to expand robustly. What we are seeing now is a dual growth trajectory: traditional cloud business growth compounded by the accelerated demand for AI infrastructure.
Trends in data centers reveal a marked increase in power consumption. A few years ago, a five-megawatt data center was considered significant; soon, however, a five-megawatt capacity will fit within a 10×10-meter room as rack density skyrockets, reducing the need for extensive white space but requiring expanded infrastructure areas. Data centers are scaling up to unprecedented sizes, with discussions now involving capacities of 300-400 megawatts, or even gigawatts. Visualizing a gigawatt-sized facility is challenging, yet that is the direction the industry is moving—toward ultra-dense, compacted facilities where every element is intensified, driving an enormous need for power.
As data centers continue to grow, how do you view the sustainability aspect associated with these large facilities?
Today, nearly everyone has a mobile phone in hand, yet data centers—the backbone of our digital lives—often face criticism despite being indispensable to modern society. Data centers are not disappearing; on the contrary, they are set to expand as the pace of digitalization accelerates globally. Power generation remains, and will continue to be, a critical challenge, particularly in regions where resources are limited.
Currently, we see significant advancements in AI infrastructure across Northern Europe. Countries like Sweden, Finland, and Norway benefit from ample hydropower and renewable energy sources, making them well-suited for sustainable AI development. Meanwhile, the Middle East is experiencing a technology boom, backed by its rich energy resources and favorable conditions for large-scale investment in data centers.
There’s also a rising trend toward on-site power generation, with organizations increasingly considering dedicated power stations and micro-grids that tap into renewable or alternative energy sources. In the U.S., for instance, discussions are underway about small, mobile nuclear reactors to support local power needs. Finding sustainable power solutions has become imperative. A sudden surge in electric vehicle usage, for instance, could stress current power supplies dramatically, underscoring the need for substantial changes in our energy landscape.
What support and services does Vertiv offer to its customers? What types of infrastructure investments is Vertiv making in various parts of the world?
At Vertiv, service is arguably the most critical aspect of what we do. Selling a product is only one phase of its lifecycle; the real value lies in our ability to maintain and support that product for the next 10 to 15 years. The availability of highly skilled labor and expertise is essential, as we know that issues will inevitably arise. This is why resilience is a cornerstone of data centers. A strong service infrastructure is vital for addressing challenges promptly when they occur. Just as a car will eventually break down, data center systems too will face difficulties over time. At Vertiv, we have developed an exceptional service framework to ensure that we are always prepared to support our clients.
My philosophy is simple: we don’t sell a product unless we can guarantee the service to support it. When you sell a solution, you’re essentially selling a potential future problem, so ensuring your service capabilities are in place is vital for sustainable growth. This commitment to service is one of our key differentiators in the market.
We are continuously enhancing our core production facilities and making ongoing investments in engineering, research, and development. We are also evaluating our global footprint to optimize production capabilities and meet the growing demand. We are not just focused on the immediate needs of today; we are preparing for the demands that will arise in the next six months, a year, or even two to three years. In this race for capacity, the winner will be the one best positioned with the most scalable and resilient capacity.
What is the future of cooling systems in relation to AI chips, and where do you see this race heading?
While we are not completely moving away from air cooling as it will always play a role in the equation, fully eliminating it would be prohibitively costly. The transition to liquid cooling is a critical step forward. We are already seeing advancements in the liquids used to cool servers, enabling them to absorb higher levels of heat. However, the primary challenge will be addressing the overall densification of data center systems, as more powerful and compact solutions require innovative approaches to heat management.
Our partnerships with industry leaders like NVIDIA and Intel are essential, as they provide us with invaluable, first-hand insights into the development of cutting-edge chips and GPUs. While cooling and power systems might seem straightforward, AI introduces a new layer of complexity that demands forward-thinking solutions. To meet these challenges, we are making significant investments in research and development to support the AI-driven data centers of today and tomorrow. Our commitment to continuous innovation ensures that we remain at the forefront of these critical advancements.
Tech Interviews
The Role of Snowflake’s AI Data Cloud in Transforming Enterprise Data Management
Exclusive Interview with Mohammad Zouri, General Manager, META at Snowflake.
What specific innovations is Snowflake highlighting this year at GITEX?
The main highlights this year are certainly centered around AI and GenAI, particularly how we have integrated these capabilities into our Snowflake data platform. We have even rebranded to AI Data Cloud. We are harnessing the power of GenAI and presenting all these capabilities to our customers to bring new workloads to their data while ensuring they maintain strong foundations within the boundaries of security and governance.
How does Snowflake’s AI Data Cloud differentiate itself from other platforms?
We are a fully managed service. When we began our journey 12 years ago, we developed a new cloud-native architecture built from the ground up to address market challenges related to scalability and performance. This architecture enables us to handle large volumes of data and support new use cases in advanced and predictive analytics. As we continued on this journey, we expanded our focus to disrupt additional domains within data, including data collaboration and data science, while increasingly delivering applications.
How do we differentiate ourselves? We offer a fully managed service with a very fast time to market and unlimited scalability in both storage and compute. Additionally, we are building a network where each customer can develop their own data strategy and share seamlessly with other customers in their ecosystem. This collaboration drives efficiencies, enhances productivity, and fosters the creation of innovative advanced use cases in the market.
Looking ahead, what are some of the most exciting AI and data-driven trends that Snowflake is focusing on?
In terms of AI and data, everyone today is striving to do more with their data, engaging with it in natural language to improve efficiency and productivity. To support this, through our fully managed data service, we provide additional fully managed services in the GenAI space, which we call Cortex AI. We place it in the hands of new personas.
Previously, platforms were primarily designed for technical teams, data teams, and data scientists, who would build internal services to deliver to their end users, including internal employees and customers. The disruption today lies in the new capabilities we offer, enabling a broader range of personas, including legal, finance, HR, and analysts, to harness the power of GenAI. Any employee can now utilize tools for summarization, translation, sentiment analysis, and interaction with chatbots, all in natural language. This allows users to communicate with their documents and unstructured data, asking questions in everyday language. As a result, we are creating entirely new use cases aimed at enhancing customer satisfaction, improving productivity, and increasing revenue. This acceleration is driven by these advanced capabilities.
-
Tech News5 months ago
Denodo Bolsters Executive Team by Hiring Christophe Culine as its Chief Revenue Officer
-
Tech Interviews9 months ago
Navigating the Cybersecurity Landscape in Hybrid Work Environments
-
Tech News9 months ago
Brighton College Abu Dhabi and Brighton College Al Ain Donate 954 IT Devices in Support of ‘Donate Your Own Device’ Campaign
-
Features6 months ago
Security in the Cloud Age: Combating Risks with Hybrid Cloud Solutions
-
Tech Features6 months ago
The Middle East to Lead with Next-generation Mission Critical Communication Advancement
-
VAR3 months ago
Samsung Galaxy Z Fold6 vs Google Pixel 9 Pro Fold: Clash Of The Folding Phenoms
-
Tech News12 months ago
Senet enters MENA’s Competitive Gaming Scene with ‘skill-to-earn’ Platform
-
Automotive10 months ago
Al-Futtaim Automotive Builds On 23-Year Legacy of Trust & Leadership in UAE’s Pre-Owned Car Market to Sell Over 25,000 Used Vehicles in 2023