Tech Interviews
Protecting Each Device is Difficult
Vulnerable urban futurism technologies are not only fair game for mischievous hackers but also crime groups and hostile nation states, warns Steve Hicks, Head of Global Sales, BullGuard in a discussion with VAR.
Q1. Smart homes are central to Internet of Things (IoT) which makes them susceptible to cyber-attacks. How can smart homes also be made safe homes?
A1. It’s difficult to protect each individual device. This places too much onus on the owner if they, for instance, have six or seven smart devices in the home. Even well-known brand devices have proved to be vulnerable to hacks while many mass-produced devices have really poor protection such as default passwords and admin details.
Protecting all devices in a home requires time and technical insights that most people don’t have. These include looking for firmware updates and keeping an eye on network traffic to see if something is leaving or entering the network that shouldn’t. What is required is an overarching protection that starts at the network router, monitors all devices on a network simultaneously and also monitors network traffic. Fused with artificial intelligence, machine learning and cloud-based security, this level of protection acts like a security blanket thrown over the home. It can identify and stop attacks in real time and assess the security status of new devices when they are attached to the smart home network.
Q2. Malware is one of the most common methods of stealing personally identifiable customer information. What can be done to protect unsuspecting customers from falling victim to such attacks?
A2. It’s a fundamental step but people need to use good antivirus software that uses both signature and behaviour-based detection. Signature-based detection takes care of the millions of known malware variants while behaviour-based detection identifies zero-day malware and stops it in its tracks as it tries to penetrate computers. Another important point is the need for awareness about email-based phishing attacks. This is a popular route for hackers and people need to be aware that emails requesting sensitive personal information such as payment card numbers, or providing links to web pages that also request personal information are scams designed to extract important financial information.
Q3. Data breaches are at an all-time high. What must businesses do to ensure protection of sensitive customer data?
A3. In addition to having fundamental security technologies in place businesses must adopt a zero-trust model across the entire enterprise, which in turn informs how they protect sensitive customer data. This is a ‘never trust, always verify’ approach. This then protects against a wide range of existing and evolving threats. For instance, under this model those responsible ask questions such as ‘is this third-party script on our website secure or can it be exploited by hackers?’ This detailed questioning approach requires investment but it makes all the difference as to whether sensitive data is stolen or kept safe.
Q4. With internet becoming ubiquitous, more and more kids are getting online. What precautions can parents take to ensure their kids’ safety online?
A4. Education and parental tools enable discreet monitoring of what children are doing online. Children receive cyber security training on stranger danger at school and it does no harm to reinforce this at home. One of the more disturbing perils is if children stumble across inappropriate content which can have an adverse impact on developing minds. Unhealthy peer pressure, social media bullying and posting risqué images are also important things to look out for. This is where parental controls are very helpful.
Q5. In the wake of urban futurism, what is the basic security hygiene that businesses and customers alike must maintain?
A5. Urban futurism is a short phrase for such big topic covering everything from
renewable energies, all kinds of web technology, blockchain, tactical urbanism, decentralised networks, autonomous cars and a lot more. People are getting excited at the potential of new technologies whether its town planners, civic designers, power companies and so on. However, to ensure basic security hygiene after asking the question ‘what can this technology do?’ a follow up question is required: ‘is it secure or is it vulnerable to hacking? And if so, how?’ This is less a requirement and more of an attitude. Without this questioning stance we are going to see an awful lot of new tech implementations that are ripe for hacking. And it’s not just singular devices we are talking about, it’s also about the networks they sit on and how these networks can be exploited.
Tech Interviews
Bitdefender Shapes Modern Cloud Security Solutions
Exclusive interview with Mr. Raphaël Peyret, Director of Product Management — Cloud Security, BitDefender
What is Bitdefender’s main objective at GITEX this year?
GITEX is a great opportunity to meet not only our end customers but also partners. We are a channel company and we sell exclusively through partners. Hence, it’s great to have partners from all over the region all over the world come to us so that we can connect again in person. We are also presenting some of our newer products and offerings. For example, this year we launched a new cloud security offering. Talking about some of our innovations in risk management, we announced something called PHASR recently. So those are going to be some big, big things that we hope to spread the word about.
As a cybersecurity company, how Bitdefender keep itself ahead of the competition?
Every company in the space has its own angle on this. And for us, our angle is its technology. We’re a technology company at heart, our founder, founded Bitdefender 23 years ago. He’s a university professor, and we have 50% of our teams working on R&D. We are not a marketing company. We are a technology company and that comes across in our products, where technically speaking we are really, really good. We have got people doing reverse engineering of malware, working with the FBI, with INTERPOL, as well as building the B2B products that you will see here. Under the hood, there’s a ton of work on core technology.
Right now, a lot of companies are moving toward having cloud security. What is Bitdefender doing in this sector/market?
At Bitdefender, we recognize that while the cloud requires a tailored approach, it cannot be treated as entirely separate from other environments. Attackers won’t stop at the cloud’s edge, so isolated approaches only create exploitable silos. That’s why our cloud security platform and cloud-native application protection are integrated into our broader security framework. Our capabilities correlate data and responses across all environments—cloud, endpoints, email, mobile, or IoT. This unified approach ensures comprehensive threat prevention, detection, and response, following attackers wherever they move.
How are you utilizing AI and machine learning in enhancing your threat detection capabilities?
We often get asked about AI and machine learning, especially now, but for our technical teams, it’s nothing new—it’s like discussing computing itself. Bitdefender has been leveraging machine learning on endpoints for over a decade. Instead of just claiming “we have AI,” we deploy a vast range of models fine-tuned for different security needs, including machine learning, heuristics-based, and generative models. These aren’t just buzzwords; each model serves a specific security purpose, optimized through rigorous R&D. In 2017, a new ransomware was blocked by a 2014 model—demonstrating the effectiveness of our approach. Moreover, every device with our protection uses its own customized AI model, meaning we effectively deploy millions of unique models tailored to individual devices.
Considering that hackers are aware of cybersecurity measures from companies like Bitdefender, are they constantly innovating new tactics to bypass these defenses?
Hackers driven by financial gain often target more vulnerable systems because they offer easier access with lower investment. If defenses are strong enough to make hacking unprofitable, attackers tend to move on. Despite this, hackers continue to innovate, particularly in malware and ransomware. For instance, they increasingly use harder-to-reverse-engineer languages like Rust to slow down security teams’ efforts to create decryptors. They also explore advanced cryptography, anticipating quantum computing’s potential to break traditional security methods. This constant evolution creates a cat-and-mouse dynamic; the goal remains to make attacks so costly that cybercriminals are deterred.
Do you also consider the perspective and strategies of hackers when developing your security measures?
Of course and in fact the services that we offer include penetration testing and red teaming. Red teaming is the one that’s really the closest to the hacker mindset. Because organizations will come and say. “Here are our crown jewels. Here’s the safe. Here are the rules of the game. You’re not allowed to, beat my CEO up. Everything else is on the table. Break in and tell me how you did it.” And so for that, we basically have the white-hat hackers, the good hackers. They hack you so that you know how another hacker would do it and that you can block. We do that. And of course, it’s important not to separate from the hacker mindset for doing cybersecurity properly. Because otherwise, you’re blind to where the risks actually are.
Vertiv Outlines Data Center Evolution and AI Infrastructure Strategy
Exclusive Interview with Peter Lambrecht, Vice President Sales, EMEA, Vertiv
What specific challenges do clients face in powering and cooling AI infrastructure, and how did you address these at GITEX this year?
At GITEX this year, the focus on artificial intelligence (AI) was unmistakable. Every booth showcased AI-driven solutions running on GPU-powered servers from leading companies like NVIDIA. However, for AI applications to function effectively, the right infrastructure is essential to power and cool these GPUs, ensuring smooth and efficient performance. For us, this highlights our expertise in AI infrastructure, designed to support these platforms optimally.
One of our key focus areas is liquid cooling, as traditional air cooling in data centers is no longer sufficient. With rack densities now reaching 50, 60, and even up to 132 kilowatts per rack, air cooling alone cannot handle the thermal load. Liquid cooling has become critical, efficiently drawing heat away and directing it elsewhere. This core technology, developed with our partner NVIDIA, supports the deployment of GPUs worldwide, and together we provide, market, and deliver these advanced solutions to our clients.
Cooling, however, is only one part of the equation. The shift to AI also requires a comprehensive approach to power management, as AI workloads significantly alter electricity load patterns. Our solutions are designed to meet these power demands, and we have showcased these capabilities at our booth. We’ve been actively engaging with partners and clients to address these challenges as they implement their AI solutions, ensuring that both cooling and power needs are effectively met.
Can you elaborate on your partnership with NVIDIA and how it has evolved over the years?
Our partnership with NVIDIA has grown significantly over the years, and over the past year, it has reached an unprecedented level of collaboration. Both of our CEOs, Jensen Huang of NVIDIA and Giordano Albertazzi of Vertiv communicate regularly, aligning closely on joint development initiatives. Together, we create environments optimized for NVIDIA’s cutting-edge chips, ensuring they have the necessary infrastructure to operate at peak performance.
In a recent joint announcement, both CEOs unveiled new solutions that integrate NVIDIA GPUs with Vertiv’s advanced infrastructure, designed to maximize efficiency and reliability. This collaboration represents the core strength of our partnership, where we have refined reference designs that allow NVIDIA to deploy their GPUs seamlessly and effectively on a global scale.
What current trends do you see in the data center and critical infrastructure market? With many hyperscalers entering the market, what is your perspective on this development?
The data center market is dominated by Hyperscalers, whether through their direct deployments or co-location facilities—two primary models for quickly scaling data center capacity. These Hyperscalers are making substantial investments in infrastructure, fueling competition in this fast-evolving landscape.
The AI race is fully underway, with industry giants all striving toward the same goal. As their infrastructure partner, we are advancing with them, providing the essential support they need to drive this innovation. At the same time, the growth of the cloud sector remains foundational and continues to expand robustly. What we are seeing now is a dual growth trajectory: traditional cloud business growth compounded by the accelerated demand for AI infrastructure.
Trends in data centers reveal a marked increase in power consumption. A few years ago, a five-megawatt data center was considered significant; soon, however, a five-megawatt capacity will fit within a 10×10-meter room as rack density skyrockets, reducing the need for extensive white space but requiring expanded infrastructure areas. Data centers are scaling up to unprecedented sizes, with discussions now involving capacities of 300-400 megawatts, or even gigawatts. Visualizing a gigawatt-sized facility is challenging, yet that is the direction the industry is moving—toward ultra-dense, compacted facilities where every element is intensified, driving an enormous need for power.
As data centers continue to grow, how do you view the sustainability aspect associated with these large facilities?
Today, nearly everyone has a mobile phone in hand, yet data centers—the backbone of our digital lives—often face criticism despite being indispensable to modern society. Data centers are not disappearing; on the contrary, they are set to expand as the pace of digitalization accelerates globally. Power generation remains, and will continue to be, a critical challenge, particularly in regions where resources are limited.
Currently, we see significant advancements in AI infrastructure across Northern Europe. Countries like Sweden, Finland, and Norway benefit from ample hydropower and renewable energy sources, making them well-suited for sustainable AI development. Meanwhile, the Middle East is experiencing a technology boom, backed by its rich energy resources and favorable conditions for large-scale investment in data centers.
There’s also a rising trend toward on-site power generation, with organizations increasingly considering dedicated power stations and micro-grids that tap into renewable or alternative energy sources. In the U.S., for instance, discussions are underway about small, mobile nuclear reactors to support local power needs. Finding sustainable power solutions has become imperative. A sudden surge in electric vehicle usage, for instance, could stress current power supplies dramatically, underscoring the need for substantial changes in our energy landscape.
What support and services does Vertiv offer to its customers? What types of infrastructure investments is Vertiv making in various parts of the world?
At Vertiv, service is arguably the most critical aspect of what we do. Selling a product is only one phase of its lifecycle; the real value lies in our ability to maintain and support that product for the next 10 to 15 years. The availability of highly skilled labor and expertise is essential, as we know that issues will inevitably arise. This is why resilience is a cornerstone of data centers. A strong service infrastructure is vital for addressing challenges promptly when they occur. Just as a car will eventually break down, data center systems too will face difficulties over time. At Vertiv, we have developed an exceptional service framework to ensure that we are always prepared to support our clients.
My philosophy is simple: we don’t sell a product unless we can guarantee the service to support it. When you sell a solution, you’re essentially selling a potential future problem, so ensuring your service capabilities are in place is vital for sustainable growth. This commitment to service is one of our key differentiators in the market.
We are continuously enhancing our core production facilities and making ongoing investments in engineering, research, and development. We are also evaluating our global footprint to optimize production capabilities and meet the growing demand. We are not just focused on the immediate needs of today; we are preparing for the demands that will arise in the next six months, a year, or even two to three years. In this race for capacity, the winner will be the one best positioned with the most scalable and resilient capacity.
What is the future of cooling systems in relation to AI chips, and where do you see this race heading?
While we are not completely moving away from air cooling as it will always play a role in the equation, fully eliminating it would be prohibitively costly. The transition to liquid cooling is a critical step forward. We are already seeing advancements in the liquids used to cool servers, enabling them to absorb higher levels of heat. However, the primary challenge will be addressing the overall densification of data center systems, as more powerful and compact solutions require innovative approaches to heat management.
Our partnerships with industry leaders like NVIDIA and Intel are essential, as they provide us with invaluable, first-hand insights into the development of cutting-edge chips and GPUs. While cooling and power systems might seem straightforward, AI introduces a new layer of complexity that demands forward-thinking solutions. To meet these challenges, we are making significant investments in research and development to support the AI-driven data centers of today and tomorrow. Our commitment to continuous innovation ensures that we remain at the forefront of these critical advancements.
Tech Interviews
The Role of Snowflake’s AI Data Cloud in Transforming Enterprise Data Management
Exclusive Interview with Mohammad Zouri, General Manager, META at Snowflake.
What specific innovations is Snowflake highlighting this year at GITEX?
The main highlights this year are certainly centered around AI and GenAI, particularly how we have integrated these capabilities into our Snowflake data platform. We have even rebranded to AI Data Cloud. We are harnessing the power of GenAI and presenting all these capabilities to our customers to bring new workloads to their data while ensuring they maintain strong foundations within the boundaries of security and governance.
How does Snowflake’s AI Data Cloud differentiate itself from other platforms?
We are a fully managed service. When we began our journey 12 years ago, we developed a new cloud-native architecture built from the ground up to address market challenges related to scalability and performance. This architecture enables us to handle large volumes of data and support new use cases in advanced and predictive analytics. As we continued on this journey, we expanded our focus to disrupt additional domains within data, including data collaboration and data science, while increasingly delivering applications.
How do we differentiate ourselves? We offer a fully managed service with a very fast time to market and unlimited scalability in both storage and compute. Additionally, we are building a network where each customer can develop their own data strategy and share seamlessly with other customers in their ecosystem. This collaboration drives efficiencies, enhances productivity, and fosters the creation of innovative advanced use cases in the market.
Looking ahead, what are some of the most exciting AI and data-driven trends that Snowflake is focusing on?
In terms of AI and data, everyone today is striving to do more with their data, engaging with it in natural language to improve efficiency and productivity. To support this, through our fully managed data service, we provide additional fully managed services in the GenAI space, which we call Cortex AI. We place it in the hands of new personas.
Previously, platforms were primarily designed for technical teams, data teams, and data scientists, who would build internal services to deliver to their end users, including internal employees and customers. The disruption today lies in the new capabilities we offer, enabling a broader range of personas, including legal, finance, HR, and analysts, to harness the power of GenAI. Any employee can now utilize tools for summarization, translation, sentiment analysis, and interaction with chatbots, all in natural language. This allows users to communicate with their documents and unstructured data, asking questions in everyday language. As a result, we are creating entirely new use cases aimed at enhancing customer satisfaction, improving productivity, and increasing revenue. This acceleration is driven by these advanced capabilities.
-
Tech News5 months ago
Denodo Bolsters Executive Team by Hiring Christophe Culine as its Chief Revenue Officer
-
Tech Interviews9 months ago
Navigating the Cybersecurity Landscape in Hybrid Work Environments
-
Tech News9 months ago
Brighton College Abu Dhabi and Brighton College Al Ain Donate 954 IT Devices in Support of ‘Donate Your Own Device’ Campaign
-
Features6 months ago
Security in the Cloud Age: Combating Risks with Hybrid Cloud Solutions
-
Tech Features6 months ago
The Middle East to Lead with Next-generation Mission Critical Communication Advancement
-
VAR3 months ago
Samsung Galaxy Z Fold6 vs Google Pixel 9 Pro Fold: Clash Of The Folding Phenoms
-
Tech News12 months ago
Senet enters MENA’s Competitive Gaming Scene with ‘skill-to-earn’ Platform
-
Automotive9 months ago
Al-Futtaim Automotive Builds On 23-Year Legacy of Trust & Leadership in UAE’s Pre-Owned Car Market to Sell Over 25,000 Used Vehicles in 2023