Connect with us

Tech Interviews

ON THE FAST LANE

Published

on

Updated : September 9, 2013 0:0  ,Dubai
By Editor

img54With internet speeds rising fast, many organizations are seeing their security appliances being left behind of the high speed bandwagon. Ajay Nawani, AVP Global Presale Operations at Cyberoam explains how the company can help

Why is it crucial for Cyberoam to launch the Next-Gen series of UTMs at this time?

The NG Series enables SOHO/SMBs networks to become future-ready as they move towards gigabit network infrastructure and high Internet speeds. However, today, their network security is not ready for these trends as few security vendors today offers them security throughputs or gigabit ports required for such networks.  This is resulting in security becoming a bottleneck for the networks.

The need for Next-Gen UTM like the NG Series is being driven by several factors including the rise of superfast internet. We are seeing data moving at 10 times current internet speeds. Examples include Google which launched a 1000 Mbps ‘Google Fiber’ Internet service in the US. IDNet also launched 1000 Mbps Leased Line Business Ethernet Service in the UK. The MEA region cannot be further behind. There has also been a device explosion with on average multiple internet devices per user. In the enterprise, employee-owned smart devices are expected to reach 350mn by 2014, up from 150mn today. Cloud is also driving some of this need for fast internet. A lot of critical business applications and services are moving to the Web. Cloud computing is expected to generate more than $45.4 billion in revenue by 2015.

Why is it important then for security to move with the same speed as the rest of the network?

In many organizations, network infrastructure is moving to gigabit and anything less than gigabit is nearly obsolete. Security must also keep pace. With gigabit internet speeds to be available soon from ISPs and Telcos, the question is whether your security appliance can match these high Internet speeds and offer gigabit throughputs. With more devices per employee connecting to the IT network, the approach is changing from ‘Users’ to ‘Devices’. CIOs need to consider whether their security appliance can support any more number of Devices. The risk is that security becomes a bottleneck in the network. The most viable solution to tackle this is for organizations to move to future-ready security appliances. Cyberoam NG series offers ‘Future-Ready’ security with the fastest UTM appliances for SOHO and SMB segments, while offering Next-Generation firewall protection for large enterprises.

How is Cyberoam NG series then able to offer future-ready security?

The NG Series offers a combination of powerful hardware with superfast processing speed, a must for gigabit internet speeds, coupled with gigabit ports to enable gigabit speeds. Users also get nano-second security processing with multicore Gigahertz processors. The new NG-series also features a complete overhaul of appliance design outside, while inside we have included faster, next-gen memory and high capacity storage. This ensures superior quality for high performance and robustness.  The design and components support high speed I/O throughput while also being able to withstand extreme environments.

A lot of regional Telcos are busy deploying 4G/LTE networks. What does this development portend for UTMs?

Our Next-Gen UTM products are 4G high-speed ready. We offer very easy deployment for high speed internet connection including VPN over 4G/LTE. This can be done through a simple USB modem plug-in into a Cyberoam UTM appliance. Further, our devices have auto detection of modem plug-in and plug-out making it easy and simple to deploy. The plug-and-play attributes of our Next-Gen UTMs continue with automatic failover to 4G/LTE WWAN connection as well as gateway failover over VPN. We’ve also extended this secure connectivity to support BYOD in organizations, offering Layer 8 Identity-based security for mobile devices. Users can thus securely connect their iPhones, iPad and Android devices to their corporate network through VPN.

With the new UTM, you have also included your own operating system, the Cyberoam OS. Why is this software different?

NG Series appliances come pre-loaded with Cyberoam OS – the most intelligent and powerful Cyberoam firmware till date. The new firmware tightly integrates with the hardware for network and crypto acceleration to deliver very high performance. The Cyberoam OS also extracts full performance from a multi-core platform, along with offering minimum latency and improved processing speed with use of optimized Interrupt rates and FastPath technology.

Please discuss a specific security application available with the new UTM

YouTube for Schools, an educational service from the video sharing giant, offers a huge repository of relevant and engaging content for students. However, in most instances, teachers cannot access these YouTube educational videos due to institutional security policies. Cyberoam’s Layer-8 controls on “YouTube for Schools” allows access to selected educational content while offering user (teachers/student) and group (school/class) level control after which it will generate user-based access reports. This application will also block all inappropriate and peripheral content such as ads, comments and links.

As virtualization grows in the region, what security challenges are organizations facing in this area?

Existing UTMs cannot scan Inter-VM traffic with externally based security hardware despite the fact that a single compromised virtual machine can infect the entire data center. Virtual machines face a higher risk of attacks on the hypervisor management console. Virtualized applications also face vulnerabilities from hackers. We have the right solutions for protecting virtualized servers from intrusion attacks. The Cyberoam Virtual Appliance resides on the virtual layer and scans all traffic inside the virtual environment.

Continue Reading

Tech Interviews

Bitdefender Shapes Modern Cloud Security Solutions

Published

on

Bitdefender modern cloud

Exclusive interview with Mr. Raphaël Peyret, Director of Product Management — Cloud Security, BitDefender

What is Bitdefender’s main objective at GITEX this year?  

GITEX is a great opportunity to meet not only our end customers but also partners. We are a channel company and we sell exclusively through partners. Hence, it’s great to have partners from all over the region all over the world come to us so that we can connect again in person. We are also presenting some of our newer products and offerings. For example, this year we launched a new cloud security offering. Talking about some of our innovations in risk management, we announced something called PHASR recently. So those are going to be some big, big things that we hope to spread the word about.

As a cybersecurity company, how Bitdefender keep itself ahead of the competition?

Every company in the space has its own angle on this. And for us, our angle is its technology. We’re a technology company at heart, our founder, founded Bitdefender 23 years ago. He’s a university professor, and we have 50% of our teams working on R&D. We are not a marketing company. We are a technology company and that comes across in our products, where technically speaking we are really, really good. We have got people doing reverse engineering of malware, working with the FBI, with INTERPOL, as well as building the B2B products that you will see here. Under the hood, there’s a ton of work on core technology.

Right now, a lot of companies are moving toward having cloud security. What is Bitdefender doing in this sector/market?

At Bitdefender, we recognize that while the cloud requires a tailored approach, it cannot be treated as entirely separate from other environments. Attackers won’t stop at the cloud’s edge, so isolated approaches only create exploitable silos. That’s why our cloud security platform and cloud-native application protection are integrated into our broader security framework. Our capabilities correlate data and responses across all environments—cloud, endpoints, email, mobile, or IoT. This unified approach ensures comprehensive threat prevention, detection, and response, following attackers wherever they move.

How are you utilizing AI and machine learning in enhancing your threat detection capabilities?

We often get asked about AI and machine learning, especially now, but for our technical teams, it’s nothing new—it’s like discussing computing itself. Bitdefender has been leveraging machine learning on endpoints for over a decade. Instead of just claiming “we have AI,” we deploy a vast range of models fine-tuned for different security needs, including machine learning, heuristics-based, and generative models. These aren’t just buzzwords; each model serves a specific security purpose, optimized through rigorous R&D. In 2017, a new ransomware was blocked by a 2014 model—demonstrating the effectiveness of our approach. Moreover, every device with our protection uses its own customized AI model, meaning we effectively deploy millions of unique models tailored to individual devices.

Considering that hackers are aware of cybersecurity measures from companies like Bitdefender, are they constantly innovating new tactics to bypass these defenses?

Hackers driven by financial gain often target more vulnerable systems because they offer easier access with lower investment. If defenses are strong enough to make hacking unprofitable, attackers tend to move on. Despite this, hackers continue to innovate, particularly in malware and ransomware. For instance, they increasingly use harder-to-reverse-engineer languages like Rust to slow down security teams’ efforts to create decryptors. They also explore advanced cryptography, anticipating quantum computing’s potential to break traditional security methods. This constant evolution creates a cat-and-mouse dynamic; the goal remains to make attacks so costly that cybercriminals are deterred.

Do you also consider the perspective and strategies of hackers when developing your security measures?

Of course and in fact the services that we offer include penetration testing and red teaming. Red teaming is the one that’s really the closest to the hacker mindset. Because organizations will come and say. “Here are our crown jewels. Here’s the safe. Here are the rules of the game. You’re not allowed to, beat my CEO up. Everything else is on the table. Break in and tell me how you did it.” And so for that, we basically have the white-hat hackers, the good hackers. They hack you so that you know how another hacker would do it and that you can block. We do that. And of course, it’s important not to separate from the hacker mindset for doing cybersecurity properly. Because otherwise, you’re blind to where the risks actually are. 

Continue Reading

Tech Interviews

 

Published

on

Vertiv Outlines Data Center Evolution and AI Infrastructure Strategy

Exclusive Interview with Peter Lambrecht, Vice President Sales, EMEA, Vertiv

What specific challenges do clients face in powering and cooling AI infrastructure, and how did you address these at GITEX this year?

At GITEX this year, the focus on artificial intelligence (AI) was unmistakable. Every booth showcased AI-driven solutions running on GPU-powered servers from leading companies like NVIDIA. However, for AI applications to function effectively, the right infrastructure is essential to power and cool these GPUs, ensuring smooth and efficient performance. For us, this highlights our expertise in AI infrastructure, designed to support these platforms optimally.

One of our key focus areas is liquid cooling, as traditional air cooling in data centers is no longer sufficient. With rack densities now reaching 50, 60, and even up to 132 kilowatts per rack, air cooling alone cannot handle the thermal load. Liquid cooling has become critical, efficiently drawing heat away and directing it elsewhere. This core technology, developed with our partner NVIDIA, supports the deployment of GPUs worldwide, and together we provide, market, and deliver these advanced solutions to our clients.

Cooling, however, is only one part of the equation. The shift to AI also requires a comprehensive approach to power management, as AI workloads significantly alter electricity load patterns. Our solutions are designed to meet these power demands, and we have showcased these capabilities at our booth. We’ve been actively engaging with partners and clients to address these challenges as they implement their AI solutions, ensuring that both cooling and power needs are effectively met.

Can you elaborate on your partnership with NVIDIA and how it has evolved over the years?

Our partnership with NVIDIA has grown significantly over the years, and over the past year, it has reached an unprecedented level of collaboration. Both of our CEOs, Jensen Huang of NVIDIA and Giordano Albertazzi of Vertiv communicate regularly, aligning closely on joint development initiatives. Together, we create environments optimized for NVIDIA’s cutting-edge chips, ensuring they have the necessary infrastructure to operate at peak performance.

In a recent joint announcement, both CEOs unveiled new solutions that integrate NVIDIA GPUs with Vertiv’s advanced infrastructure, designed to maximize efficiency and reliability. This collaboration represents the core strength of our partnership, where we have refined reference designs that allow NVIDIA to deploy their GPUs seamlessly and effectively on a global scale.

What current trends do you see in the data center and critical infrastructure market? With many hyperscalers entering the market, what is your perspective on this development?

The data center market is dominated by Hyperscalers, whether through their direct deployments or co-location facilities—two primary models for quickly scaling data center capacity. These Hyperscalers are making substantial investments in infrastructure, fueling competition in this fast-evolving landscape.

The AI race is fully underway, with industry giants all striving toward the same goal. As their infrastructure partner, we are advancing with them, providing the essential support they need to drive this innovation. At the same time, the growth of the cloud sector remains foundational and continues to expand robustly. What we are seeing now is a dual growth trajectory: traditional cloud business growth compounded by the accelerated demand for AI infrastructure.

Trends in data centers reveal a marked increase in power consumption. A few years ago, a five-megawatt data center was considered significant; soon, however, a five-megawatt capacity will fit within a 10×10-meter room as rack density skyrockets, reducing the need for extensive white space but requiring expanded infrastructure areas. Data centers are scaling up to unprecedented sizes, with discussions now involving capacities of 300-400 megawatts, or even gigawatts. Visualizing a gigawatt-sized facility is challenging, yet that is the direction the industry is moving—toward ultra-dense, compacted facilities where every element is intensified, driving an enormous need for power.

As data centers continue to grow, how do you view the sustainability aspect associated with these large facilities?

Today, nearly everyone has a mobile phone in hand, yet data centers—the backbone of our digital lives—often face criticism despite being indispensable to modern society. Data centers are not disappearing; on the contrary, they are set to expand as the pace of digitalization accelerates globally. Power generation remains, and will continue to be, a critical challenge, particularly in regions where resources are limited.

Currently, we see significant advancements in AI infrastructure across Northern Europe. Countries like Sweden, Finland, and Norway benefit from ample hydropower and renewable energy sources, making them well-suited for sustainable AI development. Meanwhile, the Middle East is experiencing a technology boom, backed by its rich energy resources and favorable conditions for large-scale investment in data centers.

There’s also a rising trend toward on-site power generation, with organizations increasingly considering dedicated power stations and micro-grids that tap into renewable or alternative energy sources. In the U.S., for instance, discussions are underway about small, mobile nuclear reactors to support local power needs. Finding sustainable power solutions has become imperative. A sudden surge in electric vehicle usage, for instance, could stress current power supplies dramatically, underscoring the need for substantial changes in our energy landscape.

What support and services does Vertiv offer to its customers? What types of infrastructure investments is Vertiv making in various parts of the world?

At Vertiv, service is arguably the most critical aspect of what we do. Selling a product is only one phase of its lifecycle; the real value lies in our ability to maintain and support that product for the next 10 to 15 years. The availability of highly skilled labor and expertise is essential, as we know that issues will inevitably arise. This is why resilience is a cornerstone of data centers. A strong service infrastructure is vital for addressing challenges promptly when they occur. Just as a car will eventually break down, data center systems too will face difficulties over time. At Vertiv, we have developed an exceptional service framework to ensure that we are always prepared to support our clients.

My philosophy is simple: we don’t sell a product unless we can guarantee the service to support it. When you sell a solution, you’re essentially selling a potential future problem, so ensuring your service capabilities are in place is vital for sustainable growth. This commitment to service is one of our key differentiators in the market.

We are continuously enhancing our core production facilities and making ongoing investments in engineering, research, and development. We are also evaluating our global footprint to optimize production capabilities and meet the growing demand. We are not just focused on the immediate needs of today; we are preparing for the demands that will arise in the next six months, a year, or even two to three years. In this race for capacity, the winner will be the one best positioned with the most scalable and resilient capacity.

What is the future of cooling systems in relation to AI chips, and where do you see this race heading?

While we are not completely moving away from air cooling as it will always play a role in the equation, fully eliminating it would be prohibitively costly. The transition to liquid cooling is a critical step forward. We are already seeing advancements in the liquids used to cool servers, enabling them to absorb higher levels of heat. However, the primary challenge will be addressing the overall densification of data center systems, as more powerful and compact solutions require innovative approaches to heat management.

Our partnerships with industry leaders like NVIDIA and Intel are essential, as they provide us with invaluable, first-hand insights into the development of cutting-edge chips and GPUs. While cooling and power systems might seem straightforward, AI introduces a new layer of complexity that demands forward-thinking solutions. To meet these challenges, we are making significant investments in research and development to support the AI-driven data centers of today and tomorrow. Our commitment to continuous innovation ensures that we remain at the forefront of these critical advancements.

Continue Reading

Tech Interviews

The Role of Snowflake’s AI Data Cloud in Transforming Enterprise Data Management

Published

on

Snowflake

Exclusive Interview with Mohammad Zouri, General Manager, META at Snowflake.

What specific innovations is Snowflake highlighting this year at GITEX?

The main highlights this year are certainly centered around AI and GenAI, particularly how we have integrated these capabilities into our Snowflake data platform. We have even rebranded to AI Data Cloud. We are harnessing the power of GenAI and presenting all these capabilities to our customers to bring new workloads to their data while ensuring they maintain strong foundations within the boundaries of security and governance.

How does Snowflake’s AI Data Cloud differentiate itself from other platforms?

We are a fully managed service. When we began our journey 12 years ago, we developed a new cloud-native architecture built from the ground up to address market challenges related to scalability and performance. This architecture enables us to handle large volumes of data and support new use cases in advanced and predictive analytics. As we continued on this journey, we expanded our focus to disrupt additional domains within data, including data collaboration and data science, while increasingly delivering applications.

How do we differentiate ourselves? We offer a fully managed service with a very fast time to market and unlimited scalability in both storage and compute. Additionally, we are building a network where each customer can develop their own data strategy and share seamlessly with other customers in their ecosystem. This collaboration drives efficiencies, enhances productivity, and fosters the creation of innovative advanced use cases in the market.

Looking ahead, what are some of the most exciting AI and data-driven trends that Snowflake is focusing on?

In terms of AI and data, everyone today is striving to do more with their data, engaging with it in natural language to improve efficiency and productivity. To support this, through our fully managed data service, we provide additional fully managed services in the GenAI space, which we call Cortex AI. We place it in the hands of new personas.

Previously, platforms were primarily designed for technical teams, data teams, and data scientists, who would build internal services to deliver to their end users, including internal employees and customers. The disruption today lies in the new capabilities we offer, enabling a broader range of personas, including legal, finance, HR, and analysts, to harness the power of GenAI. Any employee can now utilize tools for summarization, translation, sentiment analysis, and interaction with chatbots, all in natural language. This allows users to communicate with their documents and unstructured data, asking questions in everyday language. As a result, we are creating entirely new use cases aimed at enhancing customer satisfaction, improving productivity, and increasing revenue. This acceleration is driven by these advanced capabilities.

Continue Reading

Trending

Please enable JavaScript in your browser to complete this form.

Copyright © 2023 | The Integrator