Category: Cloud

Edge Computing vs Fog Computing: What’s the Difference?

Edge computing and fog computing allow processing data within a local network rather than sending it to the cloud, which decreases latency and increases security. The main difference between the two is processing location. With edge computing, data processing typically occurs directly on a sensor-equipped product that collects the information or a gateway device physically close to those sensors. Fog computing moves edge computing activities to local area network (LAN) hardware or processors connected to it. These may be physically farther from the data-capturing sensors compared to edge computing.

The Collision of Cloud and the Edge

As 5G deployments continue toward the goal of ubiquitous coverage, two apparently conflicting trends are developing: C-RAN and Edge Computing. To understand why these two apparently diametrically opposed trends are happening, we need to look at the drivers of each, and when we do, we’ll see there isn’t necessarily an inevitable “collision.” But Service Providers need to plan for each of these trends with a network that is expandable, flexible and accessible.

From Cloud To Cloudlets

Cloudlets, or mini-clouds, are starting to roll out closer to the sources of data in an effort to reduce latency and improve overall processing performance. But as this approach gains steam, it also is creating some new challenges involving data distribution, storage and security.

New undersea web cable to connect major Asian information hubs

The Asia Direct Cable Consortium has chosen NEC to build its new high-performance submarine cable. The finished cable will feature multiple pairs of high capacity optical fibers designed to carry more than 140 Tbps of traffic to allow for high capacity transmission of data across the East and Southeast Asian regions. As a result of ADC’s high capacity, the new submarine cable will be able to support bandwidth-intensive applications driven by technological advancements in 5G, the cloud, IoT and AI.

Cloud Versus Edge — Is There a Winner? Complementary or Competitive?

At the risk of giving away the conclusion too early, there’s a clear place — not to mention, a need — for both application and infrastructure deployments in the cloud and on the edge. Centralizing data and the processing it in the cloud can be efficient and effective, but where latency can’t be tolerated, some amount of processing needs to be carried out at the edge. In fact, it’s often easier and more efficient to bring the processing to the data than it is to bring the data to the processing engine.

The Competition Surrounding 800 Gigabit Ethernet

The high demand for technologies for faster FO data transmission in hyperscale data centers has triggered a whole range of developments. The manufacturer consortia – called MSAs (Multi-Source Agreements) – are working at high pressure on new specifications, which focus on the roadmap from 400 to 800 Gigabit Ethernet (800G). The cloud industry is waiting for new, faster optical connectivity. It is expected that cloud companies will need usable 800G modules by 2023-2024 to be able to increase the transmission performance in their data centers.