Category: Cloud

From Cloud To Cloudlets

Cloudlets, or mini-clouds, are starting to roll out closer to the sources of data in an effort to reduce latency and improve overall processing performance. But as this approach gains steam, it also is creating some new challenges involving data distribution, storage and security.

New undersea web cable to connect major Asian information hubs

The Asia Direct Cable Consortium has chosen NEC to build its new high-performance submarine cable. The finished cable will feature multiple pairs of high capacity optical fibers designed to carry more than 140 Tbps of traffic to allow for high capacity transmission of data across the East and Southeast Asian regions. As a result of ADC’s high capacity, the new submarine cable will be able to support bandwidth-intensive applications driven by technological advancements in 5G, the cloud, IoT and AI.

Cloud Versus Edge — Is There a Winner? Complementary or Competitive?

At the risk of giving away the conclusion too early, there’s a clear place — not to mention, a need — for both application and infrastructure deployments in the cloud and on the edge. Centralizing data and the processing it in the cloud can be efficient and effective, but where latency can’t be tolerated, some amount of processing needs to be carried out at the edge. In fact, it’s often easier and more efficient to bring the processing to the data than it is to bring the data to the processing engine.

The Competition Surrounding 800 Gigabit Ethernet

The high demand for technologies for faster FO data transmission in hyperscale data centers has triggered a whole range of developments. The manufacturer consortia – called MSAs (Multi-Source Agreements) – are working at high pressure on new specifications, which focus on the roadmap from 400 to 800 Gigabit Ethernet (800G). The cloud industry is waiting for new, faster optical connectivity. It is expected that cloud companies will need usable 800G modules by 2023-2024 to be able to increase the transmission performance in their data centers.

Coronavirus: Will offices be safe for a return to work?

As lockdowns start to ease in many countries, so the tentative return to work begins, leaving people understandably concerned about how safe a space an office will be in the middle of a global pandemic. There is likely to be an increase in the amount of technology used to monitor employees. From thermal cameras taking your temperature when you enter the building to apps or wearables to alert you if you get too close to colleagues, work could soon have the feel of the Minority Report movie.

Challenges for Hyperscale Networks

Arguably the biggest challenge for hyperscalers is continuity and, by association, reliability. New findings generated by a survey from Uptime Institute revealed that over 10% of all respondents said their most recent reportable outage cost them more than $1m in direct and indirect costs. On March 13th, 2019, Facebook suffered its worst-ever outage, affecting the estimated 2.7 billion users of its core social network, Instagram and messaging applications, Facebook Messenger, and WhatsApp. By extrapolating the company’s 2018 revenue figures, CCN estimated that the blackout could have cost Facebook up to $90 million in lost revenue based on an income of $106,700 per minute. With so many businesses relying on hyperscale data centers to provide the IT backbone to their operations, any downtime can have a substantial impact and sometimes catastrophic ramifications.

IoT offers a way to track COVID-19 via connected thermometers

A company called Kinsa is leveraging IoT tech to create a network of connected thermometers, collecting a huge amount of anonymous health data that could offer insights into the current and future pandemics. The ability to track fever levels across the U.S. in close to real time could be a crucial piece of information for both the public at large and for decision-makers in the healthcare sector and government.