The six installment in Bob Hult’s Technology Trends series examines the rise of data centers and the shift to cloud and edge computing.
Edge computing reduces the latency between a device and service where latency is the roundtrip time between two systems over a network. While reducing latency is one of the its main benefits, edge computing has other benefits that will be exposed as 5G use cases are discussed.
This webinar gives data center managers some key cabling strategies to manage multiple waves of 5G — helping ensure next-generation coverage for billions of devices. Topics include the three phases of 5G deployment, data processing at the edge to reduce latency, and centralized data centers to support the workloads generated by 5G.
Cloudlets, or mini-clouds, are starting to roll out closer to the sources of data in an effort to reduce latency and improve overall processing performance. But as this approach gains steam, it also is creating some new challenges involving data distribution, storage and security.
For IoT deployments, going to the edge may be the best choice when it comes to helping businesses deploy IoT technology across their network infrastructures. Panduit’s white paper, “Edge Computing: Behind the Scenes of IoT,” explains the difference between the cloud and edge computing and three ways the edge can help IoT technology deployments. It also discusses the following key areas for consideration when deploying edge computing: real-time requirements, environmental conditions, space limitations, and security.
The findings discussed in this report reveal what operators around the world are thinking, doing, and planning in the areas of efficiency, resiliency, workload placement, staffing, and new technology adoption.
Edge computing environments, including edge data centers, reset goals and customer expectations for the housing, protection and management of equipment, including cabling and network gear. In these environments, one-size-fits-all isn’t even a consideration. The concepts, approaches, and best practices described in this document position network administrators for success in establishing the technologies and techniques needed for managing computing at the edge.
The move to edge cloud is resulting in a huge proliferation of local data centers. By moving processing power and services closer to the edge of the network, a wealth of new cloud-based applications dependent on low latencies and highly reliable connections emerge. Like their centralized counterparts, edge data centers need high capacity like long-haul transport links, but the networks they’re building are fundamentally different. Instead of a connecting a few distant central data centers, cloud providers are connecting dozens of distributed data centers in a single city in order meet the fast response times and low latencies required of new edge computing services.
UC Berkeley and NTT announced a connected campus pilot project that will leverage technology to “smartly” transform the UC Berkeley Parking and Transportation Department by analyzing use patterns, easing traffic congestion, and increasing pedestrian safety in the Bancroft Way area of campus. The pilot will incorporate NTT’s Accelerate Smart data platform and Dell Technologies’ modular data center infrastructure for edge deployments of high-definition optical sensors and IoT devices that monitor traffic-related issues.
Just a few years ago, many expected all the Internet of Things (IoT) to move to the cloud—and much of the consumer-connected IoT indeed lives there—but one of the key basics of designing and building enterprise-scale IoT solutions is to make a balanced use of edge and cloud computing. Most IoT solutions now require a mix of cloud and edge computing which can alleviate latency, increase scalability, and enhance access to information so that better, faster decisions can be made, and enterprises can become more agile as a result.