View all topics
Back to CompTIA

Edge Computing: What Is It And Why Does It Matter?

  • 2 mths ago

With the emergence of artificial intelligence (AI), smart devices and the Internet of Things (IoT), businesses increasingly need to instantly process large amounts of data, especially for real-time applications. To reduce the latency and availability problems associated with the cloud, more emphasis is being placed on edge computing, where computation is performed closer to the point where data is collected.

What is edge computing?

Edge computing represents another step in the evolution of the internet. It is developing in conjunction with advances in intelligent applications that require continuous monitoring and rapid response to dynamic conditions or high-volume data streams.

The internet is truly global, connecting devices of all kinds numbering in the tens of billions (and growing). At the core of the internet are multiple tiers of internet service providers (ISPs), forming an interconnected backbone that manages the immense data traffic between end users and devices.

The "last mile" of the internet includes the infrastructure between the most local tier of ISPs and its users, including business, municipal and home networks. The devices connected to this last mile, which can include servers, workstations, mobile devices and IoT devices, represent the "edge" of the internet.

Continue reading:

Reply Oldest first
  • Oldest first
  • Newest first
  • Active threads
  • Popular
Like1 Follow
  • 2 mths agoLast active
  • 2Views
  • 1 Following
Powered by Forumbee


View all topics