View all topics
Back to CompTIA

What's the difference between edge computing and decentralized computing?

  • updated 7 mths ago

Edge Computing is a decentralized, distributed computing infrastructure that has evolved with the growth of the internet of things. 

Due to their similar names and the general unawareness of advanced computing, some people tend to think that decentralized computing and edge computing are similar.

But, both types of computing are different and complementary to each other. When combined together as decentralized edge computing, they can perform tasks that cannot be achieved individually.

What's Edge Computing?

Edge computing is the deployment of computing and storage resources at the location where data is produced. According to Gartner, edge computing is part of a distributed computing topology in which information processing is located close to the edge—where things and people produce or consume that information. Edge computing is transforming the way data is being handled, processed, and delivered from millions of devices around the world. 

What's Decentralized Computing?

Decentralized computing is the allocation of resources, both hardware and software, to each individual workstation, or office location.

Continue reading:

Reply Oldest first
  • Oldest first
  • Newest first
  • Active threads
  • Popular
Like1 Follow
  • 7 mths agoLast active
  • 3Views
  • 1 Following
Powered by Forumbee


View all topics