By Samuel Msiska
We are living at such a time when everything seems to be changing in the blink of an eye. A queen bumblebee that went into hibernation last year may be surprised by the massive changes that have happened in just a period of 9 months. Of course, we may brag to say “we are the most technologically advanced generation to have ever lived” but that cannot invalidate the fact that most of the innovations we are enjoying now are a makeover of what others did in the past. For instance, one of the popular operating systems, Linux, came from UNIX whose origin can be traced back to the mid-1960s. There is always a thin line between the parent invention and its child since they share some common traits. It is with no surprise people are still confused between Linux and UNIX, so in the case with cloud computing and edge computing. Let us look at cloud computing and edge computing models, when each can be deployed and whether one will replace the other, as some expect.
Cloud Computing
This is a form of distributed computing where files are stored on a remote data center (managed by a cloud service provider) and can be accessed anytime from any device. However, cloud computing goes beyond just keeping files online. It also hosts other computing resources like applications and development tools. This technology eliminates the costs of purchasing, installing, and maintaining physical data centers and servers. Instead, the end-user can access computing services over the internet with a pay-as-you-go plan from a cloud computing service provider.
Pros
Cons
Edge Computing
This is a form of data computing where data is stored on data centers while some of it is on the local network servers— at the edge. Instead of transferring data from the endpoint to data centers, which are usually located very far, processing happens at the edge (near the endpoint), reducing response time. From the edge, devices like phones, industrial robots can access the rest of the network. What distinguishes edge computing from cloud computing is that in cloud computing, all the data is stored in remote data centers (on the cloud) while with edge computing, data is stored on the cloud as well as on local data storage. Edge computing is mainly applied in situations that require real-time data-processing like self-driving cars, healthcare software, and IoT devices. Edge computing can be applied in situations where there is limited bandwidth to transfer huge volumes of data to the cloud and files need to be accessed in offline mode due to unreliable internet connectivity.
Pros
Cons
Conclusion
Is edge computing a competitor of cloud computing? Well, I don’t believe it is. As discussed in the article, edge computing has its own applications therefore, it may not be used as a substitute for cloud computing. Of course in the end it all comes down to personal preference but it is essential to understand the nature of your business and consider the pros and cons of each of the two technologies before making a decision. Edge computing would be ideal in situations that need data to be processed in a split second while cloud computing is applicable in applications that are not time-sensitive. You can see for yourself that the two have different functions thus, saying edge computing will replace cloud computing is the same as saying eggs will replace meat.