Once upon a time, computing was happening on PCs and Servers. Bigger organizations like Banks were setting up the server farms in data-centers for their computing needs. Along came big companies like Amazon, AT&T, Microsoft and what not that setup their own server farms and started to rent out the computing power. And one day Big Data showed up. Small companies that could not afford maintain their own data-centers but had a great idea, started out in Cloud. That was the dawn of Cloud as we know it. And rest is history. So computing moved from the devices that were on the Edge of Internet or “Cloud” to the “Cloud”.
Fast forward few years. Smart Phones and Autonomous devices brought in a new age. Mobile phones became way powerful than 15 year old high end PCs providing good amount of processing power. Autonomous components needed answer right then and there based on the live-events. There is no margin for error or delay. If the autopilot in car detects oncoming car in its track it has to decide next course of action right away. No time to send a message to Cloud and wait. Or god forbids, no signal. So computing has to be done in situ. That is Edge computing.
Now this does not mean that Cloud is on its way out. Both have their own niches. For example, Autonomous device has to react based on real-life events in wild. Think nature. These are the critters that react to their environment. They do not need a central hub to guide them or help them to navigate. They look around and find their way to food and avoid colliding with other objects or insects. But that does not mean that queen or alpha-counterpart is useless. They guide the colonies in case of disaster or finding new abodes and prospects. That is Cloud which is seeing all and deciding the next best course of action.
So to get a quick solution, Edge Computing is must. But on a grand scale you need to bring data to central computing or Cloud, analyze, find patterns and decide what is best course of action.
And then they both lived happily ever after!