Edge computing
Edge computing is a distributed computing paradigm that brings computation and data storage closer to the location where it is needed, to improve response times and save bandwidth.[1]
The origins of edge computing lie in content delivery networks that were created in the late 1990s to serve web and video content from edge servers that were deployed close to users.[2] In the early 2000s, these networks evolved to host applications and application components at the edge servers,[3] resulting in the first commercial edge computing services[4] that hosted applications such as dealer locators, shopping carts, real-time data aggregators, and ad insertion engines.[3]
Modern edge computing significantly extends this approach through virtualization technology that makes it easier to deploy and run a wider range of applications on the edge servers.
Definition
One definition of edge computing is any type of computer program that delivers low latency nearer to the requests. Karim Arabi, in an IEEE DAC 2014 Keynote [5] and subsequently in an invited talk at MIT's MTL Seminar in 2015 [6] defined edge computing broadly as all computing outside the cloud happening at the edge of the network, and more specifically in applications where real-time processing of data is required. In his definition, cloud computing operates on big data while edge computing operates on "instant data" that is real-time data generated by sensors or users.
According to The State of the Edge report, edge computing concentrates on servers "in close proximity to the last mile network." Alex Reznik, Chair of the ETSI MEC ISG standards committee loosely defines the term: "anything that's not a traditional data center could be the 'edge' to somebody."[7]
Edge nodes used for game streaming are known as gamelets,[8] which are usually one or two hops away from the client.[9] Per Anand and Edwin say 'the edge node is mostly one or two hops away from the mobile client to meet the response time constraints for real-time games' in the cloud gaming context.[9]
Concept
The increase of IoT devices at the edge of the network is producing a massive amount of data to be computed at data centers, pushing network bandwidth requirements to the limit.[10] Despite the improvements of network technology, data centers cannot guarantee acceptable transfer rates and response times, which could be a critical requirement for many applications.[11] Furthermore, devices at the edge constantly consume data coming from the cloud, forcing companies to build content delivery networks to decentralize data and service provisioning, leveraging physical proximity to the end user.
In a similar way, the aim of Edge Computing is to move the computation away from data centers towards the edge of the network, exploiting smart objects, mobile phones or network gateways to perform tasks and provide services on behalf of the cloud.[12] By moving services to the edge, it is possible to provide content caching, service delivery, storage and IoT management resulting in better response times and transfer rates. At the same time, distributing the logic in different network nodes introduces new issues and challenges.
Privacy and security
The distributed nature of this paradigm introduces a shift in security schemes used in cloud computing. In edge computing, data may travel between different distributed nodes connected through the Internet, and thus requires special encryption mechanisms independent of the cloud. Edge nodes may also be resource constrained devices, limiting the choice in terms of security methods. Moreover, a shift from centralized top-down infrastructure to a decentralized trust model is required.[13] On the other hand, by keeping data at the edge it is possible to shift ownership of collected data from service providers to end-users.
Scalability
Scalability in a distributed network must face different issues. First, it must take into account the heterogeneity of the devices, having different performance and energy constraints, the highly dynamic condition and the reliability of the connections, compared to more robust infrastructure of cloud data centers. Moreover, security requirements may introduce further latency in the communication between nodes, which may slow down the scaling process.[11]
Reliability
Management of failovers is crucial in order to maintain a service alive. If a single node goes down and is unreachable, users should still be able to access a service without interruptions. Moreover, edge computing systems must provide actions to recover from a failure and alerting the user about the incident. To this aim, each device must maintain the network topology of the entire distributed system, so that detection of errors and recovery become easily applicable. Other factors that may influence this aspect are the connection technology in use, which may provide different levels of reliability, and the accuracy of the data produced at the edge that could be unreliable due to particular environment conditions.[11]
Speed
Edge computing brings analytical computational resources close to the end users and therefore helps to speed up the communication speed. A well designed edge platform would significantly outperform a traditional cloud-based system. Some applications rely on short response times making edge computing a significantly more feasible option than cloud computing. Examples are applications involving human perception such as facial recognition, which typically takes a human between 370-620ms to perform.[14] Edge computing is more likely to be able to mimic the same perception speed as humans, which is useful in applications such as augmented reality where the headset should preferably recognize who a person is at the same time as the wearer does.
Efficiency
Due to the proximity of the analytical resources to the end users, sophisticated analytical tools and Artificial Intelligence tools can run on the edge of the system. This placement at the edge helps to increase operational efficiency and contributes many advantages to the system.
Additionally, the usage of edge computing as an intermediate stage between client devices and the wider internet results in efficiency savings that can be demonstrated in the following example: A client device requires computationally intensive processing on video files to be performed on external servers. By using servers located on a local edge network to perform those computations, the video files only need to be transmitted in the local network. Avoiding transmission over the internet results in significant bandwidth savings and therefore increases efficiency.[14]
Applications
Edge application services reduce the volumes of data that must be moved, the consequent traffic, and the distance that data must travel. That provides lower latency and reduces transmission costs. Computation offloading for real-time applications, such as facial recognition algorithms, showed considerable improvements in response times, as demonstrated in early research.[15] Further research showed that using resource-rich machines called cloudlets near mobile users, which offer services typically found in the cloud, provided improvements in execution time when some of the tasks are offloaded to the edge node.[16] On the other hand, offloading every task may result in a slowdown due to transfer times between device and nodes, so depending on the workload an optimal configuration can be defined.
Another use of the architecture is cloud gaming, where some aspects of a game could run in the cloud, while the rendered video is transferred to lightweight clients running on devices such as mobile phones, VR glasses, etc. This type of streaming is also known as pixel streaming.[8]
Other notable applications include connected cars, autonomous cars,[17] smart cities,[18] Industry 4.0 (smart industry) and home automation systems.[19]
See also
References
- Hamilton, Eric (27 December 2018). "What is Edge Computing: The Network Edge Explained". cloudwards.net. Retrieved 2019-05-14.
- "Globally Distributed Content Delivery, by J. Dilley, B. Maggs, J. Parikh, H. Prokop, R. Sitaraman and B. Weihl, IEEE Internet Computing, Volume 6, Issue 5, November 2002" (PDF). Archived (PDF) from the original on 2017-08-09. Retrieved 2019-10-25.
- Nygren., E.; Sitaraman R. K.; Sun, J. (2010). "The Akamai Network: A Platform for High-Performance Internet Applications" (PDF). ACM SIGOPS Operating Systems Review. 44 (3): 2–19. doi:10.1145/1842733.1842736. S2CID 207181702. Archived (PDF) from the original on September 13, 2012. Retrieved November 19, 2012.
See Section 6.2: Distributing Applications to the Edge
- Davis, A.; Parikh, J.; Weihl, W. (2004). "EdgeComputing: Extending Enterprise Applications to the Edge of the Internet". 13th International World Wide Web Conference. doi:10.1145/1013367.1013397. S2CID 578337.
- IEEE DAC 2014 Keynote: Mobile Computing Opportunities, Challenges and Technology Drivers
- MIT MTL Seminar: Trends, Opportunities and Challenges Driving Architecture and Design of Next Generation Mobile Computing and IoT Devices
- "ETSI - ETSI Blog - What is Edge?". etsi.org. Retrieved 2019-02-19.
- "CloudHide: Towards Latency Hiding Techniques for Thin-client Cloud Gaming". ResearchGate. Retrieved 2019-04-12.
- Anand, B.; Edwin, A. J. Hao (January 2014). "Gamelets — Multiplayer mobile games with distributed micro-clouds". 2014 Seventh International Conference on Mobile Computing and Ubiquitous Networking (ICMU): 14–20. doi:10.1109/ICMU.2014.6799051. ISBN 978-1-4799-2231-4. S2CID 10374389.
- Ivkovic, Jovan (2016-07-11). "[Serbian] The Methods and Procedures for Accelerating Operations and Queries in Large Database Systems and Data Warehouse (Big Data Systems)". Hgpu.org.
- Shi, Weisong; Cao, Jie; Zhang, Quan; Li, Youhuizi; Xu, Lanyu (October 2016). "Edge Computing: Vision and Challenges". IEEE Internet of Things Journal. 3 (5): 637–646. doi:10.1109/JIOT.2016.2579198. S2CID 4237186.
- Merenda, Massimo; Porcaro, Carlo; Iero, Demetrio (29 April 2020). "Edge Machine Learning for AI-Enabled IoT Devices: A Review". Sensors. 20 (9): 2533. doi:10.3390/s20092533. PMC 7273223. PMID 32365645.
- Garcia Lopez, Pedro; Montresor, Alberto; Epema, Dick; Datta, Anwitaman; Higashino, Teruo; Iamnitchi, Adriana; Barcellos, Marinho; Felber, Pascal; Riviere, Etienne (30 September 2015). "Edge-centric Computing". ACM SIGCOMM Computer Communication Review. 45 (5): 37–42. doi:10.1145/2831347.2831354.
- Satyanarayanan, Mahadev (January 2017). "The Emergence of Edge Computing". Computer. 50 (1): 30–39. doi:10.1109/MC.2017.9. ISSN 1558-0814.
- Yi, S.; Hao, Z.; Qin, Z.; Li, Q. (November 2015). "Fog Computing: Platform and Applications". 2015 Third IEEE Workshop on Hot Topics in Web Systems and Technologies (HotWeb): 73–78. doi:10.1109/HotWeb.2015.22. ISBN 978-1-4673-9688-2. S2CID 6753944.
- Verbelen, Tim; Simoens, Pieter; De Turck, Filip; Dhoedt, Bart (2012). "Cloudlets: Bringing the Cloud to the Mobile User". Proceedings of the Third ACM Workshop on Mobile Cloud Computing and Services. ACM: 29–36. doi:10.1145/2307849.2307858. hdl:1854/LU-2984272. S2CID 3249347. Retrieved 4 July 2019.
- It's Time to Think Beyond Cloud Computing Published by wired.com retrieved April 10, 2019,
- Taleb, Tarik; Dutta, Sunny; Ksentini, Adlen; Iqbal, Muddesar; Flinck, Hannu (March 2017). "Mobile Edge Computing Potential in Making Cities Smarter". IEEE Communications Magazine. 55 (3): 38–43. doi:10.1109/MCOM.2017.1600249CM. S2CID 11163718. Retrieved 5 July 2019.
- Chakraborty, T.; Datta, S. K. (November 2017). "Home automation using edge computing and Internet of Things". 2017 IEEE International Symposium on Consumer Electronics (ISCE): 47–49. doi:10.1109/ISCE.2017.8355544. ISBN 978-1-5386-2189-9. S2CID 19156163.