if you want to remove an article from website contact us from top.

    by incorporating edge devices in iot systems along with cloud infrastructure, we can achieve

    Mohammed

    Guys, does anyone know the answer?

    get by incorporating edge devices in iot systems along with cloud infrastructure, we can achieve from screen.

    Edge computing technologies for Internet of Things: a primer

    With the rapid development of mobile internet and Internet of Things applications, the conventional centralized cloud computing is encountering severe…

    Digital Communications and Networks

    Volume 4, Issue 2, April 2018, Pages 77-86

    Edge computing technologies for Internet of Things: a primer

    Author links open overlay panel

    https://doi.org/10.1016/j.dcan.2017.07.001

    Get rights and content

    Under a Creative Commons license

    Open access

    Abstract

    With the rapid development of mobile internet and Internet of Things applications, the conventional centralized cloud computing is encountering severe challenges, such as high latency, low Spectral Efficiency (SE), and non-adaptive machine type of communication. Motivated to solve these challenges, a new technology is driving a trend that shifts the function of centralized cloud computing to edge devices of networks. Several edge computing technologies originating from different backgrounds to decrease latency, improve SE, and support the massive machine type of communication have been emerging. This paper comprehensively presents a tutorial on three typical edge computing technologies, namely mobile edge computing, cloudlets, and fog computing. In particular, the standardization efforts, principles, architectures, and applications of these three technologies are summarized and compared. From the viewpoint of radio access network, the differences between mobile edge computing and fog computing are highlighted, and the characteristics of fog computing-based radio access network are discussed. Finally, open issues and future research directions are identified as well.

    Next article

    Keywords

    Internet of Things (IoT)Mobile edge computingCloudletsFog computing

    1. Introduction

    Over the past decades, cloud computing has been greatly developed and applied owing to its high cost-efficiency and flexibility achieved through consolidation, in which computing, storage, and network management functions work in a centralized manner. With the rapid development of mobile internet and Internet of Things (IoT) applications, the existing centralized cloud computing architecture is encountering severe challenges. Mobile devices connected to distant centralized cloud servers try to obtain sophisticated applications, which impose additional load on both Radio Access Networks (RANs) and backhaul networks and result in high latency [1]. In addition, with the explosive growth in various access devices and end-user demands, IoT is driving a digital transformation in all aspects of the current modern life [2]. It is estimated by Cisco that the number of devices connected to IoT will become 50 billion by 2020 [3]. The emerging IoT introduces new challenges, such as stringent latency, capacity constraints, resource-constrained devices, uninterrupted services with intermittent connectivity, and enhanced security, which cannot be adequately addressed by the centralized cloud computing architecture [4]. An advanced cloud computing paradigm that breaks through the centralized architecture and alleviates the capacity and latency constraints is urgently required to cope with these challenges.

    IoT refers to the interaction and communication between billions of devices that produce and exchange data related to real-world objects (i.e., things) [5]. IoT's features, including ultra-largescale network of things, device and network level heterogeneity, and large numbers of events generated by these things, will make the development of diverse applications and services a very challenging task [6]. These requirements are becoming difficult to accomplish in the IoT+ cloud scenario. IoT applications generate enormous amounts of data by IoT sensors. Big data are subsequently analyzed to determine reactions to events or to extract analytics or statistics. However, sending all the data to the cloud will require prohibitively high network bandwidth. Recent research efforts are investigating on how to effectively exploit capabilities at the edge of networks to support the IoT and its requirements [7]. In edge computing, the massive data generated by different types of IoT devices can be processed at the network edge instead of transmitting them to the centralized cloud infrastructure owing to bandwidth and energy consumption concerns. Edge computing can provide services with faster response and greater quality, in comparison with cloud computing. Edge computing is more suitable to be integrated with IoT to provide efficient and secure services for a large number of end-users, and edge computing-based architecture can be considered for the future IoT infrastructure [8].

    Recently, nascent technologies and applications are driving a trend in the computing and communication landscape that shifts the function of centralized cloud computing into the edge devices of networks [9]. Software Defined Networking (SDN) and the associated concept of Network Function Virtualization (NFV) are proposed as emerging solutions for future networks [10]. In particular, NFV enables edge devices to provide computing services and operate network functions by creating multiple Virtual Machines (VMs). Moreover, ultra-low latency is identified as one of the major requirements of 5th Generation (5G) RANs [11]. To decrease the latency, mobile operators are prone to deploying the application and content at the edge of networks. Meanwhile, operators can open the edge devices of RANs to third-party partners, allowing them to rapidly deploy innovative applications and content toward mobile subscribers, enterprisers, and other vertical segments [12]. Although the computing capabilities of wearable watches, smart phones, and other IoT devices have been significantly improved, they are still constrained by the fundamental challenges, such as memory size, battery life, and heat dissipation. Mobile devices need to extend battery lifetime by offloading energy-consuming computation of applications to the edge of networks [13].

    स्रोत : www.sciencedirect.com

    What Is Edge Computing? Everything You Need to Know

    Learn about edge computing, how it works and the importance of its role in the growth of 5G. Discover why edge computing matters, including benefits and use cases.

    Home Data Center Hardware Networking and communications What is edge computing? Everything you need to know

    DEFINITION

    What is edge computing? Everything you need to know

    Stephen J. Bigelow, Senior Technology Editor

    Edge computing is a distributed information technology (IT) architecture in which client data is processed at the periphery of the network, as close to the originating source as possible.

    Data is the lifeblood of modern business, providing valuable business insight and supporting real-time control over critical business processes and operations. Today's businesses are awash in an ocean of data, and huge amounts of data can be routinely collected from sensors and IoT devices operating in real time from remote locations and inhospitable operating environments almost anywhere in the world.

    But this virtual flood of data is also changing the way businesses handle computing. The traditional computing paradigm built on a centralized data center and everyday internet isn't well suited to moving endlessly growing rivers of real-world data. Bandwidth limitations, latency issues and unpredictable network disruptions can all conspire to impair such efforts. Businesses are responding to these data challenges through the use of edge computing architecture.

    In simplest terms, edge computing moves some portion of storage and compute resources out of the central data center and closer to the source of the data itself. Rather than transmitting raw data to a central data center for processing and analysis, that work is instead performed where the data is actually generated -- whether that's a retail store, a factory floor, a sprawling utility or across a smart city. Only the result of that computing work at the edge, such as real-time business insights, equipment maintenance predictions or other actionable answers, is sent back to the main data center for review and other human interactions.

    Thus, edge computing is reshaping IT and business computing. Take a comprehensive look at what edge computing is, how it works, the influence of the cloud, edge use cases, tradeoffs and implementation considerations.

    Edge computing brings data processing closer to the data source.

    How does edge computing work?

    Edge computing is all a matter of location. In traditional enterprise computing, data is produced at a client endpoint, such as a user's computer. That data is moved across a WAN such as the internet, through the corporate LAN, where the data is stored and worked upon by an enterprise application. Results of that work are then conveyed back to the client endpoint. This remains a proven and time-tested approach to client-server computing for most typical business applications.

    But the number of devices connected to the internet, and the volume of data being produced by those devices and used by businesses, is growing far too quickly for traditional data center infrastructures to accommodate. Gartner predicted that by 2025, 75% of enterprise-generated data will be created outside of centralized data centers. The prospect of moving so much data in situations that can often be time- or disruption-sensitive puts incredible strain on the global internet, which itself is often subject to congestion and disruption.

    So IT architects have shifted focus from the central data center to the logical edge of the infrastructure -- taking storage and computing resources from the data center and moving those resources to the point where the data is generated. The principle is straightforward: If you can't get the data closer to the data center, get the data center closer to the data. The concept of edge computing isn't new, and it is rooted in decades-old ideas of remote computing -- such as remote offices and branch offices -- where it was more reliable and efficient to place computing resources at the desired location rather than rely on a single central location.

    Although only 27% of respondents have already implemented edge computing technologies, 54% find the idea interesting.

    Edge computing puts storage and servers where the data is, often requiring little more than a partial rack of gear to operate on the remote LAN to collect and process the data locally. In many cases, the computing gear is deployed in shielded or hardened enclosures to protect the gear from extremes of temperature, moisture and other environmental conditions. Processing often involves normalizing and analyzing the data stream to look for business intelligence, and only the results of the analysis are sent back to the principal data center.

    The idea of business intelligence can vary dramatically. Some examples include retail environments where video surveillance of the showroom floor might be combined with actual sales data to determine the most desirable product configuration or consumer demand. Other examples involve predictive analytics that can guide equipment maintenance and repair before actual defects or failures occur. Still other examples are often aligned with utilities, such as water treatment or electricity generation, to ensure that equipment is functioning properly and to maintain the quality of output.

    स्रोत : www.techtarget.com

    11 Reasons Why IoT and Edge Computing Should Go Together

    This article will outline all the benefits when IoT and edge computing work together and the key reasons why they should.

    11 Reasons Why IoT and Edge Computing Should Go Together

    Here are the top eleven reasons why IoT and Edge computing should go hand-in-hand:

    Less Waiting Time

    While the data is being processed at the edge, users can detect events or receive information almost immediately. With this improved response time, businesses can act much faster than they could before to meet customer demands. This will enable them to provide a better service or product to the end-user.

    Reduced Network Congestion and Latency

    IoT devices can be numerous and unpredictably dispersed, which makes it difficult for them to form an efficient network connection without some help. However, remote servers cannot handle all of these requests simultaneously because they would create too much traffic on the Internet. This means that the IoT devices have to queue up while they wait for their turn. Fortunately, with edge computing, there are fewer instances of server overload because the processing takes place closer to where data originates.

    Improved Security

    Machines are still easier to hack than human beings. However, when applications are moved away from dedicated servers, the risk of someone breaking into your network is reduced. Transmitting data to and from the cloud also leaves room for potential security breaches. Additionally, because IoT devices are so vulnerable to intrusion, edge computing could play a big role in future malware security systems.

    Improved Customer Satisfaction

    Because everything runs more smoothly on an edge computing system, your customers will feel that your product or services are more reliable. This added predictability could even lead to increased customer satisfaction and loyalty. If you notice a decline in returning users, it might be time to move some of your processes away from the cloud. With useful applications being performed closer to home, end-users can benefit from improved machine performance times. This will instill greater trust and confidence in your brand.

    New IoT Functionalities

    Edge computing enables the use of real-time analytics, allowing companies to gain insights into their data that were simply not possible before. As such, businesses can work more precisely with their applications and create new functionalities for existing products. IoT devices have a potential range of applications to offer, but they still involve high latency and network congestion. Therefore, allowing such technologies to develop further by implementing edge computing into the system will lead to more efficient devices that can offer better service to end-users.

    More Affordable IoT Solutions

    IoT devices are still relatively expensive because they require specific components that can be difficult to obtain. By decentralizing its functions, edge computing promises more affordable options on the mass market. Lower costs will also allow for greater interaction with smaller businesses and independent developers. This means more ideas are explored in a wider range of sectors.

    Improved Battery Life

    Applying edge computing to your IoT system will allow you to make more intelligent algorithms, which can optimize device performance and reduce power consumption. For instance, the devices will only need to use their resources when they detect a change in data or an event that requires attention from the controller. This means that processes that aren’t necessary for the device’s immediate function can run when it is plugged back in.

    Improved Machine Learning & AI Capabilities

    Edge computing enables more complex algorithms to be used safely and securely on an IoT system. This will allow companies to train their devices to make predictions based on current and past data, which might help with forecasting future outcomes and anticipating future events. This means that devices can become smarter and more self-sufficient over time.

    Unsupervised Learning-enabled

    It may not be the case for every business but unsupervised learning could allow certain businesses to train their device without human input. As such, this could help them create more autonomous products that are less reliant on human interference. This will enable your business to focus more of its energy elsewhere, rather than having to repeatedly train its models or keep updating data structures.

    Low Latency

    By keeping applications closer to the source of input data, it becomes possible to reduce overall latency time across an IoT system. This will make it easier for devices to communicate with each other and accomplish previously impossible tasks due to the time constraints of cloud processing. For instance, an autonomous drone can be instructed to drop supplies at a disaster scene without needing to return first.

    Better Network Management

    IoT devices are becoming increasingly popular across all major industries, meaning that more users are accessing your data on a regular basis. Edge computing allows for better network management, ensuring that it is not swamped. As such, you can avoid having to undertake costly upgrades or risk compromising user experience with lagging applications. It also offers the potential for future upgrades, opening up new opportunities in terms of growth.

    There are many other reasons why edge computing should be implemented into an IoT system. However, these are just a few of the key benefits it promises and they could lead to huge overhauls in how your business operates.

    स्रोत : www.orientsoftware.com

    Do you want to see answer or more ?
    Mohammed 2 month ago
    4

    Guys, does anyone know the answer?

    Click For Answer