Next Big Thing

Fog Computing – Where the Cloud Meets the Ground Mar 16, 2017 Fog Computing

The recent trend toward cloud services has allowed small and medium sized businesses to take advantage of everything that big data processing has to offer. The high entry costs of building a enterprise data center had limited access to the massive computing power used by industry gienats like Google, Amazon or Ebay. Until recently, the ability to analyze, for example, customer data for 100,000 clients per day to check for actionable trends and then adjust the website appropriately was not available.

Now these entry level clients can simply rent seconds, minutes or hours in cloud computing centers like Amazon Web Services or Microsoft Azure. This processing power is available at a relatively reasonable price, and thousands of businesses are signing up to reduce their IT costs, while increasing processing power.

One downside noted by some of these businesses, especially the ones where milliseconds count, is the increased latency required to upload the data and retrieve the results. For larger jobs that would normally require days to compute, such as processing monthly payroll for a large organization, this can be reduced to minutes when processed in the cloud. But on the other side, simple calculations that used to take milliseconds, can stretch out to minutes in latency as well.

Fog computing has rolled in to fill this gap and attempt to provide the best of both worlds. This decentralized computing infrastructure allows data, computations and storage to be distributed through the cloud in the most logical, efficient way between the data source and the cloud. This allows the cloud to extend to the edge of the network, so that clients can have all the advantages of the cloud closer to home and comes from the idea of fog being a cloud that has landed on the ground.

This reduces the amount of data that needs to be moved into the cloud to be processed, analyzed and stored, increasing efficiency. In some instances, it can also allow organizations to satisfy security and compliance requirements, such as government agencies that do not allow data to be transported or stored outside of their national jurisdictions.

Fog computing is becoming especially relevant as the Internet of Things (IoT) expands. Edge devices and sensors generate massive amounts of data that would be extremely costly to process and store locally, but which would also be inefficient to send to the conventional cloud (if we can use that term already) for advanced processing and machine learning tasks. In many applications, sending raw data from a sensor over the Internet would have serious privacy, security and legal implications. Even if these problems could be overcome, the network and processing latency may render the processed data obsolete by the time it became actionable.

Fog computing solves these problems by keeping data closer to the ground, making smart grids, smart cities, smart buildings, vehicle-to-vehicle networks more efficient.

In 2015, to help develop an open reference architecture, the OpenFog Consortium was founded by members from Cisco, Dell, Intel, Microsoft, ARM and Princeton University. They set out processes that allow decision making to take place in a data hub on a smart device to route traffic internally to be processed, or externally to the cloud. This process is not designed to replace cloud computing. It is used to reduce the load placed on the cloud by performing light, short-term analytics at the edge, while allowing the cloud to perform the more resource-intensive, longer-term analytics.

Even though fog computing takes place at the edge of the cloud, it is not to be confused with edge computing. Although they are both designed to decentralize processing within complex systems, the primary difference is where this processing is moved to. Edge computing attempts to move the storage and processing as close to the edge as possible, outside the cloud. Sensors, for example, would be responsible for storing and processing their own data and communicating with the servers as little as possible. For example, a security camera may not broadcast a signal at all, unless it identifies something in the image that it feels is worth sending to the server.

Fog computing on the other hand, would send its signal to a processor within the LAN that would perform a preliminary analysis before deciding whether to pass the data up the line to cloud servers for more in depth review.

While edge computing minimizes concentrated points of failure by allowing each device to act independently, proponents of fog computing argue that it lowers costs and is more scalable.

The applications of fog computing are almost as varied as the cloud itself. By applying fog computing to electrical grids, it may be possible to build smarter electrical grids that could further improve operational efficiency and reduce maintenance costs by allowing each component of the grid to both talk and listen.

Smart cities are another application that would benefit from fog computing by allowing municipalities to track and share key information with the public. Environmental and safety concerns are helping to drive these initiatives and fog computing is allowing the large scale machine learning to take place closer to where these key decisions are being made.

Related to this trend are smart buildings and vehicle networks that are benefiting from the reduced energy waste and efficiency that the IoT is providing.

Fog computing is expected to make inroads in telemedicine and patient care environments, where every millisecond matters, as well as smart rail, manufacturing and utility applications.

Fog computing has applications that touch on so many industries, that most applications are still in the design phases. Noosphere Ventures is eager to support any technology that helps acquire and share new knowledge. Fog computing in particular is positioned to drive intelligent applications of knowledge to the limits, expanding our world and making life for everyone in it easier.