April 14, 2015
At a construction site in Dubai near the world’s tallest building, the Burj Khalifa, a complex ballet unfolds. Hundreds of trucks come and go as 5,000 workers labor beneath 37 cranes to raise another skyscraper in this United Arab Emirates city.
The potential for disaster is never far. According to OSHA, one in five workers killed on the job in the United States in 2013 — nearly 800 people — met their end on construction sites. In the Arab Gulf states, construction site safety has been in the spotlight after a slew of worker deaths in the rush to build infrastructure for soccer’s 2022 World Cup in Qatar. But on this particular site, there’s more choreography than meets the eye.
That’s because the cranes are loaded with sensors that allow their real-time location to be determined instantly—part of a state-of-the-art collision-avoidance system from SK Solutions that helps keep cranes from clashing with each other and anything else. Key to the system’s effectiveness is the fact that the sensor-based data on the cranes—3-D motion control, location, load weight, equipment usage and wind speed — is processed locally instead of in remote data centers, aka the cloud. Welcome to the brave new world of “fog computing”— an emerging paradigm that extends cloud computing and services to end users and end devices at the edge of the network, improving performance, user experience, security and reliability.
“There is no way SK Solutions could rely on data streaming off the cranes to go all the way back to a cloud data center some place, then back to the crane site to ensure there is no crane collision,” says Steve Hilton, co-founder and president at Boston, Mass.-based MachNation, which offers applications and insight services for theInternet of Things (IoT) and its next phase, the Internet of Everything (IoE). “Based on our IoT research, fog computing is great for Internet of Things deployments where the enterprise needs some moderate to sophisticated computing and processing closer to the edge of a network.”
Taming the Data Tsunami at the Edge
As its name implies, fog computing is similar to cloud computing but—extending the metaphor—is closer to the ground. Translation: services are hosted where they’re used—on the things that make up the Internet of Things, or on devices that sit between those things and the Internet, known as the network edge. Because services are hosted locally, the fog paradigm results in faster processing and reduced latency—allowing things like connected cranes to dance collision-free. Other distinguishing characteristics of fog are its widespread geographical distribution and mobility.
Proponents say fog computing provides a new computing model that’s becoming necessary as the Internet of Things explodes. As things from thermometers and electric meters, to blood pressure gauges, brake assemblies and even sheep become connected to the Internet, the coming “data tsunami” will render the existing cloud model incapable of handling the load. Consider, for example, that a single commercial airplane generates about 10 terabytes of data for every 30 minutes of flight. Moving such huge volumes of data around—for example,backhauling it to a cloud-based application for analysis—is not only expensive, but in many cases unnecessary and even counter-productive.
Fog in Connected Vehicles, Smart Cities and Smart Grid
Fog computing doesn’t consist of powerful servers, but of weaker and more dispersed computers of the sort that are proliferating in vehicles, cities, power utilities and just about everywhere else. These computers don’t talk to the cloud unless they have to.
Transportation offers more Internet of Things scenarios where fog computing could take a data load off. As trains become increasingly connected, rail cars will become loaded with sensors—both in their interiors and on things like wheels, axles and the connectors between cars. The data streaming from these sensors—such as temperature and vibration data—could be constantly collected and analyzed locally, on an access point, hub or other edge device in the train.
As long as the data remain within appropriate tolerances—i.e. “I’m OK”—nothing is relayed to the cloud. But if tolerances are exceeded—i.e. “I need maintenance”—the data streams to a data center in the cloud to initiate predictive maintenance or other actions. For emergency maintenance, such as when sensors on wheel ball bearings detect excessive heat, a local application can send an automatic alert to the train operator to stop the train at the next station to avoid potential derailment.
Possible fog scenarios are numerous. Activated by a video camera sensing an ambulance’s flashing lights, smarter traffic lights could automatically open lanes to speed the vehicle’s passage through traffic. Vents in mines could automatically change airflow if air conditions became dangerous to miners. And energy load-balancing applications running on edge devices could automatically switch to alternative energies like solar and wind based on energy demand, availability and price.
In the next five years, Hilton says he expects to see more adoption of fog computing in sectors like industrial, manufacturing, construction, heavy machinery and petro-chemical. He says it will be up to enterprises to decide when to use the fog paradigm.
“They’ll want to ensure their Internet of Things deployments get most benefit from using fog computing for the right processes, while relying on cloud-based computing for other processes,” he says.
Laurence Cruz is a freelance writer based in Los Angeles. A U.K. transplant, he has worked as a reporter with The Associated Press in Seattle and as an environmental reporter for The Statesman Journal in Salem, Oregon. He has a BA in English from Oxford and an MA in Communications from Washington State University. Used with the permission of http://thenetwork.cisco.com/.