The oil and gas industry already live on the edge when it comes to the remote and often inhospitable geographic locations that it operates in, but now it is moving its computing to the edge to gain valuable business insights that can increase operational efficiency and profitability.
For any industry, downtime is an anathema, a situation that all process companies strive to avoid. It can be costly and disruptive in any industry, but for oil and gas companies, it can be particularly expensive. According to an MIT Sloan study, a single day of downtime for a liquefied natural gas (LNG) facility can cost $25 million. And a typical midsize LNG facility goes down about five times a year.
It is well recorded that oil facilities, both upstream and downstream, generate vast volumes of data. A report from Cisco estimates that a typical oil platform generates up to 2TB of data every day. This creates enormous challenges when it comes to communications, storage, and analysis. One solution would be to collect less data, but with the growth of sensors, there is no sign of any reduction in the flow of data, quite the opposite.
“While technologies such as cloud computing and hybrid storage have been touted as solutions, these still rely on data being transmitted, and with many offshore facilities working on satellite communications at a speed of around 2Mbps that is still not practical,” Jane Ren, CEO, and founder of Atomiton explains. “The obvious solution would be to deal with that data on site as close as possible to where it is generated. Not just handled but analyzed and used to deliver actionable business information. That is why edge computing is rapidly becoming a crucial tool in the industrial internet of things (IIoT) toolbox.”
Complete article can be read here