Cracking the Code On Predictive Maintenance Requires More Than Sensored-Generated Telemetry Data
MANUEL TERRANOVA, President and CEO of Peaxy, says:
Current talk would lead one to believe that industrial giants have cracked the code on predictive maintenance thanks to the petabytes of sensor-generated data coming off of machines today and landing in the hands of engineering teams — GE attributed $800 million in incremental revenue to new predictive maintenance capabilities in 2013. The reality is that only the surface has been scratched, as current IT infrastructures make it impossible to achieve zero-outage ambitions or long-lasting breakthrough “smart” technology innovation.
Yes, sensor data is a vital piece to bringing the true promise of the Industrial Internet of Things, and the ability to harness and analyze these data sets in real time will no doubt yield notable improvements. However, to reach the next-level predictive maintenance breakthroughs and “smart” products, engineering teams must be able to aggregate and compare telemetry data with original geometry drawings and test simulations. Yet, keeping in mind that industrial machinery be in the field for 30 or more years, these files may be decades old and dark.
Consider, for example, a large passenger aircraft. The original design may have been created in the 1980s, while data from various simulations and the original test bench would have occurred over the course of multiple decades. This would pose no problem for engineers if the following were true: 1) files were aggregated over the years in a consistent and logical manner throughout the organization; and 2) data management practices over the decades maintained the stability of the location of the files.
Click here to read the full article about Peaxy published on #DataCenterPost!