In engineering school in the late 80’s, I was required to take drafting class my freshman year - “rotate that pencil to make consistent lines.” By my senior year, the university purchased something mind blowing - it was CAD/CAM. This attests to my experience and unfortunately my age, but it was the disruption caused by the digitization of the engineering process that left an impression. Instead of taking a more traditional path as a freshly minted engineer, I was fortunate to land at an innovative computer graphics company in Alabama called Intergraph. At that time, Intergraph was a leader in the application of computer graphics and was disrupting the engineering industry. It was an interesting time and demonstrated how digital technologies could transform industries. I have many stories of trying to teach old school draftsmen (drafts people I guess would be the correct term) and machinists how to adopt this new paradigm. This was the advent of Industry 3.0.
In 2013 after oscillating back and forth between engineering automation and IT for several years, I found myself at a hardware startup building subsea LiDAR systems for the oil and gas industry. This was the first time in my career where we merged hardware and software to develop high value solutions for clients. I spent countless days on large deep-sea construction vessels collecting and processing data and integrated many ancillary hardware systems. In today’s parlance this would be considered OT (operational technology). It provided me with the appreciation that not all solutions can be solved with software alone. I spent many days on board, wiring together various systems and integrating their specific software tools. It was very different from the IT world on the other side of the company that relied solely on software and the cloud to solve most business problems.
With the proliferation of inexpensive and high performance micro computers and sensors for edge computing, the next industry revolution called industry 4.0 seems to be upon us. As IT become the acronym to describe the back office enterprise systems, a new term was now coined to describe the tools and technologies used in the field and shop floor OT (operational technologies). OT has existed well before IT, but it seems as though the systems and data produced on this side of the business is increasingly more useful and available for the entire company. Traditional closed OT systems such as SCADA are now evolving as manufacturers adopt open standards and modern software techniques for edge applications. OT data has begun to creep into IT systems. These datasets can now be integrated into ERP systems providing real-time views into the supply side of the business. Many companies are optimizing their supply chain with more open and autonomous hardware and sensors in what many publications have called the rise of the machines. The media refers to this next generation wave of machine data as IoT (internet of things).
While the invention of Industry 3.0 centered around automation using human programmed robots and software to automate information tasks, Industry 4.0 is experiencing an explosion of data generated by sensors. And it is the intent that these sensors will be used to make real time decisions to support overall systems and processes. Unfortunately, traditional IT systems were not designed for the volumes of data and their real-time data analytic requirements and so there is a need to develop the next generation data architectures to support sensor data (IoT), machine learning and artificial intelligence.
During the brief time that I have been at HarperDB, I have met with many clients and partners challenged to build workflows and data processes using their sensor data. I see the following key challenges common in these strategies:
- Data is collected on the edge - on devices. ALL of this data is then expected to transmit to the cloud where a centralized system can mine the data for useful information. Although this is good for companies that sell network and computing, it is not very efficient and cost effective for end users as sending everything to the cloud is expensive. One method to optimize costs for compute and network is to move the processing and data filtering to the edge.
- Data security is a challenge since each of the various software layers used to collect, transfer and store have their own embedded security schemes. With IoT applications, there are often unique security requirements for the data that the individual data stacks cannot support. It becomes more important with IoT to have a more flexible and open model which can adopt a broad set of security tools that can evolve over time.
- Integration of this data into corporate IT systems is difficult, mainly due to the fact that these two organizations have traditionally never worked together before. If you ask a modern IT employee to stream data from a serial port, you will be met with a blank stare. If you ask a production engineer to write an interface for his data into their ERP you will be met with the same blank stare (and probably a few interesting phrases).
- Real-time is a relative term and undefined. Most end users want data in “real-time” but with the tools and applications needed to collect, transform, and then process through traditional cloud architectures, real-time has evolved to “near real time” and finally to “as real time as possible”. Moving the processing to the devices and the edge will reduce latency and provide more “real time” alerts and events.