Vortrag Graduiertenkolloquium2

Aus Aifbportal
Wechseln zu:Navigation, Suche

Automating Industrial Event Stream Analytics: Methods, Models, and Tools


Industrial event streams are an important cornerstone of Industrial Internet of Things (IIoT) applica-tions. Such streams are produced by distributed industrial assets on the shop floor at high frequency and are typically automatically processed in close proximity to the source. Industrial event stream ana-lytics is an essential building block for exploiting the full potential and creating added value, such as predictive quality assessment or predictive maintenance.

A major challenge that hinders the full adoption of industrial event stream analytics is the distribution of required technical and domain knowledge among several people. This makes the realization of analyt-ics projects time-consuming and error-prone. For instance, accessing industrial data sources requires a high level of technical skills due to a large heterogeneity of protocols and formats and long-lasting operating times of industrial assets. In addition, there is a gap between domain knowledge required to understand complex production processes and data science/operations knowledge to implement ana-lytics applications based on high-frequency industrial event streams. Altogether, there is a lack of a holistic approach that focuses on domain experts.

To solve these challenges, a new set of methods and tools is required that covers the entire data ana-lytics life cycle for industrial event streams. Data must be connected and analyzed (e.g. train machine learning models). These models must then be deployed in a geographically distributed architecture. There exist specialized solutions for the individual tasks and the combination of all these tasks can become rather complex.

In this work, we enable domain experts to exploit the full potential of data-driven decision making in industry scenarios. Therefore, we present an end-to-end framework to connect, transform and analyze industrial event streams without deep technical expertise. We leverage semantic models that are used to automatically instantiate adapters at the edge and extract preprocessing rules. Subsequently, these models are used i) to automatically adapt event streams (e.g. frequency, quality) according to pro-cessing requirements at the edge and ii) to better exploit event streams by enabling domain experts to label data in an intuitive way and train new machine learning models automatically. We provide an edge-based semantic adapter library to accelerate data connection. Connected data streams are dynamically adapted according to the requirements of the algorithms processing the events. Domain experts can use a labeling editor to label data and train machine learning models for time-series data using an automated machine learning service. We have evaluated both the performance of our solutions and their applicability and usability in a conducted user study. The proposed methods are accompanied by extensive tool support, which is fully available as open source.

In this presentation, we show how our work enables domain experts to leverage machine data and aim to reduce the effort of data analytics projects on industrial event streams.

(Philipp Zehnder)

Start: 29. Juli 2020 um 15:45
Ende: 29. Juli 2020 um 17:15

Im Gebäude 05.20, Raum: Onlineveranstaltung Zoom Meeting

Veranstaltung vormerken: (iCal)

Veranstalter: Forschungsgruppe(n) Web Science
Information: Media:Zehnder 29-07-2020.pdf