Refine
Document Type
- Working Paper (5)
- Report (4)
- Article (1)
- Conference Proceeding (1)
Language
- English (11)
Has Fulltext
- yes (11)
Keywords
- Optimization (4)
- Evolutionärer Algorithmus (2)
- Imputation (2)
- Metaheuristik (2)
- Modeling (2)
- Optimierung (2)
- Soft Computing (2)
- Surrogate (2)
- Taxonomie (2)
- Taxonomy (2)
- Artificial intelligence (1)
- Bayesian Learning (1)
- Benchmarking (1)
- Big Data (1)
- Big data platform (1)
- Cognition (1)
- Cyclone Dust Separator (1)
- Datenanalyse (1)
- Electrostatic Precipitator (1)
- Entstauber (1)
- Evolutionary Computation (1)
- Evolutionary Robotics (1)
- Fehlende Daten (1)
- Funktionstest (1)
- Globale Optimierung (1)
- Heuristics (1)
- Industrie 4.0 (1)
- Industry 4.0 (1)
- Kognition (1)
- Künstliche Intelligenz (1)
- Lineare Regression (1)
- Mehrkriterielle Optimierung (1)
- Metaheuristics (1)
- Missing Data (1)
- Modelierung (1)
- Multi-Criteria Optimization (1)
- Multiobjective Optimization (1)
- Neural Networks (1)
- Parallelization (1)
- Parameter Tuning (1)
- Prognose (1)
- Regression (1)
- Sensortechnik (1)
- Sequential Parameter Optimization (1)
- Sequentielle Parameter Optimierung (1)
- Simulation (1)
- Social Learning (1)
- Surrogat-Modellierung (1)
- Surrogate Modeling (1)
- Surrogate model based optimization (1)
- Surrogates (1)
- Test Function (1)
- Time Series (1)
- Time-series (1)
- Univariate Data (1)
- Vorverarbeitung (1)
- Zeitreihe (1)
- Zeitreihenanalyse (1)
- Zylon Enstauber (1)
This paper introduces CAAI, a novel cognitive architecture for artificial intelligence in cyber-physical production systems. The goal of the architecture is to reduce the implementation effort for the usage of artificial intelligence algorithms. The core of the CAAI is a cognitive module that processes declarative goals of the user, selects suitable models and algorithms, and creates a configuration for the execution of a processing pipeline on a big data platform. Constant observation and evaluation against performance criteria assess the performance of pipelines for many and varying use cases. Based on these evaluations, the pipelines are automatically adapted if necessary. The modular design with well-defined interfaces enables the reusability and extensibility of pipeline components. A big data platform implements this modular design supported by technologies such as Docker, Kubernetes, and Kafka for virtualization and orchestration of the individual components and their communication. The implementation of the architecture is evaluated using a real-world use case.