Refine
Document Type
- Report (4)
- Article (2)
- Conference Proceeding (1)
- Working Paper (1)
Language
- English (8)
Has Fulltext
- yes (8)
Keywords
- Optimierung (3)
- Bayesian Optimization (2)
- Benchmarking (2)
- BBOB (1)
- Bayesian Learning (1)
- Bayesian Regression (1)
- Data Analysis (1)
- Data Modelling (1)
- Evolutionary Robotics (1)
- Flushing (1)
In this paper we present a comparison of different data driven modeling methods. The first instance of a data driven linear Bayesian model is compared with several linear regression models, a Kriging model and a genetic programming model.
The models are build on industrial data for the development of a robust gas sensor.
The data contain limited amount of samples and a high variance.
The mean square error of the models implemented in a test dataset is used as the comparison strategy.
The results indicate that standard linear regression approaches as well as Kriging and GP show good results,
whereas the Bayesian approach, despite the fact that it requires additional resources, does not lead to improved results.
Modelling Zero-inflated Rainfall Data through the Use of Gaussian Process and Bayesian Regression
(2018)
Rainfall is a key parameter for understanding the water cycle. An accurate rainfall measurement is vital in the development of hydrological models. By means of indirect measurement, satellites can nowadays estimate the rainfall around the world. However, these measurements are not always accurate. As a first approach to generate a bias-corrected rainfall estimate using satellite data, the performance of Gaussian process and Bayesian regression is studied. The results show Gaussian process as the better option for this dataset but leave place to improvements on both modelling strategies.
Social learning enables multiple robots to share learned experiences while completing a task. The literature offers examples where robots trained with social learning reach a higher performance compared to their individual learning counterparts. No explanation has been advanced for that observation. In this research, we present experimental results suggesting that a lack of tuning of the parameters in social learning experiments could be the cause. In other words: the better the parameter settings are tuned, the less social learning can improve the system performance.
Many black-box optimization problems rely on simulations to evaluate the quality of candidate solutions. These evaluations can be computationally expensive and very time-consuming. We present and approach to mitigate this problem by taking into consideration two factors: The number of evaluations and the execution time. We aim to keep the number of evaluations low by using Bayesian optimization (BO) – known to be sample efficient– and to reduce wall-clock times by executing parallel evaluations. Four parallelization methods using BO as optimizer are compared against the inherently parallel CMA-ES. Each method is evaluated on all the 24 objective functions of the Black-Box-Optimization-Benchmarking test suite in their 20-dimensional versions. The results show that parallelized BO outperforms the state-of-the-art CMA-ES on most of the test functions, also on higher dimensions.
An important class of black-box optimization problems relies on using simulations to assess the quality of a given candidate solution. Solving such problems can be computationally expensive because each simulation is very time-consuming. We present an approach to mitigate this problem by distinguishing two factors of computational cost: the number of trials and the time needed to execute the trials. Our approach tries to keep down the number of trials by using Bayesian optimization (BO) –known to be sample efficient– and reducing wall-clock times by parallel execution of trials. We compare the performance of four parallelization methods and two model-free alternatives. Each method is evaluated on all 24 objective functions of the Black-Box-Optimization- Benchmarking (BBOB) test suite in their five, ten, and 20-dimensional versions. Additionally, their performance is investigated on six test cases in robot learning. The results show that parallelized BO outperforms the state-of-the-art CMA-ES on the BBOB test functions, especially for higher dimensions. On the robot learning tasks, the differences are less clear, but the data do support parallelized BO as the ‘best guess’, winning on some cases and never losing.
Sensor placement for contaminant detection in water distribution systems (WDS) has become a topic of great interest aiming to secure a population's water supply. Several approaches can be found in the literature with differences ranging from the objective selected to optimize to the methods implemented to solve the optimization problem. In this work we aim to give an overview of the current work in sensor placement with focus on contaminant detection for WDS. We present some of the objectives for which the sensor placement problem is defined along with common optimization algorithms and Toolkits available to help with algorithm testing and comparison.
Drinking water supply and distribution systems are critical infrastructure that has to be well maintained for the safety of the public. One important tool in the maintenance of water distribution systems (WDS) is flushing. Flushing is a process carried out in a periodic fashion to clean sediments and other contaminants in the water pipes. Given the different topographies, water composition and supply demand between WDS no single flushing strategy is suitable for all of them. In this report a non-exhaustive overview of optimization methods for flushing in WDS is given. Implementation of optimization methods for the flushing procedure and the flushing planing are presented. Suggestions are given as a possible option to optimise existing flushing planing frameworks.
EventDetectR: An efficient Event Detection System (EDS) capable of detecting unexpected water quality conditions. This approach uses multiple algorithms to model the relationship between various multivariate water quality signals. Then the residuals of the models were utilized in constructing the event detection algorithm, which provides a continuous measure of the probability of an event at every time step. The proposed framework was tested for water contamination events with industrial data from automated water quality sensors. The results showed that the framework is reliable with better performance and is highly suitable for event detection.