The obstacle-avoidance problem of intelligent vehicles is one of the challenges that we face in path planning. In order to tackle it, this paper proposes a real-time path planning approach based on tentacle algorithm ...
详细信息
The obstacle-avoidance problem of intelligent vehicles is one of the challenges that we face in path planning. In order to tackle it, this paper proposes a real-time path planning approach based on tentacle algorithm and B-spline curve. In this approach, firstly, some virtual tentacles are built to represent the precalculated paths of the ego vehicle at a current speed. Secondly, it selects the best tentacle path among them, which is required to provide a safe driving direction and sampling area for generating B-spline paths. When the vehicle drives along the best tentacle path, the B-spline path is generated according to the sampling area. Finally, the designed path is formed by segments of the best tentacle and a B-spline curve. Compared with other sampling-based path sets approaches, using the proposed approach needs shorter reaction time. Simulation and experiment results verify the real-time performance and effectiveness of the algorithm of local path planning.
This paper aims to investigate uncertainties in railway vehicle suspension components and the implement of uncertainty quantification methods in railway vehicle dynamics The sampling-based method represented by Latin ...
详细信息
This paper aims to investigate uncertainties in railway vehicle suspension components and the implement of uncertainty quantification methods in railway vehicle dynamics The sampling-based method represented by Latin Hypercube sampling (LHS) and generalized polynomial chaos approaches including the stochastic Galerkin and Collocation methods (SGM and SCM) are employed to analyze the propagation of uncertainties from the parameters input in a vehicle-track mathematical model to the results of running dynamics In order to illustrate the performance qualities of SGM, SCM and LHS, a stochastic wheel model with uncertainties of the stiffness and damping is firstly formulated to study the vertical displacement of wheel. Numerical results show that SCM, which can be easily implemented by means of the existing deterministic model, has explicit advantages over SGM and LHS in terms of the efficiency and accuracy. Furthermore, a simplified stochastic bogie model with three random suspension parameters is also established by means of SCM and LHS to analyze the critical speed, which is affected obviously by the parametric uncertainties Finally, a stochastic vertical vehicle-track coupled model with parametric uncertainties is built comprehensively on the basis of SCM, by which the dynamic response of vehicles under the track irregularity is explored in terms of the Sperling index. It concludes that the uncertainties of parameters have a significant influence on Sperling index from the view of the running quality.
One of the most challenging safety precautions for workers in dynamic, radioactive environments is avoiding radiation sources and sustaining low exposure. This paper presents a sampling-based algorithm, DL-RRT*, for m...
详细信息
One of the most challenging safety precautions for workers in dynamic, radioactive environments is avoiding radiation sources and sustaining low exposure. This paper presents a sampling-based algorithm, DL-RRT*, for minimum dose walk-path re-planning in radioactive environments, expedient for occupational workers in nuclear facilities to avoid unnecessary radiation exposure. The method combines the principle of random tree star (RRT*) and D* Lite, and uses the expansion strength of grid search strategy from D* Lite to quickly find a high-quality initial path to accelerate convergence rate in RRT*. The algorithm inherits probabilistic completeness and asymptotic optimality from RRT* to refine the existing paths continually by sampling the search-graph obtained from the grid search process. It can not only be applied to continuous cost spaces, but also make full use of the last planning information to avoid global re-planning, so as to improve the efficiency of path planning in frequently changing environments. The effectiveness and superiority of the proposed method was verified by simulating radiation field under varying obstacles and radioactive environments, and the results were compared with RRT* algorithm output. (C) 2018 Korean Nuclear Society, Published by Elsevier Korea LLC.
This paper assesses the impact of the homogenization level (nodal or pin-by-pin) on the uncertainties predicted by core simulators, highlighting the need of calculations at pin level for a reliable estimate of uncerta...
详细信息
This paper assesses the impact of the homogenization level (nodal or pin-by-pin) on the uncertainties predicted by core simulators, highlighting the need of calculations at pin level for a reliable estimate of uncertainties in hot channel factors. In order to perform the analysis, two methodologies for nuclear data uncertainty quantification in stand-alone neutronics calculations have been implemented in the core simulator COBAYA. The first one is based on first-order perturbation theory and allows sensitivity/uncertainty analysis in the multiplication factor. The second one is based on random sampling and allows the uncertainty propagation in all responses computed by the code. Both methodologies are used in conjunction with SCALE system, which provides the capabilities to propagate the nuclear data uncertainties into the few-group constants required for the diffusion calculations. These methodologies are validated using a simplified pin-cell and a fuel assembly, and then applied to a full 3D core in the context of the OECD/NEA UAM benchmark (Uncertainty Analysis in Best-Estimate Modeling for Design, Operation and Safety Analysis of LWRs).
Given a sample from a finite population, we provide a nonparametric Bayesian prediction interval for a finite population mean when a standard normal assumption may be tenuous. We will do so using a Dirichlet process (...
详细信息
Given a sample from a finite population, we provide a nonparametric Bayesian prediction interval for a finite population mean when a standard normal assumption may be tenuous. We will do so using a Dirichlet process (DP), a nonparametric Bayesian procedure which is currently receiving much attention. An asymptotic Bayesian prediction interval is well known but it does not incorporate all the features of the DP. We show how to compute the exact prediction interval under the full Bayesian DP model. However, under the DP, when the population size is much larger than the sample size, the computational task becomes expensive. Therefore, for simplicity one might still want to consider useful and accurate approximations to the prediction interval. For this purpose, we provide a Bayesian procedure which approximates the distribution using the exchangeability property (correlation) of the DP together with normality. We compare the exact interval and our approximate interval with three standard intervals, namely the design-based interval under simple random sampling, an empirical Bayes interval and a moment-based interval which uses the mean and variance under the DP. However, these latter three intervals do not fully utilize the posterior distribution of the finite population mean under the DP. Using several numerical examples and a simulation study we show that our approximate Bayesian interval is a good competitor to the exact Bayesian interval for different combinations of sample sizes and population sizes.
This paper proposes a methodology for sampling-based design optimization in the presence of interval variables. Assuming that an accurate surrogate model is available, the proposed method first searches the worst comb...
详细信息
This paper proposes a methodology for sampling-based design optimization in the presence of interval variables. Assuming that an accurate surrogate model is available, the proposed method first searches the worst combination of interval variables for constraints when only interval variables are present or for probabilistic constraints when both interval and random variables are present. Due to the fact that the worst combination of interval variables for probability of failure does not always coincide with that for a performance function, the proposed method directly uses the probability of failure to obtain the worst combination of interval variables when both interval and random variables are present. To calculate sensitivities of the constraints and probabilistic constraints with respect to interval variables by the sampling-based method, behavior of interval variables at the worst case is defined by the Dirac delta function. Then, Monte Carlo simulation is applied to calculate the constraints and probabilistic constraints with the worst combination of interval variables, and their sensitivities. A merit of using an MCS-based approach in the X-space is that it does not require gradients of performance functions and transformation from X-space to U-space for reliability analysis, thus there is no approximation or restriction in calculating sensitivities of constraints or probabilistic constraints. Numerical results indicate that the proposed method can search the worst case probability of failure with both efficiency and accuracy and that it can perform design optimization with mixture of random and interval variables by utilizing the worst case probability of failure search.
This paper proposes a methodology for sampling-based design optimization in the presence of interval variables. Assuming that an accurate surrogate model is available, the proposed method first searches the worst comb...
详细信息
ISBN:
(纸本)9780791855898
This paper proposes a methodology for sampling-based design optimization in the presence of interval variables. Assuming that an accurate surrogate model is available, the proposed method first searches the worst combination of interval variables for constraints when only interval variables are present or for probabilistic constraints when both interval and random variables are present. Due to the fact that the worst combination of interval variables for probability of failure does not always coincide with that for a performance function, the proposed method directly uses the probability of failure to obtain the worst combination of interval variables when both interval and random variables are present. To calculate sensitivities of constraints and probabilistic constraints with respect to interval variables by the sampling-based method, the behavior of interval variables at the worst case is defined by utilizing the Dirac delta function. Then, Monte Carlo simulation is applied to calculate constraints and probabilistic constraints with the worst combination of interval variables, and their sensitivities. The important merit of the proposed method is that it does not require gradients of performance functions and transformation from X-space to U-space for reliability analysis after the worst combination of interval variables is obtained, thus there is no approximation or restriction in calculating the sensitivities of constraints or probabilistic constraints. Numerical results indicate that the proposed method can search the worst case probability of failure with both efficiency and accuracy and that it can perform design optimization with mixture of random and interval variables by utilizing the worst case probability of failure search.
We consider a Bayesian approach to the study of independence in a two-way contingency table which has been obtained from a two-stage cluster sampling design. If a procedure based on single-stage simple random sampling...
详细信息
We consider a Bayesian approach to the study of independence in a two-way contingency table which has been obtained from a two-stage cluster sampling design. If a procedure based on single-stage simple random sampling (rather than the appropriate cluster sampling) is used to test for independence, the p-value may be too small, resulting in a conclusion that the null hypothesis is false when it is, in fact, true. For many large complex surveys the Rao-Scott corrections to the standard chi-squared (or likelihood ratio) statistic provide appropriate inference. For smaller surveys, though, the Rao-Scott corrections may not be accurate, partly because the chi-squared test is inaccurate. In this paper, we use a hierarchical Bayesian model to convert the observed cluster samples to simple random samples. This provides surrogate samples which can be used to derive the distribution of the Bayes factor. We demonstrate the utility of our procedure using an example and also provide a simulation study which establishes our methodology as a viable alternative to the Rao-Scott approximations for relatively small two-stage cluster samples. We also show the additional insight gained by displaying the distribution of the Bayes factor rather than simply relying on a summary of the distribution. (C) 2013 Elsevier B.V. All rights reserved.
暂无评论