In modern civilization, water distribution network has a substantial role in preserving the desired living standard. It has different components such as pipe, pump, and control valve to convey water from the supply so...
详细信息
In modern civilization, water distribution network has a substantial role in preserving the desired living standard. It has different components such as pipe, pump, and control valve to convey water from the supply source to the consumer withdrawal points. Among these elements, optimal sizing of pipes has great importance because more than 70% of the project cost is incurred on it. Unfortunately, optimal pipe sizing falls in the category of nonlinear polynomial time hard (NP-hard) problems. Hence, solid research activities march on because of two facts, namely, importance and complexity of the problem. The literature revealed that the stochastic optimization algorithms are successful in exploring the combination of least-cost pipe diameters from the commercially available discrete diameter set, but with the expense of significant computational effort. The hybrid model PSO-GA, presented in this paper aimed to effectively utilize local and global search capabilities of particle swarm optimization (PSO) and genetic algorithm (GA), respectively, to reduce the computational burden. The analyses on different water distribution networks uncover that the proposed hybrid model is capable of exploring the optimal combination of pipe diameters with minimal computational effort. DOI: 10.1061/(ASCE)PS.1949-1204.0000113. (C) 2013 American Society of Civil Engineers.
Compressed sensing (CS) is an important theory for sub-Nyquist sampling and recovery of compressible data. Recently, it has been extended to cope with the case where corruption to the CS data is modeled as impulsive n...
详细信息
Compressed sensing (CS) is an important theory for sub-Nyquist sampling and recovery of compressible data. Recently, it has been extended to cope with the case where corruption to the CS data is modeled as impulsive noise. The new formulation, termed as robust CS, combines robust statistics and CS into a single framework to suppress outliers in the CS recovery. To solve the newly formulated robust CS problem, a scheme that iteratively solves a number of CS problems-the solutions from which provably converge to the true robust CS solution-is suggested. This scheme is, however, rather inefficient as it has to use existing CS solvers as a proxy. To overcome limitations with the original robust CS algorithm, we propose in this paper more computationally efficient algorithms by following latest advances in large-scale convex optimization for nonsmooth regularization. Furthermore, we also extend the robust CS formulation to various settings, including additional affine constraints, l(1)-norm loss function, mix-norm regularization, and multitasking, so as to further improve robust CS and derive simple but effective algorithms to solve these extensions. We demonstrate that the new algorithms provide much better computational advantage over the original robust CS method on the original robust CS formulation, and effectively solve more sophisticated extensions where the original methods simply cannot. We demonstrate the usefulness of the extensions on several imaging tasks.
Although HVDC transmission systems have been available since mid-1950s, almost all installations worldwide are point-to-point systems. In the past, the lower reliability and higher costs of power electronic converters...
详细信息
Although HVDC transmission systems have been available since mid-1950s, almost all installations worldwide are point-to-point systems. In the past, the lower reliability and higher costs of power electronic converters, together with complex controls and need for fast telecommunication links, may have prevented the construction of multiterminal DC (MTDC) networks. The introduction of voltage-source converters for transmission purposes has renewed the interest in the development of supergrids for integration of remote renewable sources, such as offshore wind. The main focus of the present work is on the control and operation of MTDC networks for integration of offshore wind energy systems. After a brief introduction, this paper proposes a classification of MTDC networks. The most utilized control structures for VSC-HVDC are presented, since it is currently recognized as the best candidate for the development of supergrids, followed by a discussion of the merits and shortcomings of available DC voltage control methods. Subsequently, a novel control strategy-with distributed slack nodes-is proposed by means of a DC optimal power flow. The distributed voltage control (DVC) strategy is numerically illustrated by loss minimization in an MTDC network. Finally, dynamic simulations are performed to demonstrate the benefits of the DVC strategy.
Pyrolysis models used in Computational-Fluid-Dynamics-based fire models are typically semi-empirical, include a large number of unknown parameters (i.e., material properties and parameters of the chemical reactions) a...
详细信息
Pyrolysis models used in Computational-Fluid-Dynamics-based fire models are typically semi-empirical, include a large number of unknown parameters (i.e., material properties and parameters of the chemical reactions) and require a careful calibration phase. During the calibration phase, the pyrolysis model coefficients are determined by comparisons with reference experimental data, for instance data taken from thermo-gravimetric and/or bench-scale experiments. The present study examines the predictive capability of pyrolysis models developed via a calibrated semi-empirical approach. The study first introduces six different semi-empirical models developed to simulate pyrolysis of polyvinyl chloride (PVC). All of the models are similar and use a global one-step Arrhenius-type pyrolysis reaction. They differ because of different modeling assumptions made that impact the number of unknown model parameters and/or because of the optimization technique used to determine the unknown parameters (a genetic algorithm or a stochastic hill-climber algorithm). The six models are calibrated and by design, provide similar results under conditions that are close to those of reference cone calorimeter experiments. The study then considers an evaluation of the predictive capability of the six pyrolysis models through a series of numerical experiments, including several cone calorimeter tests and one vertical upward flame spread problem;these configurations feature conditions that are significantly different from the reference conditions used in the model calibration phase. It is found that predictions from the PVC pyrolysis models start to diverge for conditions that lie outside of the calibration range. Most notably, the models lead to conflicting results when applied to the flame spread problem. These results suggest that the domain of validity of semi-empirical pyrolysis models is limited to the conditions that were used during model calibration and that extrapolation to non-calibra
Energy consumption is one of the main concerns in mobile ad hoc networks (or MANETs). The lifetime of its devices highly depends on the energy consumption as they rely on batteries. The adaptive enhanced distance base...
详细信息
Energy consumption is one of the main concerns in mobile ad hoc networks (or MANETs). The lifetime of its devices highly depends on the energy consumption as they rely on batteries. The adaptive enhanced distance based broadcasting algorithm, AEDB, is a message dissemination protocol for MANETs that uses cross-layer technology to highly reduce the energy consumption of devices in the process, while still providing competitive performance in terms of coverage and time. We use two different multi-objective evolutionary algorithms to optimize the protocol on three network densities, and we evaluate the scalability of the best found AEDB configurations on larger networks and different densities.
Hepatitis is usually caused by a viral infection or metabolic diseases. Hepatitis type B virus (HBV) infection is among the most common causes of hepatitis and can result in serious liver diseases. Several dynamic mod...
详细信息
Hepatitis is usually caused by a viral infection or metabolic diseases. Hepatitis type B virus (HBV) infection is among the most common causes of hepatitis and can result in serious liver diseases. Several dynamic models have been developed to mathematically describe the HBV infection and antiviral therapy. In addition, different control strategies have been reported in the literature to deal with optimal antiviral therapy problem of infectious diseases. In this paper, a set of optimized closed-loop fuzzy controllers are employed for optimal treatment of basic HBV infection. To optimize the proposed scheme, five modified and modern optimization algorithms are investigated. After designing the controller, some parameters of the HBV infection model are considered to be unknown, and the robustness of the optimized controller is studied. Experimental results show that the covariance matrix adaptation-evolution strategy-based optimized closed-loop fuzzy controller has the best performance in terms of total cost of an objective function defined based on maximization of uninfected target cells, minimization of free HBVs and minimization of drug usage. In addition, the execution time of this optimization algorithm is only 8 % more than the execution time of imperialist competition algorithm as the investigated algorithm with the best convergence speed.
It is very important when search methods are being designed to know which parameters have the greatest influence on the behaviour and performance of the algorithm. To this end, algorithm parameters are commonly calibr...
详细信息
It is very important when search methods are being designed to know which parameters have the greatest influence on the behaviour and performance of the algorithm. To this end, algorithm parameters are commonly calibrated by means of either theoretic analysis or intensive experimentation. However, due to the importance of parameters and its effect on the results, finding appropriate parameter values should be carried out using robust tools to determine the way they operate and influence the results. When undertaking a detailed statistical analysis of the influence of each parameter, the designer should pay attention mostly to the parameters that are statistically significant. In this paper the ANOVA (ANalysis Of the VAriance) method is used to carry out an exhaustive analysis of an evolutionary algorithm method and the different parameters it requires. Following this idea, the significance and relative importance of the parameters regarding the obtained results, as well as suitable values for each of these, were obtained using ANOVA and post-hoc Tukey's Honestly Significant Difference tests on four well known function optimization problems. Through this statistical study we have verified the adequacy of parameter values available in the bibliography using parametric hypothesis tests.
Motivated by various problems such as distributed computation, multiagent coordination, wireless communication, and online search algorithms, a time-varying optimal coordinated information load balancing problem has b...
详细信息
Motivated by various problems such as distributed computation, multiagent coordination, wireless communication, and online search algorithms, a time-varying optimal coordinated information load balancing problem has been solved by means of a sequential, two-stage, optimal semistable control approach. Technically we formulate this information load balancing problem into a linear, time-varying quadratic semistabilization problem with time-dependent iterative algorithms for information load balancing in peer-to-peer networks. To solve this problem, we propose a novel, sequential two-stage design. The first stage is to guarantee the convergence of the optimal policy while the second stage is to derive the explicit recursive formulas for optimal strategies under a finite set of convergence-guaranteed candidate policies.
Artificial neural networks (ANNs) becomes very popular tool in hydrology, especially in rainfall-runoff modelling. However, a number of issues should be addressed to apply this technique to a particular problem in an ...
详细信息
Artificial neural networks (ANNs) becomes very popular tool in hydrology, especially in rainfall-runoff modelling. However, a number of issues should be addressed to apply this technique to a particular problem in an efficient way, including selection of network type, its architecture, proper optimization algorithm and a method to deal with overfitting of the data. The present paper addresses the last, rarely considered issue, namely comparison of methods to prevent multi-layer perceptron neural networks from overfitting of the training data in the case of daily catchment runoff modelling. Among a number of methods to avoid overfitting the early stopping, the noise injection and the weight decay have been known for about two decades, however only the first one is frequently applied in practice. Recently a new methodology called optimized approximation algorithm has been proposed in the literature. Overfitting of the training data leads to deterioration of generalization properties of the model and results in its untrustworthy performance when applied to novel measurements. Hence the purpose of the methods to avoid overfitting is somehow contradictory to the goal of optimization algorithms, which aims at finding the best possible solution in parameter space according to pre-defined objective function and available data. Moreover, different optimization algorithms may perform better for simpler or larger ANN architectures. This suggest the importance of proper coupling of different optimization algorithms, ANN architectures and methods to avoid overfitting of real-world data - an issue that is also studied in details in the present paper. The study is performed for Annapolis River catchment, characterized by significant seasonal changes in runoff, rapid floods during winter and spring, moderately dry summers, severe winters with snowfall, snow melting, frequent freeze and thaw, and presence of river ice. The present paper shows that the elaborated noise injection meth
A relevant logistic issue in the organization of a fair is to determine how stands have to be placed in the exhibition space so as to satisfy all constraints on security, ease of access, services, and so on, while max...
详细信息
A relevant logistic issue in the organization of a fair is to determine how stands have to be placed in the exhibition space so as to satisfy all constraints on security, ease of access, services, and so on, while maximizing the revenues coming from the exhibitors. We consider in particular the problem of allocating the maximum number of stands by satisfying all the constraints required by practical implementations. We examine a number of real-world cases, and show how basic mathematical programming models can be improved to handle specific requests from the organizers. We report the solutions obtained through an original decision support system, that embeds a number of algorithms to solve the various cases by reduction to one or more linear programs.
暂无评论