Distribution system analysis with ever increasing numbers of distributed energy resources (DER) requires quasistatic time-series (QSTS) analysis to capture the time-varying and time-dependent aspects of the system. Pr...
详细信息
ISBN:
(纸本)9781538622131;9781538622124
Distribution system analysis with ever increasing numbers of distributed energy resources (DER) requires quasistatic time-series (QSTS) analysis to capture the time-varying and time-dependent aspects of the system. Previous literature has demonstrated the benefits of QSTS, but there is limited information available for the requirements and standards for performing QSTS simulations. This paper provides a novel analysis of the QSTS requirements for the input data time-resolution, the simulation time-step resolution, and the length of the simulation. Detailed simulations quantify the specific errors introduced by not performing yearlong high-resolution QSTS simulations.
An ever-increasing need for improving the accuracy includes more involved and detailed features, thus inevitably leading to larger-scale dynamical systems [1]. To overcome this problem, efficient finite methods heavil...
An ever-increasing need for improving the accuracy includes more involved and detailed features, thus inevitably leading to larger-scale dynamical systems [1]. To overcome this problem, efficient finite methods heavily rely on model reduction. One of the main approaches to model reduction of both linear and nonlinear systems is by means of interpolation. The Loewner framework is a direct data-driven method able to identify and reduce models derived directly from measurements. For measured data in the frequency domain, the Loewner framework is well established in linear case [2] while it has already extended to nonlinear [6]. On the other hand in the case of time domain data, the Loewner framework was already applied for approximating linear models [3]. In this study, an algorithm which uses time domain data for nonlinear (bilinear) system reduction and identification is presented.
The businesses in various fields use the online communication application to gather their data and information with local and global sources. The gathered data may sensitive such as the financial and businesses develo...
The businesses in various fields use the online communication application to gather their data and information with local and global sources. The gathered data may sensitive such as the financial and businesses development information. The hackers or online thief try to stole the valuable data i.e. credit card numbers. The organizations looking for secure online channels in order to transfer their data efficiently and avoid the data thieving. One of the most applicable methods that developed to secure the online transferred data is the cryptography which transfers the original data or information to encrypted formulation. Cryptography still has many drawbacks such as stole and decrypts the original texts using automatic decryption counter. The main aim of this research is to improve the cryptography securing level using supportive method which is Steganography. The Steganography is the processes of hide the data or information in media files such as video, images and audio files. There are four stages represent the methodology of this paper; (1) encrypt the original texts using RSA algorithm, (2) hide the encrypted texts in Image files, (3) extract the encrypted texts from Image files, and (4) decrypt the original texts using decryption key of RSA algorithm. It is expected to improve the security level of the online transferred textual data. The performance of the final results will be evaluated through compare the Image files quality before and after hide the data in these files. The quality of the original and stego Image files need to be same or near in order to maximize the difficulty of detect that there data hide in these files.
This paper develops a coarse-to-fine framework for single-image super-resolution (SR) reconstruction. The coarse-to-fine approach achieves high-quality SR recovery based on the complementary properties of both example...
详细信息
Among different components of urban mobility, urban freight transport is usually considered as the least sustainable. Limited traffic infrastructures and increasing demands in dense urban regions lead to frequent deli...
详细信息
ISBN:
(纸本)9781538639337
Among different components of urban mobility, urban freight transport is usually considered as the least sustainable. Limited traffic infrastructures and increasing demands in dense urban regions lead to frequent delivery runs with smaller freight vehicles. This increases the traffic in urban areas and has negative impacts upon the quality of life in urban populations. Data driven optimizations are essential to better utilize existing urban transport infrastructures and to reduce the negative effects of freight deliveries for the cities. However, there is limited work and data driven research on urban delivery areas and freight transportation networks. In this paper, we collect and analyse data on urban freight deliveries and parking areas towards an optimized urban freight transportation system. Using a new check-in based mobile parking system for freight vehicles, we aim to understand and optimize freight distribution processes. We explore the relationship between areas' availability patterns and underlying traffic behaviour in order to understand the trends in urban freight transport. By applying the detected patterns we predict the availabilities of loading/unloading areas, and thus open up new possibilities for delivery route planning and better managing of freight transport infrastructures.
Background: The COVID-19 pandemic highlighted gaps in health surveillance systems, disease prevention, and treatment globally. Among the many factors that might have led to these gaps is the issue of the financing of ...
Background: The COVID-19 pandemic highlighted gaps in health surveillance systems, disease prevention, and treatment globally. Among the many factors that might have led to these gaps is the issue of the financing of national health systems, especially in low-income and middle-income countries (LMICs), as well as a robust global system for pandemic preparedness. We aimed to provide a comparative assessment of global health spending at the onset of the pandemic;characterise the amount of development assistance for pandemic preparedness and response disbursed in the first 2 years of the COVID-19 pandemic;and examine expectations for future health spending and put into context the expected need for investment in pandemic preparedness. Methods: In this analysis of global health spending between 1990 and 2021, and prediction from 2021 to 2026, we estimated four sources of health spending: development assistance for health (DAH), government spending, out-of-pocket spending, and prepaid private spending across 204 countries and territories. We used the Organisation for Economic Co-operation and Development (OECD)'s Creditor Reporting system (CRS) and the WHO Global Health Expenditure Database (GHED) to estimate spending. We estimated development assistance for general health, COVID-19 response, and pandemic preparedness and response using a keyword search. Health spending estimates were combined with estimates of resources needed for pandemic prevention and preparedness to analyse future health spending patterns, relative to need. Findings: In 2019, at the onset of the COVID-19 pandemic, US$9·2 trillion (95% uncertainty interval [UI] 9·1–9·3) was spent on health worldwide. We found great disparities in the amount of resources devoted to health, with high-income countries spending $7·3 trillion (95% UI 7·2–7·4) in 2019;293·7 times the $24·8 billion (95% UI 24·3–25·3) spent by low-income countries in 2019. That same year, $43·1 billion in development assistance was provided
The adaptation of sequential algorithms for High Performance Computing (HPC) systems is determined by a tradeoff between algorithmic effectiveness (software) and communication frequency (hardware) of the parallel impl...
详细信息
The adaptation of sequential algorithms for High Performance Computing (HPC) systems is determined by a tradeoff between algorithmic effectiveness (software) and communication frequency (hardware) of the parallel implementation (efficiency). To get a better understanding of the correlation, we define simple models for both, software and hardware, in order to dynamically find the best mapping parameters for the execution of the algorithm on the parallel system. For the evaluation of our method we look at population-based algorithms like the Particle Swarm Optimization Algorithm (PSO) for the solution of optimization problems. Different goals like best quality of the solution of the optimization problem for a given execution time or best execution time to find the optimum are defined by the user. Our method enables us to find the best parameters for the mapping which then results in an efficient and effective parallel implementation to achieve the user-defined goals on a High Performance Computing Cluster (HPCC).
暂无评论