For several decades, the power system protection relay has experienced many important changes, from purely electromechanical type to the mixture of electronic and electromechanical type, then to fully static and now f...
详细信息
For several decades, the power system protection relay has experienced many important changes, from purely electromechanical type to the mixture of electronic and electromechanical type, then to fully static and now fully numerical relays based on microprocessors. In the transformer protection area, similar changes can be seen. This new numerical technology had been developed so much that now the protection systems integrates in the same device, besides the protection function also the control functions, this kind of systems also being called as multifunctional protection system (MPS). The paper relieve the main features of the newly developed transformer protection and control systems, the positive impact on the faults clearance times and also the development trends in this domain .
In this study we propose a series of test parameters for random number generators in order to analyze the uniformity and the degree of correlation of generated numbers. The present analysis was applied to a series of ...
详细信息
In this study we propose a series of test parameters for random number generators in order to analyze the uniformity and the degree of correlation of generated numbers. The present analysis was applied to a series of standard generators used in libraries of the usual programming languages such as: C/C++, Java and Fortran. Besides using some standard tests, new parameters that provide a more adequate description of uniformity and correlation are proposed. The analysis attempts to explain the different results obtained in simulations performed with different types of generators and to outline the different degree of uniformity and correlation of the numbers produced by these generators. The uniformity and correlation properties were analyzed both independently and together through integrants parameters. The analysis revealed that high sizes of storage and representation or high periodicity of the random number generators does not offer at the same time both high uniformity and less correlation of random numbers packets with different sizes.
This study attempts to establish methods for characterizing the complexity of ordinal data through the information and entropy parameters. In this respect, there were examined the methods for measuring the complexity ...
详细信息
This study attempts to establish methods for characterizing the complexity of ordinal data through the information and entropy parameters. In this respect, there were examined the methods for measuring the complexity of data with similar statistical characteristics and the parameters that can make the difference between them were established. For this purpose, the analysis was applied to three data sets with identical overall statistical characteristics but with different order of elements, respectively incremental, random and oscillatory incremental data. The analysis highlighted that it is necessary to include the information and entropy parameters for different variation orders of data elements. In this regard new parameters have been proposed, based on information and entropy expressions that describe, in an adequate way, the complexity of ordinal data.
To engineer the factory of the future the paper argues for a reference model that is not necessary restricted to the control component, but integrates the physical and human components as well. This is due to the real...
To engineer the factory of the future the paper argues for a reference model that is not necessary restricted to the control component, but integrates the physical and human components as well. This is due to the real need to accommodate the latest achievements in factory automation where the human is not merely playing a simple and clear role inside the control-loop, but is becoming a composite factor in a highly automated system (“man in the mesh”). The concept is demonstrated by instantiating the anthropocentric cyber-physical reference architecture for smart factories (ACPA4SF) in a concrete case study that needs to accommodate the ongoing researches from the SmartFactory KL facility (e.g. augmented reality, mobile interaction technology, virtual training of human operators).
Accurate prediction of power loss distribution within an electrical device is highly desirable as it allows thermal behavior to be evaluated at the early design stage. Three-dimensional (3-D) and two-dimensional (2-D)...
详细信息
Accurate prediction of power loss distribution within an electrical device is highly desirable as it allows thermal behavior to be evaluated at the early design stage. Three-dimensional (3-D) and two-dimensional (2-D) finite element analysis (FEA) is applied to calculate dc and ac copper losses in the armature winding at high-frequency sinusoidal currents. The main goal of this paper is showing the end-winding effect on copper losses. Copper losses at high frequency are dominated by the skin and proximity effects. A time-varying current has a tendency to concentrate near the surfaces of conductors, and if the frequency is very high, the current is restricted to a very thin layer near the conductor surface. This phenomenon of nonuniform distribution of time-varying currents in conductors is known as the skin effect. The term proximity effect refers to the influence of alternating current in one conductor on the current distribution in another, nearby conductor. To evaluate the ac copper loss within the analyzed machine a simplified approach is adopted using one segment of stator core. To demonstrate an enhanced copper loss due to ac operation, the dc and ac resistances are calculated. The resistances ratio ac to dc is strongly dependent on frequency, temperature, shape of slot and size of slot opening.
In this paper comparison of the two innovative signal processing methods for analysis of both EEG and EMG biomedical signals is in short presented. The reason for that is caused by the fact, that nowadays the broad an...
详细信息
In this paper comparison of the two innovative signal processing methods for analysis of both EEG and EMG biomedical signals is in short presented. The reason for that is caused by the fact, that nowadays the broad analysis of various biomedical signals is extremely popular. The first method presented in this paper relies on kernel density estimators application. Implementation of such method enables construction of densitograms for the examined bio-signals. One of the biggest advantages of this method is that it allows to obtain statistically filtered signals, which results in making the whole signal processing task significantly quicker. The second method described in this paper is based on basic mathematical operations only. Despite its simplicity the whole process can be implemented on almost any hardware platform, including those with very limited computational capabilities. Also it makes the task quick. In accordance with the conducted experiments - the method is also efficient and as it can also be implemented on embedded platform and the algorithm can be rewritten in any programming language, the potential application of this method is wide.
This paper introduces a systematic approach to synthesize linear parameter-varying (LPV) representations of nonlinear (NL) systems which are originally defined by control affine state-space representations. The conver...
This paper introduces a systematic approach to synthesize linear parameter-varying (LPV) representations of nonlinear (NL) systems which are originally defined by control affine state-space representations. The conversion approach results in LPV state-space representations in the observable canonical form. Based on the relative degree concept of NL systems, the states of a given NL representation are transformed to new coordinates that provide its normal form. In the SISO case, all nonlinearities of the original system are embedded in one NL function which is factorized to construct the LPV form. An algorithms is proposed for this purpose. The resulting transformation yields an LPV model where the scheduling parameter depends on the derivatives of the inputs and outputs of the system. In addition, if the states of the NL model can be measured or estimated, then the procedure can be modified to provide LPV models scheduled by these states. Examples are included for illustration.
One of the techniques which can be used to quantitatively evaluate images statistically is the so-called random-walk approach. The resulting Hurst exponent is a measure of the complexity of the picture. Especially lon...
One of the techniques which can be used to quantitatively evaluate images statistically is the so-called random-walk approach. The resulting Hurst exponent is a measure of the complexity of the picture. Especially long, fine elements in the image, such as fibres, influence the Hurst exponent significantly. Thus, determination of the Hurst exponent has been suggested as new method to measure the hairiness of yarns or knitted fabrics, since existing hairiness measurement instruments are based on different measurement principles which are not comparable. While the principal usability of this method for hairiness detection has been shown in former projects, the absolute value of the calculated Hurst exponents depends on the technique to take the photographic image of a sample, to transfer it into a monochrome picture, and on possible image processing steps. This article gives an overview of edge detection filters, possible definitions of the threshold value between black and white for the transformation into a monochrome image, etc. It shows how these parameters should be chosen in case of typical textile samples and correlates the challenges of this novel method with well-known problems of common techniques to measure yarn and fabric hairiness.
This paper focuses on evaluating the performance of two aperture averaging methods used for compensating the effects of the air turbulence in free space optical (FSO) communications. These methods are based on using a...
详细信息
This paper focuses on evaluating the performance of two aperture averaging methods used for compensating the effects of the air turbulence in free space optical (FSO) communications. These methods are based on using a concentration lens and spherical concave mirrors (SCM). The preliminary experimental results show that the quality of the received signal in terms of the Q-factor and the scintillation index is moderately improved when employing a lens in comparison to SCM for all turbulence regimes. However, these results were obtained with different collection areas and focal points. Therefore, a more rigorous approach using lens and SCM with the same aperture diameters and focal lengths needs to be adopted to ensure conclusive results.
In this paper we focus on the segmentation problem of specific chained configurations in images taken from colon tissues. The proposed technique uses a priori information about the general structure and the relationsh...
详细信息
暂无评论