The field of nature inspired computing and optimization techniques have evolved to solve difficult optimization problems in diverse fields of engineering, science and technology. The firefly attraction process is mimi...
The field of nature inspired computing and optimization techniques have evolved to solve difficult optimization problems in diverse fields of engineering, science and technology. The firefly attraction process is mimicked in the algorithm for solving optimization problems. In Firefly Algorithm (FA) sorting of fireflies is done by using sorting algorithm. The original FA is proposed with bubble sort for ranking the fireflies. In this paper, the quick sort replaces bubble sort to decrease the time complexity of FA. The dataset used is unconstrained benchmark functions from CEC 2005 [22]. The comparison of FA using bubble sort and FA using quick sort is performed with respect to best, worst, mean, standard deviation, number of comparisons and execution time. The experimental result shows that FA using quick sort requires less number of comparisons but requires more execution time. The increased number of fireflies helps to converge into optimal solution whereas by varying dimension for algorithm performed better at a lower dimension than higher dimension.
Over the years, lossless audio compression has gained popularity as researchers and businesses has become more aware of the need for better quality and higher storage demand. This paper will analyse various lossless a...
Over the years, lossless audio compression has gained popularity as researchers and businesses has become more aware of the need for better quality and higher storage demand. This paper will analyse various lossless audio coding algorithm and standards that are used and available in the market focusing on Linear Predictive Coding (LPC) specifically due to its popularity and robustness in audio compression, nevertheless other prediction methods are compared to verify this. Advanced representation of LPC such as LSP decomposition techniques are also discussed within this paper.
The technical difficulty of diagnosing joint misalignment and/or dysfunction by quantitative evaluation is commonly acknowledged among manual therapists. Usually, manual therapists make a diagnosis based on a combinat...
The technical difficulty of diagnosing joint misalignment and/or dysfunction by quantitative evaluation is commonly acknowledged among manual therapists. Usually, manual therapists make a diagnosis based on a combination of observing patient symptoms and performing physical examinations, both of which rely on subjective criteria and thus contain some uncertainty. We thus sought to investigate the correlations among posture, skeletal misalignment, and pain severity over the course of manual therapy treatment, and to explore the possibility of establishing objective criteria for diagnosis. For this purpose, we developed an iPad application that realizes the measurement of patients’ postures and analyzes them quantitatively. We also discuss the results and effectiveness of the measurement and analysis.
Elongation of vibrato tones while preserving vibrato rate is a non-trivial problem in audio processing. It is well known that linear time-scaling in the time domain simply translates duration in lockstep with pitch. I...
Elongation of vibrato tones while preserving vibrato rate is a non-trivial problem in audio processing. It is well known that linear time-scaling in the time domain simply translates duration in lockstep with pitch. In the time/frequency domain, frequencies are preserved under linear time-scaling, but vibrato rates are not. Looping can preserve rate, but there is a problem synchronizing loops with vibrato cycles. Our current method is to parameterize harmonic frequency- and amplitude-vs-time variations so that vibrato rate is one of the parameters, along with vibrato depth and frequency drift. A heterodyne/filter method, where the heterodyne frequency is estimated from an FFT of the full-duration frequency deviation waveform, is used for vibrato analysis. Harmonic amplitude tremolo parameters are also estimated by using the same heterodyne/filter method. Once the model is completed, the harmonic parameters can be time-scaled while keeping the vibrato rate intact. An exception to this treatment is to retain the original attack and decay epochs. Once recomputed, the new time-varying harmonic spectrum representation is converted to a time-domain signal via sinusoidal additive synthesis.
The plasmodium of true slime mold Physarum polycephalum is a unicellular and multinuclear giant amoeba. Since the cellular organism has some computational abilities, it is attracting much attention in the field of inf...
The plasmodium of true slime mold Physarum polycephalum is a unicellular and multinuclear giant amoeba. Since the cellular organism has some computational abilities, it is attracting much attention in the field of information science. However, previous studies have mainly focused on the optimization behavior of the plasmodium for a single-modality stimulus, and there are few studies on how the organism adapts to multi-modal stimuli. We stimulated the plasmodium with mixture of attractant and repellent stimuli, and we observed bifurcation in the chemotactic behavior of the plasmodium.
In this manuscript a new monitoring scheme has been introduced using the double moving average for the exponential distribution under the EWMA statistic. The control chart coefficients have been estimated for specific...
In this manuscript a new monitoring scheme has been introduced using the double moving average for the exponential distribution under the EWMA statistic. The control chart coefficients have been estimated for specific error rates of the in-control and out-of-control processes. The design structure of the proposed chart is constructed and its performance is evaluated for different settings of the parameters with the commonly used measure of the average run lengths. The developed design has been compared with the existing counterpart. The proposed chart shows an outshine performance for the quickest detection of the shifted process. A practical example has been included for its practical use by the quality control personnel.
Obesity is a risk factor for heart disease, stroke, diabetes, high blood pressure, and other chronic diseases. Some drugs, including fenofibrate, are used to treat obesity or excessive weight by lowering the level of ...
详细信息
Obesity is a risk factor for heart disease, stroke, diabetes, high blood pressure, and other chronic diseases. Some drugs, including fenofibrate, are used to treat obesity or excessive weight by lowering the level of specific triglycerides. However, different groups have different drug sensitivities and, consequently, there are differences in drug effects. In this study, we assessed both genetic and nongenetic factors that influence drug responses and stratified patients into groups based on differential drug effect and sensitivity. Our methodology of investigating genetic factors and nongenetic factors is applicable to studying differential effects of other drugs, such as statins, and provides an approach to the development of personalized medicine.
The elemental detection of green bean of arabica and robusta coffee from Gayo Highland, Aceh-Indonesia, has been identified by using fundamental Nd-YAG Laser at 10 Torr of surrounding air gas pressure for distinguishi...
The elemental detection of green bean of arabica and robusta coffee from Gayo Highland, Aceh-Indonesia, has been identified by using fundamental Nd-YAG Laser at 10 Torr of surrounding air gas pressure for distinguishing the characteristics of both coffees. As the preliminary study, we have detected the elements of K 766.49 nm, Na 588.9 nm, Ca 393.3 nm, CN band at 388.3 nm, N 337.13 nm and C 247.8 nm of both coffees. It is noticed that the order of elements concentration from highest to lowest are Ca>K>CN> Na>N> C for arabica and K>Ca>CN >Na>C>N for robusta. The emission intensity of K 766.49 nm is almost same for both of coffee. However, the emission intensity of Na 588.9 nm is lower in Arabica coffee. To distinguish the Arabica coffee and Robusta Coffee, we take the ratio intensity of K/C, Na/C, CN/C, and Ca/C. It is found that the ratio intensities of CN/C and Ca/C in arabica bean are significantly different with robusta bean. That ratio intensities can be used as a marker to discriminate kind of coffee. We also noted that the arabica green bean is 1.3 harder than robusta green bean. These findings prove that the technique of laser-induced plasma spectroscopy can be used to make rapid identification of elements in coffee and can potentially be applied to measure the concentration of blended coffee for the purpose of authentication.
In Indonesia, based on the result of Basic Health Research 2013, the number of stroke patients had increased from 8.3 ‰ (2007) to 12.1 ‰ (2013). These days, some researchers are using electroencephalography (EEG) re...
In Indonesia, based on the result of Basic Health Research 2013, the number of stroke patients had increased from 8.3 ‰ (2007) to 12.1 ‰ (2013). These days, some researchers are using electroencephalography (EEG) result as another option to detect the stroke disease besides CT Scan image as the gold standard. A previous study on the data of stroke and healthy patients in National Brain Center Hospital (RS PON) used Brain Symmetry Index (BSI), Delta-Alpha Ratio (DAR), and Delta-Theta-Alpha-Beta Ratio (DTABR) as the features for classification by an Extreme Learning Machine (ELM). The study got 85% accuracy with sensitivity above 86 % for acute ischemic stroke detection. Using EEG data means dealing with many data dimensions, and it can reduce the accuracy of classifier (the curse of dimensionality). Principal Component Analysis (PCA) could reduce dimensionality and computation cost without decreasing classification accuracy. XGBoost, as the scalable tree boosting classifier, can solve real-world scale problems (Higgs Boson and Allstate dataset) with using a minimal amount of resources. This paper reuses the same data from RS PON and features from previous research, preprocessed with PCA and classified with XGBoost, to increase the accuracy with fewer electrodes. The specific fewer electrodes improved the accuracy of stroke detection. Our future work will examine the other algorithm besides PCA to get higher accuracy with less number of channels.
This paper provides a framework for characterizing anisotropic guided waves to locate damage in composite structures. Composite guided wave structural health monitoring is a significant challenge due to anisotropy. Wa...
This paper provides a framework for characterizing anisotropic guided waves to locate damage in composite structures. Composite guided wave structural health monitoring is a significant challenge due to anisotropy. Wave velocities and attenuation vary as a function of propagation direction. Traditional localization algorithms, such as triangulation and delay-and-sum beamforming, fail for composite monitoring because they rely on isotropic velocity assumptions. Estimating the anisotropic velocities is also challenging because the inverse problem is inherently ill-posed. We cannot solve for an infinite number of directions with a finite number of measurements. This paper addresses these challenges by deriving a physics-based model for unidirectional anisotropy and integrating it with sparse recovery tools and matched field processing to characterize composite guided waves and locate an acoustic source. We validate our approach with experimental laser doppler vibrometry measurements from a glass fiber reinforced composite panel. We achieve localization accuracies of more than 290 and 49 times better, respectively, than delay-and-sum and matched field processing with isotropic assumptions.
暂无评论