We present a cryptographic scheme for encrypting 2-D gray scale images by using a large family of fractals. This scheme is based on a transposition of the image elements implemented by a generator of 2-D hierarchical ...
详细信息
We present a cryptographic scheme for encrypting 2-D gray scale images by using a large family of fractals. This scheme is based on a transposition of the image elements implemented by a generator of 2-D hierarchical scanning patterns producing a large subset of the (n(2))! possible orders defined on a 2-D image of n x n elements. Each pattern defines a distinct order of pixels and can be described by an expression, which is considered as the key of the transposition. This transposition cipher can easily be combined with various substitution ciphers, producing efficient product ciphers operating on pictorial data. Two such ciphers are constructed and their effects on real gray value images are shown. Encryption and decryption algorithms are derived from a parallel algorithm implementing the creation of the family of scanning patterns.
Many image processing operations can be abstracted into matrix operations. With the help of matrix analysis, we can understand the inherent properties of the operations and thus design better algorithms. In this paper...
详细信息
Many image processing operations can be abstracted into matrix operations. With the help of matrix analysis, we can understand the inherent properties of the operations and thus design better algorithms. In this paper, we propose a matrix decomposition method referred to as identity-plus-row decomposition, The decomposition is particularly useful in design of parallel projection algorithms on mesh-connected computers. Projection is a frequently used process in image processing and visualization. In volume graphics, projection is used to render the essential content of a three-dimensional volume onto a two-dimensional image plane. For Radon transform, projection is used to transform the image space into a parameter space. By applying the identity-plus-row matrix decomposition method, we solve the data redistribution problem due to the irregular data access patterns present in those applications on single instruction stream, multiple data stream (SIMD) mesh-connected computers, developing fast algorithms for volume rendering and Radon transform on SIMD mesh-connected computers. (C) 2001 SPIE and IS&T.
Inspired by the idea that threshold surface always intersects the image surface at high gradient points, an active surface-based adaptive thresholding algorithm is proposed to get the binarized result. In this model, ...
详细信息
Inspired by the idea that threshold surface always intersects the image surface at high gradient points, an active surface-based adaptive thresholding algorithm is proposed to get the binarized result. In this model, the external force is designed to be repulsive from the image surface, thus at the equilibrium state the active surface tends to cover the, supporting points of high gradient with smooth property. as well as be away from the image surface locally, which makes the obtained. threshold surface properly, separate the foreground and background. The description of the algorithm,is. in a simple and reasonable energy functional form, and only two parameters need to be tuned, which gives more convenience to the operation. Analysis and comparison for the, experimental. results reveal that it cannot only give the proper thresholding result but also restrain the occurrence of the ghost phenomenon. (C) 2003 SPIE and IST.
A retrospective of the challenges and developments in computational physical property prediction at Linde Engineering over the last 40 years is given. The outstanding impact of professor Michael Michelsen's resear...
详细信息
A retrospective of the challenges and developments in computational physical property prediction at Linde Engineering over the last 40 years is given. The outstanding impact of professor Michael Michelsen's research in this area is highlighted by a transcript of a talk given by professor Kistenmacher, one of the former heads of the Physical Property Group at Linde Engineering. The reasons why the results of academic research are sometimes slow to be applied by industry are considered. We also give an insight into the current and future challenges facing industry in predicting physical properties and why further research requires more interdisciplinary collaboration in academia. Finally, a proposal is made on how parties from both industry and academia can join forces to establish new standards in physical property modeling.
Chemical composition of biomass feedstock is an important parameter for optimizing the yield and economics of various bioconversion pathways. Although chemical composition of biomass varies among species, varieties, a...
详细信息
Chemical composition of biomass feedstock is an important parameter for optimizing the yield and economics of various bioconversion pathways. Although chemical composition of biomass varies among species, varieties, and plant components, there is distinct variation even among stem components, such as nodes and internodes. Separation of morphological components possessing different quality attributes and utilizing them in 'segregated processing' leads to better handling, more efficient processing, and high-valued products generation. Using equipment to separate morphological components such as node and internodes of biomass stem that have closely related physical properties (e.g., size, shape, density) is difficult. However, as the nodes and internodes are clearly distinct in appearance by visual observation, the potential of digital image analysis for node and internode identification and quantification was investigated. We used chopped stems of big bluestem, corn, and switchgrass as test materials. Pixel color variation along the length was used as the principle of identifying the nodes and internodes. An algorithm in MATLAB was developed to evaluate the gray value intensity within a narrow computational band along the major axis of nodes and internodes. Several extracted image features, such as minimum, maximum, average, standard deviation, and variation of the computational band gray values;ribbon length of the computational band normalized gray value curve (NGVC), unit ribbon length of NGVC;area under NGVC, and unit area under NGVC were tested for the identification. Unit area under NGVC was the best feature/parameter for the identification of the nodes and internodes with an accuracy of about 96.6% (9 incorrect out of 263 objects). This image processing methodology of nodes and intenodes identification can form the supporting software for the hardware systems that perform the separation. Published by Elsevier B.V.
Texture synthesis is the ability to create ensembles of images of similar structures from sample textures that have been photographed. The method we employ for texture synthesis is based on histogram matching of image...
详细信息
Texture synthesis is the ability to create ensembles of images of similar structures from sample textures that have been photographed. The method we employ for texture synthesis is based on histogram matching of images at multiple scales and orientations. This paper reports two fast and in one case simple algorithms for histogram matching We show that the sort-matching and the optimal cumulative distribution function (CDF)-matching (OCM) algorithms provide high computational speed compared to that provided by the conventional approach. The sort-matching algorithm also provides exact histogram matching. Results of texture synthesis using either method show no subjective perceptual differences. The sort-matching algorithm is attractive because of its simplicity and speed, however as the size of the image increases, the OCM algorithm may be preferred for optimal computational speed. (C) 2000 SPIE and IS&T. [S1017-9909(00)00601-2].
The increasing availability of remotely sensed data offers a new opportunity to address landslide hazard assessment at larger spatial scales. A prototype global satellite-based landslide hazard algorithm has been deve...
详细信息
The increasing availability of remotely sensed data offers a new opportunity to address landslide hazard assessment at larger spatial scales. A prototype global satellite-based landslide hazard algorithm has been developed to identify areas that may experience landslide activity. This system combines a calculation of static landslide susceptibility with satellite-derived rainfall estimates and uses a threshold approach to generate a set of 'nowcasts' that classify potentially hazardous areas. A recent evaluation of this algorithm framework found that while this tool represents an important first step in larger-scale near real-time landslide hazard assessment efforts, it requires several modifications before it can be fully realized as an operational tool. This study draws upon a prior work's recommendations to develop a new approach for considering landslide susceptibility and hazard at the regional scale. This case study calculates a regional susceptibility map using remotely sensed and in situ information and a database of landslides triggered by Hurricane Mitch in 1998 over four countries in Central America. The susceptibility map is evaluated with a regional rainfall intensity-duration triggering threshold and results are compared with the global algorithm framework for the same event. Evaluation of this regional system suggests that this empirically based approach provides one plausible way to approach some of the data and resolution issues identified in the global assessment. The presented methodology is straightforward to implement, improves upon the global approach, and allows for results to be transferable between regions. The results also highlight several remaining challenges, including the empirical nature of the algorithm framework and adequate information for algorithm validation. Conclusions suggest that integrating additional triggering factors such as soil moisture may help to improve algorithm performance accuracy. The regional algorithm scenario r
A joint classification-compression scheme that provides the user with added capability to prioritize classes of interest in the compression process is proposed. The dual compression system includes a primary unit for ...
详细信息
A joint classification-compression scheme that provides the user with added capability to prioritize classes of interest in the compression process is proposed. The dual compression system includes a primary unit for conventional coding of a multispectral image set followed by an auxiliary unit to code the resulting error induced on pixel vectors that represent classes of interest. This technique effectively allows classes of interest in the scene to be coded at a relatively higher level of precision than nonessential classes. Prioritized classes are selected from a thematic map or directly specified by their unique spectral signatures. Using the specified spectral signatures of the prioritized classes as end members, a modified linear spectral unmixing procedure is applied to the original data as well as to the decoded data. The resulting two sets of concentration maps, which represent classes prioritized before and after compression, are compared and the differences between them are coded via an auxiliary compression unit and transmitted to the receiver along with a conventionally coded image set. At the receiver, the differences found are blended back into the decoded data for enhanced restoration of the prioritized classes. The utility of this approach is that it works with any multispectral compression scheme. This method has been applied to test the imagery from various platforms including NOAA's AVHRR (1.1 km GSD), and LANDSAT 5 TM (30 m GSD), LANDSAT 5 MSS (79 m GSD). (C) 2002 SPIE and IST.
In recent years several algorithms have been reported for automating fringe data collection in photomechanics using the technique of digital image processing (DIP). Recent advances in phase shifting interferometry hav...
详细信息
In recent years several algorithms have been reported for automating fringe data collection in photomechanics using the technique of digital image processing (DIP). Recent advances in phase shifting interferometry have offered some hope for full automation of static problems. However, for real-time dynamic studies conventional recording of fringes is a must. Fringe thinning is a very crucial step in extracting data for further processing. The various DIP algorithms for fringe thinning are surveyed and an attempt is made to explain better the mechanism of fringe skeleton extraction by various algorithms. The algorithm of Ramesh and Pramod is improved to extract fringe skeletons from saddle points in the fringe field. A comparative performance evaluation of these algorithms is discussed with respect to the quality and accuracy of fringe skeleton extracted and the processing time. Performance evaluation is done on a few computer-generated test images and also on images recorded by the technique of photoelasticity. The improved version of the algorithm of Ramesh and Pramod is found to give better fringe skeletons, it is also the fastest and the processing time is an order of magnitude less than the other algorithms. If is proposed that these computer-generated test images could be used as standard test images for checking the performance of any new fringe thinning algorithm.
The integration of intemet and mobile phones has opened the door to a new wave of utilizing private vehicles as probes not only for performance evaluation but for traffic control as well, gradually replacing the role ...
详细信息
The integration of intemet and mobile phones has opened the door to a new wave of utilizing private vehicles as probes not only for performance evaluation but for traffic control as well, gradually replacing the role of traffic surveillance systems as the dominant source of traffic data. To prepare for such a paradigm shift, one needs to overcome some key institutional barriers, in particular, the privacy issue. A Highway Voting System (HVS) is proposed to address this issue in which drivers provide link- and/or path-based vehicle data to the traffic management system in the form of "votes" in order to receive favorable service from traffic control. The proposed HVS offers a platform that links data from individual vehicles directly with traffic control. In the system, traffic control responds to voting vehicles in a way similar to the current system responding to prioritized vehicles and providing the requested services accordingly. We show in the paper that the proposed "voting" system can effectively resolve the privacy issue which often hampers traffic engineers from getting detailed data from drivers. Strategies to entice drivers into "voting" so as to increase the market penetration level under all traffic conditions are discussed. Though the focus of the paper is on addressing the institutional issues associated with data acquisition from individual vehicles, other research topics associated with the proposed system are identified. Two examples are given to demonstrate the impact of the proposed system on algorithm development and traffic control. (C) 2015 Elsevier Ltd. All rights reserved.
暂无评论