The discovery of biomarkers and the underlying causes of diseases are enabled through the analysis of biological samples and data stored in biobanks. Biological samples and their associated data are expensive to colle...
详细信息
The discovery of biomarkers and the underlying causes of diseases are enabled through the analysis of biological samples and data stored in biobanks. Biological samples and their associated data are expensive to collect and maintain, and it is important to store and manage them efficiently. During processing, samples and data go through a number of procedures and techniques, which are often in different locations and by individuals with different access permissions. It is essential to maintain the link between the samples and their associated data throughout the processes they undergo. This paper presents a novel system of tracking samples and data from collection through processing to storage and retrieval, which is based on Radio Frequency Identification (RFID) technology. This system ensures the security and reliability of sample data as well as location-independent recording and updating of sample data, as they move along the workflow.
The due date assignment and scheduling problems arise in production planning when the management is faced with setting realistic due dates for a number of jobs. Most research on scheduling with due date assignment is ...
详细信息
The due date assignment and scheduling problems arise in production planning when the management is faced with setting realistic due dates for a number of jobs. Most research on scheduling with due date assignment is focused on optimal sequencing of independent jobs. However, it is often found in practice that some products are manufactured in a certain order implied, for example, by technological, marketing or assembly requirements and this can be modeled by imposing precedence constraints on the set of jobs. In classical deterministic scheduling models, the processing conditions, including job processing times, are usually viewed as given constants. However, in many real-life situations, the processing conditions may vary over time, thereby affecting actual durations of jobs. There are two categories of scheduling models in which the actual processing time of a job depends on its place in the schedule: in scheduling with deterioration, the later a job starts, the longer it takes to process, and in scheduling with learning, the actual processing time of a job gets shorter, provided that the job is scheduled later. We review the results on scheduling with due date assignment under such conditions on job processing as given precedence constraints or various scenarios of processing time deterioration and learning.
Current data sharing and integration among various organizations require a central and trusted authority to first collect data from all data sources and then integrate the collected data. This process tends to complic...
详细信息
Recently Ferraris, Lee and Lifschitz proposed a new definition of stable models that does not refer to grounding, which applies to the syntax of arbitrary first-order sentences. We show its relation to the idea of loo...
详细信息
In this paper we investigate potential benefits that an adaptive delayed channel access algorithm can attain for the next-generation wireless LANs, the IEEE 802.11n. We show that the performance of frame aggregation i...
详细信息
Complexity in real-world problems is often tackled by a divide-and-conquer strategy which consists of breaking down the problem into sub-problems to find local solutions. These solutions are then merged in a bottom-up...
详细信息
ISBN:
(纸本)9781605580463
Complexity in real-world problems is often tackled by a divide-and-conquer strategy which consists of breaking down the problem into sub-problems to find local solutions. These solutions are then merged in a bottom-up fashion and optimized to produce the final solution. Applications like wiring and pipelining in urban areas are typically complex problems. They require searching the famous Minimum Steiner tree in huge graphs that model the real-world topology of the urban areas. The present paper introduces a new approach relying on the notion of divide-and-conquer to solve the Minimum Steiner tree in large graphs. This approach, called SC-IAC, combines spectral clustering and ant colony optimization in a two-stage algorithm. The first stage allows generating graph segments, whereas the second uses parallel independent ant colonies to find local and global minima of the Steiner tree. To illustrate the efficiency and accuracy of SC-IAC, large real-world benchmarks are used. Copyright 2008 ACM.
Extracting dense sub-components from graphs efficiently is an important objective in a wide range of application domains ranging from social network analysis to biological network analysis, from the World Wide Web to ...
详细信息
ISBN:
(纸本)9781605581026
Extracting dense sub-components from graphs efficiently is an important objective in a wide range of application domains ranging from social network analysis to biological network analysis, from the World Wide Web to stock market analysis. Motivated by this need recently we have seen several new algorithms to tackle this problem based on the (frequent) pattern mining paradigm. A limitation of most of these methods is that they are highly sensitive to parameter settings, rely on exhaustive enumeration with exponential time complexity, and often fail to help the users understand the underlying distribution of components embedded within the host graph. In this article we propose an approximate algorithm, to mine and visualize cohesive subgraphs (dense sub components) within a large graph. The approach, refereed to as Cohesive Subgraph Visualization (CSV) relies on a novel mapping strategy that maps edges and nodes to a multi-dimensional space wherein dense areas in the mapped space correspond to cohesive subgraphs. The algorithm then walks through the dense regions in the mapped space to output a visual plot that effectively captures the overall dense sub-component distribution of the graph. Unlike extant algorithms with exponential complexity, CSV has a complexity of O(V 2logV) when fixing the parameter mapping dimension, where V corresponds to the number of vertices in the graph, although for many real datasets the performance is typically sub-quadratic. We demonstrate the utility of CSV as a stand-alone tool for visual graph exploration and as a pre-filtering step to significantly scale up exact subgraph mining algorithms such as CLAN [33]. Copyright 2008 ACM.
This paper proposes a systemic framework combining techniques from soft systems methodology (SSM), the unified modelling language (UML) and the naked objects approach to implementing interactive software systems. The ...
详细信息
This paper proposes a systemic framework combining techniques from soft systems methodology (SSM), the unified modelling language (UML) and the naked objects approach to implementing interactive software systems. The framework supports the full development cycle from business process modelling to software implementation. SSM is used to explore the problem situation, techniques based on the UML are used for detailed design and the Naked Objects framework is used for implementation. We argue that there are advantages from combining and using the three techniques together to improve the quality of business process modelling and implementation. The proposed systemic framework is explained and justified using Mingers multimethodology ideas. The approach is being evaluated through a series of action research projects based on real-world case studies.
A number of Denial of Service (DoS) attacks in IEEE 802.11 are due to unauthenticated/unencrypted management and control frames. Current IEEE 802.11 simulators deal with Physical and MAC layers and do not include the ...
详细信息
A number of Denial of Service (DoS) attacks in IEEE 802.11 are due to unauthenticated/unencrypted management and control frames. Current IEEE 802.11 simulators deal with Physical and MAC layers and do not include the exchange of management and control frames, thus making it difficult to simulate an attack (DoS) and its possible solution. A basic IEEE 802.11 network simulator using Verilog is presented. Basic aim is to design a simulator using a hardware description language (HDL) such as Verilog, since the functions and protocols described in state machines are best simulated using a HDL. Besides simulation of a simple wireless network, the paper also presents simulation of a spoofed MAC disassociation DoS attack and one of its possible solutions. The proposed simulator includes the communication setup process and can be used for simulating other DoS attacks and their possible solutions.
Voice over Internet Protocol (VoIP) is an important service with strict Quality-of-Service (QoS) requirements in Wireless Local Area Networks (WLANs). The popular Distributed Coordination Function (DCF) of IEEE 802.11...
详细信息
暂无评论