The article presents information on the new web-based manuscript submission and peer review system adopted by the publishers of the periodical 'Grass and Forage Science.' The system is called Manuscript Centra...
详细信息
The article presents information on the new web-based manuscript submission and peer review system adopted by the publishers of the periodical 'Grass and Forage Science.' The system is called Manuscript Central. Manuscript submission is a step-by-step process, and little special preparation is required beyond having all parts of a manuscript in an electronic format and having access to a computer with an Internet connection and a Web browser. e-mail. In the system, a completed submission is confirmed by e-mail immediately and your paper will enter the editorial process with no postal delay.
The article provides some concepts and principles concerning e-learning. An editorial in the 'American Journal of Distance Education' suggests that the challenge facing teachers and administrators is to know w...
详细信息
The article provides some concepts and principles concerning e-learning. An editorial in the 'American Journal of Distance Education' suggests that the challenge facing teachers and administrators is to know when to use face-to-face teaching and when to substitute distance learning, an equally or more effective, and less costly alternative. Face-to-face and online teaching are seen as complimentary and synergistic and allow greater flexibility in learning. Ways to maximize the benefits of e-learning include building on success, avoiding the technology traffic jam, and using good instructional design.
作者:
Peserico, EMIT
Comp Sci & Artificial Intelligence Lab Cambridge MA 02139 USA
In the context of competitive analysis of online algorithms for the k-server problem, it has been conjectured that every randomized, memoryless online algorithm exhibits the highest competitive ratio against an offlin...
详细信息
In the context of competitive analysis of online algorithms for the k-server problem, it has been conjectured that every randomized, memoryless online algorithm exhibits the highest competitive ratio against an offline adversary that is lazy, i.e., that will issue requests forcing it to move one of its own servers only when this is strictly necessary to force a move on the part of the online algorithm. We prove that, in general, this lazy adversary conjecture fails. Moreover, it fails in a very strong sense: there are adversaries which perform arbitrarily better than any other adversary which is even slightly "lazier."
A problem facing many areas of industry is the rapid increase in data and how to deal with it efficiently. In many cases, these large amounts of data are a useful resource, and the problem then becomes one of how to e...
详细信息
A problem facing many areas of industry is the rapid increase in data and how to deal with it efficiently. In many cases, these large amounts of data are a useful resource, and the problem then becomes one of how to extract meaning from the data. data mining is the process of exploring abstract data in the search for valuable and unexpected patterns. People can consider the available tools for data mining in two broad categories: automated intelligent tools and human perceptual tools. Automated intelligent tools implement well-defined strategies for finding rules and patterns in data. These tools exploit a computer's capability to perform error free, repetitive tasks and to efficiently process large amounts of data without human intervention. With human perception tools, on the other hand, the focus is on keeping the human in the process by displaying data to users and letting them search for patterns. These tools take advantage of the human capability to perform subtle pattern matching tasks.
We consider the on-line channel assignment problem in the case of cellular networks and we formalize this problem as an on-line load balancing problem for temporary tasks with restricted assignment. For the latter pro...
详细信息
We consider the on-line channel assignment problem in the case of cellular networks and we formalize this problem as an on-line load balancing problem for temporary tasks with restricted assignment. For the latter problem, we provide a general solution (denoted as the cluster algorithm) and we characterize its competitive ratio in terms of the combinatorial properties of the graph representing the network. We then compare the cluster algorithm with the greedy one when applied to the channel assignment problem: it turns out that the competitive ratio of the cluster algorithm is strictly better than the competitive ratio of the greedy algorithm. The cluster method is general enough to be applied to other on-line load balancing problems and, for some topologies, it can be proved to be optimal. (C) 2003 Elsevier B.V. All rights reserved.
Depending on the context, a policy can be a paper document, a table for selecting options, a sequence of logical assertions to automate operational decisions, or a tool to articulate business goals and service priorit...
详细信息
Depending on the context, a policy can be a paper document, a table for selecting options, a sequence of logical assertions to automate operational decisions, or a tool to articulate business goals and service priorities and facilitate decisions in enforcing business rules and service priorities. In this paper, we will analyze these aspects of policies and show how they relate to each other We will also analyze industry practices, examine applicable standards, and explore some advantages that a policy-based system can offer in such areas as network management, quality of service (QoS), and network security. (C) 2004 Lucent Technologies Inc.
Many, existing and emergent applications collect and reference data by geospatial location. Credit card transactions, for example, include addresses of both the place of purchase and the purchaser; telephone records i...
详细信息
Many, existing and emergent applications collect and reference data by geospatial location. Credit card transactions, for example, include addresses of both the place of purchase and the purchaser; telephone records include addresses and sometimes cell phone zones and geocoordinates; and population census tables contain addresses and other location information. These data sets are sources of potentially valuable information that can give their holders a competitive advantage. Government agencies also publish a wealth of statistical information that data analysts can apply to key problems in public health and safety or combine with proprietary data. The difficulty lies in finding the details that reveal the fine structures hidden in this data. Many approaches to analyzing such data exist-for example, statistical models, clustering, and association rules. Effective spatial data mining, however, must focus on finding location-related patterns and relationships. Interactive visual data exploration is important to spatial data mining. The wide area layout data observer involves the analyst in data exploration; thus complementing human perceptual skills, imagination, and flexibility with current computer systems to process large volumes of data and generate sophisticated displays. In this setting, the analyst directly interacts with the data, solving problems by applying domain expertise and general background knowledge to form and validate new hypotheses. INSET: Related Work..
This paper is concerned with the design of online scheduling algorithms that exploit extra resources. In particular, it studies how to make use of multiple processors to counteract the lack of future information in on...
详细信息
This paper is concerned with the design of online scheduling algorithms that exploit extra resources. In particular, it studies how to make use of multiple processors to counteract the lack of future information in online deadline scheduling. Our results extend the previous work that are primarily based on using a faster,processor to obtain a performance guarantee. The challenge arises from the fact that jobs are sequential in nature and cannot be executed on more than one processor at the same time. Thus, a faster processor can speed up a job while multiple unit-speed processors cannot.
Advances in information technology and inexpensive high-end computational power are motivating a new generation of methodological paradigms for the efficient information-based real-time operation of large-scale traffi...
详细信息
Advances in information technology and inexpensive high-end computational power are motivating a new generation of methodological paradigms for the efficient information-based real-time operation of large-scale traffic systems equipped with sensor technologies. Critical to their effectiveness are the control architectures that provide a blueprint for the efficient transmission and processing of large amounts of real-time data, and consistency-checking and fault tolerance mechanisms to ensure seamless automated functioning. However, the lack of low-cost, high-performance, and easy-to-build computing environments are key impediments to the widespread deployment of such architectures in the real-time traffic operations domain. This article proposes an Internet-based on-line control architecture that uses a Beowulf cluster as its computational backbone and provides an automated mechanism for real-time route guidance to drivers. To investigate this concept, the computationally intensive optimization modules are implemented on a low-cost 16-processor Beowulf cluster and a commercially available supercomputer, and the performance of these systems on representative computations is measured. The results highlight the effectiveness of the cluster in generating substantial computational performance scalability, and suggest that its performance is comparable to that of the more expensive supercomputer.
The Analytical Development Section of Savannah River Technology Center (SRTC) was requested by the Facilities Disposition Projects (FDP) to determine the holdup of enriched uranium in the 321-M facility as part of an ...
详细信息
The Analytical Development Section of Savannah River Technology Center (SRTC) was requested by the Facilities Disposition Projects (FDP) to determine the holdup of enriched uranium in the 321-M facility as part of an overall deactivation project of the facility. The 321-M facility was used to fabricate enriched uranium fuel assemblies, lithium-aluminum target tubes, neptunium assemblies, and miscellaneous components for the production reactors. The results of the holdup assays are essential for determining compliance with the Waste Acceptance Criteria, Material Control & Accountability, and to meet criticality safety controls. This report covers calibration of the detectors in order to support holdup measurements in the C and D out-gassing ovens. These ovens were used to remove gas entrained in billet assembly material prior to the billets being extruded into rods by the extrusion press. A portable high purity germanium detection system was used to determine highly enriched uranium (HEU) holdup and to determine holdup of U-235, Np-237, and Am-241 that were observed in these components. The detector system was run by an EG&G Dart(TM) system that contains the high voltage power supply and signal processing electronics. A personal computer with Gamma-Vision software was used to control the Dart(TM) MCA and provide space to store and manipulate multiple 4096-channel gamma-ray spectra. The measured Np-237 and Am-241 contents were especially important in these components because their presence is unusual and unexpected in 321-M. It was important to obtain a measured value of these two components to disposition the out-gassing ovens and to determine whether a separate waste stream was necessary for release of these contaminated components to the E-Area Solid Waste Vault. This report presents determination of the calibration constants from first principles for determination of Am-241 and Np-237 using this detection system and compares the values obtained for Np-237 with the
暂无评论