The interest is in characterizing insightfully the power of program self-reference in effective programming systems (epses), the computability-theoretic analogs of programming languages (for the partial computable fun...
详细信息
The interest is in characterizing insightfully the power of program self-reference in effective programming systems (epses), the computability-theoretic analogs of programming languages (for the partial computable functions). In an eps in which the constructive form of Kleene's Recursion Theorem (KRT) holds, it is possible to construct, algorithmically, from an arbitrary algorithmic task, a self-referential program that, in a sense, creates a self-copy and then performs that task on the self-copy. In an eps in which the not-necessarily-constructive form of Kleene's Recursion Theorem (krt) holds, such self-referential programs exist, but cannot, in general, be found algorithmically. In an earlier effort, Royer proved that there is no collection of recursive denotational control structures whose implementability characterizes the epses in which KRT holds. One main result herein, proven by a finite injury priority argument, is that the epses in which krt holds are, similarly, not characterized by the implementability of some collection of recursive denotational control structures. On the positive side, however, a characterization of such epses of a rather different sort is shown herein. Though, perhaps not the insightful characterization sought after, this surprising result reveals that a hidden and inherent constructivity is always present in krt.
The success of a programming system depends as much on the learnability of its language concepts as the usability of its interface. We argue that learnability can be significantly improved by integrating into the prog...
详细信息
The success of a programming system depends as much on the learnability of its language concepts as the usability of its interface. We argue that learnability can be significantly improved by integrating into the programming system learning supports that allow individuals to educate themselves about the syntax, semantics and applications of a language. Reflecting on our experience with developing novice programming systems, we identify infrastructural characteristics of such systems that can make the integration of learning supports practical. We focus on five core facilities: annotatability, scriptability monitorability, supplementability and constrainability. Our hope is that our examination of these technical facilities and their tradeoffs can inform the design of future programming systems that better address the educational needs of their users. (C) 2001 Academic Press.
In spite of impressive gains by PL/I, Fortran and Cobol remain the languages in which most of the world's production programs are written and will remain so into the foreseeable future. There is a great deal of th...
详细信息
In spite of impressive gains by PL/I, Fortran and Cobol remain the languages in which most of the world's production programs are written and will remain so into the foreseeable future. There is a great deal of theoretical interest in Algol 68 and in extensible languages, but so far at least they have had little practical impact. Problem-oriented languages may very well become the most important language development area in the next five to ten years. En the operating system area all major computer manufacturers set out to produce very ambitious multiprogramming systems, and they all ran into similar problems. A number of university projects, though not directly comparable to those of the manufacturers, have contributed greatly to a better understanding of operating system principles. Important trends include the increased interest in the development of system measurement and evaluation techniques, and increased use of microprogramming for some programming system functions. [ABSTRACT FROM AUTHOR]
This talk examines some trends in the modern developments of memory systems and their relations with the massive parallelism in processors and applications. It then draws on some recent work on GPU to explain the impo...
详细信息
ISBN:
(纸本)9781450350440
This talk examines some trends in the modern developments of memory systems and their relations with the massive parallelism in processors and applications. It then draws on some recent work on GPU to explain the important role of programming systems in bridging the gap; it particularly emphasizes the importance of innovations for enabling better software controllability, more software elasticity, and inter-thread data locality enhancements. The talk further discusses the implications brought to programming systems by the increasingly blurred boundaries among memory, storage, and processing.
The procedure of systematization programming languages on the TRIZ-evolution base is considered in the report. The procedure of systematization programming languages is illustrated on the example of the programming la...
详细信息
The procedure of systematization programming languages on the TRIZ-evolution base is considered in the report. The procedure of systematization programming languages is illustrated on the example of the programming languages paradigms development. The algorithm of presentation TRIZ-evolution map [1] of the programming languages paradigms is described. The complete analysis of programming paradigms as specified in the evolution of programming languages allowed to determinate the progressing and inheritance paradigms of programming languages. The major contradictions had been arisen in the modern programming languages is established [2] . With the using of TRIZ tools it is derived solutions of contradictions and defined set of the properties, which will be included to the new programming paradigm. The Imperative Paradigm, the Object-Oriented Paradigm, the Functional Paradigm, and the Logic programming Paradigm are considered in the report. It is determined the principal contradictions which became the “moving force” of each new paradigm, solution of this contradictions on the base of TRIZ tools [3] . The evolution of programming languages by the criterion “from contradiction to contradiction” is formed. The TRIZ-evolutionary map on the base of TRIZ-tools which is used for transition on development programming paradigm is formed. The present approach usage allows forecast the appearance of new programming paradigms, new programming languages and new methods of programming. It becomes possible to determine the tendency of programming languages development and prediction of the next languages generations.
Big Data analysis refers to advanced and efficient datamining and machine learning techniques applied to large amount of data. Research work and results in the area of Big Data analysis are continuously rising, and mo...
详细信息
Big Data analysis refers to advanced and efficient datamining and machine learning techniques applied to large amount of data. Research work and results in the area of Big Data analysis are continuously rising, and more and more new and efficient architectures, programming models, systems, and data mining algorithms are proposed. Taking into account the most popular programming models for Big Data analysis (MapReduce, Directed Acyclic Graph, Message Passing, Bulk Synchronous Parallel, Workflow and SQL-like), we analysed the features of the main systems implementing them. Such systems are compared using four classification criteria (i.e. level of abstraction, type of parallelism, infrastructure scale and classes of applications) for helping developers and users to identify and select the best solution according to their skills, hardware availability, productivity and application needs. [GRAPHICS] This figure is a word cloud highlighting the most popular words related to Big Data analysis.
In the age of the Internet of Things and social media platforms, huge amounts of digital data are generated by and collected from many sources, including sensors, mobile devices, wearable trackers and security cameras...
详细信息
In the age of the Internet of Things and social media platforms, huge amounts of digital data are generated by and collected from many sources, including sensors, mobile devices, wearable trackers and security cameras. This data, commonly referred to as Big Data, is challenging current storage, processing, and analysis capabilities. New models, languages, systems and algorithms continue to be developed to effectively collect, store, analyze and learn from Big Data. Most of the recent surveys provide a global analysis of the tools that are used in the main phases of Big Data management (generation, acquisition, storage, querying and visualization of data). Differently, this work analyzes and reviews parallel and distributed paradigms, languages and systems used today to analyze and learn from Big Data on scalable computers. In particular, we provide an in-depth analysis of the properties of the main parallel programming paradigms (MapReduce, workflow, BSP, message passing, and SQL-like) and, through programming examples, we describe the most used systems for Big Data analysis (e.g., Hadoop, Spark, and Storm). Furthermore, we discuss and compare the different systems by highlighting the main features of each of them, their diffusion (community of developers and users) and the main advantages and disadvantages of using them to implement Big Data analysis applications. The final goal of this work is to help designers and developers in identifying and selecting the best/appropriate programming solution based on their skills, hardware availability, application domains and purposes, and also considering the support provided by the developer community.
Specialised CAM systems called programming stations are used to automate program generation and achieve tool-path verification in CNC profile cutting. The advantages of using programming stations vis-a-vis other CAD/C...
详细信息
Specialised CAM systems called programming stations are used to automate program generation and achieve tool-path verification in CNC profile cutting. The advantages of using programming stations vis-a-vis other CAD/CAM systems are discussed. It is pointed out that with today's software tools, it is well worth the effort to develop one's own programming station software rather than to depend on expensive or unsuitable alternatives. Features necessary for programming stations are discussed and their implementation is illustrated in a PASCAL-like pseudo-code. programming station software developed on the principles outlined is described, and its use is discussed.
Modern interconnects offer remote direct memory access (RDMA) features. Yet, most applications rely on explicit message passing for communications albeit their unwanted overheads. The MPI-3.0 standard defines a progra...
详细信息
Modern interconnects offer remote direct memory access (RDMA) features. Yet, most applications rely on explicit message passing for communications albeit their unwanted overheads. The MPI-3.0 standard defines a programming interface for exploiting RDMA networks directly, however, it's scalability and practicability has to be demonstrated in practice. In this work, we develop scalable bufferless protocols that implement the MPI-3.0 specification. Our protocols support scaling to millions of cores with negligible memory consumption while providing highest performance and minimal overheads. To arm programmers, we provide a spectrum of performance models for all critical functions and demonstrate the usability of our library and models with several application studies with up to half a million processes. We show that our design is comparable to, or better than UPC and Fortran Coarrays in terms of latency, bandwidth and message rate. We also demonstrate application performance improvements with comparable programming complexity.
暂无评论