The paper addresses the problem of assessing media text readability depending on the correlation of objective and subjective text complexity. Objective text complexity refers to a set of measurable characteristics suc...
详细信息
This paper describes the development of an auto-active verification technique in the Frama-C framework. We outline the lemma functions method and present the corresponding ACSL extension, its implementation in Frama-C...
详细信息
ISBN:
(纸本)9781728112763;9781728112756
This paper describes the development of an auto-active verification technique in the Frama-C framework. We outline the lemma functions method and present the corresponding ACSL extension, its implementation in Frama-C, and evaluation on a set of string-manipulating functions from the Linux kernel. We illustrate the benefits our approach can bring concerning the effort required to prove lemmas, compared to the approach based on interactive provers such as Coq. Current limitations of the method and its implementation are discussed.
This paper describes the development of an auto-active verification technique in the Frama-C framework. We outline the lemma functions method and present the corresponding ACSL extension, its implementation in Frama-C...
详细信息
The paper addresses the problem of assessing media text readability depending on the correlation of objective and subjective text complexity. Objective text complexity refers to a set of measurable characteristics suc...
The paper addresses the problem of assessing media text readability depending on the correlation of objective and subjective text complexity. Objective text complexity refers to a set of measurable characteristics such as baseline, morphological, syntactic and lexical characteristics of a media text (a piece of news published on university websites), which is calculated automatically for each text. In total, 34 parameters are considered. Subjective text complexity score is assigned to each text by human experts who assess the clarity, structure, cohesion and coherence of the text. The correlation between subjective and objective text complexity is studied using machine learning models. The readability level is measured using machine learning methods and regression-correlation analysis. Namely, an artificial neural network and regression models are used. It is demonstrated that the use of polynomial features and lasso regularization allows to get a compact regression model of a high quality. The use of the chosen machine learning techniques made it possible to estimate the impact of distinct linguistic features on the readability level.
The next phase of LHC Operations-High Luminosity LHC (HL-LHC), which is aimed at ten-fold increase in the luminosity of proton-proton collisions at the energy of 14 TeV, is expected to start operation in 2027-2028 and...
详细信息
In this paper application of the OpenFOAM solver QGDFoam for numerical simulation of transonic viscous flows is considered. The developed solver, which implements regularized or quasi- gas dynamics (QGD) algorithms, i...
In this paper application of the OpenFOAM solver QGDFoam for numerical simulation of transonic viscous flows is considered. The developed solver, which implements regularized or quasi- gas dynamics (QGD) algorithms, is validated using the transonic low-Re jet flow case (Ma=0.9, Re=3600). The conducted numerical simulations allow the assessing applicability of the solver for modelling hydrodynamic instabilities and their interaction with transonic flow. Results of the numerical simulations are compared with experimental observations and Navier-Stokes-based code simulation. Results of the present study formulate a guideline for choosing (values of) QGD-algorithm tuning parameters.
The next phase of LHC Operations – High Luminosity LHC (HL-LHC), which is aimed at ten-fold increase in the luminosity of proton-proton collisions at the energy of 14 TeV, is expected to start operation in 2027-2028 ...
The next phase of LHC Operations – High Luminosity LHC (HL-LHC), which is aimed at ten-fold increase in the luminosity of proton-proton collisions at the energy of 14 TeV, is expected to start operation in 2027-2028 and will deliver an unprecedented scientific data volume of multi-exabyte scale. This amount of data has to be stored and the corresponding storage system should ensure fast and reliable data delivery for processing by scientific groups distributed all over the world. The present LHC computing and data processing model will not be able to provide the required infrastructure growth even taking into account the expected hardware technology evolution. To address this challenge the new state-of-the-art computing infrastructure technologies are now being developed and are presented here. The possibilities of application of the HL-LHC distributed data handling technique for other particle and astro-particle physics experiments dealing with large-scale data volumes like DUNE, LSST, Belle-II, JUNO, SKAO etc. are also discussed.
暂无评论