In this paper we reexamine how to utilize the previous proposed color-octet axial-vector boson ZC to explain the 3.4σ anomaly of tt¯ forward-backward (FB) asymmetry AFB for mtt¯>450 GeV observed by CDF....
详细信息
In this paper we reexamine how to utilize the previous proposed color-octet axial-vector boson ZC to explain the 3.4σ anomaly of tt¯ forward-backward (FB) asymmetry AFB for mtt¯>450 GeV observed by CDF. Our numerical results indicate that the best-fit parameters are gAq=0.07, gAQ=3, and MC=440 GeV, which are obtained by fitting the mass dependent AFB and total cross section data provided by a recent CDF measurement. Here gAq(gAQ) and MC are the axial couplings among ZC with the first two (the third) generation quarks, and ZC mass, respectively. We also calculate one-side forward-backward asymmetry AOFB for top and bottom quark pair production at the LHC, focusing on the new contributions from ZC. Our studies show that AOFB can be utilized to measure the properties of new particle ZC.
Background: The decreasing cost of DNA sequencing has led to a great increase in our knowledge about genetic variation. While population-scale projects bring important insight into genotype-phenotype relationships, th...
详细信息
Background: The decreasing cost of DNA sequencing has led to a great increase in our knowledge about genetic variation. While population-scale projects bring important insight into genotype-phenotype relationships, the cost of performing whole-genome sequencing on large samples is still prohibitive. In-silico genotype imputation coupled with genotyping-by-arrays is a cost-effective and accurate alternative for genotyping of common and uncommon variants. Imputation methods compare the genotypes of the typed variants with the large population-specific reference panels and estimate the genotypes of untyped variants by making use of the linkage disequilibrium patterns. Most accurate imputation methods are based on the Li-Stephens hidden Markov model, HMM, that treats the sequence of each chromosome as a mosaic of the haplotypes from the reference panel. Results: Here we assess the accuracy of vicinity-based HMMs, where each untyped variant is imputed using the typed variants in a small window around itself (as small as 1 centimorgan). Locality-based imputation is used recently by machine learning-based genotype imputation approaches. We assess how the parameters of the vicinity-based HMMs impact the imputation accuracy in a comprehensive set of benchmarks and show that vicinity-based HMMs can accurately impute common and uncommon variants. Conclusions: Our results indicate that locality-based imputation models can be effectively used for genotype imputation. The parameter settings that we identified can be used in future methods and vicinity-based HMMs can be used for re-structuring and parallelizing new imputation methods. The source code for the vicinity-based HMM implementations is publicly available at https://***/harmancilab/LoHaMMer.
The measurement of the forward-backward asymmetry at LHC could be an important instrument to pinpoint the features of extra neutral gauge particles obtained by an extension of the gauge symmetry group of the standard ...
详细信息
The measurement of the forward-backward asymmetry at LHC could be an important instrument to pinpoint the features of extra neutral gauge particles obtained by an extension of the gauge symmetry group of the standard model. For definitiveness, in this work, we consider an extension of the gauge group of the minimal supersymmetric standard model by an extra anomalous U(1) gauge symmetry. We focus on pp→e+e− at LHC and use four different definitions of the asymmetry obtained implementing four different cuts on the directions and momenta of the final states of our process of interest. The calculations are performed without imposing constraints on the charges of the extra Z’s of our model, since the anomaly is cancelled by a Green-Schwarz type mechanism. Our final result is a fit of our data with a polynomial in the charges of the extra U(1) that is used to extract the values of the charges, given the experimental result.
This study introduces an innovative approach to address convex optimization problems, with a specific focus on applications in image and signal processing. The research aims to develop a selfadaptive extra proximal al...
详细信息
This study introduces an innovative approach to address convex optimization problems, with a specific focus on applications in image and signal processing. The research aims to develop a selfadaptive extra proximal algorithm that incorporates an inertial term to effectively tackle challenges in convex optimization. The study's significance lies in its contribution to advancing optimization techniques in the realm of image deblurring and signal reconstruction. The proposed methodology involves creating a novel self-adaptive extra proximal algorithm, analyzing its convergence rigorously to ensure reliability and effectiveness. Numerical examples, including image deblurring and signal reconstruction tasks using only 10% of the original signal, illustrate the practical applicability and advantages of the algorithm. By introducing an inertial term within the extra proximal framework, the algorithm demonstrates potential for faster convergence and improved optimization outcomes, addressing real-world challenges of image enhancement and signal reconstruction. The algorithm's incorporation of an inertial term showcases its potential for faster convergence and improved optimization outcomes. This research significantly contributes to the field of optimization techniques, particularly in the context of image and signal processing applications.
Entity linking (also called entity disambiguation) aims to map the mentions in a given document to their corresponding entities in a target knowledge base. In order to build a high-quality entity linking system, effor...
详细信息
Entity linking (also called entity disambiguation) aims to map the mentions in a given document to their corresponding entities in a target knowledge base. In order to build a high-quality entity linking system, efforts are made in three parts: Encoding of the entity, encoding of the mention context, and modeling the coherence among mentions. For the encoding of entity, we use long short term memory (LSTM) and a convolutional neural network (CNN) to encode the entity context and entity description, respectively. Then, we design a function to combine all the different entity information aspects, in order to generate unified, dense entity embeddings. For the encoding of mention context, unlike standard attention mechanisms which can only capture important individual words, we introduce a novel, attention mechanism-based LSTM model, which can effectively capture the important text spans around a given mention with a conditional random field (CRF) layer. In addition, we take the coherence among mentions into consideration with a forward-backward algorithm, which is less time-consuming than previous methods. Our experimental results show that our model obtains a competitive, or even better, performance than state-of-the-art models across different datasets.
We consider the monotone inclusion problems in real Hilbert spaces. Proximal splitting algorithms are very popular technique to solve it and generally achieve weak convergence under mild assumptions. Researchers assum...
详细信息
We consider the monotone inclusion problems in real Hilbert spaces. Proximal splitting algorithms are very popular technique to solve it and generally achieve weak convergence under mild assumptions. Researchers assume the strong conditions like strong convexity or strong monotonicity on the considered operators to prove strong convergence of the algorithms. Mann iteration method and normal S-iteration method are popular methods to solve fixed point problems. We propose a new common fixed point algorithm based on normal S-iteration method using Tikhonov regularization to find common fixed point of non-expansive operators and prove strong convergence of the generated sequence to the set of common fixed points without assuming strong convexity and strong monotonicity. Based on proposed fixed point algorithm, we propose a forward-backward-type algorithm and a Douglas-Rachford algorithm in connection with Tikhonov regularization to find the solution of monotone inclusion problems. Further, we consider the complexly structured monotone inclusion problems which are very popular these days. We also propose a strongly convergent forward-backward-type primal-dual algorithm and a Douglas-Rachford-type primal-dual algorithm to solve the monotone inclusion problems. Finally, we conduct a numerical experiment to solve image deblurring problems.
We show that simultaneous precision measurements of the CP-violating phase in time-dependent Bs→J/ψϕ study and the Bs→μ+μ− rate, together with measuring mt′ by direct search at the LHC, would determine Vt′s*Vt...
详细信息
We show that simultaneous precision measurements of the CP-violating phase in time-dependent Bs→J/ψϕ study and the Bs→μ+μ− rate, together with measuring mt′ by direct search at the LHC, would determine Vt′s*Vt′b and therefore the b→s quadrangle in the four-generation standard model. The forward–backward asymmetry in B→K*ℓ+ℓ− provides further discrimination.
We introduce a number of novel techniques to lexical substitution, including an application of the forward-backward algorithm, a grammatical relation based similarity measure, and a modified form of n-gram matching. W...
详细信息
ISBN:
(纸本)9789544520106
We introduce a number of novel techniques to lexical substitution, including an application of the forward-backward algorithm, a grammatical relation based similarity measure, and a modified form of n-gram matching. We test these techniques on the Semeval-2007 lexical substitution data [McCarthy and Navigli, 2007], to demonstrate their competitive performance. We create a similar (small scale) dataset for Czech, and our evaluation demonstrates language independence of the techniques.
In this article, we offer two modifications of the modified forward-backward splitting method based on inertial Tseng method and viscosity method for inclusion problems in real Hilbert spaces. Under standard assumptio...
详细信息
In this article, we offer two modifications of the modified forward-backward splitting method based on inertial Tseng method and viscosity method for inclusion problems in real Hilbert spaces. Under standard assumptions, such as Lipschitz continuity and monotonicity (also maximal monotonicity), we establish weak and strong convergence of the proposed algorithms. We give the numerical experiments to show the efficiency and advantage of the proposed methods and we also used our proposed algorithm for solving the image deblurring and image recovery problems. Our result extends some related works in the literature and the primary experiments might also suggest their potential applicability.
暂无评论