This paper presents enhanced parameterized quantum Hermite-Hadamard type integral inequalities for functions whose third right and left q-derivatives in absolute value are strongly convex functions. We obtain new boun...
详细信息
This paper presents enhanced parameterized quantum Hermite-Hadamard type integral inequalities for functions whose third right and left q-derivatives in absolute value are strongly convex functions. We obtain new bounds using Holder's and power mean inequalities as primary tools. Also, we derive new quantum estimates for q-trapezoidal and q-midpoints type inequalities in specific scenarios, which we illustrate with examples. These outcomes possess the potential for practical applications in optimizing various economic problems.
In this paper, using the class of strongly convex functions, which is subclass of convexfunctions with stronger versions of analogous properties, we get improvements of Jessen's and Jensen's inequalities, the...
详细信息
In this paper, using the class of strongly convex functions, which is subclass of convexfunctions with stronger versions of analogous properties, we get improvements of Jessen's and Jensen's inequalities, their converses as well as related Jensen-type interpolating inequalities which present a starting point in many significant results in recent investigations. Obtained improvements we apply to so called strongly f -divergences, a concept of f-divergences for strongly convex functions. As outcome we derive stronger estimates for some well known divergences as the Kullback-Leibler divergence, x2-divergence, Hellinger divergence, Bhattacharya distance and Jeffreys distance.(c) 2023 Elsevier Inc. All rights reserved.
The study of fractional integral inequalities has attracted the interests of many researchers due to their potential applications in various fields. Estimates obtained via strongly convex functions produce better and ...
详细信息
The study of fractional integral inequalities has attracted the interests of many researchers due to their potential applications in various fields. Estimates obtained via strongly convex functions produce better and sharper bounds when compared to convexfunctions. To this end, we establish some new Hermite-Hadamard and Fejer types inequalities by means of the Caputo-Fabrizio fractional integral operators for strongly convex functions. In particular, we prove among other things that if omega : I -> R is a stronglyconvex function with modulus c > 0 and alpha, beta is an element of I with alpha 0 is a normalization function. Some applications to special means have also been investigated.,
In this paper we use basic properties of strongly convex functions to obtain new inequalities including Jensen type and Jensen-Mercer type inequalities. Applications for special means are pointed out as well. We also ...
详细信息
In this paper we use basic properties of strongly convex functions to obtain new inequalities including Jensen type and Jensen-Mercer type inequalities. Applications for special means are pointed out as well. We also give a Jensen's operator inequality for strongly convex functions. As a corollary, we improve the Holder-McCarthy inequality under suitable conditions. More precisely we show that if Sp (A) subset of (1, infinity), then < Ax, x >(r) <= < A(r)x, x > - r(2) - r/2 (< A(2)x, x > - < Ax, x >(2)), r >= 2 and if Sp (A) subset of (0, 1), then < A(r)x, x > <= < Ax, x >(r) + r - r(2)/2 (< Ax, x >(2) - < A(2)x, x >), 0 < r < 1 for each positive operator A and x is an element of H with parallel to x parallel to = 1.
In this paper, we obtain some Jensen's and Hermite-Hadamard's type inequalities for lower, upper, and strongly convex functions defined on convex subsets in normed linear spaces. The case of inner product spac...
详细信息
In this paper, we obtain some Jensen's and Hermite-Hadamard's type inequalities for lower, upper, and strongly convex functions defined on convex subsets in normed linear spaces. The case of inner product space is of interest since in these case the concepts of lower convexity and strong convexity coincides. Applications for univariate functions of real variable and the connections with earlier Hermite-Hadamard's type inequalities are also provided.
The Jensen inequality for convexfunctions holds under the assumption that all of the included weights are nonnegative. If we allow some of the weights to be negative, such an inequality is called the Jensen-Steffense...
详细信息
The Jensen inequality for convexfunctions holds under the assumption that all of the included weights are nonnegative. If we allow some of the weights to be negative, such an inequality is called the Jensen-Steffensen inequality for convexfunctions. In this paper we prove the Jensen-Steffensen inequality for strongly convex functions.
Counterparts of the converse Jensen inequality for stronglyconvex and strongly midconvexfunctions are presented. The Jessen inequality and converse Jessen inequality (involving linear positive normalized functionals...
详细信息
Counterparts of the converse Jensen inequality for stronglyconvex and strongly midconvexfunctions are presented. The Jessen inequality and converse Jessen inequality (involving linear positive normalized functionals) for strongly convex functions are also given. (C) 2015 Elsevier Inc. All rights reserved.
The aim of this paper is to find a convenient and practical method to approximate a given real-valued function of multiple variables by linear operators, which approximate all strongly convex functions from above (or ...
详细信息
The aim of this paper is to find a convenient and practical method to approximate a given real-valued function of multiple variables by linear operators, which approximate all strongly convex functions from above (or from below). Our main contribution is to use this additional knowledge to derive sharp error estimates for continuously differentiable functions with Lipschitz continuous gradients. More precisely, we show that the error estimates based on such operators are always controlled by the Lipschitz constants of the gradients, the convexity parameter of the strong convexity and the error associated with using the quadratic function, see Theorems 3.1 and 3.3. Moreover, assuming the function, we want to approximate, is also stronglyconvex, we establish sharp upper as well as lower refined bounds for the error estimates, see Corollaries 3.2 and 3.4. As an application, we define and study a class of linear operators on an arbitrary polytope, which approximate strongly convex functions from above. Finally, we present a numerical example illustrating the proposed method. (C) 2014 Elsevier Inc. All rights reserved.
In this paper we deal with functions related to generalized convexity and refine Jensen type inequalities satisfied by such functions. Specifically, we extend inequalities satisfied by uniformly convexfunctions, stro...
详细信息
In this paper we deal with functions related to generalized convexity and refine Jensen type inequalities satisfied by such functions. Specifically, we extend inequalities satisfied by uniformly convexfunctions, strongly convex functions as well as superquadratic functions.
The Delayed Weighted Gradient Method (DWGM) is a two-step gradient algorithm that is efficient for the minimization of large scale strictly convex quadratic functions. It has orthogonality properties that make it to c...
详细信息
The Delayed Weighted Gradient Method (DWGM) is a two-step gradient algorithm that is efficient for the minimization of large scale strictly convex quadratic functions. It has orthogonality properties that make it to compete with the Conjugate Gradient (CG) method. Both methods calculate in sequence two step-sizes, CG minimizes the objective function and DWGM the gradient norm, alongside two search directions defined over first order current and previous iteration information. The objective of this work is to accelerate the recently developed extension of DWGM to nonquadratic stronglyconvex minimization problems. Our idea is to define the step-sizes of DWGM in a unique two dimensional convex quadratic optimization problem, calculating them simultaneously. Convergence of the resulting algorithm is analyzed. Comparative numerical experiments illustrate the effectiveness of our approach.
暂无评论