GENERALIZED GRADIENTS AND APPLICATIONS

GENERALIZED GRADIENTS AND APPLICATIONS

1975 | FRANK H. CLARKE
This paper presents a general theory of generalized gradients and normals for a wide class of functions and closed sets. It shows how these concepts extend and unify the usual gradients and normals of smooth functions and manifolds, as well as the subdifferentials and normals of convex analysis. A key result is a theorem on the differentiability of functions of the form $ \max\{g(x,u):u\in U\} $, which generalizes and extends results by Danskin and others. The theory is applied to characterize flow-invariant sets, yielding theorems of Bony and Brezis as corollaries. The paper begins by introducing the concept of generalized gradients for locally Lipschitz functions, which are functions that are not necessarily differentiable everywhere but have a derivative almost everywhere. The generalized gradient at a point is defined as the convex hull of the limits of the gradients at points approaching that point. It is shown that the generalized gradient is a nonempty convex compact set and upper semicontinuous. The paper then discusses the generalized directional derivative and its relationship to the generalized gradient. It proves that the generalized directional derivative is the support function of the generalized gradient. This result is used to show that the generalized gradient is a convex set and that the generalized directional derivative is convex in the direction. The paper then applies the theory to max functions, showing that under certain conditions, the generalized gradient of a max function can be expressed in terms of the generalized gradients of the individual functions. This result is used to characterize the generalized gradient of the distance function to a closed set, which is shown to be related to the normal cone to the set. The paper then introduces the concept of normals to sets, which are defined as the closure of the set of all limits of the generalized gradients of the distance function to the set. It shows that for smooth manifolds and convex sets, the normal cone coincides with the usual normal space. The paper then discusses the tangent cone to a set, which is the dual of the normal cone. It shows that a vector is tangent to a set at a point if and only if the directional derivative of the distance function to the set in that direction is zero. The paper then applies the theory to flow-invariant sets, showing that a set is flow-invariant if and only if the set of possible velocities at each point is tangent to the set at that point. This result is used to prove theorems of Bony and Brezis on flow-invariant sets. The paper concludes by discussing the implications of the theory for optimization, showing that the generalized gradient provides a natural setting for very general problems in optimization. It also notes that the theory can be extended to arbitrary locally convex linear topological spaces, which provides a more convenient setting for deriving general results.This paper presents a general theory of generalized gradients and normals for a wide class of functions and closed sets. It shows how these concepts extend and unify the usual gradients and normals of smooth functions and manifolds, as well as the subdifferentials and normals of convex analysis. A key result is a theorem on the differentiability of functions of the form $ \max\{g(x,u):u\in U\} $, which generalizes and extends results by Danskin and others. The theory is applied to characterize flow-invariant sets, yielding theorems of Bony and Brezis as corollaries. The paper begins by introducing the concept of generalized gradients for locally Lipschitz functions, which are functions that are not necessarily differentiable everywhere but have a derivative almost everywhere. The generalized gradient at a point is defined as the convex hull of the limits of the gradients at points approaching that point. It is shown that the generalized gradient is a nonempty convex compact set and upper semicontinuous. The paper then discusses the generalized directional derivative and its relationship to the generalized gradient. It proves that the generalized directional derivative is the support function of the generalized gradient. This result is used to show that the generalized gradient is a convex set and that the generalized directional derivative is convex in the direction. The paper then applies the theory to max functions, showing that under certain conditions, the generalized gradient of a max function can be expressed in terms of the generalized gradients of the individual functions. This result is used to characterize the generalized gradient of the distance function to a closed set, which is shown to be related to the normal cone to the set. The paper then introduces the concept of normals to sets, which are defined as the closure of the set of all limits of the generalized gradients of the distance function to the set. It shows that for smooth manifolds and convex sets, the normal cone coincides with the usual normal space. The paper then discusses the tangent cone to a set, which is the dual of the normal cone. It shows that a vector is tangent to a set at a point if and only if the directional derivative of the distance function to the set in that direction is zero. The paper then applies the theory to flow-invariant sets, showing that a set is flow-invariant if and only if the set of possible velocities at each point is tangent to the set at that point. This result is used to prove theorems of Bony and Brezis on flow-invariant sets. The paper concludes by discussing the implications of the theory for optimization, showing that the generalized gradient provides a natural setting for very general problems in optimization. It also notes that the theory can be extended to arbitrary locally convex linear topological spaces, which provides a more convenient setting for deriving general results.
Reach us at info@futurestudyspace.com
Understanding Generalized gradients and applications