IJPAM: Volume 102, No. 3 (2015)

A SUBGRADIENT METHODS FOR NON-SMOOTH
VECTOR OPTIMIZATION PROBLEMS

Arunima Kumari
Department of Mathematics
Bhagwan Parshuram Institute of Technology
Guru Govind Singh Indraprastha University
Rohini, Delhi 110089, INDIA


Abstract. Vector optimization problems are a significant extension of scalar optimization and have wide range of application in various fields of economics, decision theory, game theory, information theory and optimal control theory. In this paper, unlike general subgradient methods, bundle methods and gradient sampling methods used to solve nonsmooth vector optimization problems which include scalarization approach, a subgradient method without usual scalarization approached is proposed for minimizing a non-differentiable convex function which works directly with vector-valued function. A general sub-gradient method for non-smooth convex optimization that includes regularization and interior point variants of Newton's Method are proposed. This algorithm builds a sequence of efficient points at the interior of epigraph of objective function which satisfies KKT conditions. In this paper, under the suitable conditions it is proved that the sequence generated by algorithm converges to $\epsilon$-efficient point.

Received: December 1, 2015

AMS Subject Classification: 90C29, 90C30

Key Words and Phrases: subgradient method, nonsmooth optimization, vector optimization, $\epsilon$-efficient points

Download paper from here.




DOI: 10.12732/ijpam.v102i3.13 How to cite this paper?

Source:
International Journal of Pure and Applied Mathematics
ISSN printed version: 1311-8080
ISSN on-line version: 1314-3395
Year: 2015
Volume: 102
Issue: 3
Pages: 563 - 578


Google Scholar; DOI (International DOI Foundation); WorldCAT.

CC BY This work is licensed under the Creative Commons Attribution International License (CC BY).