Mathematical programming formulations for neural combinatorial optimization algorithms

Kiichi Urahama

Research output: Contribution to journalArticle

3 Citations (Scopus)

Abstract

Analog neural network algorithms for solving combinatorial optimization problems are analyzed on the basis of the saddle point theorem in mathematical programming theories. A generalized Hopfield network scheme is shown to be a gradient system for searching a saddle point of Lagrange functions of continuous relaxation problems of 0-1 integer programs. This derivation of neural network algorithms provides us an interpretation of deterministic annealing procedures on the basis of mathematical programming theories. Additional interpretation of the generalized Hopfield network scheme is also given as a gradient projection method in a Riemannian space. The Lagrange function is shown to be a Liapunov function for this gradient algorithm whose convergence is then guaranteed from Liapunov stability theorems.

Original languageEnglish
Pages (from-to)353-364
Number of pages12
JournalJournal of artificial neural networks
Volume2
Issue number4
Publication statusPublished - Dec 1 1995

Fingerprint

Mathematical programming
Combinatorial optimization
Programming theory
Neural networks
Annealing

All Science Journal Classification (ASJC) codes

  • Engineering(all)

Cite this

Mathematical programming formulations for neural combinatorial optimization algorithms. / Urahama, Kiichi.

In: Journal of artificial neural networks, Vol. 2, No. 4, 01.12.1995, p. 353-364.

Research output: Contribution to journalArticle

@article{315dd521dfbc4bf1a7de181406b11d70,
title = "Mathematical programming formulations for neural combinatorial optimization algorithms",
abstract = "Analog neural network algorithms for solving combinatorial optimization problems are analyzed on the basis of the saddle point theorem in mathematical programming theories. A generalized Hopfield network scheme is shown to be a gradient system for searching a saddle point of Lagrange functions of continuous relaxation problems of 0-1 integer programs. This derivation of neural network algorithms provides us an interpretation of deterministic annealing procedures on the basis of mathematical programming theories. Additional interpretation of the generalized Hopfield network scheme is also given as a gradient projection method in a Riemannian space. The Lagrange function is shown to be a Liapunov function for this gradient algorithm whose convergence is then guaranteed from Liapunov stability theorems.",
author = "Kiichi Urahama",
year = "1995",
month = "12",
day = "1",
language = "English",
volume = "2",
pages = "353--364",
journal = "Journal of artificial neural networks",
issn = "1073-5828",
number = "4",

}

TY - JOUR

T1 - Mathematical programming formulations for neural combinatorial optimization algorithms

AU - Urahama, Kiichi

PY - 1995/12/1

Y1 - 1995/12/1

N2 - Analog neural network algorithms for solving combinatorial optimization problems are analyzed on the basis of the saddle point theorem in mathematical programming theories. A generalized Hopfield network scheme is shown to be a gradient system for searching a saddle point of Lagrange functions of continuous relaxation problems of 0-1 integer programs. This derivation of neural network algorithms provides us an interpretation of deterministic annealing procedures on the basis of mathematical programming theories. Additional interpretation of the generalized Hopfield network scheme is also given as a gradient projection method in a Riemannian space. The Lagrange function is shown to be a Liapunov function for this gradient algorithm whose convergence is then guaranteed from Liapunov stability theorems.

AB - Analog neural network algorithms for solving combinatorial optimization problems are analyzed on the basis of the saddle point theorem in mathematical programming theories. A generalized Hopfield network scheme is shown to be a gradient system for searching a saddle point of Lagrange functions of continuous relaxation problems of 0-1 integer programs. This derivation of neural network algorithms provides us an interpretation of deterministic annealing procedures on the basis of mathematical programming theories. Additional interpretation of the generalized Hopfield network scheme is also given as a gradient projection method in a Riemannian space. The Lagrange function is shown to be a Liapunov function for this gradient algorithm whose convergence is then guaranteed from Liapunov stability theorems.

UR - http://www.scopus.com/inward/record.url?scp=0029462616&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0029462616&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:0029462616

VL - 2

SP - 353

EP - 364

JO - Journal of artificial neural networks

JF - Journal of artificial neural networks

SN - 1073-5828

IS - 4

ER -