This paper examines the optimal control processes represented by stochastic sequential dynamic systems involving a parameter obtained by unique solution conditions concerning constant input values. Then, the principle of optimality is proven for the considered process. Afterwards, the Bellman equation is constructed by applying the dynamic programming method. Moreover, a particular set defined as an accessible set is established to show the existence of an optimal control problem. Finally, it is discussed the need for further research.
Optimal control process Bellman’s equation Dynamical programming Stochastic sequential dynamical systems
Birincil Dil | İngilizce |
---|---|
Konular | Matematik |
Bölüm | Articles |
Yazarlar | |
Yayımlanma Tarihi | 31 Ağustos 2021 |
Yayımlandığı Sayı | Yıl 2021 Cilt: 10 Sayı: 2 |