I have an optimization problem that I want to solve with conjugate gradient descent method. Definition is as follows:

$$\operatorname{argmin} f(X)$$

Where

$$f(X)=\sum_{s=1}^{T} \left \| F_sX(:,s)-Y(:,s)))\right \|_{2}$$

I want to obtain the stepsize using standard Wolfe condition. It is defined as: $$g(x_k +\alpha_k d_k )^Td_k>\sigma g_k ^T d_k $$

Where $ g_k =\nabla_x f(x_k )$ and $d_k =g_k +\beta_k d_{k-1} $ $$$$ However this condition is defined for the case where $x$ is a vector. In my problem, it is a matrix. Is there a similiar condition for matrices or how can I update the condition? Have a pleasant day!

- Serverfault Help
- Superuser Help
- Ubuntu Help
- Webapps Help
- Webmasters Help
- Programmers Help
- Dba Help
- Drupal Help
- Wordpress Help
- Magento Help
- Joomla Help
- Android Help
- Apple Help
- Game Help
- Gaming Help
- Blender Help
- Ux Help
- Cooking Help
- Photo Help
- Stats Help
- Math Help
- Diy Help
- Gis Help
- Tex Help
- Meta Help
- Electronics Help
- Stackoverflow Help
- Bitcoin Help
- Ethereum Help