桌子旋转 90 度

桌子旋转 90 度

我的桌子为什么会旋转?提前谢谢您! 在此处输入图片描述

\documentclass[8pt]{extarticle}
\usepackage{array}
\usepackage{pdflscape}
\usepackage{comment}
\usepackage{graphicx}
\usepackage{easytable}
\usepackage{enumitem}
\usepackage{amssymb}
\usepackage{mathtools, nccmath, esdiff}
\usepackage{rotating}
\usepackage{makecell}
\renewcommand{\theadfont}{\normalsize\bfseries}
\usepackage{booktabs}
\usepackage{multirow,hhline,graphicx,array, caption, tabularx}
\usepackage[margin=0.5in]{geometry}
\newcommand{\x}{\mathbf{x}}
\newcommand{\g}{\mathbf{g}}
\newcommand{\h}{\mathbf{h}}
\newcommand{\0}{\mathbf{0}} %<- that's not a good idea
\newcolumntype{M}[1]{>{\centering\arraybackslash}m{#1}}
\makeatletter
\newcommand*{\compress}{\@minipagetrue}
\makeatother
\newlength{\TXcolwd}
\begin{document}
\aboverulesep=0ex
\belowrulesep=0ex
\renewcommand{\theadalign}{tc}
\newgeometry{margin=0.1cm}
\begin{landscape}
\null\vfill
% Table generated by Excel2LaTeX from sheet 'Sheet1'
\begin{table}[htbp]
\setlist[itemize, 1]{wide=0pt, leftmargin=*, before=\compress, after=\vspace*{\dimexpr\topsep-\baselineskip}}
\setlength{\extrarowheight}{4pt}
\centering
\caption{Summary of NLP Numerical Methods: $x_{k+1} = x_k + \alpha_k s_k$}
\begin{tabularx}{\linewidth}{|c|c|X|X|X|}% }{|p{0.7em}|p{0.4em}|X|X|X|}% p{0.7em}

\midrule
\multirowcell{20}{\rotatebox{90}{Search Direction: $s_k$}}%
&
\multirowcell{9}{\rotatebox{90}{First Order}}
&
Steepest Descent:
\newline
Move in the direction of the negative gradient (steepest downhill direction locally)
\[
x_{k+1} = x_k + \alpha_k \nabla f(x_k)
\]
&
Generalized Reduced Gradient (First Order):
\newline
Move in the direction of the negative reduced gradient in the reduced space; return to the constraint boundary in the state space.
\begin{gather*}
d_{k+1} = d_k - \alpha_k\nabla_d f_R(x_k)^T
\\
[s_{k+1}]_0 = s_k - \alpha_k\bigg(\frac{\partial s}{\partial d}\bigg)_k \bigg(\frac{\partial f_R}{\partial d}\bigg)^T_k
\\
[s_{k+1}]_{j+1} = [s_{k+1}]_j - \bigg[\bigg(\frac{\partial h}{\partial s}\bigg)^{-1}_{k+1}h_{k+1} \bigg]_j
\\
s_{k+1} = [s_{k+1}]_J \text{ where j iterations converge at J }
\end{gather*}
&
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~N/A \\
\cmidrule{2-5}%
&
\multirowcell{11}{\rotatebox{90}{Second Order}} %
&
Newton’s Method:
\newline
Move in the direction of the stationary point of the second order approximation (which is a minimum of the second order approximation if the function is convex)
\[
x_{k+1} = x_k - \alpha_k \nabla^2 f(x_k)^{-1}\nabla f(x_k)^T
\]
BFGS:
\newline
Move in the direction of the minimum of the second order approximation using a positive definite approximation of the Hessian
\begin{gather*}
H_{k+1} = H_k + \bigg[\frac{\Delta y \Delta y^T}{\Delta y^T\Delta x} \bigg]_k - \bigg[\frac{(H\Delta x)(H\Delta x^T)}{\Delta x^TH\Delta x}\bigg]_k
\\
\text{Where }\Delta x = \alpha_k s_k, \Delta y= \nabla f(x_{k+1})-\nabla f(x_k)
\end{gather*}

The BFGS update maintains positive definiteness if $H_0$ is positive definite and each line search satisfies the Wolfe conditions.
&
Generalized Reduced Gradient (Second Order):
\newline
Move in the direction of the stationary point of the second order approximation in the reduced space, then return to the constraint boundary in the state space.
\begin{gather*}
d_{k+1} = d_k - \alpha_k\nabla^2_d f_R(x_k)^{-1}\nabla_d f_R(x_k)^T
\\
[s_{k+1}]_0 = s_k - \bigg[\bigg(\frac{\partial s}{\partial d}\bigg)_k \partial d + \frac{1}{2}\partial d^T \bigg(\frac{\partial^2s}{\partial d^2}\bigg)_k\partial d\bigg]
\\
[s_{k+1}]_{j+1} = [s_{k+1}]_j - \bigg[\bigg(\frac{\partial h}{\partial s}\bigg)^{-1}_{k+1}h_{k+1} \bigg]_j
\\
s_{k+1} = [s_{k+1}]_J \text{ where j iterations converge at J }
\end{gather*}

&
\textbf{Sequential Quadratic Programming:}
\newline
Move in the direction of the solution to the 2nd order approximation. The Lagrange-Newton equations can be solved directly for $[s_k;\delta\lambda]$ each iteration if the matrix is invertible.
\[
\begin{bmatrix}
\nabla^2_x L & \nabla_x h^T\\
\nabla_x h & 0
\end{bmatrix}_k
\begin{bmatrix}
s_k \\
\Delta\lambda
\end{bmatrix}_k
=
\begin{bmatrix}
\nabla_x L^T\\
h
\end{bmatrix}_k
\]
Or it can be formulated as a quadratic programming subproblem.
\[
\text{minimize } f_k+\nabla_x f_k s_k+\frac{1}{2} s_k^⊤ \nabla_x^2 L_k s_k
\]
\[
\text{with respect to} s_k, \text{subject to} h_k+\nablah_k s_k = 0
\]
BFGS: Use a positive definite approximation of $\nabla_x^2L$
\[
\begin{bmatrix}
H_k & \nabla_x h^T\\
\nabla_x h & 0
\end{bmatrix}_k
\begin{bmatrix}
s_k \\
\Delta\lambda
\end{bmatrix}_k
=
\begin{bmatrix}
\nabla_x L^T\\
h
\end{bmatrix}_k
\]
Update Hessian approximation as before only if the gradient at the new and old point imply enough positive curvature: $\Delta x^⊤ \Delta y>0.$
Active Set Strategy: First assume g(x) are inactive and ignore them. In each iteration: (1) if a constraint assumed inactive is violated, add it to the active set and treat the active set as equality constraints, (2) if a constraint in the active set has a Lagrange multiplier estimate of zero, remove it from the active set.
\smallskip
\\
\midrule
\multirowcell{15}{\rotatebox{90}{Line Search: $\alpha_k$}}
&
\multirowcell{15}{\rotatebox{90}{}}
&
\multicolumn{2}{p{57em}|}{
\textbf{Inexact Line Search:}
\newline
\textbf{Wolfe Conditions:} Stop line search when a point is found that satisfies the Armijo condition
$f(x_k+\alpha_k s_k) \leq f(x_k)+c_1 \nabla f(x_k) s_k \alpha_k$ and the curvature condition $\nabla f(x_k+\alpha_k s_k) s_k \geq c_2 \nabla f(x_k s_k$
\newline
(Typical: c_1 = 10^(-4),c_2 = 0.9)
\newline
\textbf{Second order optimal step size}
\newline
Optimal step size of the second order approximation, often used as an initial guess of α for line search:
\newline
$\alpha= \bigg(\frac{-\nabla f(x_k)s_k}{s_k^⊤ \nabla^2 f(x_k)s_k}\bigg) $, which equals 1 for Newton’s Method on convex problems
\newline
\textbf{A Bisection Line Search Method:}
\begin{itemize}
\item Initial guess for α: use second order optimal step size
\item Increase α until curvature condition is satisfied. Then $\alpha_LO=0, \alpha_HI=\alpha$ is a bracket that contains a local minimum along $s_k$.
\item Bisect until Wolfe conditions satisfied:
While Wolfe conditions are not satisfied, if Armijo is not satisfied set $\alpha_HI=\alpha, \alpha=\frac{\alpha_HI+\alpha_LO}{2}$, else curvature not satisfied so set $\alpha_LO=\alpha, \alpha = \frac{\alpha_HI+\alpha_LO}{2}$.

\end{itemize}}
&
Penalty Function:
\newline
Use a penalty function during line search to avoid substantial violation of constraints. With the following choice of penalty weight, the direction $s_k$ is a descent direction.
\[
\phi_k(\alpha_k) = f(x_k+\alpha_k s_k )+w_k||h_+(x_k+\alpha_k s_k)||_1
\]
\[
\text{where }w_k = ||\lambda_(k+1)||_\inf+\epsilon
\]
Simple line search:
\[
\text{while }\phi_k (\alpha_k)>\phi_k(0),\alpha_k=\alpha_k/2, end.
\]
\\
\bottomrule
\end{tabularx}%
\label{tab:addlabel}%
\end{table}%
\vfill
\end{landscape}
\restoregeometry

\end{document}

答案1

如果要使用包landscape提供的环境旋转表格pdflandscape,请不要tablelandscape环境内使用浮动环境。相反,只需使用tabular(或tabularx...) 和包captionof中的命令,capt-of如以下示例所示:

\documentclass[8pt]{extarticle}
\usepackage{pdflscape}
\usepackage{tabularx}
\usepackage{capt-of}
\begin{document}
\begin{landscape}
  \centering
  \captionof{table}{Summary of NLP Numerical Methods: $x_{k+1} = x_k + \alpha_k s_k$}
  \label{tab:addlabel}
  \begin{tabularx}{\linewidth}{|c|c|X|X|X|}
    table contents
  \end{tabularx}
\end{landscape}
\end{document}

相关内容