答案1
这里有两种可能性:一种是aligned
环境,嵌套在equation, ans using
\MoveEqLeft from
mathtools , the other uses
flalign and an *adhoc* alignment point. Both use the medium-size fractions (
\mfrac ) from
nccmath` 中,用于数值分数,因为我认为它看起来更好:
\documentclass{article}
\usepackage{nccmath}
\usepackage{mathtools}
\begin{document}
\begin{equation}
\label{off-margin}
\begin{aligned}
\MoveEqLeft -\ell (\beta _{0},\boldsymbol{\beta }) + \lambda \|\boldsymbol{\beta }\|_{1} = \\
& \biggl[-\frac{1}{n}\biggl(\sum_{i = 1}^{n} (\beta _{0}x_{i0} + \mathbf{x}^T\boldsymbol{\beta })y_i-\log(1 + \exp(\beta _{0}x_{i0} + \mathbf{x}^T\boldsymbol{\beta }))\biggr)\biggr] \\
\text{where} &\phantom{ + } \boldsymbol{\beta } = (\beta _{1},\dots, \beta _{p})^{T}\text{ and } x_{i0} = 1 \text{ for all }i.
\end{aligned}
\end{equation}
Next, we need to show that the negative log-likelihood function is convex. By showing convexity it means that a local minimum exists which is the global minimum…
\begin{flalign}
\label{off-margin1}
-\ell (\beta _{0},\boldsymbol{\beta }) & + \lambda \|\boldsymbol{\beta }\|_{1} = \notag \\
&\phantom{ + } \biggl[-\mfrac{1}{n}\biggl(\sum_{i = 1}^{n} (\beta _{0}x_{i0} + \mathbf{x}^T\boldsymbol{\beta })y_i-\log(1 + \exp(\beta _{0}x_{i0} + \mathbf{x}^T\boldsymbol{\beta }))\biggr)\biggr] \\%
\text{where} &\phantom{ + } \boldsymbol{\beta } = (\beta _{1},\dots, \beta _{p})^{T}\text{ and } x_{i0} = 1 \text{ for all }i. \notag
\end{flalign}
\end{document}