我有以下figure
算法伪代码。
\usepackage{graphicx}
\usepackage{psfrag}
\usepackage{amssymb}
\newcommand{\argmin}{\mathop{\mathrm{argmin}}}
\newcommand{\infl}{\eta}
\newcommand{\Ind}{\mathrm{I}}
\date{June 2021}
\begin{document}
\begin{figure}[bth]
\begin{center}
\begin{center}
\fbox{\parbox{11.4cm}{
\begin{itemize}
\item[] \textbf{Input:} Data set ${\bf D}= \{ ({\bf x}_1,y_1), ({\bf x}_2,y_), \ldots,
({\bf x}_m,y_m) \};$
\item[] \hspace {1cm} Base learning algorithm $ E $ ;
\item[] \hspace {1cm} Number of learning rounds $T$;
\item[] \textbf{Process:}
\begin{enumerate}
\item $D_1(x) = 1/m$ \textit{ \# initialize the weight distribution}
\item \textbf{\textit{for}} $t = 1, \ldots ,T;$
\item $ h_t = E(D,D_t); $ \textit{\# train a classifier $h_t$ from $D$ under distribution $D_t$}
\item $\epsilon_t = P_{{\bf x}\sim D_t} h_t{\bf (x)} \neq f({\bf (x)});$ \textit{evaluate the error of $h_t$}
\item \textbf{\textit{if}} $\epsilon_t > 0.5 $ \textbf{\textit{then break}}
\item $\alpha_t = \frac{1}{2} \ln\left ( \frac{1 - \epsilon_t}{\epsilon_t} \right ); $ \textit{\# determine the weight of $h_t$}
\item $D_{t+1}{\bf (x)} = \frac{D_t {\bf (x)}}{Z_t} \text{ x } \begin{cases}
& \text{ exp } (-\alpha_t)= \text{ if } h_t{\bf (x)} = f{\bf (x)}\\
& \text{ exp } (\alpha_t) = \text{ if } h_t{\bf (x)} \neq f{\bf (x)}
\end{cases} $ \\
\hspace {1.2cm} $ = \frac{D_t{\bf (x)} \text {exp}(-\alpha_tf{\bf (x)}h_t{\bf (x)})}{Z_t}$ \\
\textit{\# update the distribution, where $Z_t$ is a normalization factor which enables $D_{t+1}$ to be a distribution.}
\item \textbf{\textit{end}}
\end{enumerate}
\item[]{\bf Output:}
$
H{\bf (x)} = sign\left ( \sum_{t=1}^{T} \alpha_th_t{\bf (x)} \right ) $
\end{itemize}
}}
\end{center}
\end{center}
\caption[]{The AdaBoost algorithm. \label{fig:ABALG}}
\end{figure}
\end{document}
现在我想把它放到我正在准备的一张幻灯片中:\begin{frame}{AdaBoost Algorithm} %code above \end{frame}
我需要一种方法来重新缩放它,以便出现最后两行(我可以忽略图形标题)。
答案1
用于\scalebox
80% 的比例
\documentclass[11pt]{beamer}
\usetheme{Warsaw}
\usepackage{psfrag}
\usepackage{amssymb}
\newcommand{\argmin}{\mathop{\mathrm{argmin}}}
\newcommand{\infl}{\eta}
\newcommand{\Ind}{\mathrm{I}}
\date{June 2021}
\begin{document}
\begin{frame}
\frametitle{AdaBoost algorithm}
\begin{figure}
\scalebox{0.8}{%
\fbox{\parbox{11.4cm}{%
\begin{itemize}
\item[] \textbf{Input:} Data set ${\bf D}= \{ ({\bf x}_1,y_1), ({\bf x}_2,y_), \ldots,
({\bf x}_m,y_m) \};$
\item[] \hspace {1cm} Base learning algorithm $ E $ ;
\item[] \hspace {1cm} Number of learning rounds $T$;
\item[] \textbf{Process:}
\begin{enumerate}
\item $D_1(x) = 1/m$ \textit{ \# initialize the weight distribution}
\item \textbf{\textit{for}} $t = 1, \ldots ,T;$
\item $ h_t = E(D,D_t); $ \textit{\# train a classifier $h_t$ from $D$ under distribution $D_t$}
\item $\epsilon_t = P_{{\bf x}\sim D_t} h_t{\bf (x)} \neq f({\bf (x)});$ \textit{evaluate the error of $h_t$}
\item \textbf{\textit{if}} $\epsilon_t > 0.5 $ \textbf{\textit{then break}}
\item $\alpha_t = \frac{1}{2} \ln\left ( \frac{1 - \epsilon_t}{\epsilon_t} \right ); $ \textit{\# determine the weight of $h_t$}
\item $D_{t+1}{\bf (x)} = \frac{D_t {\bf (x)}}{Z_t} \text{ x } \begin{cases}
& \text{ exp } (-\alpha_t)= \text{ if } h_t{\bf (x)} = f{\bf (x)}\\
& \text{ exp } (\alpha_t) = \text{ if } h_t{\bf (x)} \neq f{\bf (x)}
\end{cases} $ \\
\hspace {1.2cm} $ = \frac{D_t{\bf (x)} \text {exp}(-\alpha_tf{\bf (x)}h_t{\bf (x)})}{Z_t}$ \\
\textit{\# update the distribution, where $Z_t$ is a normalization factor which enables $D_{t+1}$ to be a distribution.}
\item \textbf{\textit{end}}
\end{enumerate}
\item[]{\bf Output:}
$
H{\bf (x)} = sign\left ( \sum_{t=1}^{T} \alpha_th_t{\bf (x)} \right ) $
\end{itemize}
}}}
\caption[]{The AdaBoost algorithm. \label{fig:ABALG}}
\end{figure}
\end{frame}
\end{document}