此代码导致标题未对齐。有没有什么办法可以修复它?
\documentclass{article}
\usepackage{longtable}
\usepackage{array}
\begin{document}
\begin{center}
\begin{longtable}{|>{\tiny}c|>{\tiny}c|>{\tiny}c|>{\tiny}c|>{\tiny}c|>{\tiny}c|>{\tiny}c|>{\tiny}c|>{\tiny}c|>{\tiny}c|>{\tiny}c|}
\caption{All the best classification results achieved by runs on the original data. In these runs [0-1] normalization is also tested.}\label{tab:resultsOriginalAllClassifiers}\\
\hline
Data & Run & Normalize & $TP\_{rate}$ & $FP\_{rate}$ & $TN\_{rate}$ & $FN\_{rate}$ & Precision & Recall & $F\_{measure}$ & AUC\\ \hline\hline \endfirsthead
\hline
Data & Run & Normalize & $TP\_{rate}$ & $FP\_{rate}$ & $TN\_{rate}$ & $FN\_{rate}$ & Precision & Recall & $F\_{measure}$ & AUC\\ \hline\hline \endhead
bt\_10x10\_alldevs & NaiveBayes & none & 0.8 & 0 & 1 & 0.2 & 1 & 0.8 & 0.888889 & 0.94\\ \hline
bt\_10x10\_alldevs & LibSVM & none & 1 & 0 & 1 & 0 & 1 & 1 & 1 & 1\\ \hline
bt\_10x10\_alldevs & LibSVM & [0,1] & 0.9 & 0.1 & 0.9 & 0.1 & 0.9 & 0.9 & 0.9 & 0.9\\ \hline
bt\_10x10\_alldevs & J48 & none & 1 & 0 & 1 & 0 & 1 & 1 & 1 & 1\\ \hline
bt\_10x10\_alldevs & RandomForest & none & 1 & 0 & 1 & 0 & 1 & 1 & 1 & 1\\ \hline
wifi\_5x5\_TF101 & RandomForest & [0,1] & 1 & 0 & 1 & 0 & 1 & 1 & 1 & 1\\ \hline
\caption{All the best classification results achieved by runs on the original data. In these runs [0,1] normalization is also tested.}
\end{longtable}
\end{center}
\end{document}
答案1
标题对齐,但表格太宽。另外,无需\tiny
在每一列中指定,只需在表格前指定即可。center
对没有影响longtable
。在这里,我减少了列间距和一些标题的宽度,但它仍然太宽了 3pt 左右,所以我定义\LTleft
和\LTright
允许表格稍微延伸到边距。
\documentclass{article}
\usepackage{longtable}
\usepackage{array}
\begin{document}
\noindent X\dotfill X
{\tiny
\setlength\tabcolsep{1pt}
\setlength\LTleft{0pt minus 2pt}
\setlength\LTright{0pt minus 2pt}
\begin{longtable}{|c|c|c|c|c|c|c|c|c|c|c|}
\caption{All the best classification results achieved by runs on the original data. In these runs [0-1] normalization is also tested.}\label{tab:resultsOriginalAllClassifiers}\\
\hline
Data & Run & Normalize & $TP\!\_{rate}$ & $FP\!\_{rate}$ & $TN\_{rate}$ & $FN\_{rate}$ & Precision & Recall & $F\!\_{measure}$ & AUC\\ \hline\hline \endfirsthead
\hline
Data & Run & Normalize & $TP\!\_{rate}$ & $FP\!\_{rate}$ & $TN\_{rate}$ & $FN\_{rate}$ & Precision & Recall & $F\!\_{measure}$ & AUC\\ \hline\hline \endhead
bt\_10x10\_alldevs & NaiveBayes & none & 0.8 & 0 & 1 & 0.2 & 1 & 0.8 & 0.888889 & 0.94\\ \hline
bt\_10x10\_alldevs & LibSVM & none & 1 & 0 & 1 & 0 & 1 & 1 & 1 & 1\\ \hline
bt\_10x10\_alldevs & LibSVM & [0,1] & 0.9 & 0.1 & 0.9 & 0.1 & 0.9 & 0.9 & 0.9 & 0.9\\ \hline
bt\_10x10\_alldevs & J48 & none & 1 & 0 & 1 & 0 & 1 & 1 & 1 & 1\\ \hline
bt\_10x10\_alldevs & RandomForest & none & 1 & 0 & 1 & 0 & 1 & 1 & 1 & 1\\ \hline
wifi\_5x5\_TF101 & RandomForest & [0,1] & 1 & 0 & 1 & 0 & 1 & 1 & 1 & 1\\ \hline
\caption{All the best classification results achieved by runs on the original data. In these runs [0,1] normalization is also tested.}
\end{longtable}}
\end{document}