在 rmarkdown 中交叉引用 latex 表格

在 rmarkdown 中交叉引用 latex 表格

我尝试在 RMarkdown 中交叉引用 latex 表。生成了超链接(单击 ?? 可转到表格),但未显示表格编号,而是显示 ??。我尝试了带有和不带有 latex 块的版本。我使用 pandoc 版本 >2.0,以便原料乳胶可以使用。

如何在 RMarkdown 中交叉引用原始乳胶表?

---
header-includes:
   - \usepackage{amsmath, dcolumn, xcolor, hyperref, svg, url, makecell, array, fourier, tabularx, booktabs, caption, float , resizegather, verbatim,threeparttable, caption, pifont, soul }
   - \newcommand{\comm}[1]{\ignorespaces}
output:
  bookdown::pdf_document2:
    keep_tex: yes
    template: template.tex
    fig_caption: true
    citation_package: biblatex
    number_sections: true
toc: true
lot: true
lof: true
graphics: true
biblio-title: References
fontsize: 12pt
geometry: lmargin = 3 cm,rmargin = 2.5cm,tmargin=2.5cm,bmargin=2.5cm
biblio-files: references.bib
classoption: a4paper
---

The gradient boosting method implementations of GBM,XGBoost,LightGBM and CatBoost used in this paper features different hyperparameter to tune as shown in table \@ref(tab:hyperparameter).

\begin{table}[!htbp]
\captionsetup{labelsep=newline, justification=centering}
  \begin{threeparttable}
       \caption{\textit{Hyperparemeter tuned for each method}}
     \begin{tabular*}{ \linewidth}{{l@{\extracolsep{\fill}}cr}}
        \toprule
         \multicolumn{1}{l}{\textbf{ Parameter}}& \multicolumn{1}{c}{Model} & \multicolumn{1}{c}{Range} \\
        \midrule
        \addlinespace
        \textbf{tree depth} & All &  [1,100]\\ 
        \addlinespace
        \textbf{mtry (see table notes)} & All &  [0,27]\\ 
        \addlinespace
        \textbf{learning rate} & GBM, XGBoost, LightGBM, Catboost &  [0,1]\\ 
        \addlinespace
        \textbf{loss reduction} & GBM, XGBoost, LightGBM    &  [0,$\infty$]\\ 
        \addlinespace
        \textbf{tree depth} & GBM, XGBoost ,LightGBM, Catboost  &  [1,$\infty$]\\ 
        \addlinespace
        \textbf{minimal node size} &    GBM, XGBoost, LightGBM, Catboost    &  [1,$\infty$]\\ 
        \bottomrule
        \bottomrule
     \end{tabular*}
    \begin{tablenotes}[flushleft]
      \small
      \item \textit{Note: The upper bound of the hyperparameter \textit{mtry} differs due to the chosen model. This is 27 for all models despite catboost which uses ordered target statistics to encode \textit{factor} variables. The latter, as explained in the section X, do not expand the dataset but encode the variable itself. Therefore the upper bound of the hyperparameter \textit{mtry} is 22 instead of 27.} 
    \end{tablenotes}
  \end{threeparttable}
  \label{tab:hyperparameter}
\end{table}


    The gradient boosting method implementations of GBM,XGBoost,LightGBM and CatBoost used in this paper features different hyperparameter to tune as shown in table \@ref(tab:hyperparameter_chunk).
```{=latex}    
    \begin{table}[!htbp]
    \captionsetup{labelsep=newline, justification=centering}
      \begin{threeparttable}
           \caption{\textit{Hyperparemeter tuned for each method}}
         \begin{tabular*}{ \linewidth}{{l@{\extracolsep{\fill}}cr}}
            \toprule
             \multicolumn{1}{l}{\textbf{ Parameter}}& \multicolumn{1}{c}{Model} & \multicolumn{1}{c}{Range} \\
            \midrule
            \addlinespace
            \textbf{tree depth} & All &  [1,100]\\ 
            \addlinespace
            \textbf{mtry (see table notes)} & All &  [0,27]\\ 
            \addlinespace
            \textbf{learning rate} & GBM, XGBoost, LightGBM, Catboost &  [0,1]\\ 
            \addlinespace
            \textbf{loss reduction} & GBM, XGBoost, LightGBM    &  [0,$\infty$]\\ 
            \addlinespace
            \textbf{tree depth} & GBM, XGBoost ,LightGBM, Catboost  &  [1,$\infty$]\\ 
            \addlinespace
            \textbf{minimal node size} &    GBM, XGBoost, LightGBM, Catboost    &  [1,$\infty$]\\ 
            \bottomrule
            \bottomrule
         \end{tabular*}
        \begin{tablenotes}[flushleft]
          \small
          \item \textit{Note: The upper bound of the hyperparameter \textit{mtry} differs due to the chosen model. This is 27 for all models despite catboost which uses ordered target statistics to encode \textit{factor} variables. The latter, as explained in the section X, do not expand the dataset but encode the variable itself. Therefore the upper bound of the hyperparameter \textit{mtry} is 22 instead of 27.} 
        \end{tablenotes}
      \end{threeparttable}
      \label{tab:hyperparameter_chunk}
    \end{table}
```

相关内容