所以我有这个简单的代码
\documentclass[12pt]{report}
\usepackage{blkarray}
\usepackage{amsmath}
\usepackage[version=4]{mhchem}
\usepackage{amsmath}
\begin{document}
Given perfect data we can use Bayesian inference to infer our rate constants. This allows us to know the state of the system at given times, this is called our sample path denoted as;
\begin{gather*}
$\mathbf x = \{ x(t) : t \in [0,T]\}$
\end{gather*}
Complete information on the sample path means that we what time of reaction occurs at different times\\
The form of the Gillespie algorithm means the likelihood function is constructed as
\begin{gather*}
$L(c;\mathbf x)$ = $\{\prod\limits_{i=1}^n h_{v_i}(x(t_{i_1},c_{v_i}\}$exp$$\{-\int_0^Th_0(x(t),c)dt\}$$
\end{gather*}
\\
A critical observation is that our rate laws can be written in the form,
\begin{align*}
$h_j(x,c_j) = c_jg_j(x)$
\intertext{which leads to;}
$L_j(c_j;\mathbf{x}) = c_j^{r_j}exp\{-c_j\int_0^T g_j(x(t))dt\}$, $j=1,...,v$
\end{align*}\\
We assume we have independent gamma priors for our rate constants
\begin{gather*}
$c_j ~ Ga(a_j,b_j)$, $j=1,..,v$
\end{gather*}\\
Using Bayes Theorem we are able to produce a posterior for our rate constants,
\begin{gather*}
$c_j|\mathbf {x}$ ~ $Ga(a_j+r_j,b_j+\int_0^Tg_j(x(t))dt)$, $j=1,...,v$
\end{gather*}\\
Applying this to our Michaelis-Menten model produces graphs of the prior and posterior seen below
\end{document}
但我一直收到错误
Missing { inserted. <to be read again>}
但我不明白这是什么意思
答案1
您在和环境中拥有很多$
和,但您不必使用它们,因为您已经在这些环境中处于数学模式。$$
gather
align
此外,不要\\
在普通文本中使用 仅转到新行,只需留一个空行即可创建新段落或使用\newline
。
有一个命令用于“exp”:\exp
和“:”:,\colon
并且\quad
您可以留出一些空间。除了这些小问题之外,我还没有纠正你的公式,但还有其他错误,我相信这里的一些数学家会很乐意改进它们。
此外,正如大卫在他的评论中指出的那样,最好使用它\[...\]
而不是gather
单线方程。
最终,正如 egreg 所说,没有必要\intertext
在里面使用gather
,而是使用\dots
代替...
。
\documentclass[12pt]{report}
\usepackage{blkarray}
\usepackage{amsmath}
\usepackage[version=4]{mhchem}
\usepackage{amsmath}
\begin{document}
Given perfect data we can use Bayesian inference to infer our rate constants. This allows us to know the state of the system at given times, this is called our sample path denoted as:
\[
\mathbf{x} = \{ x(t)\colon t \in [0,T]\}
\]
Complete information on the sample path means that we what time of reaction occurs at different times\newline
The form of the Gillespie algorithm means the likelihood function is constructed as
\[
L(c;\mathbf{x}) = \bigl\{\prod\limits_{i=1}^n h_{v_i}(x(t_{i_1},c_{v_i}\}\exp\{-\int_0^Th_0(x(t),c)dt\bigr\}
\]
A critical observation is that our rate laws can be written in the form,
\[
h_j(x,c_j) = c_jg_j(x)
\]
which leads to:
\[
L_j(c_j;\mathbf{x}) = c_j^{r_j}\exp\{-c_j\int_0^T g_j(x(t))dt\}, \quad j=1,\dots,v
\]
We assume we have independent gamma priors for our rate constants
\[
c_j ~ Ga(a_j,b_j), \quad j=1, \dots ,v
\]
Using Bayes Theorem we are able to produce a posterior for our rate constants,
\[
c_j|\mathbf {x} ~ Ga(a_j+r_j,b_j+\int_0^Tg_j(x(t))dt), \quad j=1,\dots,v
\]
Applying this to our Michaelis-Menten model produces graphs of the prior and posterior seen below
\end{document}