我正在使用 Nginx 并将几个旧域名指向一个新网站。
此配置中的第一个块可以很好地满足我的需要旧域名重定向至新域名。
在第二个区块中,我尝试转发任何请求旧媒体服务器.域名 除了/robots.txt 到主页新域名。在当前状态下,每个请求都会重定向,包括 /robots.txt - 我不知道为什么。
(原因是我从旧域名中获得了 Google 索引的内容,并且我试图通过网站管理员工具将其从搜索结果中删除 - 这可能不起作用,但这不是我在这里寻求帮助的原因!)。
# Old site to new site config
server {
listen 80;
listen [::]:80;
server_name old.domain www.old.domain;
rewrite ^ $scheme://www.new.domain$request_uri permanent;
}
# Media server Redirect and Robots directive
server {
listen 80;
listen [::]:80;
server_name oldmediaserver.domain www.oldmediaserver.domain;
location / {
rewrite / $scheme://www.new.domain/ permanent;
}
location /robots.txt {
return 200 "User-agent: *\nDisallow: /";
}
rewrite ^ $scheme://www.new.domain/ permanent;
}
server {
listen 80 default_server;
listen [::]:80 default_server;
root /var/www/website-name/html;
# Add index.php to the list if you are using PHP
index index.php index.html index.htm index.nginx-debian.html;
server_name www.new.domain;
location / {
# First attempt to serve request as file, then
# as directory, then fall back to displaying a 404.
try_files $uri $uri/ /index.php?$args;
}
# pass the PHP scripts to FastCGI server listening on 127.0.0.1:9000
location ~ \.php$ {
include snippets/fastcgi-php.conf;
# # With php5-fpm:
fastcgi_pass unix:/var/run/php5-fpm.sock;
}
# include a file for any 301 redirects
include includes/website-name-redirects;
location /members/ {
try_files $uri $uri/ /index.php?$args;
auth_basic "Members Login";
auth_basic_user_file /var/www/website-name/html/.htpasswd;
location ~ \.php$ {
include snippets/fastcgi-php.conf;
# With php5-fpm:
fastcgi_pass unix:/var/run/php5-fpm.sock;
}
}
#!!! IMPORTANT !!! We need to hide the password file from prying eyes
# This will deny access to any hidden file (beginning with a .period)
location ~ /\. { deny all; }
}
感谢您提供的启发!
答案1
谢谢gf_和流浪者104评论。流浪者104关于匹配位置的评论让我研究了不同的匹配模式并最终找到了下面的配置。
# Media server Redirect and Robots directive
server {
listen 80;
listen [::]:80;
server_name oldmediaserver.domain www.oldmediaserver.domain;
location ^~ / {
rewrite ^ $scheme://www.new.domain/ permanent;
}
location ^~ /robots.txt {
return 200 "User-agent: *\nDisallow: /";
}
}
我仍然不确定我是否完全理解为什么这个有效而另一个无效,所以如果有人可以进一步阐明那就太好了!