Traefik、docker swarm 和 portainer。提供 robots.txt 文件

Traefik、docker swarm 和 portainer。提供 robots.txt 文件

我正在使用我的家庭实验室,并尝试包含 robots.txt 文件。我正在使用此 docker_compose 文件启动 traefik 和 portainer。这是使用 Docker 群模式

version: "3.3"

services:
  traefik:
    container_name: traefik
    image: "traefik:latest"
    restart: unless-stopped
    command:
      - --entrypoints.web.address=:80
      - --entrypoints.websecure.address=:443
      - --providers.docker.network=web
      - --providers.docker=true
      - --api.dashboard=true
      - --api.insecure=true
      - --log.level=DEBUG
      - --certificatesresolvers.leresolver.acme.httpchallenge=true
      - --certificatesresolvers.leresolver.acme.email=SOME_EMAIL@gmail.com
      - --certificatesresolvers.leresolver.acme.storage=./acme.json
      - --certificatesresolvers.leresolver.acme.httpchallenge.entrypoint=web
      - --providers.docker.exposedbydefault=false
      - --providers.file.filename=/dynamic.yaml
      - --providers.docker.swarmMode=true
    ports:
      - 80:80
      - 443:443
      - 8080:8080
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock:ro
      - ./traefik-data/acme.json:/acme.json
      - ./traefik-data/dynamic.yaml:/dynamic.yaml
    networks:
      - web
    deploy:
      labels:
        - "traefik.enable=true"
        - "traefik.http.routers.http-catchall.rule=hostregexp(`{host:.+}`)"
        - "traefik.http.routers.http-catchall.entrypoints=web"
        - "traefik.http.routers.http-catchall.middlewares=redirect-to-https"
        - "traefik.http.middlewares.redirect-to-https.redirectscheme.scheme=https"
        - "traefik.http.routers.api.rule=Host(`monitor.SOME_DOMAIN.dev`)"
        - "traefik.http.routers.api.service=api@internal"
      placement:
        constraints:
          - node.labels.entrypoint == true
  portainer:
    image: portainer/portainer-ce:latest
    command: -H unix:///var/run/docker.sock
    restart: unless-stopped
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock

      - ./portainer-data:/data
    networks:
      - web
    deploy:
      labels:
        # Portainer
        - "traefik.enable=true"
        - "traefik.http.routers.portainer.rule=Host(`portainer.SOME_DOMAIN.dev`)"
        - "traefik.http.routers.portainer.entrypoints=websecure"
        - "traefik.http.services.portainer.loadbalancer.server.port=9000"
        - "traefik.http.routers.portainer.service=portainer"
        - "traefik.http.routers.portainer.tls.certresolver=leresolver"
      placement:
        constraints:
          - node.labels.entrypoint == true
networks:
  web:
    external: true
volumes:
  portainer-data:
    driver: local
    driver_opts:
      o: bind
      type: none
      device: ./portainer-data

然后我尝试启动 nginx 来提供 robots.txt 文件

version: "3.9"
services:
  antybots:
    image: nginx:alpine
    container_name: antybots
    volumes:
      - /mnt/config/robots/robots.txt:/usr/share/nginx/html/robots.txt:ro
    deploy:
      labels:
        # Antybots
        - "traefik.enable=true"
        - "traefik.http.routers.antybots.rule=HostRegexp(`{host:.*}`) && Path(`/robots.txt`)"
        - "traefik.http.routers.antybots.entrypoints=web"
        - "traefik.http.services.antybots.loadbalancer.server.port=80"
        - "traefik.http.routers.antybots.service=antybots"
        - traefik.http.routers.antybots.priority=99
    networks:
      - web
networks:
  web:
    external: true

但是当我输入 https://SOME_DOMAIN.dev/robots.txt 或 https://ANYTHING.SOME_DOMAIN.dev/robots.txt 时,我得到的只是 404 或 robots.txt 的空内容

User-agent: *
Disallow:

我没有将其放置在我的 robots.txt 文件中。

我确认容器可以正确查看我的 robots.txt 文件。我认为这可能是由于重定向到 https 造成的,这可能是由于

 - "traefik.http.routers.http-catchall.middlewares=redirect-to-https"
 - "traefik.http.middlewares.redirect-to-https.redirectscheme.scheme=https"

有没有办法绕过这个文件并将其作为 http 提供?或者可能是由其他原因引起的?

相关内容