AWStats cron 作业“awstats.pl”需要超过 24 小时

AWStats cron 作业“awstats.pl”需要超过 24 小时

我们在 AWStats 后台读取access.log文件时遇到了问题:awstats.pl脚本需要超过24小时而且似乎永无尽头。

我们有一个网站,每天的页面浏览量超过 800 万次,每天都会生成一个2 GBApache文件。access.log

这是我们尝试awstats.pl手动运行脚本时的命令日志:

root@hostname:~# /usr/lib/cgi-bin/awstats.pl -config=org.mysite -update

Create/Update database for config "/etc/awstats/awstats.org.mysite" by AWStats version 7.4 (build 20150714)
From data in log file "/var/log/apache2/org.mysite-access.log"...

Phase 1 : First bypass old records, searching new record...
Direct access to last remembered record is out of file.
So searching it from beginning of log file...

Phase 2 : Now process new records (Flush history on disk after 20000 hosts)...
Flush history file on disk (unique url reach flush limit of 5000)
Flush history file on disk (unique url reach flush limit of 5000)
Flush history file on disk (unique url reach flush limit of 5000)
Flush history file on disk (unique url reach flush limit of 5000)
Flush history file on disk (unique url reach flush limit of 5000)
Flush history file on disk (unique url reach flush limit of 5000)
Flush history file on disk (unique url reach flush limit of 5000)
Flush history file on disk (unique url reach flush limit of 5000)
Flush history file on disk (unique url reach flush limit of 5000)
Flush history file on disk (unique url reach flush limit of 5000)
Flush history file on disk (unique url reach flush limit of 5000)
Flush history file on disk (unique url reach flush limit of 5000)
Flush history file on disk (unique url reach flush limit of 5000)
Flush history file on disk (unique url reach flush limit of 5000)
Flush history file on disk (unique url reach flush limit of 5000)
Flush history file on disk (unique url reach flush limit of 5000)
Flush history file on disk (unique url reach flush limit of 5000)
Flush history file on disk (unique url reach flush limit of 5000)
Flush history file on disk (unique url reach flush limit of 5000)
Flush history file on disk (unique url reach flush limit of 5000)
Flush history file on disk (unique url reach flush limit of 5000)
Flush history file on disk (unique url reach flush limit of 5000)
Flush history file on disk (unique url reach flush limit of 5000)
Flush history file on disk (unique url reach flush limit of 5000)
Flush history file on disk (unique url reach flush limit of 5000)
^C

我们习惯ctrl+C在这里停下来,因为已经花了一个多小时了。

我们尝试禁用 DNS 查找(设置为0),但没有帮助:

编辑文件/etc/awstats/awstats.org.mysite.conf

 # 0 - No DNS Lookup
 # 1 - DNS Lookup is fully enabled
 # 2 - DNS Lookup is made only from static DNS cache file (if it exists)
 # Default: 2
 DNSLookup=0

硬件不应该是瓶颈,而是 ovh.comHOST-128L专用服务器

  • Intel Xeon D-1520 - 4 核 / 8 线程
  • 128GB DDR4 ECC 2133 MHz
  • 2 x 480 GB / 固态硬盘

操作系统是Ubuntu 16.04.3 LTS

root@hostname:~# cat /etc/*-release

NAME="Ubuntu"
VERSION="16.04.3 LTS (Xenial Xerus)"
ID=ubuntu
ID_LIKE=debian
PRETTY_NAME="Ubuntu 16.04.3 LTS"
VERSION_ID="16.04"
VERSION_CODENAME=xenial

所以,

  1. 我们可以做些什么来提高 AWStats 的性能?
  2. 或者我们是否达到了 AWStats 的性能阈值?

相关内容