CentOS:运行 python 脚本时因对等错误而重置连接

CentOS:运行 python 脚本时因对等错误而重置连接

对于我的大学项目,我正在使用 python 构建一种流量生成工具。我在Vmware上开发了自己的linux服务器和客户端。我正在使用 urllib2 在 python 中生成流量。我在这里面临的问题是,当我在客户端计算机上运行脚本时(使用多重处理不断向 Linux 服务器发送请求),它在前几分钟工作正常,例如大约 2000 个请求,但之后它显示“连接重置为peer”错误,我的脚本崩溃了。可能是什么问题呢?我尝试做,但这没有帮助。

如何防止这种超时错误并连续运行我的脚本几个小时?我使用的是centos 6.5

'''
Traffic Generator Script:
    Here I have used IP Aliasing to create multiple clients on single vm machine. same I have done on server side to create multiple servers. I have around 50 clients and 10 servers
'''
import multiprocessing
import urllib2
import random
import myurllist    #list of all destination urls for all 10 servers
import time
import socbindtry   #script that binds various virtual/aliased client ips to the script
response_time=[]    #some global variables shared between all processes
error_count=multiprocessing.Value('i',0)
def send_request3():    #function to send requests from alias client ip 1
    opener=urllib2.build_opener(socbindtry.BindableHTTPHandler3)    #bind to alias client ip1
    try:
    tstart=time.time()
    for i in range(myurllist.url):
        x=random.choice(myurllist.url[i])
        opener.open(x).read()
        print "file downloaded:",x
        response_time.append(time.time()-tstart)
    except urllib2.URLError, e:
        error_count.value=error_count.value+1
def send_request4():    #function to send requests from alias client ip 2
    opener=urllib2.build_opener(socbindtry.BindableHTTPHandler4)    #bind to alias client ip2
    try:
    tstart=time.time()
    for i in range(myurllist.url):
        x=random.choice(myurllist.url[i])
        opener.open(x).read()
        print "file downloaded:",x
        response_time.append(time.time()-tstart)
    except urllib2.URLError, e:
        error_count.value=error_count.value+1
#50 such functions are defined here for 50 clients
process=[]
def func():
    global process
    process.append(multiprocessing.Process(target=send_request3))
    process.append(multiprocessing.Process(target=send_request4))
    process.append(multiprocessing.Process(target=send_request5))
    process.append(multiprocessing.Process(target=send_request6))
#append 50 functions here
    for i in range(len(process)):
    process[i].start()
    for i in range(len(process)):
    process[i].join() 
    print"All work Done..!!"
    return
start=float(time.time())
func()
end=float(time.time())-start
print end

相关内容