问题发现:
前段时间项目中,为了防止被封号(提供的可用账号太少),对于能不登录就可以抓取的内容采用不带cookie的策略,只有必要的内容才带上cookie去访问。
本来想着很简单:在每个抛出来的Request的meta中带上一个标志位,通过在CookieMiddleware中查看这个标志位,决定是否是给这个Request是否装上Cookie。
实现的代码大致如下:
class<i style="color:transparent">本文来源gaodai$ma#com搞$$代**码)网8</i> CookieMiddleware(object): """ 每次请求都随机从账号池中选择一个账号去访问 """ def __init__(self): client = pymongo.MongoClient(MONGO_URI) self.account_collection = client[MONGO_DATABASE][ACCOUNT_COLLECTION] def process_request(self, request, spider): if 'target' in request.meta: logging.debug('进入到process_request了') flag = request.meta['target'] if flag != 'no': all_count = self.account_collection.find({'status': 'success'}).count() if all_count == 0: raise Exception('当前账号池为空') random_index = random.randint(0, all_count - 1) random_account = self.account_collection.find({'status': 'success'})[random_index] request.cookies = json.loads(random_account['cookie']) else: logging.debug('对XXX的请求不做处理') else: all_count = self.account_collection.find({'status': 'success'}).count() if all_count == 0: raise Exception('当前账号池为空') random_index = random.randint(0, all_count - 1) random_account = self.account_collection.find({'status': 'success'})[random_index] request.cookies = json.loads(random_account['cookie'])
在settings.py中的配置如下:
DOWNLOADER_MIDDLEWARES = { 'eyny.middlewares.CookieMiddleware': 550, }
到这里可能有些大佬已经能够看出端倪了,和我一样认为这么写没啥问题的同志们继续往下看。
在这么编写完之后,我正常开启了项目,还适当调高了并发量,然后第二天发现账号被封了。在debug过程中看到在抓取不需要携带cookie的url的时候,依然携带了cookie,并且cookie是被放在了header中,经过我花费了两个多小时查看框架源码之后,终于发现了原因。
原因&解决方案:
在scrapy的settings目录下的default_settings.py文件中,初始声明了一些DOWNLOADER_MIDDLEWARES_BASE,这些middlewares的声明如下:
DOWNLOADER_MIDDLEWARES_BASE = { # Engine side 'scrapy.downloadermiddlewares.robotstxt.RobotsTxtMiddleware': 100, 'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware': 300, 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware': 350, 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware': 400, 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware': 500, 'scrapy.downloadermiddlewares.retry.RetryMiddleware': 550, 'scrapy.downloadermiddlewares.ajaxcrawl.AjaxCrawlMiddleware': 560, 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware': 580, 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware': 590, 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware': 600, 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware': 700, 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware': 750, 'scrapy.downloadermiddlewares.stats.DownloaderStats': 850, 'scrapy.downloadermiddlewares.httpcache.HttpCacheMiddleware': 900, # Downloader side }
可以看到在DOWNLOADER_MIDDLEWARES_BASE中也声明了一个CookiesMiddleware,而且是700,也就是说比我们写的CookieMiddleware(500)要靠后执行,而且在debug过程中也看到,在执行完我们编写的CookieMiddleware之后,header中没有携带cookie,但是在执行完scrapy.downloadermiddlewares.cookies.CookiesMiddleware: 700之后,在header中看到了cookie,这说明cookie是scrapy帮我们自动加了。