Scrapy多个爬虫批量运行 ```python from scrapy.crawler import CrawlerProcess from scrapy.utils.project import get_project_settings settings = get_project_settings() crawler = CrawlerProcess(settings) crawler.crawl('爬虫1') crawler.crawl('爬虫2') crawler.crawl('爬虫3') crawler.crawl('爬虫4') crawler.crawl('爬虫5') crawler.start() ```