0.目录

1.背景

某号码卡申请页面通过省份+城市切换归属地,每次返回10个号码。

通过 Fiddler 抓包确认 url 关键参数规律:

provinceCode 两位数字

cityCode 三位数字

groupKey 与 provinceCode 为一一对应

所以任务是手动遍历省份,取得 provinceCode 和 groupKey 组合列表,对组合列表的每个组合执行 for 循环 cityCode ,确认有效 url 。

url 不对的时候正常返回,而使用 squid 多代理经常出现代理失效,需要排除 requests 相关异常,尽量避免错判。

  1. # In [88]: r.text
  2. # Out[88]: u'jsonp_queryMoreNums({"numRetailList":[],"code":"M1","uuid":"a95ca4c6-957e-462a-80cd-0412b
  3. # d5672df","numArray":[]});'

获取号码归属地信息:

url = 'http://www.ip138.com:8080/search.asp?action=mobile&mobile=%s' %num

中文转换拼音:

from pypinyin import lazy_pinyin

province_pinyin = ''.join(lazy_pinyin(province_zh))

确认任务队列已完成:

https://docs.python.org/2/library/queue.html#module-Queue

  1. Queue.task_done()
  2. Indicate that a formerly enqueued task is complete. Used by queue consumer threads. For each get() used to fetch a task, a subsequent call to task_done() tells the queue that the processing on the task is complete.
  3.  
  4. If a join() is currently blocking, it will resume when all items have been processed (meaning that a task_done() call was received for every item that had been put() into the queue).
  5.  
  6. Raises a ValueError if called more times than there were items placed in the queue.

2.完整代码

referer 和 url 细节已#!/usr/bin/env python# -*- coding: UTF-8 -*import timeimport reimport jsonimport traceback

  1. import threading
  2. lock = threading.Lock()
  3. import Queue
  4. task_queue = Queue.Queue()
  5. write_queue = Queue.Queue()
  6. import requests
  7. from requests.exceptions import (ConnectionError, ConnectTimeout, ReadTimeout, SSLError,
  8. ProxyError, RetryError, InvalidSchema)
  9. s = requests.Session()
  10. s.headers.update({'user-agent':'Mozilla/5.0 (iPhone; CPU iPhone OS 9_3_5 like Mac OS X) AppleWebKit/601.1.46 (KHTML, like Gecko) Mobile/13G36 MicroMessenger/6.5.12 NetType/4G'})
    # 隐藏 referer 细节,实测可不用
  11. # s.headers.update({'Referer':'https://servicewechat.com/xxxxxxxx'})
  12. s.verify = False
  13. s.mount('https://', requests.adapters.HTTPAdapter(pool_connections=1000, pool_maxsize=1000))
  14. import copy
  15. sp = copy.deepcopy(s)
  16. proxies = {'http': 'http://127.0.0.1:3128', 'https': 'https://127.0.0.1:3128'}
  17. sp.proxies = proxies
  18. from urllib3.exceptions import InsecureRequestWarning
  19. from warnings import filterwarnings
  20. filterwarnings('ignore', category = InsecureRequestWarning)
  21. from bs4 import BeautifulSoup as BS
  22. from pypinyin import lazy_pinyin
  23. import pickle
  24. import logging
  25. def get_logger():
  26. logger = logging.getLogger("threading_example")
  27. logger.setLevel(logging.DEBUG)
  28. # fh = logging.FileHandler("d:/threading.log")
  29. fh = logging.StreamHandler()
  30. fmt = '%(asctime)s - %(threadName)-10s - %(levelname)s - %(message)s'
  31. formatter = logging.Formatter(fmt)
  32. fh.setFormatter(formatter)
  33. logger.addHandler(fh)
  34. return logger
  35. logger = get_logger()
  36. # url 不对的时候正常返回:
  37. # In [88]: r.text
  38. # Out[88]: u'jsonp_queryMoreNums({"numRetailList":[],"code":"M1","uuid":"a95ca4c6-957e-462a-80cd-0412b
  39. # d5672df","numArray":[]});'
  40. results = []
  41. def get_nums():
  42. global results
  43. pattern = re.compile(r'({.*?})') #, re.S | re.I | re.X)
  44. while True:
  45. try: #尽量缩小try代码块大小
  46. _url = task_queue.get()
  47. url = _url + str(int(time.time()*1000))
  48. resp = sp.get(url, timeout=10)
  49. except (ConnectionError, ConnectTimeout, ReadTimeout, SSLError,
  50. ProxyError, RetryError, InvalidSchema) as err:
  51. task_queue.task_done() ############### 重新 put 之前需要 task_done ,才能保证释放 task_queue.join()
  52. task_queue.put(_url)
  53. except Exception as err:
  54. logger.debug('\nstatus_code:{}\nurl:{}\nerr: {}\ntraceback: {}'.format(resp.status_code, url, err, traceback.format_exc()))
  55. task_queue.task_done() ############### 重新 put 之前需要 task_done ,才能保证释放 task_queue.join()
  56. task_queue.put(_url)
  57. else:
  58. try:
  59. # rst = resp.content
  60. # match = rst[rst.index('{'):rst.index('}')+1]
  61. # m = re.search(r'({.*?})',resp.content)
  62. m = pattern.search(resp.content)
  63. match = m.group()
  64. rst = json.loads(match)
  65. nums = [num for num in rst['numArray'] if num>10000]
  66. nums_len = len(nums)
  67. # assert nums_len == 10
  68. num = nums[-1]
  69. province_zh, city_zh, province_pinyin, city_pinyin = get_num_info(num)
  70. result = (str(num), province_zh, city_zh, province_pinyin, city_pinyin, _url)
  71. results.append(result)
  72. write_queue.put(result)
  73. logger.debug(u'results:{} threads: {} task_queue: {} {} {} {} {}'.format(len(results), threading.activeCount(), task_queue.qsize(),
  74. num, province_zh, city_zh, _url))
  75. except (ValueError, AttributeError, IndexError) as err:
  76. pass
  77. except Exception as err:
  78. # print err,traceback.format_exc()
  79. logger.debug('\nstatus_code:{}\nurl:{}\ncontent:{}\nerr: {}\ntraceback: {}'.format(resp.status_code, url, resp.content, err, traceback.format_exc()))
  80. finally:
  81. task_queue.task_done() ###############
  82. def get_num_info(num):
  83. try:
  84. url = 'http://www.ip138.com:8080/search.asp?action=mobile&mobile=%s' %num
  85. resp = s.get(url)
  86. soup = BS(resp.content, 'lxml')
  87. # pro, cit = re.findall(r'<TD class="tdc2" align="center">(.*?)<', resp.content)[0].decode('gbk').split('&nbsp;')
  88. rst = soup.select('tr td.tdc2')[1].text.split()
  89. if len(rst) == 2:
  90. province_zh, city_zh = rst
  91. else:
  92. province_zh = city_zh = rst[0]
  93. province_pinyin = ''.join(lazy_pinyin(province_zh))
  94. city_pinyin = ''.join(lazy_pinyin(city_zh))
  95. except Exception as err:
  96. print err,traceback.format_exc()
  97. province_zh = city_zh = province_pinyin = city_pinyin = 'xxx'
  98. return province_zh, city_zh, province_pinyin, city_pinyin
  99. def write_result():
  100. with open('10010temp.txt','w',0) as f: # 'w' open时会截去之前内容,所以放在 while True 之上
  101. while True:
  102. try:
  103. rst = ' '.join(write_queue.get()) + '\n'
  104. f.write(rst.encode('utf-8'))
  105. write_queue.task_done()
  106. except Exception as err:
  107. print err,traceback.format_exc()
  108. if __name__ == '__main__':
  109. province_groupkey_list = [
  110. ('', ''),
  111. ('', ''),
  112. ('', ''),
  113. ('', ''),
  114. ('', ''),
  115. ('', ''),
  116. ('', ''),
  117. ('', ''),
  118. ('', ''),
  119. ('', ''),
  120. ('', ''),
  121. ('', ''),
  122. ('', ''),
  123. ('', ''),
  124. ('', ''),
  125. ('', ''),
  126. ('', ''),
  127. ('', ''),
  128. ('', ''),
  129. ('', ''),
  130. ('', ''),
  131. ('', ''),
  132. ('', ''),
  133. ('', ''),
  134. ('', ''),
  135. ('', ''),
  136. ('', ''),
  137. ('', ''),
  138. ('', ''),
  139. ('', ''),
  140. ('', '')]
  141. # province_groupkey_list = [('51', '21236872')]
  142. import itertools
  143. for (provinceCode, groupKey) in province_groupkey_list:
  144. # for cityCode in range(1000):
  145. for cityCode in [''.join(i) for i in itertools.product('',repeat=3)]:
    fmt = 'https://m.1xxxx.com/xxxxx&provinceCode={provinceCode}&cityCode={cityCode}&xxxxx&groupKey={groupKey}&xxxxx' # url 细节已被隐藏
  146. url = fmt.format(provinceCode=provinceCode, cityCode=cityCode, groupKey=groupKey)#, now=int(float(time.time())*1000))
  147. task_queue.put(url)
  148. threads = []
  149. for i in range(300):
  150. t = threading.Thread(target=get_nums) #args接收元组,至少(a,)
  151. threads.append(t)
  152. t_write_result = threading.Thread(target=write_result)
  153. threads.append(t_write_result)
  154. # for t in threads:
  155. # t.setDaemon(True)
  156. # t.start()
  157. # while True:
  158. # pass
  159. for t in threads:
  160. t.setDaemon(True)
  161. t.start()
  162. # for t in threads:
  163. # t.join()
  164. task_queue.join()
  165. print 'task done'
  166. write_queue.join()
  167. print 'write done'
  168. with open('10010temp','w') as f:
  169. pickle.dump(results, f)
  170. print 'all done'
  171. # while True:
  172. # pass

3.运行结果

多运行几次,确认最终 results 数量339

python之多线程 queue 实践 筛选有效url的更多相关文章

  1. 【python】多线程queue导致的死锁问题

    写了个多线程的python脚本,结果居然死锁了.调试了一整天才找到原因,是我使用queue的错误导致的. 为了说明问题,下面是一个简化版的代码.注意,这个代码是错的,后面会说原因和解决办法. impo ...

  2. day11学python 多线程+queue

    多线程+queue 两种定义线程方法 1调用threading.Thread(target=目标函数,args=(目标函数的传输内容))(简洁方便) 2创建一个类继承与(threading.Threa ...

  3. 【转】使用python进行多线程编程

    1. python对多线程的支持 1)虚拟机层面 Python虚拟机使用GIL(Global Interpreter Lock,全局解释器锁)来互斥线程对共享资源的访问,暂时无法利用多处理器的优势.使 ...

  4. Python编程-多线程

    一.python并发编程之多线程 1.threading模块 multiprocess模块的完全模仿了threading模块的接口,二者在使用层面,有很大的相似性,因而不再详细介绍 1.1 开启线程的 ...

  5. PythonI/O进阶学习笔记_10.python的多线程

     content: 1. python的GIL 2. 多线程编程简单示例 3. 线程间的通信 4. 线程池 5. threadpool Future 源码分析   ================== ...

  6. Python的多线程(threading)与多进程(multiprocessing )

    进程:程序的一次执行(程序载入内存,系统分配资源运行).每个进程有自己的内存空间,数据栈等,进程之间可以进行通讯,但是不能共享信息. 线程:所有的线程运行在同一个进程中,共享相同的运行环境.每个独立的 ...

  7. Python实现多线程HTTP下载器

    本文将介绍使用Python编写多线程HTTP下载器,并生成.exe可执行文件. 环境:windows/Linux + Python2.7.x 单线程 在介绍多线程之前首先介绍单线程.编写单线程的思路为 ...

  8. Python实现多线程调用GDAL执行正射校正

    python实现多线程参考http://www.runoob.com/python/python-multithreading.html #!/usr/bin/env python # coding: ...

  9. Python之多线程和多进程

    一.多线程 1.顺序执行单个线程,注意要顺序执行的话,需要用join. #coding=utf-8 from threading import Thread import time def my_co ...

随机推荐

  1. 持续集成之②:整合jenkins与代码质量管理平台Sonar并实现构建失败邮件通知

    持续集成之②:整合jenkins与代码质量管理平台Sonar并实现构建失败邮件通知 一:Sonar是什么?Sonar 是一个用于代码质量管理的开放平台,通过插件机制,Sonar 可以集成不同的测试工具 ...

  2. 51Nod--1384全排列

    1384 全排列 基准时间限制:1 秒 空间限制:131072 KB 分值: 0 难度:基础题 收藏 关注 给出一个字符串S(可能又重复的字符),按照字典序从小到大,输出S包括的字符组成的所有排列.例 ...

  3. JAVA实现网络文件下载

    HttpURLConnection conn = null; OutputStream outputStream = null; InputStream inputStream = null; try ...

  4. 坚持:学习Java后台的第一阶段,我学习了那些知识

    最近的计划是业余时间学习Java后台方面的知识,发现学习的过程中,要学的东西真多啊,让我一下子感觉很遥远.但是还好我制定了计划,自己选择的路,跪着也要走完!关于计划是<终于,我还是下决心学Jav ...

  5. 通过设置ie的通过跨域访问数据源,来访问本地服务

    1.首先设置通过域访问数据源 设置通过域访问数据源 2.javascript脚本ajax使用本地服务登录(评价,人证的类似)接口 <html> <head> <scrip ...

  6. npx简介(转载)

    npm v5.2.0引入的一条命令(npx),引入这个命令的目的是为了提升开发者使用包内提供的命令行工具的体验. 举例:使用create-react-app创建一个react项目. 老方法: npm ...

  7. bat如何实现多台android设备同时安装多个apk

    背景:在做预置资源(安装apk)时,有多台android设备需要做相同的资源(如:10台,安装10个apk).一台一台去预置的话(当然也可以每人一台去预置),耗时较长有重复性. 问题:如何去实现多台同 ...

  8. HTML5 缓存: cache manifest

    ---恢复内容开始--- 1:MIME TYPE:text/cache-manifest 服务器配置MIME类型2:需要由你创建的:NAME.manifest 创建manifest文件3:给 < ...

  9. MongoDB的简单操作

    一.简介 二.MongoDB基础知识 三.安装 四.基本数据类型 五.增删改查操作 六.可视化工具 七.pymongo 一.简介 MongoDB是一款强大.灵活.且易于扩展的通用型数据库 MongoD ...

  10. 《剑指offer》 二维数组中的查找

    本题目是<剑指offer>中的题目 二维数组中的查找 题目: 在一个二维数组中(每个一维数组的长度相同),每一行都按照从左到右递增的顺序排序,每一列都按照从上到下递增的顺序排序.请完成一个 ...