澳门威利斯人_威利斯人娱乐「手机版」

来自 网络资讯 2020-01-12 02:51 的文章
当前位置: 澳门威利斯人 > 网络资讯 > 正文

重拾python爬虫之urllib3

威尼斯人在线投注 1

Urllib3是一个功效强盛,条理清晰,用于HTTP客商端的Python库。多数Python的原生系统已经开首使用urllib3。Urllib3提供了无数python规范库urllib里所未曾的主要性特征:

  1. 线程安全
  2. 连接池
  3. 客户端SSL/TLS验证
  4. 文本办事处编码上传
  5. 救助管理重复诉求和HTTP重定位
  6. 协理压压缩编制码
  7. 支持HTTP和SOCKS代理

一、get请求

urllib3第黄金时代选用连接池实行互联网央求的拜望,所以访谈在此之前咱们必要创立多少个连接池对象,如下所示:

import urllib3url = "http://httpbin.org"http = urllib3.PoolManager();r = http.request('GET',url "/get")print(r.data.decodeprint带参数的getr = http.request('get','http://www.baidu.com/s',fields={'wd':'周杰伦'})print(r.data.decode

经查阅源码:

def request(self, method, url, fields=None, headers=None, **urlopen_kw):
  • 率先个参数method 必选,钦点是什么央求,'get''GET''POST''post''PUT''DELETE'等,不区分抑扬顿挫写。
  • 其次个参数url,必选
  • 其八个参数fields,央求的参数,可选
  • 第一个参数headers 可选

request乞求的重回值是<urllib3.response.HTTPResponse object at 0x000001B3879440B8>

我们可以通过dir()查看其所有的属性和方法。dir直截取了一部分#'data', 'decode_content', 'enforce_content_length', 'fileno', 'flush', 'from_httplib',# 'get_redirect_location', 'getheader', 'getheaders', 'headers', 'info', 'isatty',# 'length_remaining', 'read', 'read_chunked', 'readable', 'readinto', 'readline',# 'readlines', 'reason', 'release_conn', 'retries', 'seek', 'seekable', 'status',# 'stream', 'strict', 'supports_chunked_reads', 'tell', 'truncate', 'version', 'writable',# 'writelines']

二、post请求

import urllib3url = "http://httpbin.org"fields = { 'name':'xfy'}http = urllib3.PoolManager()r = http.request('post',url "/post",fields=fields)print(r.data.decode

可以看见很简短,只是第八个参数get威尼斯人在线投注,换成了post。并且参数没有必要再像urllib同样转换到byte型了。

三、设置headers

import urllib3headers = { 'User-Agent':'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.113 Safari/537.36'}http = urllib3.PoolManager();r = http.request('get',url "/get",headers = headers)print(r.data.decode

四、设置代理

import urllib3url = "http://httpbin.org"headers = { 'User-Agent':'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.113 Safari/537.36'}proxy = urllib3.ProxyManager('http://101.236.19.165:8866',headers = headers)r = proxy.request('get',url "/ip")print(r.data.decode

五、当号令的参数为json

在提倡呼吁时,能够经过定义body 参数并定义headers的Content-Type参数来发送二个已透过编写翻译的JSON数据

import urllib3url = "http://httpbin.org"import jsondata = {'name':'徐繁韵'}json_data = json.dumpshttp = urllib3.PoolManager()r = http.request('post',url "/post",body = json_data,headers = {'Content-Type':'application/json'})print(r.data.decode('unicode_escape'))

六、上传文件

#元组形式with open('a.html','rb') as f: data = f.read()http = urllib3.PoolManager()r = http.request('post','http://httpbin.org/post',fields = {'filefield':('a.html',data,'text/plain')})print(r.data.decode#二进制形式r = http.request('post','http://httpbin.org/post',body = data,headers={'Content-Type':'image/jpeg'})print(r.data.decode

七、超时设置

# 1全局设置超时# http = urllib3.PoolManager(timeout = 3)# 2在request里设置# http.request('post','http://httpbin.org/post',timeout = 3)

八、重试和重定向

import urllib3http = urllib3.PoolManager()#重试r = http.request('post','http://httpbin.org/post',retries = 5) #请求重试测次数为5次 ,默认为3ciprint(r.retries) #Retry(total=5, connect=None, read=None, redirect=0, status=None)#关闭重试http.request('post','http://httpbin.org/post',retries = False) #请求重试测次数为5次 ,默认为3cir = http.request('get','http://httpbin.org/redirect/1',redirect = False)print(r.retries)# Retry(total=3, connect=None, read=None, redirect=None, status=None)printprint(r.data.decodeprint("--------------------")print(r.get_redirect_location#302不是异常

九、urllib3 本人安装了https的拍卖,可是有警示

固然如此能够诉求,但是报如下警报:

InsecureRequestWarning: Unverified HTTPS request is being made. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings InsecureRequestWarning)

剥夺警示:

import urllib3urllib3.disable_warnings() #禁用各种警告url = "https://www.12306.cn/mormhweb/"http = urllib3.PoolManager()r = http.request('get',url)print(r.data.decode

urllib3很有力,不过并未requests好用。领会为主。

本文由澳门威利斯人发布于网络资讯,转载请注明出处:重拾python爬虫之urllib3

关键词: 澳门威利斯人 python 爬虫 重拾