小编给大家分享一下python爬虫实战之怎样爬取高德地区,相信大部分人都还不怎么了解,因此分享这篇文章给大家参考一下,希望大家阅读完这篇文章后大有收获,下面让我们一起去了解一下吧!
代码:
import requests import json def weatherlist(url1,url2,headers,proxies): response = requests.get(url=url1, headers=headers, proxies=proxies).content.decode('utf-8') response = json.loads(response) for i in response["data"]["cityByLetter"].values(): for j in i: adcode = j["adcode"] name = j["name"] full_url = url2+adcode response = requests.get(url=full_url, headers=headers, proxies=proxies).content.decode('utf-8') response = json.loads(response) print(response) try: if response["data"]["data"]: for weather in response["data"]["data"]: for weather in weather['forecast_data']: weather_name = weather['weather_name'] temp_min = weather['min_temp'] temp_max = weather['max_temp'] with open('weather_list.txt', 'a', encoding='utf-8') as fp: fp.write("城市:"+name+ " 天气: "+weather_name+" 最高气温: "+ temp_max +" 最低气温: "+temp_min+'\n') except: print('空') if __name__ == '__main__': url1 = 'https://www.amap.com/service/cityList' url2 = 'https://www.amap.com/service/weather?adcode=' headers = {'User-Agent':'Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/78.0.3904.97 Safari/537.36','Cookie':'BIDUPSID=F6BBCD59FE2A646812DB8DAE641A0BE5; PSTM=1573713375; BAIDUID=F6BBCD59FE2A6468D0329C1E2F60212F:FG=1; BD_UPN=12314353; BDORZ=B490B5EBF6F3CD402E515D22BCDA1598; H_PS_PSSID=1452_21098_29568_29221_26350; delPer=0; BD_CK_SAM=1; PSINO=2; H_PS_645EC=50d5uY51q2qJG%2BVlK7rlPmCgY73TcN9qKRz4sPKuBII1GIkIx4QkChitGd4; BDSVRTM=209'} proxies = {'http':'124.113.217.5:9999','https':''} weatherlist(url1,url2,headers,proxies)
以上是“python爬虫实战之怎样爬取高德地区”这篇文章的所有内容,感谢各位的阅读!相信大家都有了一定的了解,希望分享的内容对大家有所帮助,如果还想学习更多知识,欢迎关注亿速云行业资讯频道!
免责声明:本站发布的内容(图片、视频和文字)以原创、转载和分享为主,文章观点不代表本网站立场,如果涉及侵权请联系站长邮箱:is@yisu.com进行举报,并提供相关证据,一经查实,将立刻删除涉嫌侵权内容。