0
所以,我试图从我的饲料请求状态时遇到这个问题,错误是如此模糊我无法弄清楚我做错了什么。任何帮助,将不胜感激。JSON和Twitter API的问题
My Code:
with open("Output.txt") as input:
lines = [line for line in input if line.strip()]
with open("Output.txt", "w") as output:
for items in api.request('statuses/home_timeline', {'count': '200'}):
x = ((items['text'] if 'text' in items else items).encode('ascii', 'ignore'))
current_id = id(x)
print(current_id)
print(x.decode('ascii', 'ignore'))
output.write((x.decode('ascii', 'ignore')) + '\n')
for items in api.request('statuses/home_timeline', {'count': '200'}, {'max_id': str(current_id)}):
x = ((items['text'] if 'text' in items else items).encode('ascii', 'ignore'))
current_id = id(x)
print(current_id)
print(x.decode('ascii', 'ignore'))
output.write((x.decode('ascii', 'ignore')) + '\n')
The Error Code returned:
Traceback (most recent call last):
File "C:/Users/Brandon/PycharmProjects/untitled/automated_timeline_collector.py", line 29, in <module>
for items in api.request('statuses/home_timeline', {'count': '200'}, {'max_id': str(current_id)}):
File "C:\Python27\lib\site-packages\TwitterAPI\TwitterAPI.py", line 140, in __iter__
for item in self.get_iterator():
File "C:\Python27\lib\site-packages\TwitterAPI\TwitterAPI.py", line 137, in get_iterator
return RestIterator(self.response)
File "C:\Python27\lib\site-packages\TwitterAPI\TwitterAPI.py", line 165, in __init__
resp = response.json()
File "C:\Python27\lib\site-packages\requests\models.py", line 756, in json
return json.loads(self.content.decode(encoding), **kwargs)
File "C:\Python27\lib\json\__init__.py", line 338, in loads
return _default_decoder.decode(s)
File "C:\Python27\lib\json\decoder.py", line 366, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "C:\Python27\lib\json\decoder.py", line 384, in raw_decode
raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded
Process finished with exit code 1
该方案通过前200层的状态就没有得到关于我使用的电流id,让代码拿起它离开的地方是不完全正确的方式问题,而是事。感谢您的帮助和反馈。
所以,如果你回来的不是JSON,那么**是什么? – 2014-09-25 03:18:36
捕捉异常,看看它是否有更多的细节。碰到一些不好的页面可能是正常的,发现异常并继续。 – tdelaney 2014-09-25 03:51:11