'getting NoSuchWindowException while scrapping twitter usernames using Selenium
I have been trying to scrape twitter usernames by going inside the followers page but the issue is if I leave my pc there after some time I get this exception and also If I look into the browser it shows me this error browser error check screenshot here
I am attaching my code here but I think it has some issue with scrolling when I scroll to much the memory of browser goes small. but still thinking of knowing the actual reason.
while scrolling:
sleep(1)
page_cards = driver.find_elements_by_xpath('.//div[@data-testid="primaryColumn"]//section//span[contains(text(), "@")]')
print('cards length:',len(page_cards))
for card in page_cards[-20:]:
count = count + 1
tweet = get_tweet_data(card)
if tweet:
tweet_id = ''.join(tweet)
if tweet_id not in tweet_ids:
tweet_ids.add(tweet_id)
data.append(tweet)
writer.writerow([tweet])
print(count, 'username: ', tweet)
# postgres_insert_query = """ INSERT INTO tbl_username (id, u_name) VALUES (%s,%s)"""
# record_to_insert = (count, tweet)
# cursor.execute(postgres_insert_query, record_to_insert)
# connection.commit()
scroll_attempt = 0
while True:
driver.execute_script('window.scrollTo(0, document.body.scrollHeight);')
sleep(2)
curr_position = driver.execute_script("return window.pageYOffset;")
if last_position == curr_position:
scroll_attempt += 1
if scroll_attempt >= 3:
scrolling = False
break
else:
sleep(2)
else:
last_position = curr_position
break
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
