'selenium opens Chrome instance after making changes in code

I have a simple scraper using selenium. It worked as expected until I started debugging the code and after a second I suddenly stopped the debugger. After this, whenever I make any changes in the code, the Chrome browser instance is opened automatically, stays open for a few seconds, repeats all the steps that I have defined in the code (scrolling a webpage, waiting for the element to show, and clicking some elements and so on), and quits after that.

Why do I have this sort of behavior and how can I solve this? The smallest change in code triggers selenium to open Chrome.

Edit:

Here is almost full code:


    class Scraper:

        def __init__(self, driver=selenium.webdriver.Chrome, scroll_pause=1, timeout=10):
            options = Options()
            options.add_argument("--headless")
            self.driver = driver(chrome_options=options)
            self.scroll_pause = scroll_pause
            self.timeout = timeout
            self.driver.get("https://www.example.com")
            self.driver.set_window_size(1920, 1080)

        def wait(self, condition):
            return WebDriverWait(self.driver, self.timeout).until(condition)

        def wait_for_el(self, selector):
            return self.wait(EC.presence_of_element_located((By.CSS_SELECTOR, selector)))

        def count_elements(self):
            return self.driver.find_elements(By.CSS_SELECTOR, ".element_css_here")


        def scroll_to_bottom(self):
            start = datetime.now()
            while True:
                self.driver.execute_script("window.scrollTo(0, document.body.scrollHeight)")
                time.sleep(self.scroll_pause)
                try:
                    if self.driver.find_element(By.XPATH, '//div[@class="display-flex p5"]/button').is_displayed():
                        self.driver.find_element(By.XPATH, '//div[@class="display-flex p5"]/button').click()
                        self.wait_for_el(".element_css_here")
                        self.driver.execute_script("window.scrollTo(0, document.body.scrollHeight)")
                except NoSuchElementException:
                   self.count_elements
                   break
            logger.info(f"Scrolled to bottom in {datetime.now() - start}")


        def scrape(self, page):
            self.driver.get(page)
            try:
                self.wait_for_el(".css_selector_here")
            except TimeoutException:
                self.driver.quit()

            total_connections_text = self.driver.find_element(By.CSS_SELECTOR, ".css_selector_here").text
            self.count_elements = int(re.search(r"(\d+)", total_connections_text).group(1))
            logger.info(f"Found {self.count_elements} elements")

            logger.info("Scrolling to bottom...")
            self.scroll_to_bottom()
            result = self.get_all_data()
            self.driver.quit()
            return result

Second Edit:

I'm using vscode and I suspect it's vscode fault but cannot verify that.

Another update:

I observed this behavior right now. When I make changes in VSCode settings selenium script is triggered and opens the Chrome browser.



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source