'Python - Write For Loop items into CSV as new rows

Hi Ive been working a python script to collect a list of results and write those results into a csv file.

My code works, but it writes the results into one cell (joined by "~"). I would like to write each result into a new line.

I don't have a lot of experience, and I did search to the extent of my capabilities to do so. If you feel as if something eluded me, could you please let me know what I should have been searching for?

Here is my current code:

person_result_names = WebDriverWait(browser, 60, ignored_exceptions=ignored_exceptions).until(EC.visibility_of_element_located((By.ID, 'person_results'))).find_elements_by_css_selector("#person_info > div > div > div.name.variation")

wait_time(1)
all_persons = [ person.text.replace(',', '').replace('\n', ' ') for person in person_result_names ]
print(f"[**] Found {len(all_persons)} people")
person = '~'.join(all_persons)
print(person)


report_links = browser.find_elements_by_css_selector('#person_results > div > a')
            
all_urls = [ url.get_attribute('href') for url in report_links ]
print(f"[**] Found {len(all_urls)} report urls")
url = '~'.join(all_urls)
print(url)
            
time.sleep(5)

with open('3_results-output.csv', 'a') as fp:
    print(f"{row_id},{person},{url}", file=fp)

Don't worry about row_id variable in the last line. It's being pulled from another source. Thanks in advance to all of the helpful solutions.

This is my current output:

ID signifies each search query, so ID 0 has 3 results for both Names and Urls. ID 1 has 4 results each

ID | Names                                 | Urls
0  | John Doe~Joe Doe~Jay Doe              | http://www.link.com/1~http://www.link.com/2~http://www.link.com/3
1  | Jane Doe~Janet Doe~Jill Doe~Julia Doe | http://www.link.com/4~http://www.link.com/5~http://www.link.com/6~http://www.link.com/7

This what Id like the output to look like:

ID | Names     | Urls
0  | John Doe  | http://www.link.com/1
0  | Joe Doe   | http://www.link.com/2
0  | Jay Doe   | http://www.link.com/3
1  | Jane Doe  | http://www.link.com/4
1  | Janet Doe | http://www.link.com/5
1  | Jill Doe  | http://www.link.com/6
1  | Julia Doe | http://www.link.com/7

Update

Here is the code that ended up working for me. Thanks to @Tim Post for pointing me in the right direction. I just need the writing method to include all results. The code below worked for me:

person_result_names = WebDriverWait(browser, 60, ignored_exceptions=ignored_exceptions).until(EC.visibility_of_element_located((By.ID, 'person_results'))).find_elements_by_css_selector("#person_info > div > div > div.name.variation")

wait_time(1)
print("\tScraping People\n")
all_persons = [ person.text.replace(',', '').replace('\n', ' ') for person in person_result_names ]
print(f"[**] Found {len(all_persons)} people")

wait_time(1)

report_links = browser.find_elements_by_css_selector('#person_results > div > a')

all_urls = [ url.get_attribute('href') for url in report_links ]
print(f"[**] Found {len(all_urls)} report urls")

with open('3_results-output.csv', 'a') as fp:
    for person, url in zip(all_persons, all_urls):
        print(f"{row_id},{person},{url}", file=fp)


Solution 1:[1]

Try the following kind of approach. It makes use of Python's CSV library to help with writing the rows to the file:

import csv

person_result_names = WebDriverWait(browser, 60, ignored_exceptions=ignored_exceptions).until(EC.visibility_of_element_located((By.ID, 'person_results'))).find_elements_by_css_selector("#person_info > div > div > div.name.variation")

wait_time(1)
all_persons = [person.text.replace(',', '').replace('\n', ' ') for person in person_result_names]
print(f"[**] Found {len(all_persons)} people")

report_links = browser.find_elements_by_css_selector('#person_results > div > a')
all_urls = [ url.get_attribute('href') for url in report_links ]

print(f"[**] Found {len(all_urls)} report urls")

with open('3_results-output.csv', 'w', newline='') as f_output:
    csv_output = csv.writer(f_output)
    csv_output.writerow(["ID", "Names", "Urls"])    # Write the header
    
    row_id = 0
    
    for person, url in zip(all_persons, all_urls):
        csv_output.writerow([row_id, person, url])

Obviously you will need to add another loop somewhere to deal with multiple row_id, but this at least shows how you can write to a file correctly. It assumes all_persons and all_urls are of equal length.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Martin Evans