'How to use multi-threading for Parsing of a large XML file using Python 3 and Beautiful Soup

Here we are opening an XML file reading the contents and then parsing with Beautifulsoup which is eating away all the resources of the system even in Colabs. We tried all the ways by using Multithreading or concurrency but were unable to tackle the problem

from bs4 import BeautifulSoup
data2 = open('filename.xml')
contents2 = data2.read()

**THIS PART MUST BE RUN INTO MULTI-THREADING WHICH IS PRESENTLY CONSUMING LOT OF RESOURCES**
extract2 = BeautifulSoup(contents2, 'xml')

Any suggestion or a better solution is welcome.



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source