'How to Improve 3rd party API query time Async or Multi-thread
I developed some Python classes that consume data from a third party API.
When the API takes too long to respond, it breaks the search engine.
I tried to find the best solution (with less response time) and even then it's taking a long time to fetch the data.
I saw a few things about Asyncio and multi-threading for data consumption, but I didn't understand how to apply it to my process.
If the API that gives me the data goes down frequently, wouldn't fetching asynchronously require more processing from the third party? Wouldn't it be even more unstable?
I appreciate any help and I will post the query part of the API below
def hit(self):
''' Request APi and return JSON '''
response = requests.get(self.url, headers=self._headers)
json = response.json()
self._page = json['metadata']['pagination']['next_page'] ##Set Next page
self._total_pages = json['metadata']['pagination']['total_pages'] ##Set total pages
return json['constituents']
def perform(self, chunk_size):
''' number of records per page and chunk size.'''
i = 0
data = []
while (self.has_next_page() and i < chunk_size):
print(f'Pagina {self._page} de {self._total_pages} - '
f'{self.percentagem(self._page, self._total_pages):.2f} % of process')
data += self.hit()
i += 1
return data
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
