Skip to main content
Tweeted twitter.com/StackCodeReview/status/1319564237644201984
edited tags
Link
user228914
user228914
edited tags
Link
Ben A
  • 10.8k
  • 5
  • 38
  • 103
Source Link
humid
  • 101
  • 4

How can this async url response checker be cleaner/faster?

with open('things.txt') as things:  
    urls = [url.strip().lower() for url in things]

async def is_site_404(s, url):
    async with s.head(f"https://example.com/{url}") as r1:
        if r1.status == 404:
            print('hello i am working')

async def create_tasks(urls):
    tasks = []
    async with aiohttp.ClientSession() as s:
        for url in urls:
            if len(url) >= 5 and len(url) < 16 and url.isalnum():
                task = asyncio.create_task(is_site_404(s, url))
                tasks.append(task)
        return await asyncio.gather(*tasks)

while True:
    asyncio.get_event_loop().run_until_complete(create_tasks(urls))

Hi, this is a basic asynchronous url response checker that I created and it's pretty fast, but I'm wondering if there is any way to get more requests per second with the base of this code. I have it designed so that it runs forever and just prints whenever there is a 404 in this example basically. I'm pretty new to python and coding in general and I would like some guidance/advice from anyone who has more experience with this kind of thing.. maybe there is an aiohttp alternative I should use that's faster? ANY advice is greatly appreciated.