Tornado 4.5 Documentation
containing only base_url. When a worker fetches a page it parses the links and puts new ones in the queue, then calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have have all been seen before, and there is also no work left in the queue. Thus that worker’s call to task_done decrements the counter to zero. The main coroutine, which is waiting for join, is unpaused and def worker(): while True: yield fetch_url() q.put(base_url) # Start workers, then wait for the work queue to be empty. for _ in range(concurrency): worker() yield0 码力 | 333 页 | 322.34 KB | 1 年前3
Tornado 6.1 Documentation
containing only base_url. When a worker fetches a page it parses the links and puts new ones in the queue, then calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have have all been seen before, and there is also no work left in the queue. Thus that worker’s call to task_done decrements the counter to zero. The main coroutine, which is waiting for join, is unpaused and follow links beneath the base URL if new_url.startswith(base_url): await q.put(new_url) async def worker(): async for url in q: if url is None: return try: await fetch_url(url) except Exception as0 码力 | 245 页 | 904.24 KB | 1 年前3
Tornado 4.5 Documentation
containing only base_url. When a worker fetches a page it parses the links and puts new ones in the queue, then calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have have all been seen before, and there is also no work left in the queue. Thus that worker’s call to task_done decrements the counter to zero. The main coroutine, which is waiting for join, is unpaused and task_done() @gen.coroutine def worker(): while True: yield fetch_url() q.put(base_url) # Start workers, then wait for the work queue to be empty. for _ in range(concurrency): worker() yield q.join(timeou0 码力 | 222 页 | 833.04 KB | 1 年前3
Tornado 5.1 Documentation
containing only base_url. When a worker fetches a page it parses the links and puts new ones in the queue, then calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have have all been seen before, and there is also no work left in the queue. Thus that worker’s call to task_done decrements the counter to zero. The main coroutine, which is waiting for join, is unpaused and follow links beneath the base URL if new_url.startswith(base_url): await q.put(new_url) async def worker(): async for url in q: if url is None: return try: await fetch_url(url) except Exception as0 码力 | 243 页 | 895.80 KB | 1 年前3
Tornado 6.0 Documentation
containing only base_url. When a worker fetches a page it parses the links and puts new ones in the queue, then calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have have all been seen before, and there is also no work left in the queue. Thus that worker’s call to task_done decrements the counter to zero. The main coroutine, which is waiting for join, is unpaused and follow links beneath the base URL if new_url.startswith(base_url): await q.put(new_url) async def worker(): async for url in q: if url is None: return (continues on next page) 6.1. User’s guide 210 码力 | 245 页 | 885.76 KB | 1 年前3
Tornado 6.4 Documentation
containing only base_url. When a worker fetches a page it parses the links and puts new ones in the queue, then calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have have all been seen before, and there is also no work left in the queue. Thus that worker’s call to task_done decrements the counter to zero. The main coroutine, which is waiting for join, is unpaused and (continued from previous page) if new_url.startswith(base_url): await q.put(new_url) async def worker(): async for url in q: if url is None: return try: await fetch_url(url) except Exception as0 码力 | 268 页 | 1.09 MB | 1 年前3
Tornado 6.2 Documentation
containing only base_url. When a worker fetches a page it parses the links and puts new ones in the queue, then calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have have all been seen before, and there is also no work left in the queue. Thus that worker’s call to task_done decrements the counter to zero. The main coroutine, which is waiting for join, is unpaused and (continued from previous page) if new_url.startswith(base_url): await q.put(new_url) async def worker(): async for url in q: if url is None: return try: await fetch_url(url) except Exception as0 码力 | 260 页 | 1.06 MB | 1 年前3
Tornado 6.4 Documentation
containing only base_url. When a worker fetches a page it parses the links and puts new ones in the queue, then calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have have all been seen before, and there is also no work left in the queue. Thus that worker’s call to task_done decrements the counter to zero. The main coroutine, which is waiting for join, is unpaused and (continued from previous page) if new_url.startswith(base_url): await q.put(new_url) async def worker(): async for url in q: if url is None: return try: await fetch_url(url) except Exception as0 码力 | 268 页 | 1.09 MB | 1 年前3
Tornado 6.4 Documentation
containing only base_url. When a worker fetches a page it parses the links and puts new ones in the queue, then calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have have all been seen before, and there is also no work left in the queue. Thus that worker’s call to task_done decrements the counter to zero. The main coroutine, which is waiting for join, is unpaused and (continued from previous page) if new_url.startswith(base_url): await q.put(new_url) async def worker(): async for url in q: if url is None: return try: await fetch_url(url) except Exception as0 码力 | 268 页 | 1.09 MB | 1 年前3
Tornado 6.3 Documentation
containing only base_url. When a worker fetches a page it parses the links and puts new ones in the queue, then calls task_done to decrement the counter once. Eventually, a worker fetches a page whose URLs have have all been seen before, and there is also no work left in the queue. Thus that worker’s call to task_done decrements the counter to zero. The main coroutine, which is waiting for join, is unpaused and (continued from previous page) if new_url.startswith(base_url): await q.put(new_url) async def worker(): async for url in q: if url is None: return try: await fetch_url(url) except Exception as0 码力 | 264 页 | 1.06 MB | 1 年前3
共 20 条
- 1
- 2













