_ApifyRequestQueueSharedClient
Index
Methods
__init__
Initialize a new instance.
Preferably use the
ApifyRequestQueueClient.open
class method to create a new instance.Parameters
keyword-onlyapi_client: RequestQueueClientAsync
keyword-onlymetadata: RequestQueueMetadata
keyword-onlycache_size: int
keyword-onlymetadata_getter: Callable[[], Coroutine[Any, Any, ApifyRequestQueueMetadata]]
Returns None
add_batch_of_requests
Add a batch of requests to the queue.
Parameters
requests: Sequence[Request]
The requests to add.
optionalkeyword-onlyforefront: bool = False
Whether to add the requests to the beginning of the queue.
Returns AddRequestsResponse
fetch_next_request
Return the next request in the queue to be processed.
Once you successfully finish processing of the request, you need to call
mark_request_as_handled
to mark the request as handled in the queue. If there was some error in processing the request, callreclaim_request
instead, so that the queue will give the request to some other consumer in another call to thefetch_next_request
method.Returns Request | None
get_request
Get a request by unique key.
Parameters
unique_key: str
Unique key of the request to get.
Returns Request | None
is_empty
Check if the queue is empty.
Returns bool
mark_request_as_handled
Mark a request as handled after successful processing.
Handled requests will never again be returned by the
fetch_next_request
method.Parameters
request: Request
The request to mark as handled.
Returns ProcessedRequest | None
reclaim_request
Reclaim a failed request back to the queue.
The request will be returned for processing later again by another call to
fetch_next_request
.Parameters
request: Request
The request to return to the queue.
optionalkeyword-onlyforefront: bool = False
Whether to add the request to the head or the end of the queue.
Returns ProcessedRequest | None
An Apify platform implementation of the request queue client.
This implementation supports multiple producers and multiple consumers scenario.