multithreading - PHP worker (Multiple processes and/or threads) -
i'm trying fetch statistical data web service. each request has response time of 1-2 seconds , i've submit request thousands of ids, 1 @ time. requests sum few hours, because of server's response time.
i want parallelize requests possible (the server's can handle it). i've installed php7 , pthreads (cli-only), maximum number of threads limited (20 in windows php cli), i've start multiple processes.
is there simple php based framework/library multi-process/pthread , job-queue handling? don't need large framework symfony or laravel.
workers
you using php-resque doesn't require pthreads.
you have run local redis server though (could remote). believe can run redis on windows, according this so
concurrent requests
you may want sending concurrent requests using guzzlehttp, can find examples on how use here
from docs:
you can send multiple requests concurrently using promises , asynchronous requests.
use guzzlehttp\client; use guzzlehttp\promise; $client = new client(['base_uri' => 'http://httpbin.org/']); // initiate each request not block $promises = [ 'image' => $client->getasync('/image'), 'png' => $client->getasync('/image/png'), 'jpeg' => $client->getasync('/image/jpeg'), 'webp' => $client->getasync('/image/webp') ]; // wait on of requests complete. throws connectexception // if of requests fail $results = promise\unwrap($promises); // wait requests complete, if of them fail $results = promise\settle($promises)->wait(); // can access each result using key provided unwrap // function. echo $results['image']->getheader('content-length'); echo $results['png']->getheader('content-length');
Comments
Post a Comment