'How to run Locust with multiprocessing on a single machine

I want Locust to use all cores on my PC.

I have many Locust classes and I want to use Locust as a library.

Example of my code:

import gevent
from locust.env import Environment
from locust.stats import stats_printer
from locust.log import setup_logging
import time



from locust import HttpUser, TaskSet, task, between


def index(l):
    l.client.get("/")

def stats(l):
    l.client.get("/stats/requests")

class UserTasks(TaskSet):
    # one can specify tasks like this
    tasks = [index, stats]

    # but it might be convenient to use the @task decorator
    @task
    def page404(self):
        self.client.get("/does_not_exist")

class WebsiteUser(HttpUser):
    """
    User class that does requests to the locust web server running on localhost
    """
    host = "http://127.0.0.1:8089"
    wait_time = between(2, 5)
    tasks = [UserTasks]

def worker():
    env2 = Environment(user_classes=[WebsiteUser])
    env2.create_worker_runner(master_host="127.0.0.1", master_port=50013)
    # env2.runner.start(10, hatch_rate=1)
    env2.runner.greenlet.join()

def master():
    env1 = Environment(user_classes=[WebsiteUser])
    env1.create_master_runner(master_bind_host="127.0.0.1", master_bind_port=50013)
    env1.create_web_ui("127.0.0.1", 8089)
    env1.runner.start(20, hatch_rate=4)
    env1.runner.greenlet.join()

import multiprocessing
from multiprocessing import Process
import time


procs = []

proc = Process(target=master)
procs.append(proc)
proc.start()

time.sleep(5)

for i in range(multiprocessing.cpu_count()):
    proc = Process(target=worker)  # instantiating without any argument
    procs.append(proc)
    proc.start()

for process in procs:
    process.join()

This code doesn't work correctly.

(env) ➜  test_locust python main3.py
You are running in distributed mode but have no worker servers connected. Please connect workers prior to swarming.
Traceback (most recent call last):
  File "src/gevent/greenlet.py", line 854, in gevent._gevent_cgreenlet.Greenlet.run
  File "/home/alex/projects/performance/env/lib/python3.6/site-packages/locust/runners.py", line 532, in client_listener
    client_id, msg = self.server.recv_from_client()
  File "/home/alex/projects/performance/env/lib/python3.6/site-packages/locust/rpc/zmqrpc.py", line 44, in recv_from_client
    msg = Message.unserialize(data[1])
  File "/home/alex/projects/performance/env/lib/python3.6/site-packages/locust/rpc/protocol.py", line 18, in unserialize
    msg = cls(*msgpack.loads(data, raw=False, strict_map_key=False))
  File "msgpack/_unpacker.pyx", line 161, in msgpack._unpacker.unpackb
TypeError: unpackb() got an unexpected keyword argument 'strict_map_key'
2020-08-13T11:21:10Z <Greenlet at 0x7f8cf300c848: <bound method MasterRunner.client_listener of <locust.runners.MasterRunner object at 0x7f8cf2f531d0>>> failed with TypeError

Unhandled exception in greenlet: <Greenlet at 0x7f8cf300c848: <bound method MasterRunner.client_listener of <locust.runners.MasterRunner object at 0x7f8cf2f531d0>>>
Traceback (most recent call last):
  File "src/gevent/greenlet.py", line 854, in gevent._gevent_cgreenlet.Greenlet.run
  File "/home/alex/projects/performance/env/lib/python3.6/site-packages/locust/runners.py", line 532, in client_listener
    client_id, msg = self.server.recv_from_client()
  File "/home/alex/projects/performance/env/lib/python3.6/site-packages/locust/rpc/zmqrpc.py", line 44, in recv_from_client
    msg = Message.unserialize(data[1])
  File "/home/alex/projects/performance/env/lib/python3.6/site-packages/locust/rpc/protocol.py", line 18, in unserialize
    msg = cls(*msgpack.loads(data, raw=False, strict_map_key=False))
  File "msgpack/_unpacker.pyx", line 161, in msgpack._unpacker.unpackb
TypeError: unpackb() got an unexpected keyword argument 'strict_map_key'

ACTUAL RESULT: workers do not connect to the master and run users without a master EXPECTED RESULT: workers run only with the master.

What is wrong?



Solution 1:[1]

You cannot use multiprocessing together with Locust/gevent (or at least it is known to cause issues).

Please spawn separate processes using subprocess or something completely external to locust. Perhaps you could modify locust-swarm (https://github.com/SvenskaSpel/locust-swarm) to make it able to run worker processes on the same machine.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Cyberwiz