Setting Up Celery Cluster with RabbitMQ

1 minute read

Setting up Celery Cluster with RabbiMQ


I will be using 3 machines for this example. Let’s say we have 1 Master node and 2 Worker nodes We need to do the following steps on all of the nodes: ```sudo pip install celery sudo rabbitmqctl add_user sudo rabbitmqctl add_vhost sudo rabbitmqctl set_permissions -p ".*" ".*" ".*"

You can replace <username>, <password> and <vhost_name> with what you want. We will use these names later on.

You can check the status of rabbit server using
`sudo rabbitmqctl status`
Start and stop your server using
`sudo rabbitmq-server -detached`
`sudo rabbitmqctl stop`

### Configuring and starting the cluster
Here is the summary of steps that you need to do for configuring and starting a rabbitmq cluster for your celery project
We need to configure the erlang cookie. Just make sure you have the same string in the .erlang.cookie file in all the servers.
Make sure you have RabbitMQ stopped on all nodes before changing the .erlang.cookie file.

sudo chmod 777 /var/lib/rabbitmq/.erlang.cookie sudo vi /var/lib/rabbitmq/.erlang.cookie sudo chmod 400 /var/lib/rabbitmq/.erlang.cookie

Make sure the `/etc/hosts` file on all machines have information about the other machines in the cluster. You can check that by doing a ping between machines

Start the RabbitMQ in detached mode on the server

Start the RabbitMQ server in detached mode on the worker nodes and perform the following steps

sudo rabbitmqctl stop_app sudo rabbitmqctl reset sudo rabbitmqctl cluster rabbit@<server_name/ip> sudo rabbitmqctl start_app ```

You can check the nodes in the cluster using sudo rabbitmqctl cluster_status

Starting a Celery Worker

celery worker -A tasks -l info

Celery worker with feeding queue specified celery worker -Q <queue_name>

Use celery multi start for starting multiple workers