A simple distributed task queue implementation in Go using Redis, where tasks are added, processed, and prioritized for execution by workers. This system demonstrates how to use Redis lists to manage task queues, enabling both high and low priority tasks to be processed efficiently.
- Task Queueing: Tasks can be added to a queue with either high or normal priority.
- Task Scheduling: Tasks can be scheduled to a queue.
- Worker: A worker processes tasks based on their priority (high priority tasks first).
- REST API: A simple HTTP interface for adding tasks to the queue and bulk uploading tasks.
- Go: The programming language for implementing the producer, worker, and queue systems.
- Redis: Used as the underlying message queue to manage and store tasks.
- HTTP Server: Simple HTTP handlers to interact with the task queue through API endpoints.
- Go version 1.18 or later
- Redis (installed locally or accessible via a URL)
git clone https://github.com/likhithkp/distributed-task-queue.git
cd distributed-task-queue
Ensure that you have Redis running. If Redis is not installed locally, you can either download and install Redis or use a hosted solution.
Install required Go packages:
go mod tidy
You can start the application by running:
go run main.go
This will start the HTTP server on http://localhost:3000
.
You can add a task to the queue by sending a POST request to /tasks
. For example, using curl
:
curl -X POST http://localhost:3000/tasks -d '{"priority":true,"task":"high-priority task"}'
This will add a task with high-priority
to the Redis queue.
You can also upload multiple tasks at once using the /tasks/bulk
endpoint:
curl -X POST http://localhost:3000/tasks/bulk -d '[{"priority":true,"task":"task 1"}, {"priority":false,"task":"task 2"}]'
This will add multiple tasks in one go, some of them with high priority, others with normal priority.
The worker will continuously listen for tasks in the queue and process them. The worker prioritizes high-priority tasks and processes them first. After all high-priority tasks are processed, it will begin processing normal-priority tasks.
main.go
: Entry point of the application, setting up the Redis connection, starting the worker, and initializing the HTTP server.producer
: Contains the functions responsible for adding tasks to the queue (AddToQueue
andBulkUpload
).queue
: Manages the Redis client connection and queue operations.worker
: Contains the worker logic for fetching and processing tasks from the queue.shared
: Contains common structs, such asTask
.
- Description: Adds a single task to the queue.
- Request Body:
{ "priority": true, "task": "Task description" }
- Response: HTTP status
200 OK
if the task was added successfully.
- Description: Adds multiple tasks to the queue in bulk.
- Request Body:
[ { "priority": true, "task": "Task 1" }, { "priority": false, "task": "Task 2" } ]
- Response: HTTP status
200 OK
if all tasks were added successfully.
- Description: Schedule task.
- Request Body:
{ "priority": true, "task": "Task description", "time": "2006-01-02T15:04:05Z07:00" }
- Response: HTTP status
200 OK
if the task was added successfully.
- The worker processes tasks from the Redis queue:
- First, it processes high-priority tasks.
- Once all high-priority tasks are completed, it moves to process normal-priority tasks.
- Tasks are processed by popping them from the queue and executing them. If a task fails, it will be moved to a dead-letter queue (DLQ) for retries.
- Scheduled tasks are also handled by checking the scheduled queue periodically and executing tasks when their scheduled time arrives.
- Multiple Workers: You can scale this by deploying multiple workers to handle more tasks concurrently.
- Retry Logic: Implementing retries for failed tasks or failed processing attempts.
- Metrics & Monitoring: Add logging and monitoring for better visibility of task processing.
This distributed task queue can be a base system for task scheduling in applications where processing order and prioritization are key.