Skip to content

Background Tasks with Celery

This page documents how Celery is used for asynchronous task processing in Arctyk ITSM.


Overview

Celery handles background tasks that shouldn't block HTTP requests:

  • Email notifications
  • Report generation
  • Scheduled tasks
  • External API calls
  • Bulk operations

Configuration

Celery Setup

# config/celery.py
from celery import Celery

app = Celery('arctyk')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()

Settings

# settings.py
CELERY_BROKER_URL = 'redis://localhost:6379/0'
CELERY_RESULT_BACKEND = 'redis://localhost:6379/0'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'UTC'

Task Examples

Email Notification Task

# tickets/tasks.py
from celery import shared_task
from django.core.mail import send_mail

@shared_task(bind=True, max_retries=3)
def send_ticket_notification(self, ticket_id, action):
    """Send email when ticket is created/updated."""
    try:
        ticket = Ticket.objects.get(id=ticket_id)

        send_mail(
            subject=f'Ticket #{ticket.id}: {action}',
            message=f'Ticket {ticket.title} was {action}',
            from_email='noreply@arctyk.dev',
            recipient_list=[ticket.assignee.email],
        )
    except Exception as exc:
        # Retry with exponential backoff
        raise self.retry(exc=exc, countdown=60 * (2 ** self.request.retries))

Scheduled Report Task

@shared_task
def generate_daily_report():
    """Generate and email daily ticket report."""
    tickets = Ticket.objects.filter(created_at__date=date.today())

    # Generate PDF report
    pdf = generate_pdf_report(tickets)

    # Email to managers
    send_mail(
        subject='Daily Ticket Report',
        message='See attached report',
        from_email='reports@arctyk.dev',
        recipient_list=['manager@example.com'],
        attachments=[('report.pdf', pdf, 'application/pdf')],
    )

Calling Tasks

Async (Non-blocking)

# Queue task for execution
send_ticket_notification.delay(ticket.id, 'created')

ETA (Scheduled)

from datetime import timedelta

# Run in 1 hour
send_ticket_notification.apply_async(
    args=[ticket.id, 'reminder'],
    eta=datetime.now() + timedelta(hours=1)
)

Retry on Failure

@shared_task(bind=True, max_retries=3, default_retry_delay=60)
def unreliable_task(self):
    try:
        # Task logic
        pass
    except Exception as exc:
        raise self.retry(exc=exc)

Periodic Tasks

Celery Beat

Schedule tasks to run periodically:

# config/celery.py
from celery.schedules import crontab

app.conf.beat_schedule = {
    'daily-report': {
        'task': 'tickets.tasks.generate_daily_report',
        'schedule': crontab(hour=8, minute=0),  # 8 AM daily
    },
    'cleanup-old-sessions': {
        'task': 'core.tasks.cleanup_sessions',
        'schedule': crontab(hour=2, minute=0),  # 2 AM daily
    },
}

Starting Celery Beat

celery -A config beat -l info

Monitoring

Flower

Web-based monitoring tool:

# Install
pip install flower

# Run
celery -A config flower

Access at http://localhost:5555

Task States

  • PENDING - Task waiting to be picked up
  • STARTED - Task execution has begun
  • SUCCESS - Task completed successfully
  • FAILURE - Task failed
  • RETRY - Task is being retried

Best Practices

  1. Keep tasks idempotent - Safe to run multiple times
  2. Set timeouts - Prevent tasks from running forever
  3. Use retries - Handle transient failures
  4. Monitor task queue - Prevent backlog
  5. Log appropriately - Debug failed tasks

Running Celery

Development

# Worker
celery -A config worker -l info

# Beat (scheduler)
celery -A config beat -l info

Production

# Worker with concurrency
celery -A config worker -l info --concurrency=4

# Daemonize with systemd
systemctl start celery celerybeat