Skip to content

Commit

Permalink
build: upgraded deps and local kafka installation
Browse files Browse the repository at this point in the history
Installs openedx-events, kafka, and so forth for local development
of kafka events.  Upgrades deps due to installation of events packages.
Provides example of bringing the kafka broker up and producing/consuming
a test event.
ENT-8761
  • Loading branch information
iloveagent57 committed May 15, 2024
1 parent 153cc7b commit 661ece1
Show file tree
Hide file tree
Showing 12 changed files with 466 additions and 110 deletions.
32 changes: 26 additions & 6 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -165,6 +165,9 @@ detect_changed_source_translations: ## check if translation files are up-to-date

validate_translations: fake_translations detect_changed_source_translations ## install fake translations and check if translation files are up-to-date

docker_build_no_cache:
docker-compose build --no-cache

docker_build:
docker build . -f Dockerfile --target app -t openedx/enterprise-access
docker build . -f Dockerfile --target app -t openedx/enterprise-access.worker
Expand All @@ -190,20 +193,37 @@ dev.provision:
bash ./provision-enterprise-access.sh

# devstack-themed shortcuts
dev.up: # Starts all containers
# Starts all containers
dev.up: dev.up.redis
docker-compose up -d

dev.up.build: docker_build
dev.up.build: docker_build dev.up.redis
docker-compose up -d

dev.up.build-no-cache:
docker-compose build --no-cache
dev.up.build-no-cache: docker_build_no_cache dev.up.redis
docker-compose up -d

dev.down: # Kills containers and all of their data that isn't in volumes
dev.up.with-events: dev.up.kafka-control-center dev.up

# Start redis via the devstack docker-compose.yml
dev.up.redis:
docker-compose -f $(DEVSTACK_WORKSPACE)/devstack/docker-compose.yml up -d redis

# Start kafka via the devstack docker-compose.yml
# https://github.com/openedx-unsupported/devstack/blob/323b475b885a2704489566b262e2895a4dca62b6/docker-compose.yml#L140
dev.up.kafka-control-center:
docker-compose -f $(DEVSTACK_WORKSPACE)/devstack/docker-compose.yml up -d kafka-control-center

# Useful for just restarting everything related to the event broker
dev.down.kafka-control-center:
docker-compose -f $(DEVSTACK_WORKSPACE)/devstack/docker-compose.yml down kafka zookeeper schema-registry kafka-control-center

# Kills containers and all of their data that isn't in volumes
dev.down:
docker-compose down

dev.stop: # Stops containers so they can be restarted
# Stops containers so they can be restarted
dev.stop:
docker-compose stop

dev.backup:
Expand Down
34 changes: 34 additions & 0 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,40 @@ Running migrations
$ make app-shell
# python ./manage.py migrate

Setting up openedx-events
^^^^^^^^^^^^^^^^^^^^^^^^^
Ensure you've installed the ``edx_event_bus_kafka`` and ``openedx_events`` requirements. Entering
a shell with ``make app-shell`` and then running ``make requirements`` should install these for you.

From your host, run ``make dev.up.with-events``, which will start a local kafka container for you.
Visit http://localhost:9021/clusters to access the local "Confluent Control Center".
Confluent is like a cloud wrapper around vanilla Kafka.

Your ``devstack.py`` settings should already be configured to point at this event broker,
and to configure enterprise-access as an openedx event consumer and produer.

We have a specific enterprise "ping" event and management command defined to test
that your local event bus is well-configured. Open a shell with ``make app-shell`` and run::

./manage.py consume_enterprise_ping_events

This will consume ping events from the ``dev-enterprise-core`` topic.
You may see a ``Broker: Unknown topic`` error the first time you run it. When you run your
test event production below, that error will resolve (producing the event creates the topic
if it does not exist). **Leave the consumer running.** You should see the ``enterprise-access-service``
as a registered consumer in your local confluent control center.

Now, go over to your **enterprise-subsidy** directory. Make sure requirements are installed,
specifically the ``edx_event_bus_kafka`` and ``openedx_events`` packages. Use ``make app-shell``
in this repo and we'll *produce* a ping event::

./manage.py produce_enterprise_ping_event

If this event was successfully produced, you'll see a log message that says
``Message delivered to Kafka event bus: topic=dev-events-testing``.
You should also now see the ``dev-events-testing`` topic available in your local confluent control center,
and even the test events that are being published to the topic.

A note on creating SubsidyRequestCustomerConfiguration Objects locally
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,116 @@
"""
Produce a single event for enterprise-specific testing or health checks.
Implements required ``APP.management.commands.*.Command`` structure.
"""
import json
import logging
from argparse import RawTextHelpFormatter
from pprint import pformat

import attr
from django.conf import settings
from django.core.management.base import BaseCommand
from django.dispatch import receiver
from openedx_events.event_bus import make_single_consumer
from openedx_events.tooling import OpenEdxPublicSignal

logger = logging.getLogger(__name__)


# First define the topic that our consumer will subscribe to.
ENTERPRISE_CORE_TOPIC = getattr(settings, 'EVENT_BUS_ENTERPRISE_CORE_TOPIC', 'enterprise-core')


# Define the shape/schema of the data that our consumer will process.
# It should be identical to the schema used to *produce* the event.
@attr.s(frozen=True)
class PingData:
"""
Attributes of a ping record.
"""
ping_uuid = attr.ib(type=str)
ping_message = attr.ib(type=str)


ENTERPRISE_PING_DATA_SCHEMA = {
"ping": PingData,
}

# Define a Signal with the type (unique name) of the event to process,
# and tell it about the expected schema of event data. The producer of our ping events
# should emit an identical signal (same event_type and data schema).
ENTERPRISE_PING_SIGNAL = OpenEdxPublicSignal(
event_type="org.openedx.enterprise.core.ping.v1",
data=ENTERPRISE_PING_DATA_SCHEMA
)


# Create a receiver function to do the "processing" of the signal data.
@receiver(ENTERPRISE_PING_SIGNAL)
def handle_enterprise_ping_signal(sender, **kwargs):
logger.info('RECEIVED PING DATA: %s', pformat(kwargs['ping']))


class Command(BaseCommand):
"""
Mgmt command to consume enterprise ping events.
"""

help = """
Consume messages from the enterprise core topic and emit their data with
a corresponding signal.
Examples:
./manage.py consume_enterprise_ping_events -g enterprise-access-service
# send extra args, for example pass check_backlog flag to redis consumer
./manage.py consume_enterprise_ping_events -g user-activity-service -g enterprise-access-service \\
--extra '{"check_backlog": true}'
# send extra args, for example replay events from specific redis msg id.
./manage.py consume_enterprise_ping_events -g enterprise-access-service \\
--extra '{"last_read_msg_id": "1679676448892-0"}'
"""

def add_arguments(self, parser):
"""
Add arguments for parsing topic, group, and extra args.
"""
parser.add_argument(
'-g', '--group-id',
nargs='?',
required=False,
type=str,
default='enterprise-access-service',
help='Consumer group id'
)
parser.add_argument(
'--extra',
nargs='?',
type=str,
required=False,
help='JSON object to pass additional arguments to the consumer.'
)

def create_parser(self, *args, **kwargs):
parser = super(Command, self).create_parser(*args, **kwargs)
parser.formatter_class = RawTextHelpFormatter
return parser

def handle(self, *args, **options):
"""
Create consumer based on django settings and consume events.
"""
try:
# load additional arguments specific for the underlying implementation of event_bus.
extra = json.loads(options.get('extra') or '{}')
event_consumer = make_single_consumer(
topic=ENTERPRISE_CORE_TOPIC,
group_id=options['group_id'],
**extra,
)
event_consumer.consume_indefinitely()
except Exception: # pylint: disable=broad-except
logger.exception("Error consuming events")
19 changes: 14 additions & 5 deletions enterprise_access/settings/devstack.py
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,11 @@

# Install django-extensions for improved dev experiences
# https://github.com/django-extensions/django-extensions#using-it
INSTALLED_APPS += ('django_extensions',)
INSTALLED_APPS += (
'django_extensions',
'edx_event_bus_kafka',
'openedx_events',
)

# BEGIN CELERY
CELERY_WORKER_HIJACK_ROOT_LOGGER = True
Expand Down Expand Up @@ -109,12 +113,17 @@


################### Kafka Related Settings ##############################
KAFKA_ENABLED = False

KAFKA_BOOTSTRAP_SERVER = 'edx.devstack.kafka:29092'
SCHEMA_REGISTRY_URL = 'http://edx.devstack.schema-registry:8081'
KAFKA_REPLICATION_FACTOR_PER_TOPIC = 1
# "Standard" Kafka settings as defined in https://github.com/openedx/event-bus-kafka/tree/main
EVENT_BUS_KAFKA_SCHEMA_REGISTRY_URL = 'http://edx.devstack.schema-registry:8081'
EVENT_BUS_KAFKA_BOOTSTRAP_SERVERS = 'edx.devstack.kafka:29092'
EVENT_BUS_PRODUCER = 'edx_event_bus_kafka.create_producer'
EVENT_BUS_CONSUMER = 'edx_event_bus_kafka.KafkaEventConsumer'
EVENT_BUS_TOPIC_PREFIX = 'dev'

# Potentially deprecated kafka settings
KAFKA_ENABLED = False
KAFKA_REPLICATION_FACTOR_PER_TOPIC = 1
COUPON_CODE_REQUEST_TOPIC_NAME = "coupon-code-request-dev"
LICENSE_REQUEST_TOPIC_NAME = "license-request-dev"
ACCESS_POLICY_TOPIC_NAME = "access-policy-dev"
Expand Down
1 change: 1 addition & 0 deletions requirements/base.in
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@ edx-django-utils
edx-django-release-util
edx-drf-extensions
edx-enterprise-subsidy-client
edx-event-bus-kafka
edx-rbac
edx-rest-api-client
jsonfield2
Expand Down
Loading

0 comments on commit 661ece1

Please sign in to comment.