This is the implementation of Cloud Humans Take-Home assignment.
View the challenge description on the oficial repo or view our copy here.
The easyest way to run the aplication is using Docker or Podman as container solutions.
-
First you will need to change the api keys on the
appsettings.json
file."OpenaiKey": "key-overridden-by-user-secrets", "AzureAiSearchKey": "key-overridden-by-user-secrets"
-
Now, with Docker use the following commands:
# Start the application docker compose up -d # Stop the execution started with the previous command docker compose down
💡Tip: Instead of Docker you could also use dotnet CLI or IDEs like Visual Studio or JetBrains Rider in order to run the application.
-
Once the application is running, you can access via
http://localhost:8080/
. To see the OpenApi documentation accesshttp://localhost:8080/swagger
. -
Here is the cURL to call the API:
curl --location 'http://localhost:8080/conversations/completions' \ --header 'Content-Type: application/json' \ --data '{ "helpdeskId": 123456, "projectName": "tesla_motors", "messages": [ { "role": "USER", "content": "Hello! How long does a Tesla battery last before it needs to be replaced?" } ] }'
First of all, we chose to develop the Smart Handover Feature, to see the behaviour just send the following payload for example:
{
"helpdeskId": 123456,
"projectName": "tesla_motors",
"messages": [
{
"role": "USER",
"content": "Hello! My dog run from a Tesla. Can you help me?"
}
]
}
Besides that, given the high level image above, provided for the challenge, we could assume that ClaudIA is a complex ecosystem with different "modules" colaborating to achieve the business goals.
Based on that we made some technical decisions:
- We adopted the
Vertical Slice Architecture
and separated the WebApi deployment from the Core/Domain project, preparing the structure to receive the next related modules, when implemented. - Adopted
Command Query Separation (AKA: CQS)
in order to clarify each feature behaviour and avoid un-intentional side efects like mutation from a feature that should only "query" data. - We also adopted some functional behaviours in order to improve the application consistency, like
Result Pattern
found as the Result wrapper used in most of the operation returns and commands. - Used
Simple Factory
(Ex: Conversation.Create() method) concept to build aggregates and commands. - For the API itself we decided to use the REPR pattern via FastEndpoints in order to specify only one endpoint per file, avoiding the couppling.
- Last but least, we designed our application thinking about DDD (Domain Driven Design), considering the Conversation as an aggregate root.
Next steps:
- Based on the request sample provided on the challenge description, that has
helpdeskId
andprojectName
, we can assume that our solution is multi tenant and we should prepare it for so. Also we should check the consistency of the informed data with our database. So in our current structure we would add aConversationsRepository
(Using the Repository pattern) as our data access for the Conversation aggregate keeping the conversation history. - Add Polly to deal with policies of Retry, CircuitBreaker, etc. But this would be only used while we have direct calls to external resources, the proposed architecture would be protecting this resources adding an API Gateway and so, would deal with the policies above.
- Improve the Services message response preventing that Exception messages leak unintentionally.
- Add OpenTelemetry using Log and Traces to enable Observability.
- Add tests: Unit tests to cover the domain and all behaviours; Integration tests to cover API requests, and error behaviours.
📌 These are the main decisions, feel free to agree or disagree with them, I would love to chat about it. 😊
- OpenAI API Documentation
- Fun fact, we used Gitmoji as a prefix on the commit messages since it enhances the commit's understanding