Log Analysis API is an OCaml-based REST API designed to process, analyze, and transform log data. It demonstrates the power of OCaml's functional programming paradigm, type safety, and performance for real-world data processing tasks.
- Filtering: Filter logs by log level (e.g., INFO, WARN, ERROR, DEBUG) or by substrings in the log message.
- Aggregation: Compute aggregates such as count, average, sum, max, and min for numeric log fields.
- Transformation: Convert logs into JSON or CSV formats.
- Schema Endpoint: Expose a JSON schema that describes the expected format for log entries.
- Health Checks: Basic endpoints to verify service liveness and readiness.
- OCaml (version 4.08 or later recommended)
- Dune build system
- Opam package manager
dream
lwt
yojson
lwt_ppx
(for PPX syntax, e.g.,let%lwt
)
You can install these using opam:
opam install dream lwt yojson lwt_ppx
git clone https://github.com/ahnineamine/log-analysis-api.git
cd log-analysis-api
dune build
dune exec ./your_executable_name
The server will start (by default on port 8080) and you can access the endpoints at http://localhost:8080/.
Processes a command on the logs loaded from the data/
folder. The request payload should contain a command. Since logs are read from the source and kept in memory, you only need to pass the command.
Example Payloads:
{
"type": "Filter",
"criteria": {
"level": "ERROR"
}
}
{
"type": "Filter",
"criteria": {
"message_contains": "failed"
}
}
{
"type": "Aggregate",
"criteria": {
"operation": "CountBy",
"field": "user_id"
}
}
{
"type": "Transform",
"format": "CSV"
}
Returns a JSON schema describing the expected log format, including standard fields like timestamp, level, message, and metadata.
A basic health check endpoint that returns the status of the API.
Place your log files (pretty-printed JSON arrays) in the data/ folder. Sample log files are provided in this folder for testing. Each file should contain a valid JSON array of log objects.