Replies: 1 comment
-
Struggling to get this working with Github as opposed to CodeCommit since we can no longer create new repos |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Serverless Data Lake Framework 2.0.0 is now available.
The workshop has been updated.
For users of SDLF 1.x, version 1 is still available on the master branch. Development of newer versions of SDLF (2.x) happens on branch main. The workshop still contains sections for version 1 as well.
GPG signature files are included for archives in this release's assets files. You can use our public key to verify that the downloaded archive file is original and unmodified.
We welcome your feedback!
What’s New
SDLF 2.0 is very much in the same spirit as SDLF 1.0 - the constructs are the same and CloudFormation is still used as the language for provisioning the infrastructure. 2.0 intends to fix long-standing issues with SDLF, extend its usage to more data architecture patterns and bring commonly-asked features to the framework.
deploy.sh
takes care of deploying the CICD infrastructure used to build these modules, and register them in the private CloudFormation registry of each account. Modules are updated whenever there is a deployment that requires them.pDomain
(which defaults todatalake
) can be provided when deploying foundations.dev
,test
,prod
).sdlf-main
.datadomain-{domain}-{env}.yaml
.main
,test
anddev
branches are expected.parameters-{env}.json
.sdlf-main-{domain}-{team}
.pipelines.yaml
and datasets indatasets.yaml
.sdlf-main-{domain}-{team}
works the same way everything works in SDLF -main
,test
anddev
branches are expected.parameters-{env}.json
.sdlf-datalakeLibrary
. They are no longer needed and have been removed.pPipelineDetails
parameter when defining a dataset insdlf-dataset
. This parameter goes even further and can be used to store more information that stages can use. These details are stored in the Datasets DynamoDB table (as was already the case in SDLF 1.x).pEventPattern
in the example), and then process these events on a schedule (pSchedule
)sdlf-pipLibrary
is now part of an optional feature called Lambda Layer Deployer. Files related to this feature are part of a team main repository (sdlf-main-{domain}-{team}
) under thelayers
folder.sdlf-utils
. Files related to this feature are part of a team main repository (sdlf-main-{domain}-{team}
) under thetransforms
folder.sdlf-monitoring
, with CloudTrail, ELK forwarding and SNS.sdlf-monitoring
is not deployed.sdlf-stage-dataquality
will soon be available as an example on how to add a third stage making use of Glue Data Quality.deploy.sh
, there is no more shell scripts.Full Changelog: 1.5.2...2.0.0
New Contributors
Thanks
We thank all the contributors/users for their work on this release!
This discussion was created from the release Serverless Data Lake Framework 2.0.0.
Beta Was this translation helpful? Give feedback.
All reactions