Willing to start a new freelance challange? Get in touch.
What You'll Do
- Design, build and optimize scalable data pipelines in Databricks (AWS)
- Develop and maintain complex data transformations using Dataiku
- Implement robust ETL/ELT processes within a Lakehouse architecture
- Manage ingestion from multiple source systems (legacy & cloud-based)
- Ensure data quality, integrity, security and performance monitoring
- Support migration from SAS, SageMaker, BW/HANA and other environments
- Document technical designs and best practices
- Transfer knowledge to the operational data engineering team
What you bring
- 3+ years of experience in SQL and Data Warehousing / Lakehouse environments
- 2+ years hands-on experience with Databricks
- Proven experience with Dataiku in development and automation contexts
- Strong knowledge of AWS services (S3, IAM, compute, etc.)
- Solid Python skills for data transformation
- Experience in platform build-up or migration projects
- Analytical mindset and strong documentation habits
Interested? Apply today!
myNEBIRU: Not into this role, but interested in what NEBIRU does? That’s totally fine.
Visit to see how we can support you - even outside our client missions. Let’s build the bridge to your next step, together.
Solliciteren