What if you built the data backbone of an organization?
We are seeking a
Senior Data Engineer
based in Brussels, Belgium, to design, industrialize, and orchestrate data pipelines within a modern Lakehouse architecture
Your Responsibilities
Design and maintain scalable data pipelines
Develop ETL processes using
SQL & Python
Orchestrate data workflows with
Airflow (or equivalent)
Implement a
Lakehouse architecture (Bronze / Silver / Gold layers)
Model data (schemas, ERDs)
Ensure data quality, reliability, and performance
Apply
Clean Code & CI/CD
principles
Containerize solutions using
Docker
Work with
PySpark
for distributed data processing
Ideal Profile
5+ years of experience in Data Engineering
Proven experience with Lakehouse architecture
Strong data modeling skills
Autonomous, proactive, solution-oriented mindset
English: minimum B2 level
French or Dutch: C1 level