4 days, 4 hours ago

Data Engineer

Job Title: Data Engineer

Work Context

The Federal Judicial Police is a specialized police service primarily responsible for combating organized crime in all its forms. It is one of the three general directorates of the federal police and focuses on conducting investigations in the areas of cybercrime, terrorism, organized crime, drug trafficking, and many others. It provides support and expertise to the entire integrated police force as well as to its national and international partners.

Within this directorate, the operational resources assigned to judicial police operations, the fight against serious and organized crime, special units, and technical and forensic police operations are grouped.

Job Description

A first task consists of setting up an API between a Postgres DB and a Python application. Afterwards, you will be responsible, among other things, for developing and optimizing ELTs: in this, data from large source databases is brought into a Postgres DB where the necessary transformations are carried out (using dbt).

The entire project is developed using Agile methods and is fully on-premise. We require a presence of 2 days per week at the office, in Brussels, easily accessible by public transport. The candidate must be able to obtain a security accreditation.

Profile

DGJ is looking for a driven and detail-oriented Data Engineer with a passion for designing, developing, and maintaining robust on-premise ETL processes and data warehouses. Has extensive experience with SQL databases and preferably also NoSQL databases, and a proven track record in optimizing data infrastructures to support decision-making processes.

Core Competencies:

  • On-premise development: Expertise in building and maintaining data engineering components in an on-premise infrastructure.
  • ETL Development and Management: Expertise in building and maintaining advanced ETL processes to efficiently extract, transform, and load data from various sources.
  • Data Warehousing: In-depth knowledge of data warehousing concepts and techniques, with experience in designing scalable and reliable data storage solutions. Knowledge of dbt.
  • Database Management (SQL & NoSQL): Proficiency in working with both relational (SQL) and non-relational (NoSQL) databases, including but not limited to MySQL, PostgreSQL, Neo4j, Elastic DB, MongoDB, Milvus.
  • API Integration: Experience in developing and integrating APIs.
  • Data Modeling: Experience in designing and implementing data models that improve performance and simplify complex queries.
  • Data Quality and Integrity: Strong focus on ensuring the accuracy, completeness, and reliability of data.
  • Programming Skills: Proficiency in programming languages such as Python for process automation and data analysis.
  • Problem-solving Ability: Excellent analytical skills to solve complex technical challenges and deliver data-driven solutions.

Education:

  • Minimum a bachelor in Computer Science / Data Science / Business Informatics or equivalent.
  • Demonstrable experience in data engineering, data warehousing, and database management.

Technical Skills:

Databases: MySQL, PostgreSQL, Neo4j, Elastic DB, MongoDB, Milvus

ETL Tools: dbt, airflow

Data: Python, SQL

API Development: REST, SOAP, GraphQL

Cloud Platforms: Azure, AWS

BI Tools: Power BI

Languages:

Language skills: English + fluent in one, or preferably both, national languages (NL/FR). If not fluent, passive understanding of the other language is required.

Soft Skills:

  • Strong communication skills
  • Team-oriented and collaborative
  • Innovative thinking
  • Attention to detail
  • Proactive work attitude

Apply for this Job

This position was originally posted on Pro Unity.

It is publicly accessible, and we recommend applying directly through the Pro Unity website instead of going through third party recruiters.

Newsletter signup illustration