Data Engineer for NATO
Would you like to join the leading international intergovernmental organization?
The Centre for Maritime Research and Experimentation (CMRE) is an established, world-class scientific research and experimentation facility that organizes and conducts scientific research and technology development centred on the maritime domain. It delivers innovative and field-tested science and technology (S&T) solutions to address defence and security needs of the Alliance.
Responsibilities:
Design, Develop, test, and maintain architectures such as databases and data platforms for large-scale data processing, data pipelines, ETL pipelines, workflow systems, etc.
Design, develop and test data models and data services to manage data efficiently, so that data can be easily accessed and processed by data scientists, developers and other services, exploiting the data as product paradigm.
Essential Qualifications & Experience:
A minimum requirement of a bachelor’s degree at a nationally recognised/certified University in an information systems, physics or electronics related scientific or engineering discipline, or other relevant scientific discipline, and 2 years post-related experience.
A minimum of 3 years of professional experience in Python or other relevant languages for building and maintaining data pipelines and processing systems.
Knowledge of library for data manipulation such as NumPy, SciPy, Pandas, xArray, MatPlotLib, etc.
Professional experience with database systems such as such as SQL Server, Cassandra, Azure Cosmos Db, Redis, PostgreSQL, etc. schema design and query optimization and with Big Data Technologies such as Data Bricks, Azure Machine Learning, Spark, or Kafka for processing large volumes of data efficiently.
Professional experience in data modelling with one or more of the major paradigms: object oriented, entity-relationships, Graph based, etc.
Knowledge of data pipelines tools, to automate AI/ML and ETL jobs, such as Apache AirFlow, or similar
Professional experience with Agile/Scrum methodologies, Git workflows, code review processes, and collaboration tools (AzureDevOps, JIRA, GitHub, GitLab, etc.)
Understanding of cloud services and infrastructure provided by one or more major Cloud vendors, including data storage and processing capabilities
Understanding of data security data protection aspects
Professional experience of implementing Findable, Accessible, Interoperable, and Reusable (FAIR) principles in data management practices and understanding of Data Mesh architecture
Good level of spoken and written English
If you've read the description and feel this role is a great match, we'd love to hear from you! Click "Apply for this job" to be directed to a brief questionnaire. It should only take a few moments to complete, and we'll be in touch promptly if your experience aligns with our needs.
- Department
- Data
- Locations
- La Spezia
- Remote status
- Fully Remote