AWS Data Engineer
- Lisbon, Portugal
- Full-Time
- Hybrid
Job Description:
Our Client
Our client is a fintech-software company, part of a leading European financial group, delivering advanced software solutions and data-driven services. They specialise in banking-tech platforms and data engineering, supporting the groups digital-transformation ambitions.
Responsibilities:
Data Pipeline Development
- Design and implement cloud-based data pipelines to transform raw data from multiple sources (e.g. APIs, FTP, SFTP) into structured and consumable formats.
- Architect and deploy high-performance data pipelines (batch and streaming), including monitoring and alerting mechanisms for data analysts and business users.
Data Management
- Build and manage large-scale, complex datasets that meet functional and non-functional business requirements.
- Support data governance initiatives, including metadata cataloguing and documentation.
- Implement functional and non-functional data tests to ensure quality and reliability.
Data Modelling and Analysis
- Design data models and create appropriate database objects such as tables, views, procedures, and scripts.
- Analyze and optimize queries to ensure performance and scalability.
Collaboration & Production Support
- Contribute to monitoring, troubleshooting, and user support activities.
- Collaborate with cross-functional teams to gather requirements and deliver effective solutions.
Continuous Improvement
- Develop code in line with established architectural standards, continuously improving scalability and performance.
- Stay current with industry trends and emerging technologies to evolve data engineering practices.
Requirements:
Mandatory Skills
-
University degree in Engineering or a related field.
- Proven experience in data pipeline development on AWS, including S3, ECS, Lambda, RDS, SNS/SQS, IAM, and CloudWatch.
- Strong experience with analytical databases (SQL) and unstructured or graph data stores.
- Ability to write complex SQL queries; experience with DBT Core is a plus.
- Proficiency in Python and data-focused libraries such as Pandas and Boto3.
- Solid understanding of data platform architecture and data governance principles.
- Fluent in English, Portuguese and/or French is a plus
Additional Assets
-
Strong experience developing solutions on Snowflake.
-
Hands-on experience with CI/CD practices.
- Experience using orchestration tools (e.g. Step Functions, Airflow, Prefect).
- Familiarity with version control systems such as GitHub or GitLab.
-
Exposure to data visualization tools (e.g. Tableau) is a plus.
- Knowledge of Infrastructure as Code (e.g. CloudFormation, Terraform).
- AWS certification (e.g. Solutions Architect or equivalent) is preferred.
-
Snowflake certification (e.g. SnowPro Core) is preferred.