Upgrade to ChromeUpgrade to FirefoxUpgrade to Internet ExplorerUpgrade to Safari

Job search

Clear search

 
 

Data Engineer – SQL/Python/Cloud Back to search results

Salary: Negotiable
Job type: Permanent
Job location: London
Date posted: 29 April 2019

Data Engineer SQL/Python/Cloud

Location: Central London

Industry: Financial Services

I currently have an exciting opportunity for a Data Engineer with experience working SQL/Python/Cloud technologies to join a major financial organisation based in Central London.

They are presently going through major growth and have several high-profile projects in the pipeline, as a result they need an experienced Data Engineer to help facilitate the growth of their business.

The position with focus on SQL Data Warehousing in Azure

The position offers and above market rate salary, exciting prospects and great additional benefits.

Details, and responsibilities of the role are as follows:

  • Building ETL pipelines from various data source to HDFS / HBase
  • Optimizing the performance of business-critical queries and dealing with ETL job-related issues
  • Identifying the data quality issues across SQL Server & Cloud and SAS to address them immediately to provide great user experience.
  • Gather and understand data requirements, work in the team to achieve high quality data ingestion and build systems that can process the data, transform the data and store in in various relational and non-relational stores.
  • Improve upon the data ingestion models, ETLs, and alarming to maintain data integrity and data availability.
  • Extracting and combining data from various heterogeneous data sources
  • Designing, implementing and supporting a platform that can provide ad-hoc access to large data-sets
  • Modelling data and metadata to support ad-hoc and pre-built reporting
  • Working with customers to fulfil their data requirement using DW tables & maintain metadata for all DW Tables

Support experience on the following is required:

– SQL

– Python

– PySpark

– Airflow

– Cloud, Azure/AWS (Azure is preferential)

Additional requirements:

Previous experience in designing efficient data pipeline using python

  • Experience with various ingestion patterns for large data sets
  • Experience with data flow monitoring and error / failure handling
  • Knowledge about data masking
  • Knowledge with agile / SCRUM methodology
  • Experience working with structured, semi-structured and unstructured data sets including social, web logs and real time data feeds

Useful Information:

  • 37.5 hour working week
  • Bonus
  • Pension
  • Healthcare
  • Personal Development Programmes

Reference: BBBH5099060_1556543952
Consultant: Prem Kumar
prem@thisisrtm.com

Similar results:

2ND-3RD LINE SERVICE DESK ANALYST – CITRIX, O365, AD – RAIL

2ND-3RD LINE SERVICE DESK ANALYST – CITRIX, O365, AD, MOBILE, LAPTOPS, ANDROID – RAIL – NORTH LONDON

INFORMATION GOVERNANCE & DATA PROTECTION – HEALTHCARE

INFORMATION GOVERNANCE & DATA PROTECTION CO-ORDINATOR – PRIVATE HEALTH CARE – £28-32K LONDON

API Developer – Mulesoft – luxury retailer

Take ownership and help lead all API developments required for the deployment of new systems that are being implemented. Own…