Build end-to-end data pipelines including data collection, transformation, quality and integration solutions to facilitate data & analytics solutions.
Collaborate with solution design and business requirements teams to identify data requirements and assemble large, complex data sets that meet the requirements
Design, implement and fine-tune analytics solutions that meet business and technical requirements
Work collaboratively with the data architect to ensure data model integrity
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
Create data tools for analytics and data scientists teams that assist them in building and optimising the data models and solutions
Work with the DevOps engineer to support the consistent operation of data & analytics solutions
Work closely with the data analysts & data scientists to design and develop APIs
Qualifications
Bachelor or Masters degree in a related field (e.g. computer science, information technology, etc.)
At least 3 years experience in SQL/PostgreSQL, data and BI solutions with integration to 3rd party tools
Experience working with application server software (e.g. ERP), Spark, Scala, Python, SQL scripting languages, relational databases (e.g. SQL DB/DW), NoSQL platforms (e.g. HBase, MongoDB, Cassandra), cloud technologies (e.g. Azure or AWS)
Highly experienced with processing large and complex datasets and building end-to-end data pipelines using on-premise or cloud-based data platforms
Experienced in coding in data management, data warehousing or unstructured data environments
Experience in the energy sector or other asset-intensive industries will be highly regarded
Experience in AWS and Azure cloud platforms and technology
Ability to define and develop data integration patterns and pipelines
Strong knowledge of data modelling, data warehousing, and BI concepts
Self-motivated, able to work independently, and attention to details.