技能:ETL, DB2, MS SQL, Oracle
Principal responsibilities
Be part of a team to ensure integrity, performance and reliability of DB
Be part of a team migrating from On-prem Oracle DB to the Google Cloud Platform
New implementations on Google Cloud Platform (especially Big Query solution)
Develop APIs to expose datawarehouse data (on GCP) to internal and external applications via Kong Gateway.
Estimation, Documentation for developments planned for the year
Work closely with Business Users to capture new requirements and pain points
Support the Live Environment, providing Level 3 support and resolving issues
Performing Impact Analysis on requested changes
Requirements
Experience in Building ETL pipelines including SCD types natively in SQL.
Understanding of Dimensional Data modelling, Kimball DW methodology, facts, dimension, SCD Types, Star schema Design
Proficiency in writing performant/modular SQL, using CTEs, window functions, procedures, packages, materialized views, partitioning on MPP Data warehouses like Oracle / BigQuery / Snowflake etc.
Experience in integrating data from diverse sources such as DB2, MS SQL, Oracle and files like Avro/parquet/csv
Using python to ingest data from various data sources, stand up APIs using frameworks like Flask.
GCP Big query and its various features like Authorized views, clustering, partition pruning etc.
Using Git, GitHub, Jenkins or similar CICD tools
Knowledge of Agile engineering practices is essential
Knowledge of Agile technologies, ie Jira, GITHUB etc is essential