Job Description
Title: Data Architect (ETL, DataStage, MATILLION, SQL, RDBMS)
Location: Remote
JOB Description:
Keywords:
DataStage, Databases, PLSQL, Databricks or Snowflake, ETL.
The ideal candidate will use DataStage, ADF, Matillion and QLIK, SQL , build data ingestion and ETL pipelines in a hybrid data and analytics environments – Oracle, HANA, DB2 and others into Snowflake Datawarehouse. He/she will follow client’s guideline and processes and work closely with their IT and business partners –In instances, responsible to document as-is and to-be processes in support of continuous improvement and excellence.
8+ years of hands on experience using DataStage, ADF, Matillion and QLIK. (DataStage 11.7.X experience is preferred)
5+ year of hands-on experience using Snowflake and Databricks and building dimensional data models.
Solid practical experience in collecting data from API end points like JIRA, ServiceNow and feeding data to downstream systems via Kafka, API and other means.
Strong ETL experience in handling large volumes of data in the complex heterogeneous data warehouses, and processing high volume jobs
Prior experience in building data ingestion pipeline and data replication into cloud environments hosted on Azure platform.
Hands on experience in creating the ETL jobs, performance tuning / optimization
Knowledge of Snowflake is added advantage
Strong Knowledge in databases such as Oracle,IBM DB2, SAP HANA, and ERP systems (SAP and JDE)
5+ yrs. of experience in SQL, UNIX shell scripting will be added advantage.
Knowledge in Snowflake, Tivoli Workflow Scheduler (TWS)/ Maestro and BMC Control-M is added advantage.
Having experience in supply chain management and distribution domains is a huge plus.
Roles & Responsibilities
The ideal candidate will use DataStage, ADF, Matillion and QLIK, SQL , build data ingestion and ETL pipelines in a hybrid data and analytics environments – Oracle, HANA, DB2 and others into Snowflake Datawarehouse. He/she will follow client’s guideline and processes and work closely with their IT and business partners –In instances, responsible to document as-is and to-be processes in support of continuous improvement and excellence.
8+ years of hands on experience using DataStage, ADF, Matillion and QLIK. (DataStage 11.7.X experience is preferred)
5+ year of hands-on experience using Snowflake and Databricks and building dimensional data models.
Solid practical experience in collecting data from API end points like JIRA, ServiceNow and feeding data to downstream systems via Kafka, API and other means.
Strong ETL experience in handling large volumes of data in the complex heterogeneous data warehouses, and processing high volume jobs
Prior experience in building data ingestion pipeline and data replication into cloud environments hosted on Azure platform.
Hands on experience in creating the ETL jobs, performance tuning / optimization
Knowledge of Snowflake is added advantage
Strong Knowledge in databases such as Oracle,IBM DB2, SAP HANA, and ERP systems (SAP and JDE)
5+ yrs. of experience in SQL, UNIX shell scripting will be added advantage.
Knowledge in Snowflake, Tivoli Workflow Scheduler (TWS)/ Maestro and BMC Control-M is added advantage.
Having experience in supply chain management and distribution domains is a huge plus.
Years Of Experience
6 to 12 Years