Job Description
Role: ETL Architect with AWS
Location: Chicago, IL (Hybrid)
Experience: 12+ years only
Duration: Long Term.
Must have skills: Lambda , Step functions in AWS, AWS-Glue, Redshif, Python, Terraform
Key Responsibilities:
· Design, implement, and optimize cloud-based solutions on AWS, leveraging services such as S3 for scalable data storage.
· Develop serverless applications using AWS Lambda for efficient and cost-effective execution of functions.
· Implement and manage data warehouses on AWS Redshift for high-performance analytics.
· Implement and manage visualisation/reports using Tableau.
· Utilize Matillion for ETL processes, ensuring efficient data extraction, transformation, and loading.
· Collaborate with cross-functional teams to understand data requirements and design effective solutions.
· Write code in Python for data processing, automation, and application development.
· Ensure the security, availability, and performance of AWS-based systems.
· Troubleshoot issues related to AWS cloud services and data engineering processes.
· Document technical solutions, best practices, and guidelines.
Required Skills and Qualifications:
· Bachelor’s degree in Computer Science, Information Technology, or related field.
· Minimum of 5 years of experience in AWS cloud engineering and data operations.
· Proficiency in AWS services, with a focus on S3, Lambda, and Redshift.
· Proficiency in using and managing Tableau.
· Experience with Matillion for ETL processes and data integration.
· Strong programming skills in Python and C#.
· Knowledge of AWS services such as Glue, Athena, etc.
· Understanding of data warehousing concepts and best practices.
· Ability to work collaboratively in a team environment.
· Excellent problem-solving and troubleshooting skills.
· Preferred Skills:
· AWS certifications related to cloud engineering and data analytics.
· Experience with continuous integration/continuous deployment (CI/CD) pipelines.
· Previous work on large-scale data migration projects.