AWS ETL Developer

TechVilla Solutions

Washington, DC

Posted On: Jun 14, 2024

Job Overview

Job Type



10 - 15 Years


Depends on Experience

Work Arrangement


Travel Requirements


Required Skills

  • AWS
  • ETL
  • Python
  • SQL
  • Salesforce
  • Hadoop
  • XML
  • Kafka
Job Description

We're seeking a skilled individual with expertise in Amazon Web Services (AWS) and Extract, Transform, Load (ETL) processes. Your role will involve crafting and maintaining ETL pipelines, implementing data integration solutions on AWS cloud services, and fine-tuning data processing workflows. If you're passionate about shaping the future of Salesforce data implementations, this opportunity is for you.

Key Responsibilities
  • Responsible for working with our Enterprise customers and migrating data into the Cloud.
  • Set up ETL process to move data into Cloud and refresh on a daily basis.
  • Help the team in optimizing queries and evaluating different architectures.
  • Working with internal teams and helping them make the most of our data lake.
  • Helping with identifying processes and tasks that can be automated with internal tools.
  • Participates in the data domain technical and business discussions relative to future architect direction.
  • Researches and evaluates emerging data technology, industry, and market trends to assist in project development and/or operational support activities.



  • Minimum 10 years of hands-on experience in Salesforce cross-cloud testing with a proven track record.
  • Mastery of Amazon Web Services (AWS) including S3, Glue, Redshift, and Athena.
  • Proficiency in Extract, Transform, Load (ETL) processes and associated tools.
  • Strong programming prowess, especially in scripting languages like Python or Java.
  • Advanced SQL skills for efficient data querying and manipulation.
  • Experience in data modeling, database concepts, and data warehousing principles.
  • Ability to design, develop, and maintain ETL pipelines and data integration solutions.
  • Familiarity with Salesforce Data Cloud and its implementation.
  • Knowledge of big data frameworks such as Hadoop, Spark, or Kafka for processing and analyzing large datasets.
  • Expertise in JEE, AWS, SOA, web development, web services, XML, JSON, DHTML, and Oracle 11G (or higher)/SQL Server.

Job ID: TS240225

Posted By

Jiya Sharma

Sr. HR