Technical / Functional Skills
- Involved in the Study of the business logic and coordinate with the client to gather the requirements.
- Experience with Hive, Nifi, Airflow, Snowflake and Kafka
- Experience in Cloud environments, particularly AWS - EC2, S3, Lambda
- Experience in Unit testing, integration testing for Spark scripts
- Scripting with Bash, CSH, Shell, or Python
- Designing and implementing large-scale ingest systems in a Big Data environment
- Optimizing all stages of the data life cycle, from initial planning, to ingest, through final display and beyond
- Designing and implementing data extraction, cleansing, transformation, loading, and replication/distribution capabilities
- Developing custom solutions/code to ingest and exploit new and existing data sources
- Working with Sponsor development teams to improve application performance
- Providing support to maintain, optimize, troubleshoot, and configure the AWS/spark/Hadoop environment as needed
- Collaborating with team-mates, other service providers, vendors, and users to develop new and more efficient method
- Experience with CI/CD pipelines, unit tests, integration, and regression testing
- Significant experience with Airflow
- Strong ability to manage competing priorities and communication to multiple stakeholders
- Bachelor's Degree in Computer Science, Computer Engineering, or a related discipline with experience in software design and development.