The Data Engineer will analyze, design, develop, performance tuning, and test their code as well as other code and support initiatives to build an enterprise integration framework while providing inputs for improving standards for data mapping, integration, and data traceability across the NAPA business functions.
Responsibilities
- The ETL Engineer performs the design, development, and implementation of integration processes for the Enterprise Data lake, Data Warehouse, and Applications
- Analyzes requirements and existing resources to create efficient database and integration designs that meet company IT standards.
- Works with project and business analyst leads to develop and clarify in-depth technical requirements.
- Participates in all phases of the integration development lifecycle, including unit testing, quality assurance(QA), and ongoing support.
- Helps with Production support as needed.
Required Experience
- Five or more years of experience in large-scale RDBMS environments or Google BigQuery.
- 5+ years of experience with Informatica PowerCenter or IICS.
- Experience in code automation (e.g. pattern-based integration).
- Experience in advanced SQL and PL/SQL techniques.
- Experience in Unix shell and Python scripting.
- Integration design & data modeling skills in Data lake and Data Warehousing environments.
- Experience with Streaming technologies is a plus( STRIIM, Kafka, etc).
- Ability to build and analyze complex integration workflows from heterogeneous data sources.
- Strong background in full lifecycle development using multiple platforms or languages.
- Development experience in high transaction/high availability systems.