Spark + Java + AWS Developer
Neshent Tech
Columbus, OH
Posted On: Dec 23, 2024
Posted On: Dec 23, 2024
Job Overview
Salary
Depends on Experience
Required Skills
- Java
- Spark
- AWS
- IaC
- Databricks
- Snowflake
- Spring Boot
Job Description
Roles and Responsibilities
- Design and implement robust and scalable solutions using Core Java, Spring Boot, Microservices, and REST APIs.
- Leverage Apache Spark to process large datasets, optimize performance, and create efficient data pipelines.
- Integrate applications with AWS services such as S3, ECS/EKS, DynamoDB, Lambda, and Step Functions for scalable and efficient cloud-based solutions.
- Implement ETL patterns and work with Data Lakes, Data Warehouses, and Big Data Engineering to process and manage large datasets.
- Apply design patterns to build maintainable, scalable, and efficient code.
- Work with cross-functional teams to ensure integration and scalability of data solutions and services.
Required Qualifications
- Strong experience in Java, including Spring Boot, Microservices, and REST APIs.
- Hands-on experience with Apache Spark for big data processing.
- Familiarity with AWS technologies such as S3, ECS/EKS, DynamoDB, Lambda, and Step Functions.
- Experience in Data Lakes, ETL processes, and Data Warehousing.
- Ability to design and implement data pipelines using Spark and AWS services.
- Strong understanding of design patterns for software development.
- Experience working with AWS cloud architecture and services.
Preferred Skills/Qualifications
- Experience with Terraform for Infrastructure as Code (IaC).
- Familiarity with Databricks for big data analytics.
- Experience with Snowflake or AWS Data Lake formation for data storage and management.
- Knowledge of Big Data engineering, data lakes, and data warehouse ETL patterns.
- Previous experience working in agile environments.
- Familiarity with DevOps practices and CI/CD pipelines.
Job ID: NT240514