We are currently seeking an experienced Hadoop Developer to join our team. As a Hadoop Developer, you will be responsible for designing, developing, and maintaining Hadoop-based applications and solutions. This is a senior-level position that requires strong technical expertise and a solid understanding of the Hadoop ecosystem.
Responsibilities:
- Design, develop, and implement Hadoop-based applications using technologies such as HDFS, MapReduce, Hive, Pig, Spark, etc.
- Collaborate with cross-functional teams to gather and analyze requirements, and translate them into technical specifications.
- Develop and optimize data ingestion and processing pipelines for large-scale data sets.
- Troubleshoot and debug Hadoop applications to identify and resolve issues in a timely manner.
- Monitor and tune Hadoop clusters for performance and scalability.
- Implement security measures and data governance policies to ensure data integrity and privacy.
- Stay up-to-date with the latest developments and advancements in the Hadoop ecosystem and big data technologies.
Requirements:
- Bachelor's degree in Computer Science, Engineering, or a related field.
- Proven work experience as a Hadoop Developer, with at least 5 to 15 years of experience.
- Strong proficiency in Hadoop technologies such as HDFS, MapReduce, Hive, Pig, Spark, etc.
- Experience with Hadoop distribution platforms like Cloudera, Hortonworks, or MapR.
- Solid understanding of SQL and relational databases.
- Proficiency in programming languages like Java or Scala.
- Familiarity with data ingestion tools such as Apache Kafka or Apache NiFi.
- Knowledge of data warehousing concepts and ETL processes.
- Experience with data visualization tools like Tableau or Power BI is a plus.
- Excellent problem-solving and analytical skills.
- Strong communication and collaboration abilities.
- Ability to work effectively in a fast-paced environment and manage multiple priorities.
If you are a highly skilled and experienced Hadoop Developer with a passion for big data and analytics, we invite you to join our team. Help us build innovative and scalable solutions that leverage the power of Hadoop to unlock valuable insights from data.