$100,000 - $120,000 Per Year
Sep 01, 2022
White Collar Technologies
•Develop architecture and standards for Hadoop clusters and data services (Informatica, Nifi, etc) technologies.
•Implement, manage, and administer the Hadoop and data services infrastructure.
•Manages the day to day operations of the Hadoop and data services technologies.
•Collaborate with other infrastructure, analytics, and application teams to make sure services are highly available and performing as expected.
•Performance tuning of application code to improve operational performance.
•Configurations of data services required by analytics and application teams like YARN, Nifi, BDM.
•Capacity planning and determining requirements for applications using services in the data platform.
•Capacity planning for storage requirements for HDFS
•Capacity planning of data movement services and the required throughput of file transfers and endpoints
•Ensuring production operations is not impacted and support level 3 on-call support by monitoring data services connectivity and performance.
•Manage and review Hadoop log files
•Establishes and maintains sound backup and recovery policies and procedures.
•Implement and maintain security of Hadoop and other data services
•Mentor junior team members
Knowledge, Skills and Abilities:
•Excellent knowledge of UNIX/LINUX OS.
•Good understanding of OS concepts, process management and resource scheduling.
•Basics of networking, CPU, memory and storage.
•Ability to demonstrate experience in distributed UNIX environments
•Experience in Big Data technologies (Hadoop, Spark, NiFi, Kafka)
•Experience in Data Services technologies (Informatica, B2B)
•Experience in writing shell scripts
•A knack of all the components in the Hadoop ecosystem like Apache Pig, Apache Hive, Apache Mahout, etc.
•Knowledge of high degree configuration management and automation tools like Puppet or Chef for non- trivial installation.
•Knowledge of cluster monitoring tools like Ambari, Ganglia, or Nagios.
•Knowing of core java is a plus for a Hadoop admin but not mandatory.
•Ability to demonstrate proficiency in Microsoft Access, Excel, Word, PowerPoint and Visio.
•Ability to multi-task and work under pressure.
•Ability to be careful and thorough with detail.
•Ability to work both independently and in a collaborative environment.
•Ability to analyze information and use logic to address work related issues and problems.
•Experience in the Healthcare Industry is a plus.
•Work Conditions and Physical Demands:
•Primarily sedentary work in a general office environment
•Ability to communicate and exchange information
•Ability to comprehend and interpret documents and data
•Requires occasional standing, walking, lifting, and moving objects (up to 10 lbs.)
•Requires manual dexterity to use computer, telephone and peripherals
•May be required to work extended hours for special business needs
•May be required to travel at least 10% of time based on business needs
•The knowledge typically acquired during the course of attaining a Bachelor’s degree in Computer Science, Mathematics, or related discipline is required. A combination of education and experience may be used in lieu of a diploma.
Minimum Related Work Experience:
•6-10 years’ experience with Hadoop technologies or Informatica/ETL technologies
•2 years’ experience designing and implementing high performance, high volume operational systems
•Proven experience using the Hadoop Components and stack, e.g., Hive, Hue, Nifi, Spark, Cloudera.
•Experience with Linux/Unix OS
•Experience with Hardware, Storage, Networking, Active Directory/Kerberos.
•Experience with Shell Scripting
•Basic Experience with programming languages, e.g., Python, Java
Job ID: WC220319
White Collar Technologies
Oct 07, 2022
Sep 01, 2022