Mastery of the technical aspects mentioned in the job description with emphasis on GCP
Development, maintenance of data pipelines & platforms (on GCP)
Strong written and verbal skills with the ability to collaborate well with Project Managers, Data Teams and Stakeholders
Organized and resourceful, proactive & creative in developing solutions
Develop, monitor and maintain critical data pipelines and platforms in a cloud-based Big Data environment.
Collaborate with product managers, data infrastructure teams, and project stakeholders to rapidly design and deploy solutions to fulfill business requests.
Bachelor's degree in Computer Science, Software Engineering, or related discipline, or equivalent work experience.
5+ years of Data Engineering experience with cloud-based services and big data technologies including Amazon EMR (Elastic MapReduce), GCP, Hive/Spark
Advanced working knowledge and experience in Python and SQL (Presto, Postgres)
Experience building and optimizing big data orchestration and workflows (Airflow)
Solid understanding of security best practices, privacy regulations and compliance
Experience in Martech, CRM, media agency or audience platforms a plus
Experience with Vertica a plus
Experience with Data Governance practices a plus
Strong analytic skills related to working with structured and unstructured datasets
Strong organization, communication and interpersonal skills
Excellent decision making, ability to work in an agile, fast-paced environment
Driven to self-educate and learn new technologies and approaches
Bachelor's degree in Computer Science, Software Engineering (or related discipline) or equivalent work experience