Hadoop Developer with GCP/Duluth, GA (Atlanta) 12 mth+ contract
Employment Type: Full-Time
H1's Okay and NO OPT/CPTLocation: Duluth, GA (Atlanta)Duration:12+Months
Primary Skillset – Hadoop, Java/Python/Scala (Any Programming language)
Secondary Skillset - GCP
• Spark with strong Java/Python/Scala
• Hands on of Hadoop concepts
• Design big data pipelines using GCP big data products.
• Familiar with standard pipeline architectures for streaming and batch data using various combinations of tools like Dataflow, BigQuery, PubSub etc.
• Able to analyze pros and cons of various architectures from different perspectives – cost, compute, robustness etc.
• Comfortable with both hands on development as well as discussing the design aspects with customer leadership like enterprise architects and program managers.
• Strong experience in Python, PySpark required
• Working knowledge of Hadoop ecosystem including Hive, HBase, Kafka
• Should have basic understanding of Machine Learning concepts such as feature engineering and should have worked in a Data Science/ Data Engineering environment
• Comfortable with Docker and Kubernetes including companion orchestration tools such as Argo or Kubeflow
• Led teams in the development of production quality ETL workflows for ingestion, transformation of scalable
• Strong communication skills, and the ability to see the impact of development on the overall product
• Pro actively identify and manage opportunities for optimizing processes
• Should be able to coach, guide and mentor junior members in the team
• Responsible for development, support, maintenance and implementation of big data projects
• GCP Data Engineer certification preferred.
Loading some great jobs for you...