• Bachelor's Degree in Engineering, Computer Science, CIS, or related field (or equivalent work experience in a related field)
• 4 years of experience in Data Engineering, Data Lake, Data Mesh, Data Warehousing/ETL
• 4 years of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC)
• Experience working with Continuous Integration/Continuous Deployment tools with Git, Bitbucket, Jenkins.
• 4 years of experience in systems analysis, including defining technical requirements and performing high level design for complex solutions Data Engineering
• 4 years of experience in Hadoop, Google Cloud(GCP) and Bigdata components (specific to the Data Engineering role)
• Expertise in Python, advanced SQL, PySpark (Spark Batch and Streaming), Airflow, Kafka.
• Exposure to REST API creation and usage
• Experience in functional programming, test first architecture (unit testing and regression testing), OOPS concepts(classes and objects).
• Exposure in DB2, Teradata is a plus.
HARMAN is proud to be an Equal Opportunity / Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics.