Anywhere is at the forefront of driving the digital transformation and building best-in-class products that help our agents and brokers sell more homes, make more money, and work more efficiently.
Data & Analytics (DNA) is Anywhere's data arm. We create innovative analytics, data science, and robust data foundation capabilities to generate data-driven insights that serve the heart of Anywhere Advisor and Anywhere Brand business. Together with our business counterparts in the real estate business, we work daily to deliver differentiating insights (AI & BI) for Strategy and AA & AB Operations.
We're seeking a Principal Engineer to join our Data Platform Team. In this critical position, you'll be responsible for designing, implementing, and managing the data infrastructure. You will work closely with data scientists, software engineers, and other stakeholders to ensure the Data Platform's availability, usability, and integrity.
Data Infrastructure Design and Implementation:
Evaluate, select, and implement new tools and frameworks to expand our data platform capabilities. Design, build, and maintain robust, scalable, and reliable data pipelines and ETL processes. Develop and maintain data infrastructure and platforms using various technologies (e.g., AWS, Snowflake Cloud Platforms, databases, Kafka streaming platforms). Ensure data quality, consistency, and integrity across the organization. Architect and optimize Data Ingestion and Snowflake ETLs. Production Support and enhancements to the observability of the Data Platform.Team Leadership and Mentorship:
Lead and mentor data engineers, providing guidance and support to junior engineers. Foster a culture of technical excellence and continuous learning. Collaborate with other teams (e.g., data scientists, software engineers, and product managers) to ensure data solutions meet business needs.Data Security and Compliance:
Implement and maintain data security measures to protect sensitive data. Ensure compliance with data protection regulations and industry standards.Problem Solving and Innovation:
Identify and solve complex data-related problems. Stay abreast of industry trends and emerging technologies and identify opportunities to enhance data capabilities. Proactively address performance, scale, complexity, and security considerations.Skills and Qualifications
Technical Expertise:
10+ years’ experience with a strong understanding of data engineering principles and technologies. 10+ years’ experience with data pipelines, ETL processes, and data warehousing. 5+ years’ experience building data pipelines using Kafka, Kafka Connect, Airflow, and Snowflake. 5+ years’ experience with Snowflake Data Platform. 5+ years’ experience with AWS Data Services such as DMS, EMR, Glue, Athena, S3, CloudWatch, Lambda, or IAM. 5+ years’ experience with Data Quality, Data Reconciliation 5+ years’ experience managing production data platforms. 5+ years’ experience building observability (Monitoring & Alerting) using tools such as Data Dog and M. Proficiency in programming languages (e.g., Java, Python, SQL). Knowledge of data governance, data modeling, and security best practices. Proficiency in CI/CD, IAC, and Agile Development.Leadership and Communication:
Strong leadership and mentoring skills. Excellent communication and collaboration skills. Ability to explain complex technical concepts to both technical and non-technical audiences.Problem-Solving and Analytical Skills:
Ability to identify and solve complex problems. Strong analytical skills to identify data quality issues and performance bottlenecks.