Bengaluru Millenia, India
14 days ago
Senior Associate_Bigdata Engineer_Data & Analytics_Advisory_Bangalore

Line of Service

Advisory

Industry/Sector

FS X-Sector

Specialism

Data, Analytics & AI

Management Level

Senior Associate

Job Description & Summary

A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge.

Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers.

Why PWC

At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us.

Responsibilities: 

• Experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC) 

• Experience working with Continuous Integration/Continuous Deployment tools 

• Experience in systems analysis, including defining technical requirements and performing high level design for complex solutions 

• Design and implement pipelines for data ingestion & transformation 

• Manage data pipelines for analytics and operational use 

• Develop and maintain scalable data pipelines and build out new API integrations to support continuing increases in data volume and complexity 

• Ensure data accuracy and integrity across multiple sources and systems 

• Collaborate with data scientists to support DS algorithms and data analysts for analytics 

• Partners with the product team to help inform the priorities within a set of Software products, applications, and/or services 

• Play a crucial role in implementing software and methodologies for data correction, reconciliation, and quality checking 

• Work closely with data science, data analyst and product teams to drive insights and innovations. 

• Experience in systems analysis, including defining technical requirements and performing high level design for complex solutions for Data Engineering 

• 3 years of experience in Hadoop or any Cloud Big Data components  

• Expertise in Scala/Python/PySpark/Apache Spark, SQL, Scripting, Hadoop (Sqoop, Hive, Pig, Map Reduce), Spark (Spark Streaming, MLib), Airflow, Kafka or equivalent Cloud Big Data components  

• Expertise in, Scripting, any RDBMS, Hadoop (OLAP on Hadoop), Dashboard development 

• Microservice Development 

• GCP Exposure ( DataProc, GCS, BigQuery) 

• Work with the platform team to resolve any infrastructure related concerns• 5-8 years of experience in Data, or Platform Engineering, Data Warehousing  

• 4 years of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC) end-end 

Mandatory skill sets: 

 Big Data, Pyrhon, Hadoop 

Preferred skill sets: 

 Big Data, Pyrhon, Hadoop 

Years of experience required: 

 3-10 Years 

Education qualification: 

 BE, B.Tech, MCA, M.Tech 

Education (if blank, degree and/or field of study not specified)

Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering

Degrees/Field of Study preferred:

Certifications (if blank, certifications not specified)

Required Skills

Big Data Hadoop

Optional Skills

Desired Languages (If blank, desired languages not specified)

Travel Requirements

Available for Work Visa Sponsorship?

Government Clearance Required?

Job Posting End Date

Confirm your E-mail: Send Email
All Jobs from PwC Public Sector