Greetings!
Our client Navy Federal Credit Union is the world's largest credit union with over 10 million members, over $149 billion in assets, and over 23,000 employees.
Our client is seeking an IT Engineer - Data-15507-Hybrid. You seem to have a rock-solid profile and your overall background seems to be a great match for the position.
Please review the below information for clarity on the position description.
Basic Purpose:
Develop strategies for data acquisition, data pipelines, and database implementation. Responsible for designing, building, integrating data from various resources, and managing big data. Develop and write complex queries, while ensuring they are easily accessible, work smoothly, with the goal of optimizing the performance of Navy Federal’s big data ecosystem within CI/CD pipelines. Recognized as an expert with a specialized depth and/or breadth of expertise in discipline. Solves highly complex problems; takes a broad perspective to identify solutions. Leads functional projects. Works independently.
Responsibilities:
Provide Data Intelligence and Data Warehousing (DW) solutions and support by leveraging project standards and leading data platformsBuild and maintain Azure data pipelines using DevSecOps processDefine and build data integration processes to be used across the organizationBuild conceptual and logical data models for stakeholders and managementWork directly with business leadership to understand data requirements; propose and develop solutions that enable effective decision-making and drives business objectivesPrepare advanced project implementation plans which highlight major milestones and deliverables, leveraging standard methods and work planning toolsDocument existing and new processes to develop and maintain technical and non-technical reference materials.Recognize potential issues and risks during the analytics project implementation and suggest mitigation strategiesCoach and mentor project team members in carrying out analytics project implementation activitiesCommunicate and own the process of manipulating and merging large datasetsPerform other duties as assigned
Qualifications and Education Requirements:
Master’s degree in Information Systems, Computer Science, Engineering, or related field, or the equivalent combination of education, training and experienceExpert skill in Azure Data Factory, DatabricksAdvanced skill in Azure SQL, Azure Data Lake, and Azure App Service, Python, T-SQLExperienced in sourcing, maintaining, and updating data in On-Prem and Cloud environmentsKnowledge of and the ability to perform basic statistical analysisThorough understanding of SQLExperienced in the use of ETL tools and techniquesExperience Designing and building of data pipelines using API ingestion and Streaming ingestion methods.Ability to understand the business problem and determine what aspects of it require optimization; articulate those aspects in a clear and concise mannerAbility to understand other projects or functional areas in order to consolidate analytical needs and processesDemonstrates change management and/or excellent communication skills Understands data warehousing, data cleaning, data pipelines and other analytical techniques required for data usageDemonstrates deep understanding of multiple data related conceptsUnderstands the concepts and application of data mapping and building requirementsUnderstands data models, large datasets, business/technical requirements, BI tools, data warehousing, statistical programming languages and librariesExperience using GIT & Source ControlSkilled in managing the process between updating and maintaining data source systems and implementing data related requirement