Washington, DC, USA
36 days ago
Legacy Integration Layer Developer
Overview BigBear.ai is seeking a motivated, career and team-oriented Legacy Integration Layer Developer in support of the U.S. Department of Homeland Security (DHS) Cybersecurity and Infrastructure Security Agency (CISA) Continuous Diagnostic & Mitigation (CDM) Data Services Program. The CDM Data Services Program is a critical component of CISA’s national effort to ensure the defense and resilience of cyberspace. The CDM Data Services Program mission is to provide a standardized platform to collect, transform, and integrate cybersecurity data from relevant authoritative data sources into a coherent data, delivering actionable information into Agency and Federal Dashboards to identify risk areas in support of mitigation as well as to facilitate coordinated agency and national response to cyber-threats. This is a remote position where the candidate can work from any location within the United States provided, they are able to work on an eastern time zone schedule. Our Architecture and Engineering team provides technical enterprise to design, implement, and maintain a multi-cloud, multi-tenant managed service offering including defining and implementing the build, deployment and monitoring standards. This team delivers end-to-end automation of deployment, monitoring, and infrastructure management in a cloud environment by building and configuring delivery environments supporting CD/CI tools using an Agile delivery methodology. In addition, the team works closely with the development team to create an automated continuous integration (CI) and continuous delivery (CD) system. What you will do The Legacy Integration Layer Developer responsibilities include conducting full development lifecycle of data that includes requirements from DHS, other OMB initiatives, and provide support for the whole program. This position also requires building a new data automation practice on the program to address our client’s most pressing needs with Cyber Security Threats and Data. The successful candidate will bring a consultative approach to data to improve the value of the data that’s being collected by our customers. This position is also a thought leader in the practice of Big Data in solving our clients’ cyber security problems, coupled with demonstrated experience designing and developing enterprise data solutions for large clients by providing a new approach to the team, presenting white papers and other solutions. Responsibilities The Integration Layer Developer will: Develop efficient processes to collect, ingest, and transform data from various sources into Splunk. Ensure data quality, integrity, and availability. Configure and maintain continuous and accurate data ingestion into Splunk. Develop complex Splunk searches and dashboards. Design and maintain Splunk data models to support efficient searches and analysis. Collaborate with other developers, share findings, and report methodologies within documentation and productive discussion. Utilize Agile Methodologies with Continuous Integration and Continuous Delivery (CI/CD) Pipelines Utilize GIT Commands for version control and delivery via platforms such as Azure Devops, GitLab, etc. Script innovative tools to further the capabilities of the solution via Python. Automate manual processes using Python, BASH, PowerShell or other scripting languages. Review, Identify, Analyze data from multiple source Cybersecurity tools at multiple agencies. Interpret data, analyze results using statistical techniques and support data trends based on the customer needs. Develop and implement databases, data collection systems, data analytics and other strategies that optimize statistical efficiency and quality. Acquire data from primary and other data sources and maintain databases/data systems.? Identify, analyze, and interpret trends or patterns in complex datasets. Analyze source data and types, identify data requirements for destination systems. Analyze, interpret, and develop data models on data based on Data Dictionary & Logical Data Models guidance. Locate and define new process improvement opportunities. Other duties as assigned. What you need to have Bachelor's Degree in computer science, data science, or closely related field. and 5+ years of experience; or Master's Degree and 3+ years of experience; or PhD and 0 to 3 years of experience; or in lieu of Bachelor’s degree, 6 additional years of relevant experience Clearance: Able to obtain and maintain a DHS Suitability/Entry on Duty (EOD) Experience with scripting languages like Java, Python, Bash, PowerShell, R Familiarity working with various API response types such as JSON and XML Familiarity with Splunk platform including Universal Forwarders. Experience developing complex queries and searches within Splunk. Ability to troubleshoot and resolve issues related to data processing such as data integrity issues, parsing errors, or query performance problems. Proven ability to analyze complex problems, theorize root causes, and develop creative solutions. Proven ability to use multiple REST API authentication types, knowledge of REST methods, and ability to mine APIs to meet data requirements. Proficient at queries, report writing and presenting findings. Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy. Experience understanding organizational needs, proposing solutions, and managing project execution efforts designed to deliver overall program benefits for Government Agencies Experience collaborating with US Government Agencies, state or local governments, or commercial entities to develop IT service program maturity in accordance with Federal IT mandates and best practices. Experience in conducting assessments of an Enterprise by reviewing technical documentation, conducting interviews and workshops to identify gaps and develop a tailored solution is highly desired. Demonstrated interest in security solution design using existing as well as emerging technologies to deliver enterprise solutions. Additional Skills Demonstrated ability to investigate data and present findings to internal teammates and client audiences. Shown interest in keeping up with industry trends and best practices. What we'd like you to have Splunk Certifications (Core, Power User, Advanced Power User, etc.) Cloud platform certifications (AWS Practitioner, Azure Fundamentals) Security certification such as Security+ About BigBear.ai BigBear.ai is a leading provider of AI-powered decision intelligence solutions for national security, supply chain management, and digital identity. Customers and partners rely on BigBear.ai’s predictive analytics capabilities in highly complex, distributed, mission-based operating environments. Headquartered in Columbia, Maryland, BigBear.ai is a public company traded on the NYSE under the symbol BBAI. For more information, visit https://bigbear.ai/ and follow BigBear.ai on LinkedIn: @BigBear.ai and X: @BigBearai.
Confirm your E-mail: Send Email