Los Angeles, CA • lyle.barner@gmail.com
With 10+ years of experience in complex systems, I find fulfillment in developing creative solutions to difficult problems.
Python • Java • C/C++ • TypeScript/JavaScript • Node.js • SQL • R • Shell • Jupyter • MATLAB • Fortran
Linux • Docker • PyTorch • TensorFlow • Keras • GitHub • Electron • TestRail • Jira • Jenkins
NIST 800.53 • AS9100
I build and maintain the infrastructure, tooling, and pipelines that enable research scientists to develop and ship software rapidly and reliably. Specializing in CI/CD, hybrid cloud infrastructure, and infrastructure-as-code, I reduce engineering overhead while maintaining strong security and observability practices.
Full-Stack DeveloperI led a team of 2–3 engineers in building a cross-platform software quality analysis toolset using Python and Electron. I managed agile sprints, oversaw open-source collaboration, and contributed code to drive feature development. Responsibilities included maintaining a robust test automation suite, ensuring backward compatibility, and tracking progress through milestone-based delivery and CI/CD pipelines to support high-quality, scalable software releases.Assurance ResearcherI enhanced the reliability of mission-critical systems through a data-driven approach to software assurance. I supported development milestones with automated testing and analysis workflows, conducted software security assessments to uncover critical vulnerabilities and evaluate risk, and developed scalable techniques to improve the efficiency and impact of assurance activities across the software lifecycle. I was funded for over $500k in strategic investments to pursue this goal.I served as a subject matter expert in the deployment, integration, and maintenance of code quality analysis tools. I analyzed software for reliability issues, cybersecurity vulnerabilities, and overall code health, while supporting seamless integration into CI/CD pipelines. I also performed data analysis and generated risk ratings to help teams make informed, security-conscious development decisions.
I applied statistical modeling and Monte Carlo analysis to simulate complex air-to-air engagements, supporting technology readiness assessments of next generation air platforms. By leveraging exploratory data analysis, I identified key patterns and trends from large datasets. Through data-driven storytelling I communicated key findings and actionable insights to government partners during monthly review meetings. I implemented software enhancements to extend simulator functionality and developed validation plans to benchmark in-house simulation tools against commercial models.
I researched and implemented a variety of orbital prediction algorithms to enhance proprietary simulation models. I also led the development of verification and validation plans to ensure accuracy, reliability, and alignment with mission requirements of high-fidelity simulation tools.
Noteable Coursework: Neural Networks and Deep Learning, Natural Language Processing, Big Data Analytics, Machine Learning Algorithms
Noteable Coursework: Software Design Patterns, Project Management, System Design, Database Design
Noteable Coursework: Control Systems, Digital Signal Processing, Communication Systems, Computer Vision
I led efforts to advance software supply chain security by investigating tools and techniques for generating and leveraging SBOM data within mission-critical environments. I developed a proof-of-concept toolchain to assess dependency quality and exposure risk through continuous integration pipelines, enabling early detection of vulnerable software elements. Drawing from this work, I now lead the development of institutional guidance for project level software supply chain risk management, using SBOM analytics and automated tooling to support compliance, transparency, and secure software deliveries throughout the development lifecycle.
Currently leading a team of 3–5 engineers to deploy automated static and binary analysis initiatives aimed at enhancing software quality and security. We leverage benchmarking and performance metrics to evaluate tool effectiveness, optimize usage, and inform strategic adoption across the development pipeline. Our work includes exploring advanced binary analysis techniques for scenarios where source code is unavailable, enabling vulnerability detection, code provenance analysis, and risk assessment in closed-source or third-party components. This effort blends leadership in secure software engineering with a strong emphasis on data-driven tool evaluation and scalable automation.
I spearheaded the development of an institutional-level Secure Coding Guideline, standardizing best practices across all mission-critical software development efforts. To enforce compliance, I built and tuned static code analysis pipelines, integrating custom rule sets to align with security policies and streamline DevSecOps workflows. Additionally, I performed threat intelligence analysis on vulnerability data from automated security scans, using risk prioritization and data-driven insights to surface critical issues and guide remediation strategies. This work enhanced software assurance, reduced attack surface, and supported secure software deployment at scale.
Presented to any combination of government and/or non-government individuals for an outstanding group achievement that has contributed substantially to NASA's mission
Presented for making a difference to the operational effectiveness and/or technical capabilities of the organization.
Presented for commitment to excellence and significant contributions to successful contract execution.