✨ About The Role
- The role involves building a real-time data platform for the Department of Defense (DoD) to enhance operators' awareness of critical events by aggregating real-time data from over 750 sensors
- Responsibilities include building Extract, Transform, Load (ETL) pipelines, leveraging software engineering principles to architect, develop, and implement large-scale streaming applications
- Key technologies involved are Kafka, Kafka Streams, Pinot, Java, Scala, and Kubernetes
- The position requires hands-on collaboration with a team to process over a billion events daily with millisecond-level latency
- The successful candidate will work on-site up to 50% of the time and must obtain Security+ within the first 90 days of employment
âš¡ Requirements
- Experienced software engineer with 9+ years in the industry, particularly in building and managing fault-tolerant, data-intensive platforms using streaming technologies like Kafka, Pinot, or Flink
- Proficient in working with large-scale relational and non-relational databases such as PostgreSQL, MySQL, MongoDB, and Elasticsearch to handle low-latency requirements
- Skilled in data management including ETL processing, data governance, and data storage
- Strong background in building and releasing software in cloud-native environments and using package managers like Maven, Gradle, and NPM
- Ability to work collaboratively with a team of accomplished individuals, striving towards excellence in building data infrastructure and platforms using streaming frameworks