✨ About The Role
- The role involves building a real-time data platform for the Department of Defense, aggregating data from over 750 sensors to empower operators in making time-critical decisions
- Responsibilities include developing data infrastructure and platforms using streaming frameworks like Apache Kafka, Flink, and Kafka Streams, as well as building ETL pipelines for comprehensive data operations
- Key technologies used in the role include Kafka, Kafka Streams, Pinot, Java, Scala, and Kubernetes, with a focus on processing over a billion events daily with millisecond-level latency
- The position requires hands-on collaboration with a team of accomplished individuals, contributing to the development and implementation of large-scale streaming applications
- The job offers a competitive salary, fully covered healthcare, dental, and vision coverage, 401(k) with company match, education & training benefits, and a range of other perks and benefits
âš¡ Requirements
- Experienced software engineer with over 14 years in the industry, specializing in building fault-tolerant, data-intensive platforms using streaming technologies like Kafka, Pinot, or Flink
- Proficient in managing large-scale relational and non-relational databases such as PostgreSQL, MySQL, MongoDB, and Elasticsearch to meet low-latency requirements
- Skilled in data management, ETL processing, data governance, and data storage, with a strong background in cloud-native environments and software release processes
- Familiar with software design and architecture patterns, automation languages like Python or GoLang, and monitoring/logging solutions in cloud-native environments
- Security-focused individual with the ability to obtain Security+ within the first 90 days of employment and hold an active Secret clearance with the potential to obtain and maintain a Top Secret clearance