Akuna Capital is a young and booming trading firm with a strong focus on collaboration, cutting-edge technology, data driven solutions and automation. We specialize in providing liquidity as an options market maker – meaning we are committed to providing competitive quotes that we are willing to both buy and sell. To do this successfully we design and implement our own low latency technologies, trading strategies and mathematical models.
Our Founding Partners, including Akuna’s CEO Andrew Killion, first conceptualized Akuna in their hometown of Sydney. They opened the firm’s first office in 2011 in the heart of the derivatives industry and the options capital of the world – Chicago. Today, Akuna is proud to operate from additional offices in Sydney, Shanghai, and Boston.
What you’ll do as a Junior Data Capture Engineer on the Data Engineering Team at Akuna:
Akuna Capital is seeking a hands-on technical leader to take our market data platform to the next level. At Akuna, we believe that our data provides a key competitive advantage and is a critical part to the success of our business. Our Data Engineering team is composed of world class talent and the Data Capture team has been entrusted with building and maintaining our market data capture and processing platform. Our platform starts with gathering data from global data captures and ends with our users across Trading, Quantitative and Middle Office staff accessing complex datasets for a wide range of streaming and batch use cases. Along the way, we build tools to capture, transform, monitor and access the data in efficient and intuitive ways, building on the best tools and technologies available for each step of the pipeline. In this role you will:
- Work within a growing global team supporting the strategic role of data at Akuna
- Build and deploy new market data capture, storage and processing capabilities for Akuna
- Support the ongoing growth, design and expansion of our hybrid cloud data platform by building key technology supporting datasets in use throughout the firm
- Assist in identifying how market data is accessed and used, contributing toward efforts to scale out data tooling across the firm
- Produce clean, well-tested and documented code with a clear design to support mission critical applications
- Challenge the status quo and help push our organization forward, as we grow beyond the limits of our current tech stack
Qualities that make great candidates:
- Bachelors, Masters or PhD in technical field – Computer Science, Engineering, Physics or equivalent technical field
- 1-3 years of professional experience developing data heavy tooling and applications
- Knowledge of at least one programming language (Java, C++, Scala, Python, Go, etc.)
- Experience working with network capture, TCP/UDP PCAP processing is a plus
- Prior hands-on experience with data platforms and technologies such as Kafka, Delta Lake, Spark, Elastic Search is a plus
- Demonstrated experience using software engineering best practices like Continuous Integration and Deployment to deliver software projects
- Legal authorization to work in the U.S. is required on the first day of employment including F-1 students using OPT or STEM
Please note: If you have applied to multiple roles, you will be asked to complete multiple coding challenges and interviews.