Senior Data/Devops Engineer
RemotePortugal, Lisboa, LisboaEngineering
We are looking for a Senior Data Engineer with demonstrated experience in engineering stateless data pipelines and the infrastructure/devops around them. You'll be part of a small and agile team developing our strategy and risk engine; building reliable data pipelines, fault tolerant risk systems and clever integrations with on-chain programs (with the help of our team) and creating real-time value! You will be working together tightly with the product team led by two of the founders and a solid engineering/financial team to build infrastructure that will allow millions to be part of a financial revolution on-chain.
As a Senior Data Engineer at Basis, you will have a central position in making technical and architectural decisions about our data scaffolding and infrastructure. You are expected to continue to build on the foundations of our data pipelines and financial engineering thus far and recruit strong talent to assist you in conquering the roadmap we are building.
Who we are
BASIS is basis.markets’ fee-sharing token. Holders can stake their BASIS tokens to receive a proportional share of trading fee rewards generated by the basis.markets Decentralised Basis Liquidity Pool (DBLP). BASIS tokens also give access a high-reward liquidity mining programme as well as discounted deposits into the DBLP in future.
basis.markets is a decentralised liquidity pool, owned by you, powered by the Basis Trade Engine. The DBLP is a fully-managed, cross-exchange, cross-collateralised, liquidity pool offer high yields on non-directional trades.
Controlled by the basis.DAO and managed by financial markets experts and a top-tier trading team, basis.markets has the vision of delivering next-level returns, delta-neutral yield, and real-time insights.
Why do we need you?
A key element of our success is helping to provide a solid data infrastructure with the right risk management tools to support our financial engineering. We derive dozens of strategies on a continuous basis on which we will deploy serious capital. This relies on a fault tolerant stateless replicable deployment which is carefully monitored and has built in risk controls. It needs to be boringly simple in an unassuming elegant and powerful way.
- Take the lead in making technical and architectural decisions regarding our stack and specific implementation of critical supporting services
- Interface and align with the On-Chain, Data Engineering and Quant teams on technical scope, decisions, and implications.
- Spearhead the development of the infrastructure and data pipelines that run on them to get data from source to sink consistently.
- Focussing on standardised modular and robust implementation of reusable components and functionality.
- Tackling challenges such as proper abstraction and reusable modular code which can be adapted at the core to derive benefits across all data pipelines.
- Ensuring our code-base is well kept, tested and usable across the board for expanding our services and engaging with partner projects.
- Working together with the wider engineering team to bring them into the fold and allowing them to work with the tools you and your team build.
- Research new (upcoming) technologies that will considerably improve the quality, cost and speed and or development time of our technical ecosystem..
- Code review and help junior developers grow technically in their role.
- Build iteratively by starting with solid interfaces which are designed to be mutable in the future but do not impact the flow of data. Build for composability and continuous improvement.
Why work with us?
- You have the opportunity to use your skills to accelerate a financial revolution; a massive FU to the traditional banking system.
- You will be working on data streams that constantly change and will require you to learn and explore new technologies. The density of knowledge you will acquire is catalysed by the speed of the ecosystem.
- Grow at a pace faster than you have ever experienced and be exposed to financial inside knowledge you would otherwise never gain.
- We are a small, experienced and completely decentralised team. This grants ultimate flexibility and ownership of your time. We care about shipping and iteratively improving together.
- Demonstrable work experience (at least 5 years) in developing robust fault tolerant and mission critical data pipelines.
- You have experience in scalable distributed data ETL.
- You have production level experience in deployed data pipelines which have withstood their challenges and know how to prepare for them going forward.
- You understand the impact of the code you are writing, you are able to think in terms of rendering performance, memory and how things will work in production before it is built.
- A firm understanding of when (and when not) to use libraries that abstract the management, security and scaffolding of data pipelines.
- Security is permanently at the forefront of every architectural decision.
- You are naturally versed in python, go/rust is a plus. Bash, yaml, terraform is daily bread.
- You can demonstrate reusable interfaces and abstraction design patterns for your data pipelines and how they have helped you compose for different scenarios.
- Experience with financial/time-series data at scale is a huge+
- Are comfortable with interacting with the basis community on social media and are willing to associate yourself publicly with basis.markets
- You enjoy a structured project but are not afraid to get started and rework as you go along.
- Perfection is the enemy of speed, you pride yourself in moving fast but making the right decisions early.
- You’re pragmatic; you know when to trade off diving deep with quick fixes
- You are a pioneer and you know sometimes you just have to dive into the deep end and it's safe to do so and that gives you energy
- You are curious and won’t stop searching and asking questions until you find the answer
- You don't wait for answers, you move