MayStreet · Oct 5th 2020
MayStreet has an immediate opening for a Lead Platform Engineer. This is a full-time remote position. We are looking for a passionate and enthusiastic problem-solver with an interest in capital markets data to join us as we continue providing technology innovation and support to the world’s leading global capital markets companies.
Reporting directly to the CTO, in this role, you will lead a talented and open-minded data team that works on mission-critical applications within capital markets technology. You will have the opportunity to lead and contribute in all phases of the development lifecycle – from suggesting the use of emerging technologies to designing, integrating, and upgrading complex architecture.
Who we are
MayStreet is a global software company Headquartered in New York City, servicing the world’s top capital markets trading companies. We’re building the next generation of capital markets technology. Global capital markets are an ocean of fast-moving, interrelated and complex data. Historically it’s been nearly impossible for all but a select few firms to make use of much of this data. MayStreet solves this problem by managing collection, storage and API access to uniquely high-quality data sets.
Current strategic pursuits
The Lead Platform Engineer will be responsible for MayStreet's data processing pipelines. MayStreet has the mission of "Capturing, Storing and Transforming" the world's market data. This person will manage the data pipelines that flow between each of those stages. The role has a large operational component while also requiring someone to envision future growth in both utilization and technology selection for the components of the work.
The ideal candidate will have 8-10+ years of experience, including in a leadership role
Strong proficiency with Linux and an understanding of Linux server architecture and operation. Should be very comfortable with data copying, scheduling tasks and Linux daemons.
Significant experience with Python. Ability to write and maintain object-oriented Python applications is a must.
Proficiency with Bash Shell Scripting.
Experience with processing job management – experience using one or more of the following: Jenkins, Rundeck, Apache Airflow or other task scheduling software
Time spent working in a batch data processing environment or other environment where large data jobs are performed using automation
Git version control systems - GitLab and/or GitHub
Nice to haves, but not required
C/C++ programming experience
Experience with and knowledge of US and International market data
Experience with multicast market data
Experience with monitoring and visualization systems such as Nagios, Prometheus and Grafana
Experience with time series databases
Experience with AWS S3
Experience with Docker and Kubernetes