Data Engineer with Cloud (AWS and GCP)

Fixed Income Firm #016

Job Responsibilities:

· Build and run our data platform using such technologies as public cloud infrastructure (AWS and GCP), Kafka, databases and containers

· Develop our data science platform based on Python data science stack.

· Work closely with our Data Science teams to build and productionize revenue generating ML models and datasets.

· Build and run ETL pipelines to onboard data into the platform, define schema, build DAG processing pipelines and monitor data quality.

· Manage and run mission crucial production services.


· Good core programming skills in either: Python, C++, NodeJS or Java

· Strong experience working with SQL and databases such as MySQL, PostgreSQL, SQL Server

· Strong understanding of scalable data query engines: e.g. BigQuery, Presto, Snowflake, Spark etc…

· Experience with ETL tooling such as Airflow, Prefect, etc

· Experience using stream processing platforms such as e.g. Kafka, Pulsar, Storm, Kinesis, etc…

· Familiarity with Python data-science stack: e.g. Juypter, Pandas, Scikit-learn, Dask, Tensor-flow

· Broad exposure to at least one cloud platform: AWS, Google, Azure

· Strong understanding of Linux. Windows a plus.

· Strong proclivity for automation and DevOps practices

· Experience with managing increasing data volume, velocity and variety

· Financial Services experience a plus but not required.

· BS or higher in a technical field: CS, Physics, Maths etc…

To apply for this job email your details to

Job Location