Data Solutions Architect

Global Growth Equity Investor #019


The Data Solutions Architect will have the following responsibilities:

Responsible for architecture, and design across the data solutions as part of Data Strategy team.
Architecture and design reviews of new code and processes really centered around cloud services and solutions.
Design and implement reusable frameworks and methods that can be leveraged for ingestion, transformations, data quality, reporting and analytics.
Evaluate and/or suggest new products, tools, methodologies to the data team
Architect and design data solutions using the following but not limited to python, Spark, dbt, Snowflake, Azure services, and Databricks
Leverage Infrastructure-as-Code to build and deploy infrastructure and data pipelines.
Define and develop standards, configurations and operational/administration guidelines for data to provide it as a services or capabilities to Data Science and App Dev teams.
Work closely with the team to help triage data issues and improve the design of solutions.
Design a future state architecture for the data platform incorporating modern technologies and methodologies
Modernize legacy applications from on prem to the cloud as well as off legacy ETL tools
Lead team discussions around design and development on overall data platform delivery, including data ingestion & preparation, data design & development, and data analytics & reporting.
Work closely with the leads to plan and execute delivery of data projects and capabilities.


Bachelor’s degree in Computer Science, management information systems, or equivalent working experience in information technology.
5+ years as solutions architect with solid cloud experience on Azure/AWS/GCP.
At least 3 years of experience working on the cloud: Azure/AWS/GCP
Preferred experience working in the Finance industry.
Demonstrated experience in learning, working with, and adopting new technology.
Good grasp on new technologies and approaches with the ability to be hands on and build POCs.
Extensive experience and competency with design and development of cloud-based data integration tools and open-source languages especially python, spark, Databricks, and Snowflake.
Proven expertise on architecting different aspects of a modern data platform including but not limited to a Lakehouse, Data Engineering, Data Governance, Security, Reporting and Analytics.
Solid understanding of Containers/Kubernetes
Experience designing and implementing event-based, real-time or streaming architectures
Ability to design and lead teams on CI/CD pipelines using Azure DevOps and Terraform as Infrastructure as code.
Ability to train fellow team members and mentor junior team members on development, architecture and cloud solutions.
Solid experience leading delivery teams and projects.
Demonstrated self-motivation to plan and execute tasks with minimal direction and drive for exceptional, high-quality results.
A strong work ethic and ‘can do’ attitude: motivated, flexible nature, team-player spirit.
Strong written and verbal communication and interpersonal skills.
Ability to communicate complex technical concepts to product owners, business partners, and IT management.
Demonstrated ability to structure communication to promote a proposed idea or solution.
Ability to identify areas for improvement and present and implement viable solutions.
Strong comfort and experience liaising with internal and external contacts.

To apply for this job email your details to

Job Location