Design and implement data models and data architecture for both structured and unstructured data. Build data quality rules and data governance practises and tools from the start Model complex business and functional processes onto logical, physical data models Oversee the design, development, and maintenance of ETL and ELT processes. Work closely with business units and other technology teams to gather data integration requirements, reporting requirements. Continuously assess and optimize existing data pipelines for performance, reliability, and cost-effectiveness. Evaluate and implement new tools and technologies that can improve data engineering processes. Ensure thorough documentation of data processes, systems, and architecture. Proficiency in SQL, and experience with programming languages like Python / Scala. Familiarity with data warehousing solutions (e.g., Snowflake) and related data technologies (e.g., Apache Spark, dbt). Experience with cloud platforms (preferably Azure) Strong understanding of data modeling techniques and principles Ability to manage multiple projects, prioritize tasks, and meet deadlines. Strong verbal and written communication skills to articulate complex concepts. Ability to work collaboratively with technical and non-technical stakeholders. Proficiency in troubleshooting data issues and optimizing data workflows. Experience in working with buy-side financial services firms Familiarity with financial products like Equities, Fixed Income and business functions like Accounting, Risk Management, Reg Reporting etc.