*Hybrid, 3 days onsite, 2 days remote*
*We are unable to sponsor as this is a permanent Full time role*
A prestigious company is looking for a Data Analytics Engineer. This role requires 3 years of experience working as a data analytics engineer using SQL queries, Python, Alteryx, AWS/Azure, Databricks or Snowflake, Etc.
Responsibilities:
Support the design and implementation of cloud infrastructure for internal analytics zone in collaboration with the Data Platform team, data architects, DevOps, IT Assist in the build, test, and deploy semantic layer's virtual and physical data models that simplify complex semi-structured data, eliminate multiple definitions of similar data, create query-friendly datasets, and standardize column naming for downstream users that are developing quantitative analytics, dashboards, and internal risk applications. Assist in maintaining performance and accuracy SLAs for semantic layer and other data products through observability practices, ensuring proactive detection of system failures and incident response Work with upstream data producers to understand how their systems work, how they generate data, and how that is subject to change over time to help manage schema drift Create documentation and testing to ensure data lineage is traceable and semantic layer components are easily discoverable and useful to business users Support the implementation of ETL and data serving solutions for large datasets generated by our risk models that meet critical business user SLAs around latency and access patterns Promote self-service capabilities and data literacy for business users leveraging the semantic layer, other analytics platforms (eg Tableau, python), and CI/CD tools Invest in your continued learning of on data engineering best practices, cloud computing, options trading industry, and financial risk management, with an eye towards improving maintainability, reliability, and utility of our analytics infrastructure Assist risk analysts in solving their analytics questions/challenges and support ad-hoc development with them, as needed.
Qualifications:
Bachelor's or Master's degree in a quantitative discipline (eg, Statistics, Computer Science, Mathematics, Physics, Data Science, Electrical Engineering, Information Systems) or equivalent professional experience 3+ years of experience as a data engineer, software engineer, data scientist, financial risk analyst, Business Intelligence analyst Ability to collaborate with multiple partners (eg Business Users, Data and Solution Architects, Data Governance and IT teams Data Platform Team, Systems & Infrastructure, Security, DevOps, Networking) to craft solutions that align business goals with internal processes, security, and delivery standards in mind. Ability to write and optimize complex analytical (SELECT) SQL queries Ability to write and optimize python for custom data pipeline code (virtual environments, scripts vs. modules vs packages, functional programming, unit testing) Experience with a source code version control repository system, branch management, pull requests (preferably Git) Experience with data viz/prep tools (preferably Tableau and Alteryx)