Sr. Data Engineer- AWS
Our Client is a leading global asset management firm with more than $937B* in assets under management. We provide our retail and institutional clients a diverse and comprehensive range of investment capabilities to help people get more out of life. They are publicly traded on the New York Stock Exchange (IVZ) and has about 7,000 employees in over 20 countries.
Key Responsibilities / Duties:
- Collaborate with business subject matter experts and technology teams to make strategic recommendations on data collection, integration and retention.
- Build ETL and DQ pipelines of both structured and unstructured data from 1000s of both proprietary and third-party sources and use advanced data management tools and big data technologies.
- Develop usage and access control policies/systems and partner with other stakeholders in continuous improvement processes impacting data quality.
- Obsess in building capabilities with zero faults that can be scaled indefinitely across sales & marketing.
- Work closely with other stakeholder groups to validate findings (e.g., with business partners) in a hypothesis-driven approach.
- Work with ambiguity (e.g. imperfect data, loosely defined concepts, ideas, or goals) and translating these into more tangible outputs.
- Communication and change advocacy.
- Present and depict recommendations so that business stakeholders can make data-informed decisions.
- Convey complex solutions in easy to understand terms through usage of data visualization and clear written and verbal communication.
- Actively leads the organization to test and buy-in to novel solutions to important problems.
- Suggests metrics on business outcomes.
Work Experience / Knowledge:
- All experience is required to be hands-on technically.
- AWS certifications in – Developer Associate, DEVOps Engineer Professional, Cloud Practitioner, and Big Data Specialty.
- Working knowledge of basic undergraduate level Statistics.
- 10+ years of experience with big data engineering technologies.
- 6+ years of enterprise experience using AWS stack (Private and Public VPCs, Airflow, Pyspark, Scala, Kinesis, Direct Connect, Redshift, Redshift Spectrum, S3, Glue, Lambda, EMR).
- 10+ years of enterprise knowledge and experience with data management tools including SQL/RDBMS (Oracle, MySQL).
- 5+ years of enterprise knowledge and experience with NoSQL (e.g. MongoDB, Elastic, HBase).
- 10+ years of experience with SQL.
- 10+ years of experience using objected oriented principles in a strongly typed language such as C++ or Java.
- 3+ years of experience with Python.
- Expert at navigating and developing in a UNIX environment.
- Expert with enterprise big data builds with high availability and disaster recovery using Amazon Web Services (AWS).
- Expert linking multiple data platforms (social media, news, email, etc).
- Expert building datamarts for visualization tools (e.g., PowerBI, Microstrategy, Qlik, Tableau).
- Experience with Agile software development – Jira, Confluence, Bitbucket.