Ref: 12072021DENG

Data Engineer

Poland, Masovia

Job description

Data Engineer

12072021DENG



Investment Banking Engineering - Data Engineer - CRM Platform

What We Do

At Goldman Sachs, our Engineers don't just make things - we make things possible. Change the world by connecting people and capital with ideas. Solve the most challenging and pressing engineering problems for our clients. Join our engineering teams that build massively scalable software and systems, architect low latency infrastructure solutions, proactively guard against cyber threats, and leverage machine learning alongside financial engineering to continuously turn data into action. Create new businesses, transform finance, and explore a world of opportunity at the speed of markets.

Engineering, which is comprised of our Technology Division and global strategists groups, is at the critical center of our business, and our dynamic environment requires innovative strategic thinking and immediate, real solutions. Want to push the limit of digital possibilities? Start here.

Who We Look For

Goldman Sachs Engineers are innovators and problem-solvers, building solutions in risk management, big data, mobile and more. We look for creative collaborators who evolve, adapt to change and thrive in a fast-paced global environment.

We are looking for a Data Engineer to join a strong engineering team focused on CRM & Salesforce platform adoption. This is an opportunity to be part of a team responsible for building a brand new successful, sustainable and strategic CRM product for 3000+ users in the Investment Banking Business.

Responsibilities:
  • As a data engineer working in a multi-disciplinary team, you will support building a robust data pipeline to fulfil data movement to/from Salesforce CRM to multiple backends hosted on-premises and in AWS
  • Build robust data models with an eye on optimizing the data platform and reducing complexity and inefficiency in the process
  • Employ best practices when implementing data pipelines, transformations, modelling.
  • Proactively employ modern techniques to debug data issues
  • Work collaboratively with cross functional colleagues to improve data quality and build a cohesive data economy within GS
  • Be a role model in driving data governance, data quality, modern data practices in the work areas you own




Qualifications:
  • Deep working knowledge of Informatica Cloud and other cloud ETL tools such as AWS Data Pipelines.
  • Have previously used Salesforce and interacted with its object model
  • Working knowledge of cloud data warehouses, specifically Snowflake
  • Mastery in SQL, data modeling and data warehousing concepts
  • Deep expertise in programming in Python and using data analysis libraries such as Pandas
  • Working knowledge of public cloud platforms such as AWS
  • Experience with Data-ops and CICD




Preferred Qualifications:
  • Experience in open source technologies such as Airflow, Beam, Flink, dbt
  • Have used data visualization / analytics tools such as Tableau CRM
  • Worked on event streaming pipelines using Kafka
  • Deep experience in NoSQL and RDBMS such as MongoDB, Cassandra, Postgres, MS SQL Server
  • Experience working in highly regulated environments