Responsibilities:
�� Lead data engineering initiatives, providing technical guidance, and mentorship to the
data engineering team.
�� Drive innovation initiatives, explore new technologies and tools, and propose innovative
solutions to improve efficiency, productivity, and customer satisfaction.
�� Design and implement robust data pipelines, ensuring scalability, performance, and
reliability.
�� Demonstrate strong proficiency in SQL, including writing complex queries, stored
procedures, and optimizing database performance.
�� Utilize extensive experience with GCP services like Big Query, Data Flow, Pub Sub, Data
Stream etc for data storage, querying, and analysis.
�� Implement data quality checks, monitoring mechanisms, and data governance practices
to ensure high-quality and reliable data.
�� Collaborate with cross-functional teams to understand data requirements, provide
technical expertise, and deliver impactful data solutions.
�� Foster a positive and collaborative team environment, conduct technical training
sessions, and mentor team members to enhance their technical skills and career growth.
Desired Profile:
�� Proven experience as a Data Engineer with hands-on experience in data
engineering and leading data engineering initiatives
�� Strong proficiency in SQL
�� Extensive experience with GCP services like BQ , Dataflow, Pub Sub, Data Stream, and
similar services.
�� Proficiency in Python with data manipulation, scripting, automation, and building data
processing workflows.
�� Hands-on experience with Apache Airflow or similar workflow orchestration tools
�� Terraform and Tekton skills are good to have.
�� Strong problem-solving skills, analytical thinking, and attention to detail
�� Excellent communication, collaboration, and leadership skills
�� Ability to work effectively in a fast