TeizoSoft Private Limited

Lead Data Engineer - ETL

Job Location

hyderabad, India

Job Description

Key Responsibilities : - Design, develop, and maintain scalable and efficient data pipelines using Python, Spark, and Airflow. - Implement ETL/ELT processes to ingest, transform, and load data from various sources into Snowflake. - Optimize data pipelines for performance, reliability, and maintainability. - Troubleshoot and resolve data pipeline issues and performance bottlenecks. - Design and implement data warehousing solutions using Snowflake. - Manage and optimize AWS cloud infrastructure for data engineering tasks. - Ensure data security and compliance with industry best practices. - Implement and manage data governance and data quality processes. - Lead and mentor a team of data engineers, providing technical guidance and support. - Foster a collaborative and high-performance team environment. - Conduct code reviews and ensure adherence to coding standards. - Participate in recruitment and onboarding processes. - Utilize DBT (Data Build Tool) for data transformation and modeling. - Develop and maintain data models for analytical and reporting purposes. - Ensure data consistency and accuracy across the data warehouse. - Write complex SQL queries for data extraction, transformation, and analysis. - Optimize SQL queries for performance and efficiency. - Design and implement database schemas and data structures. - Implement automation for data pipeline deployment and monitoring. - Utilize Airflow for workflow orchestration and scheduling. - Implement CI/CD pipelines for data engineering tasks. - Collaborate with data scientists, analysts, and business stakeholders to understand data requirements. - Communicate effectively with technical and non-technical stakeholders. - - Document data pipelines, processes, and infrastructure. - Mandatory Skill Set : - Python : Advanced proficiency in Python for data engineering tasks. - Spark : Extensive experience with Apache Spark for large-scale data processing. - SQL : Strong SQL skills for data manipulation and analysis. - Snowflake : Proven experience in designing and implementing data warehousing solutions using Snowflake. - Airflow : Experience with Apache Airflow for workflow orchestration. - AWS : Hands-on experience with AWS cloud services related to data engineering. - DBT (Data Build Tool) : Experience with DBT for data transformation and modeling. Preferred Qualifications : - Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. - Experience with other cloud platforms (GCP, Azure). - - Knowledge of data governance and data quality best practices. - Experience with containerization (Docker, Kubernetes). - Experience with real time data streaming. - Experience with CI/CD tools. - Experience with data visualization tools. Key Competencies : - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Strong leadership and mentorship abilities. - Ability to manage multiple projects and priorities. - Strong understanding of software development lifecycle. (ref:hirist.tech)

Location: hyderabad, IN

Posted Date: 5/1/2025
View More TeizoSoft Private Limited Jobs

Contact Information

Contact Human Resources
TeizoSoft Private Limited

Posted

May 1, 2025
UID: 5114729770

AboutJobs.com does not guarantee the validity or accuracy of the job information posted in this database. It is the job seeker's responsibility to independently review all posting companies, contracts and job offers.