Digihelic Solutions Private Limited

Data Analyst - Azure/Python/SQL

Job Location

noida, India

Job Description

About the Role : We are seeking a highly skilled and experienced Data Engineer to join our team in Noida for an Etihad Time Box requirement. This role demands expertise in building robust and scalable data pipelines, with a strong focus on Azure Databricks, Python, Java Scripts, and Cosmos DB. The ideal candidate will have a proven track record of designing, developing, and deploying data solutions in a fast-paced environment. This is a time-sensitive project requiring immediate contribution and delivery. Responsibilities : Data Pipeline Development : - Design, develop, and implement efficient data pipelines using Azure Databricks, Python, and Java Scripts. - Ingest, transform, and load data from various sources into Cosmos DB and other target systems. - Optimize data pipelines for performance, scalability, and reliability. Azure Databricks Expertise : - Leverage Azure Databricks for large-scale data processing and analytics. - Develop and maintain Databricks notebooks and jobs for data transformation and analysis. - Optimize Databricks clusters for performance and cost efficiency. Cosmos DB Integration : - Design and implement data models for Cosmos DB based on business requirements. - Develop data ingestion and retrieval processes for Cosmos DB. - Optimize Cosmos DB queries for performance. Python and Java Script Development : - Develop Python scripts for data processing, automation, and integration. - Utilize Java Scripts where needed for data manipulation and web based interactions. Data Analysis and Troubleshooting : - Analyze data to identify patterns, trends, and anomalies. - Troubleshoot data pipeline and system issues. - Ensure data quality and consistency. Collaboration and Communication : - Collaborate with data scientists, analysts, and other stakeholders to understand data requirements. - Communicate effectively with technical and non-technical audiences. - - Provide regular updates on project progress. Time Box Delivery : - Work within tight deadlines to deliver high quality code. - Be able to quickly adapt to changing requirements. - Prioritize tasks effectively. Required Skills : Experience : 5 years of experience in data engineering. Azure Databricks : - Strong proficiency in Azure Databricks (Spark, Delta Lake). - Experience with Databricks notebooks, jobs, and workflows. - Knowledge of Databricks cluster management. Programming Languages : - Expertise in Python for data processing and automation. - Experience with Java Scripts. - Strong SQL skills. Cosmos DB : - Experience with Azure Cosmos DB. - Knowledge of Cosmos DB data modeling and query optimization. Data Engineering Concepts : - Strong understanding of data warehousing and ETL/ELT concepts. - Experience with data modeling and schema design. - Experience with data governance concepts. Problem-Solving : - Excellent analytical and problem-solving skills. Communication : - Strong verbal and written communication skills. - Preferred Skills : - Experience with other Azure data services (e.g., Azure Data Factory, Azure Event Hubs). - Experience with CI/CD pipelines. - Knowledge of cloud security best practices. - Experience with Airline industry Data. Personal Attributes : - Highly motivated and self-directed. - Ability to work effectively in a fast-paced environment. - Strong team player with the ability to collaborate effectively. - Detail-oriented and organized. - Ability to deliver under time constraints. - Strong sense of urgency. (ref:hirist.tech)

Location: noida, IN

Posted Date: 5/1/2025
View More Digihelic Solutions Private Limited Jobs

Contact Information

Contact Human Resources
Digihelic Solutions Private Limited

Posted

May 1, 2025
UID: 5105610715

AboutJobs.com does not guarantee the validity or accuracy of the job information posted in this database. It is the job seeker's responsibility to independently review all posting companies, contracts and job offers.