Yur primary role entails designing, developing, and maintaining our data infrastructure, pipelines, and lakehouse. You'll collaborate closely with stakeholders to craft data solutions aligned with business needs, leveraging cloud-based technologies and addressing data infrastructure issues. Staying abreast of the latest data engineering trends will be essential to continuously enhancing our data infrastructure.

This position demands:

  • Minimum 7 years of experience
  • Advanced DataWarehouse Expertise
  • Comprehensive Data Processing Skills
  • Git Proficiency and Collaboration Experience
  • Cloud Services Mastery
  • This position is open for both onsite and remote work options.

Key Responsibilities:

  • Architect, develop, and maintain scalable and efficient data pipelines and ETL processes to collect, process, and store large volumes of data from diverse sources.
  • Collaborate with data analysts, BI managers, and stakeholders to understand their data requirements and design data solutions that align with business needs.
  • Utilize cloud-based technologies like AWS to manage and deploy data solutions.
  • Identify and troubleshoot issues related to data pipelines, databases, and other data infrastructure components.
  • Implement timely solutions to ensure data integrity and reliability.
  • Stay updated with the latest trends and advancements in data engineering and related technologies.
  • Evaluate potential applications of new technologies to enhance our data infrastructure.

Qualifications:

  • Bachelor’s or master’s degree in information technology, Information Systems, Computer Science, Computer Engineering, or a related field.
  • Demonstrated expertise in architecting and managing modern, cloud-native data warehouses such as Databricks.
  • Proficiency in SQL, NoSQL, and ETL tools such as AWS Glue, ensuring efficient data processing and transformation workflows.
  • Extensive experience in version control using Git and collaboration platforms like GitHub, facilitating seamless teamwork and code management.
  • In-depth knowledge and hands-on experience with cloud services, including Amazon S3 for scalable storage, Amazon RDS for relational databases, Glue, Athena, and Amazon Kinesis for real-time data processing.
  • Strong scripting skills in Python for automation, data manipulation, and integration tasks within data engineering workflows.
  • Proven track record of fostering teamwork and collaboration, essential for successful data engineering projects requiring cross-functional coordination.
  • Familiarity with Databricks Lakehouse architecture is highly desirable, as it demonstrates an understanding of unified data analytics platforms.
  • Possessing relevant data engineering or cloud platform certifications would be highly advantageous, validating expertise and proficiency in the field.

Job Details

Total Positions:
1 Post
Job Shift:
Work from Home
Job Type:
Job Location:
Gender:
No Preference
Age:
30 - 40 Years
Minimum Education:
Bachelors
Career Level:
Experienced Professional
Minimum Experience:
7 Years
Apply Before:
Aug 15, 2024
Posting Date:
Jul 19, 2024

Blueberry Tech INC

Information Technology · 11-50 employees - Islamabad, Karachi, Lahore, Multan

What is your Competitive Advantage?

Get quick competitive analysis and professional insights about yourself
Talk to our expert team of counsellors to improve your CV!
Try Rozee Premium

Similar Job Titles

Senior Data Engineer

Pakistan Single Window, Karachi, Pakistan
Posted Oct 30, 2024

Principal Data Engineer

Pakistan Single Window, Karachi, Pakistan
Posted Oct 30, 2024

Data Engineer - BI

Pakistan Single Window, Karachi, Pakistan
Posted Oct 30, 2024

Data Center Engineer

Posted Oct 22, 2024
View All
I found a job on Rozee!