You are viewing a preview of this job. Log in or register to view more details about this job.

Senior Data Engineer

 

Experience Level: Mid-Level (5+ Years)

Job Overview:

We are seeking a skilled and experienced Senior Data Engineer to join our team. The ideal candidate will have a deep understanding of data architecture, data warehousing, and ETL processes. You will be responsible for designing, developing, and maintaining scalable data pipelines and systems that support our organization's data-driven decision-making. Your expertise will be crucial in ensuring data integrity, availability, and performance across various platforms.

Key Responsibilities:

  • Design and Develop Data Pipelines: Create, optimize, and maintain robust ETL processes to move data from various sources into our data warehouses, ensuring data is clean, accurate, and accessible.
  • Data Modeling: Design and implement efficient data models that support reporting, analytics, and other business requirements.
  • Data Warehousing: Develop and manage data warehouse solutions, ensuring they are optimized for performance and scalability.
  • Data Integration: Integrate data from multiple sources, both internal and external, into a unified and cohesive data system.
  • Collaboration: Work closely with data analysts, data scientists, and other stakeholders to understand data needs and translate them into technical requirements.
  • Performance Tuning: Monitor and optimize the performance of data systems and pipelines, ensuring low-latency and high availability.
  • Data Governance: Implement and enforce best practices for data governance, including data security, privacy, and compliance.
  • Automation: Automate repetitive tasks and data processes to improve efficiency and reduce manual intervention.
  • Mentorship: Provide guidance and mentorship to junior data engineers, helping them grow their skills and knowledge.

Required Qualifications:

  • Experience: 5+ years of experience in data engineering or a related field.
  • Technical Skills:
    • Proficiency in programming languages such as Python, Java, or Scala.
    • Strong experience with SQL and database technologies (e.g., PostgreSQL, MySQL, Oracle).
    • Expertise in ETL tools and frameworks (e.g., Apache NiFi, Apache Airflow, Talend).
    • Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services (e.g., Redshift, BigQuery, Azure SQL Data Warehouse).
    • Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka).
    • Knowledge of data modeling and data warehousing concepts.
  • Tools: Experience with data visualization tools (e.g., Tableau, Power BI) and version control systems (e.g., Git).
  • Soft Skills: Strong problem-solving skills, attention to detail, and the ability to communicate complex technical concepts to non-technical stakeholders.