You are viewing a preview of this job. Log in or register to view more details about this job.

Data Engineer

ROR

Are you a skilled Data Engineer looking for a remote-first opportunity to build and maintain cutting-edge data systems? We're seeking a talented individual to join our team and contribute to our data-driven success.

As a Data Engineer, you'll play a crucial role in designing, implementing, and optimizing our data infrastructure. You'll work collaboratively with product owners and operations, ensuring our data systems are scalable, reliable, and compliant.

What you'll do:

  • Collaborate with cross-functional teams, including product owners and operations, to understand data needs and deliver robust solutions.
  • Design and implement efficient ETL and ELT data pipelines.
  • Build and maintain best-in-class data systems, focusing on performance and reliability.
  • Administer and optimize our data warehouse in BigQuery (or a comparable cloud-based analytical data warehouse).
  • Develop and maintain data pipelines between various data systems and our data warehouse using Python, Airflow, and other integration platforms.
  • Create and maintain comprehensive documentation for data architecture and systems.
  • Ship production-quality features independently, considering scalability, reliability, maintainability, and compliance.
  • Conduct code reviews to ensure high-quality, maintainable code.
  • Continuously learn and stay up-to-date with advancements in data engineering practices.

What you'll bring:

  • A Bachelor's Degree or equivalent in Information Systems, Computer Science, or a related field, plus 2 years of progressive experience in IT or engineering-related positions.
  • Proficiency with SQL and BigQuery, or a comparable cloud-based analytical data warehouse.
  • Experience using Python for data processing, including prevalent data-related Python packages like Pandas.
  • Strong understanding of the extract-load-transform (ELT) pattern and best practices.
  • Experience building data pipelines with Airflow and dbt (data build tool) or comparable technologies.
  • Knowledge of data modeling and the Kimball data warehousing methodology.
  • Familiarity with Git and command-line tools.
  • Experience with AWS and CloudFormation, or comparable cloud services and Infrastructure as Code (IaC) tools.
  • Experience with Docker and Docker Compose.
  • Excellent communication and collaboration skills, both verbal and written.
  • The ability to work efficiently and productively in a remote-first environment and learn independently.

Salary: $80,000.00 per year

This position may be performed remotely.

If you're passionate about data and thrive in a dynamic, remote-first setting, we encourage you to apply!