Data Infrastructure Engineer

engineering

Phoenix Labs is looking to hire a Data Infrastructure Engineer to assist in the expansion of our analytics pipeline and tools. A successful candidate must have strong communication skills and be able to work side by side with data analysts, developers, and designers. Candidates must be passionate about large scale ETL pipelines. The Data Infrastructure Engineer will be responsible for the design and implementation of robust, secure, and scalable data transport services as well as solutions which interpret data in real-time to integrate data into various applications.

Phoenix Labs has offices in Vancouver, BC and San Mateo, CA. Applicants must live within commuting distance to either studio. This is full-time, permanent position.

Responsibilities:

  • Maintain and extend our streaming and batch analytics pipeline
  • Extend systems for collating disparate data sources into data warehouses
  • Integrate third-party services into our data pipeline
  • Work with teammates to optimize their workflows by building/extending internal tooling
  • Build systems that leverage data to provide customized player experiences
  • Utilize machine learning to inform the behaviour of other services in real-time

Requirements:

  • BSc. degree in Computing Science or equivalent experience
  • Self-motivated and eager to learn
  • Excellent lateral thinking skills
  • Experience developing in Python or equivalent language
  • Experience building scalable solutions to support data-intensive workloads
  • Experience working with petabyte scale datasets
  • Demonstrable knowledge of automated testing
  • Passion for multiplayer gaming
  • M.L.A. - Must Love Acronyms

Preferred

  • Working knowledge of statistics and mathematics
  • Experience with BigQuery, Kafka, or similar tools
  • Familiarity with machine learning and feature engineering
  • Experience with Python’s scientific computing libraries: numpy, scikit, pandas
  • Familiarity with the R programming language
  • Experience deploying and supporting applications on cloud platforms (GCP, AWS)
  • Experience with modern devops: Terraform, Docker, Kubernetes
  • Familiarity with data storage and sharing regulations like GDPR