ROLLER is not your average software-as-a-service company. With a global presence in over 30 countries, we're here to bring some excitement to the leisure and attractions industry and make a difference! Our mission is to help businesses operate smoothly and create fun and memorable guest experiences by providing seamless ticketing, point-of-sale, self-serve kiosks, memberships and digital waiver processes.
But here's the best part: our team. We're a group of 160+ highly passionate, enthusiastic, and down-to-earth professionals located all around the world who are all working together to build something truly remarkable. We're aiming high and believe that the possibilities are endless. As we continue to grow globally, we're excited to write our success story and have fun along the way.
We genuinely love what we do, and we're looking for like-minded people to join us on this amazing journey. If you're ready to be part of a dynamic team and make a real impact, come aboard, and let's create some unforgettable experiences together at ROLLER!
About the Role
We are looking for a talented Data / ML Engineer to join us in advancing our AI initiatives and optimizing our current data infrastructure. You'll work closely with various departments and stakeholders to implement AI solutions that benefit both our customers and internal operations.
You are someone who has a passion for solving complex problems using data, being able to interact with business stakeholders, narrow down problem statements, solve problems for customers, and coordinating the development and release of features and fixes. Your strong technical skills combined with communication skills allow you to be part of something greater by creating and working on best in class data solutions.
What You'll Do
- Contribute to the architecture and optimization of the data infrastructure using big data cloud technologies within GCP to enable data-driven decision-making and meaningful insights for our customers and internal stakeholders.
- Implement data warehousing best practices with BigQuery, including data partitioning and clustering to optimize query performance and costs.
- Design, build, and maintain ETL data pipelines using best practice methods and tooling to integrate large quantities of data from disparate internal and external sources.
- Build, optimize, and deploy data models into production for efficient query performance and analysis.
- Implement and monitor ML pipelines for continuous improvement and model retraining.
- Ensure data security, privacy, and compliance with relevant regulations (eg. GDPR, CCPA)
- Collaborate with customer facing product teams, internal company teams, and other business stakeholders to translate data-related needs into technical requirements and dashboard designs.
- Provide support for data-related issues and incidents, including customer support requests, to investigate and resolve data discrepancies and anomalies to ensure data accuracy.
- Stay up-to-date with the latest advancements in data engineering and machine learning technologies.
About You
- Bachelor’s or Master's degree in Computer Science, Information Technology or a related field.
- Proven 3+ years of experience in data engineering using data warehouse cloud technologies, with a focus on designing and delivering solutions on Google Cloud Platform.
- Strong proficiency in designing, implementing, and managing data pipelines using GCP data-focused services such as BigQuery, Dataflow, Dataproc, Cloud Storage, Pub/Sub, and Data Catalog.
- Experience using Python to develop scripts and small programs for job orchestration and/or data manipulation, including familiarity with modules like Pandas, NumPy and PySpark.
- Solid understanding of machine learning algorithms and techniques (e.g., regression, classification, clustering).
- Extensive experience with SQL, with a focus on analysing and validating complex and disparate data sets.
- Proficiency in working with DevOps tooling, Git, Terraform, CI/CD pipelines and infrastructure as code.
- Comfortable familiarity with AI and machine learning tooling and applications.
- Strong hands-on technical skills with the ability to provide guidance in troubleshooting and delivering business features independently.
- Excellent communication skills, the ability to work well within a team environment and ability to articulate data to key stakeholders.
Benefits
- You'll get to work on a category-leading product that customers love in a fun, high-growth industry! Check our Capterra and G2 reviews.
- 4 ROLLER Recharge days per year (that is 4 additional days of leave that we all take off together as a team to rest and recuperate)
- Engage in our 'Vibe Tribe' - led by our team members; you can contribute to company-wide initiatives directly. Regular events and social activities, fundraising & cause-related campaigns...you name it. We're willing to make it happen!
- Team Member Assistance Program to proactively support our team's health and wellbeing - access to coaching, education modules, weekly webinars, and more.
- 16 weeks paid Parental Leave for primary carers and 4 weeks paid Parental Leave for secondary carers.
- Work with a driven, fun, and switched-on team that likes to raise the bar in all we do!
- Individual learning & development budget plus genuine career growth opportunities as we continue to expand!
What You Can Expect
- Initial call with our Talent Acquisition Manager
You'll have an initial call with our Talent Acquisition Manager to chat through some of your experience to date, salary expectations and you can check off any initial questions you might have.
- Interview with the VP of AI & Data
You'll get to meet with the VP of AI & Data to learn more about the role & ROLLER whilst also talking through your experience in more detail.
- Loop Interviews
This is where you will get to meet our wider ROLLER team to do a 'vibe check' on us to make sure our culture & vibe meet what you are looking for!
- Offer
If all lights are green and the fit feel right, we'll conduct reference checks and you'll receive an offer to join!