- You’re an expert in data engineering and data quality
- You’re a strong developer, and you’ve used SQL/NoSQL to handle large data sets and complex data transformations.
- Experience with Big data platforms: Apache Spark / Hadoop
- Experience with cloud solutions for Big Data (Snowflake, GCP BigQuery, AWS Redshift) is a plus
- Experience with data pipelines (e.g. Airflow) and streaming processing (Kafka, Kinesis, Spark Streaming, Flink)
- Knowledge of Java and/or Python
- Experience with supporting Data Scientist (Machine Learning) is a plus
- It’s essential that you’ve got experience of the full SDLC in an equivalent environment.
- You have worked in small focused scrum teams delivering events driven integrations across multiple teams.
- You’re experienced in working within an integration environment with testers to ensure end to end performance and resilience SLA’s can be achieved.
- Experience in writing well designed, testable, efficient code which follows good coding standards
- Agile mindset and practice in software development process e.g. Scrum, Kanban, TDD, BDD
- You have experience mentoring other engineers (if you’re applying for a Senior role
- Experience in Betting and Casino Area