An Insurance Technology firm are building a Big Data Analytics platform for their flagship product within a team of Data Engineers and Data Scientists.
You'll be developing Hadoop applications to exploit unique collections of the group's data, taking advantage of TDD, pair programming and continuous delivery/automation. You'll also be building logical APIs and services to enable users to easily interact with these applications.
Required Skills and Experience:
- Scala, Java, Kotlin or Python development
- Spark streaming
- Hadoop Ecosystem
- Degree in a Computer Science or Mathematical based subject
- Experience with large data volumes
- Apache NiFi / Kafka
- Shell Scripting experience
- Apache Giraph
- Fair implementation knowledge Spark MLLIB
This is a great opportunity to continue for a strong Data Engineer, with an emphasis on collaboration, to use market leading technologies in a unique environment to continue to build a successful Artificial Intelligence focused Big Data analytical platform and applications.
If you would like to hear more please send your updated CV and I'll be in touch.