All Data Engineers And DevOps Consultant please apply here.
We are looking for Data Engineers and DevOps specialists with vast experience of solving complex data engineering and architecture challenges preferably with more than one of the three Cloud Vendor offerings, Microsoft Azure, AWS and Google platforms.
What you will be doing:
- Building the infrastructure required for optimal extraction and loading of data from a wide variety of data sources using SQL, Snowflake, Azure, AWS and Google Cloud ‘big data’ technologies stack.
- Designing and implementing major architectural patterns for Data to ensure consistency, security, maintainability and flexibility leveraging Cloud platforms and solutions.
- Designing, developing and building security architecture implementation for public and/or hybrid cloud based systems within Amazon Web Services (AWS), Microsoft Azure and Google Cloud platform
Data Engineer Skills Set:
- Processing frameworks & programming tools: Spark (Scala/Python/Java), Kafka
- Hadoop platforms & distributions: Cloudera, Hortonworks, MapR, EMR
- Agile and DevOps delivery practices
- Data modelling & data pipeline design
- Cloud platforms: Developing data engineering pipelines on AWS, Azure, GCP, including with Cloud platform PaaS components; Databricks
DevOps Skill Sets:
We are looking for a DevOps engineer to work with an established data and analytics and cloud practice across a number of projects and who will be responsible for building, improving and maintaining the cloud and big data infrastructure, enabling their services to be deployed, run and operated from development to production. You will have the chance to work on a variety of cloud and big data projects with highly robust and scalable infrastructure, and some of the most cutting-edge systems and tools, including AWS, Terraform, and Docker.
- Expert skills in Amazon Web Services (EC2, S3, EMR, ELB, Elastic Beanstalk, VPC, Kinesis, Route 53, security groups)
- Solid background in DevOps engineering (load testing, continuous integration, change management, application monitoring, production support)
- Strong experience in orchestration tools (Jenkins, Ansible, Chef)
- Experience with scripting languages (Python, Scala, Go)
- Expert skills in UNIX
- Good understanding of networks and information security
- Ability to manage and work in fast-paced, dynamic work environment
- Any Data Warehousing, Big Data or Data Analytics experience is a huge bonus, but not essential.