An AMAZING job opportunity for a Big Data Engineer with a global sustainability software company in Stuttgart, Germany! Their software helps companies from all over the world in their product innovation, improvements in lifecycle assessments and increase market value.
What you will be doing:
- You will be creating, planning and developing advanced cloud applications using modern big data technologies – Hadoop, Apache Kafka, Apache Spark, Apache Hive and Presto
- Restructuring existing code
- Evaluate technologies and derive recommendations for action
- Work together with the Scrum team
What you need to have:
- 3+ years of experience in software engineering with Big Data Technologies (Hadoop, Apache Kafka, Spark, Hive, Presto)
- At least 2 years of experience in product development
- Experienced in software architecture, performance optimization, up-to-date web frameworks, data modelling, Big Data, server, network and hosting infrastructure
- Experienced in databases and the current SQL standards
- Experienced in building enterprise business applications.
- Excellent knowledge of English
- German language knowledge is NOT necessary.
A VERY attractive package around salary and benefits on offer for the right person.
If this sounds like you, we want to hear from you immediately. Click apply, email firstname.lastname@example.org or call +353 87 096 3282.
Following your application for this specific role, Ingenio may contact you regarding other positions that we feel you may be suitable for. If you do not wish to be contacted about other opportunities please let us know. For further information please refer to the Privacy Statement on our website.