Archiviertes Projekt - Raltime Data Engineer (f/m/d) wanted in Berlin/Remote!
Firmenname für PREMIUM-Mitglieder sichtbar
- Juni 2024
- Dezember 2024
- D-Großraum Berlin
- auf Anfrage
- Remote
- 14.05.2024
- 3658
Projektbeschreibung
For a Project we are looking for a Real-time Data Engineer (f/m/d).
Your task will be:
• Design, develop, and maintain real-time data pipelines and streaming applications.
• Architect and implement streaming data ingestion processes from various sources, ensuring reliability, scalability, and low-latency processing.
• Optimize and tune performance of real-time data processing systems for efficiency and throughput.
• Implement monitoring, alerting, and logging solutions to ensure the health and reliability of real-time data infrastructure.
• Develop, maintain and use deployment pipelines (following infrastructure as code paradigm)
• Producing clean, efficient code based on specifications and guidelines
• Self steered pick up on the assigned software development track and incidents
• Collaborate with peers in the assigned projects such as architects, other members of the SCRUM product team and experts and Data Architects
• Professionally maintain all software and create updates regularly to address customer and company concerns
• Analyze and test programs and products before formal launch
• Troubleshoot coding problems quickly and efficiently to ensure a productive workplace
• Actively seek ways to improve business software processes and interactions
• Preparation of training materials and delivery of training to other project team members in the use of software applications
• Join daily standup meetings and project meetings on site or remote and work in a scaled scrum enviroment
Your Know-How:
• Bachelor’s degree or higher in Computer Science, Engineering, or related field.
• 5 years of experience in data engineering, with a focus on real-time data processing.
• Proficiency in programming languages such as Python, Java, or Scala.
• Strong experience with real-time data processing frameworks such as Apache Kafka, Apache Flink, or Apache Spark Streaming.
• Solid understanding of distributed computing principles and microservices architecture.
• Experience with on-premise Kubernetes platforms as well as with cloud and platforms such as AWS, GCP, or Azure.
• Fluent German and English language skills (Level C1)
Order type: Contract
Location: Remote, Berlin
Start: 20.06.2024
Duration: 31.12.2024 ( Extension until 2026 possible)
If you are interested, please inform us about your hourly rate and your availability.
We are looking forward to your application in an MS-Word-readable format quoting the reference-number 3648.
Your task will be:
• Design, develop, and maintain real-time data pipelines and streaming applications.
• Architect and implement streaming data ingestion processes from various sources, ensuring reliability, scalability, and low-latency processing.
• Optimize and tune performance of real-time data processing systems for efficiency and throughput.
• Implement monitoring, alerting, and logging solutions to ensure the health and reliability of real-time data infrastructure.
• Develop, maintain and use deployment pipelines (following infrastructure as code paradigm)
• Producing clean, efficient code based on specifications and guidelines
• Self steered pick up on the assigned software development track and incidents
• Collaborate with peers in the assigned projects such as architects, other members of the SCRUM product team and experts and Data Architects
• Professionally maintain all software and create updates regularly to address customer and company concerns
• Analyze and test programs and products before formal launch
• Troubleshoot coding problems quickly and efficiently to ensure a productive workplace
• Actively seek ways to improve business software processes and interactions
• Preparation of training materials and delivery of training to other project team members in the use of software applications
• Join daily standup meetings and project meetings on site or remote and work in a scaled scrum enviroment
Your Know-How:
• Bachelor’s degree or higher in Computer Science, Engineering, or related field.
• 5 years of experience in data engineering, with a focus on real-time data processing.
• Proficiency in programming languages such as Python, Java, or Scala.
• Strong experience with real-time data processing frameworks such as Apache Kafka, Apache Flink, or Apache Spark Streaming.
• Solid understanding of distributed computing principles and microservices architecture.
• Experience with on-premise Kubernetes platforms as well as with cloud and platforms such as AWS, GCP, or Azure.
• Fluent German and English language skills (Level C1)
Order type: Contract
Location: Remote, Berlin
Start: 20.06.2024
Duration: 31.12.2024 ( Extension until 2026 possible)
If you are interested, please inform us about your hourly rate and your availability.
We are looking forward to your application in an MS-Word-readable format quoting the reference-number 3648.
Kontaktdaten
Als registriertes Mitglied von freelance.de können Sie sich direkt auf dieses Projekt bewerben.
Kategorien und Skills
IT, Entwicklung:
Forschung, Wissenschaft, Bildung:
Sie suchen Freelancer?
Schreiben Sie Ihr Projekt aus und erhalten Sie noch heute passende Angebote.
Jetzt Projekt erstellen