Gestamp is an international group dedicated to the design, development and manufacture of metal components for the automotive industry. We currently have more than 100 plants, 13 R&D centers and more than 41000 employees located in 21 companies
We are looking for a highly motivated individual to join a growing team working on exciting Industry 4.0 projects. You will help building Gestamp´s path towards a SMART FACTORY by integrating and maximizing the potential that new technologies can offer, to achieve connected, smart and highly efficient manufacturing plants.
Responsibilities
We are looking for a Senior Data Software Engineer with a strong background in designing and building scalable data systems. This role is ideal for someone who enjoys hands-on technical work while mentoring others and influencing the technical direction of the team. You will be a key player in developing reliable data pipelines and systems while ensuring high performance and scalability.
About you
Skills & Knowledge
( What will you do)
Team Leadership & Delivery: Lead and mentor a team of 4–5 engineers, setting technical direction and best practices for streaming platforms. Drive sprint planning, technical decision-making, and delivery of scalable real-time data solutions. Promote a culture of ownership, continuous improvement, and operational excellence . Collaborate cross-functionally with product, platform, and data teams to align on priorities and architecture.
Stream Processing & Fault Tolerance: Own and guide the implementation of real-time pipelines using Apache Flink, including stateful processing and CEP. Define best practices for checkpointing, savepoints, and exactly-once guarantees. Ensure platform reliability, resilience and high availability.
Time Semantics & Windowing: Lead the design of event-time processing strategies, watermarking, and windowing for accurate and scalable computations. Establish standards for handling out-of-order and late-arriving data.
Messaging & Event Systems: Architect and oversee systems based on kafka (brokers, streams, connect, schema registry) and RabbitMQ (exchanges, queues, routing strategies).
Industrial Protocols: Strong understanding of industrial communication protocols, with hands-on experience in OPC UA, MQTT, and AMQP for integrating IoT devices, messaging systems, and real-time data pipelines.
Infrastructure & Deployment: Guide the adoption of Docker, CI/CD pipelines, and Kubernetes-based deployments . Collaborate with DevOps to ensure robust, automated, and scalable infrastructure.
Observability & Optimization: Familiar with Prometheus, Grafana, ELK stack for monitoring, tuning, and debugging streaming pipelines.
Nice to have: