Data Platform Architect (m/f/d) EVENTIM
Anzeige vom: 10.05.2025

Data Platform Architect (m/f/d)

Standort:
  • Berlin
EVENTIM

Zusammenfassung

  • Arbeitszeit
    Vollzeit
  • Typ
    Festanstellung

Gewünschte Fähigkeiten & Kenntnisse

API
Scala
Integrity
Framework
CRM-System
Cloud
CAN
Mule ESB
C++
Predictive Analytics
Java
Mentoring
ETL
Security
Compliance
Software-Engineering
Programmiererfahrung
Lakehouse
MSC
Snowflake
CDP
AWS
Python
GitHub
GitLab
Continuous Integration
Engineering

Stellenbeschreibung

Data Platform Architect (m/f/d)

  • Full Time

  • With Professional Experience

  • 4/16/25

  • Bremen, Deutschland / Berlin, Deutschland / Hamburg, Deutschland

We are seeking an experienced Data Platform Architect with 5+ years of experience and a deep understanding of data, AI and automation to design high-impact business systems that drive operational efficiency and increase value for Eventim customers. In this role, you will work closely with business leaders and technical teams to understand complex business needs, architect scalable solutions and integrate modern tools and platforms.

What to expect:

  • Data-Platform Architecture Design: Design and implement a lakehouse-style, multi-cloud data platform on AWS and Google Cloud, balancing cost, performance and resilience.
  • Ingestion & Streaming Frameworks: Design, build, and operate batch, real-time, and streaming pipelines to move data from diverse internal and external sources into the unified platform while preserving scalability, low-latency delivery, and data integrity.
  • Unified Customer Data Model: Architect schemas that collect and reconcile events from marketing, advertising and web-tracking channels into a single customer profile, synchronized with CRM and CDP systems.
  • AI & ML Platform Architecture : Define the end-to-end framework - feature pipelines, distributed training, automated evaluation, model registry, and low-latency serving - on managed cloud AI services, so use cases such as text summarization, anomaly detection, and predictive analytics can move from experimentation to production safely and at scale.
  • Security: Define zero-trust policies, fine-grained IAM, row/column security, tokenization and enterprise API-level protections, compliance.
  • Governance: Embed lineage, quality and cataloguing (OpenLineage, DataHub).
  • Thought Leadership & Mentoring: Set reference architectures, review designs, coach engineers and champion best practices in event-driven, product-oriented data platforms.

What you'll need:

  • MSc in Computer Science, Electrical Engineering, Software Engineering, ML Engineering or related fields (or equivalent experience).
  • 5+ years designing and operating cloud-native data platforms that power large-scale analytics and AI products.
  • Expert data-modeling skills (dimensional, Data Vault, entity-relationship, graph) and hands-on mastery of Snowflake or similar cloud data warehouses.
  • Proven track record building end-to-end, production-grade data pipelines - both batch and streaming - using tools such as Airflow, Databricks, Google Pub/Sub, Dataflow, Kafka, or comparable orchestration frameworks.
  • Experience with API-management (Mulesoft & Keycloak).
  • Deep knowledge of modern ELT/ETL patterns, data quality & observability tool-chains, and data-governance best practices (incl. GDPR and RBAC).
  • Programming fluency in SQL and Python (Scala/Java or C++ a plus) plus CI/CD and Infrastructure-as-Code (Terraform, GitLab CI/CD, GitHub Actions).
  • Working understanding of ML-ops concepts - feature stores, model registries, retrieval-augmented-generation (RAG) architectures - and how to shape data layers that serve them.
  • Strong communication and presentation skills and be aware of new trends and technologies in the field of information architecture.
  • Fluent in English, German would be ideal.

Profil

Fachliche Voraussetzung

  • APIs, Airflow, Amazon Web Services, Anforderungsanalyse, Apache Kafka, Architektur, Architekturdesign, Automatisierung, Business-Systeme, C++, Coaching und Mentoring, Computerprogrammierung, Continuous Integration, DSGVO, Data Vault, Data Warehousing, Databricks, Daten-Pipeline, Datenintegrität, Datenmanagement, Datenqualität, Datenschichten, ETL, Elektrotechnik, Entity-relationship, Erkennung von Anomalien, Experimentieren, Geschäftsanforderungen, Github, Gitlab-Ci, Google Cloud, Governance, Identitätsmanagement, Informatik, Informationsarchitektur, Infrastruktur, Java, Kataloge, Kundenbeziehungsmanagement (CRM), Kundendatenmanagement, Künstliche Intelligenz, Machine Learning, Machine Learning Operations, Marketing, Modellierungsfähigkeiten, Mulesoft, Predictive Analytics, Python, Role Based Access Control, SQL, Sicherheitsbestimmungen, Skalierbarkeit, Snowflake, Softwareentwicklung, Streaming, Terraform, Text Summarization, Wirtschaftliche Effizienz

Persönliche Fähigkeiten

  • Durchhaltevermögen, Kommunikation, Thought Leadership

Schulabschluss

  • Master

Sprachkenntnisse

  • Deutsch, Englisch

Berufserfahrung

  • Mit Berufserfahrung

Bewerbung

    Branche:

    Unternehmensdienstleistungen

    Arbeitgeber:

    EVENTIM

    Adresse:

    EVENTIM
    Dingolfinger Str 6
    81673 Munchen

    Ähnliche Stellenanzeigen

    Entschuldigung! Leider ist ein technisches Problem aufgetreten. Bitte die Seite neu laden.
    Seite neu laden
    Zu deinen Filterkriterien gab es leider keine Ergebnisse. Tipp: Ändere deine Suchanfrage und erhalte mehr Ergebnisse.
    Deine Ergebnisse werden aktualisiert
    • 50Hertz Transmission GmbH
      Berlin
      16.05.2025
      Cloud
      MongoDB
      Englisch
      Mobiles Arbeiten
      Datenverarbeitung
      NumPy
      Entwicklungsumgebungen
      Flask
      SQL Azure
      Berufserfahrung
      Energiewende
      Scikit-learn
      Kommunikationsfähigkeit
      AWS
      Flexibilität
      Python
      Flexible Arbeitszeiten
      Umweltmanagement
      Optimierung
      TensorFlow
      Grafana
      GIT
      Debugging
      Engineering
      Deutsch
    • 50Hertz Transmission GmbH
      Berlin
      16.05.2025
      Cloud
      MongoDB
      Englisch
      Mobiles Arbeiten
      Datenverarbeitung
      NumPy
      Entwicklungsumgebungen
      Flask
      SQL Azure
      Berufserfahrung
      Energiewende
      Scikit-learn
      Kommunikationsfähigkeit
      AWS
      Flexibilität
      Python
      Flexible Arbeitszeiten
      Umweltmanagement
      Optimierung
      TensorFlow
      Grafana
      GIT
      Debugging
      Engineering
      Deutsch
    • 50Hertz Transmission GmbH
      Berlin
      16.05.2025
      Cloud
      MongoDB
      Englisch
      Mobiles Arbeiten
      Datenverarbeitung
      NumPy
      Entwicklungsumgebungen
      Flask
      SQL Azure
      Berufserfahrung
      Energiewende
      Scikit-learn
      Kommunikationsfähigkeit
      AWS
      Flexibilität
      Python
      Flexible Arbeitszeiten
      Umweltmanagement
      Optimierung
      TensorFlow
      Grafana
      GIT
      Debugging
      Engineering
      Deutsch