BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Mimer AI Factory - ECPv6.15.20//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:Mimer AI Factory
X-ORIGINAL-URL:https://mimer-ai.eu
X-WR-CALDESC:Events for Mimer AI Factory
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:Europe/Stockholm
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20250330T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20251026T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20260329T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20261025T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20270328T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20271031T010000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=Europe/Stockholm:20260415T110000
DTEND;TZID=Europe/Stockholm:20260415T123000
DTSTAMP:20260421T104821
CREATED:20260224T074823Z
LAST-MODIFIED:20260306T135522Z
UID:3570-1776250800-1776256200@mimer-ai.eu
SUMMARY:Operationalizing AI: MLOps x LLMOps
DESCRIPTION:About the webinar\nIn this MLOps and LLMOps webinar\, we’ll walk through the entire AI lifecycle – from idea and experimentation to production\, deployment and continuous monitoring\, highlighting how AI differs from traditional software (data-driven\, non-linear\, and sometimes unpredictable even when “done right”). You’ll learn the main deployment patterns (batch/offline\, real-time/online\, and common patterns employed in cloud solutions) and the key trade-offs around latency\, scaling\, and operational reliability. \nWe’ll then connect MLOps and LLMOps in a practical way: versioning data/models/prompts\, reproducibility\, CI/CD\, and testing strategies for probabilistic systems. It’s aimed at data scientists\, ML engineers\, software engineers\, and AI engineers who want a clear\, production-focused view of how to run ML and LLM solutions end-to-end. \nWho is the webinar for? \nIt’s aimed at data scientists\, ML engineers\, software engineers\, and AI engineers who want a clear\, production-focused view of how to run ML and LLM solutions end-to-end. Also suited to those with no experience in building and deploying AI models\, and are curious on AI/ML/LLM Ops. \nKey takeaways for participants: \n\n\nKey differences between AI and traditional software \n\n\nHow these differences translate to model deployment \n\n\nWhat is ML and LLM Ops and how they differ \n\n\nDifferent model deployment strategies \n\n\nSpeaker bio:\nMurilo Kuniyoshi Suzart Cunha (https://www.linkedin.com/in/murilo-cunha/) \nMurilo is a machine learning engineer specializing in productionizing models and applying AI Ops best practices\, with a focus on the evolving landscape of LLMOps. He takes a pragmatic approach to machine learning\, ensuring AI initiatives deliver tangible ROI. An experienced international conference speaker and open source supporter\, Murilo is also the host of the Monkey Patching Podcast.
URL:https://mimer-ai.eu/event/operationalizing-ai-mlops-x-llmops/
ATTACH;FMTTYPE=image/jpeg:https://mimer-ai.eu/wp-content/uploads/2026/02/mimer-webb.jpg
END:VEVENT
END:VCALENDAR