Skip to content
    Snowflake will be at Rev MLOps Conference, May 5th - 6th in NYC.

    Accelerate innovation with closer collaboration across all parts data science lifecycle - from data science teams building and deploying models, to business leaders putting models at the heart of operations, to IT teams governing underlying data and analytics infrastructure.

    Rapidly accelerate data science adoption with the near-unlimited access to data and processing power of Snowflake's Data Cloud paired with Domino’s open and flexible Enterprise MLOps Platform.


    Enable a modern analytics approach for data scientists

    Data scientists can leverage data and processing power in Snowflake’s Data Cloud from within Domino. Break down data silos by unifying internal and 3rd party data in Snowflake, then quickly developing models in Domino using Jupyter, R Studio, VS Code, or other IDEs.

    Live query data in Snowflake with Snowpark Java/Scala UDFs without the diminished performance or security challenges of data transfers. Leverage the elastic compute and processing of Snowflake’s Data Cloud from within Domino for building, executing, and productionalizing models.

    “By leveraging Domino and Snowflake’s Data Cloud Together, Braze has the flexibility to build machine learning models across our databases and share data seamlessly across our organization in real-time.”

    ― Jon Hyman, Co-Founder & CTO, Braze


    Reduce silos and create a research flywheel

    Data science teams can find and build on past work and freely collaborate with their peers to not only unlock new ideas and breakthrough insights, but share them across business stakeholders through Snowflake, ensuring simultaneous access with the elasticity of Snowflake’s Data Cloud.

    Data scientists can write model-derived data back into Snowflake’s Platform for end users to access via reporting and SQL business query tools.


    Create enterprise-scale MLOps workflows

    Manage the data science lifecycle from ideation through model deployment and model monitoring. Domino’s powerful scheduling function makes it possible to orchestrate a full workflow, sourcing data from Snowflake, running model code, and writing results such as predictive scores back into Snowflake for end user consumption - keeping all results in one place alongside existing data.


    Ensure IT governance and security

    Authenticate with Snowflake at the data science project level, allowing each project to have its own independent connection and Snowflake resources or even use multiple Snowflake accounts.

    Ensure security and flexibility by leveraging Snowflake authentication for Domino data science projects by storing user Snowflake credentials as environment variables, or utilizing popular tokenization and identify management solutions.

    “Snowflake provides the robust scalability, elasticity, and security we need to hold enterprise volumes of health data, while the integration with Domino ensures data scientists can securely and effectively share model output with business stakeholders, clients, and partners.”

    ― Luca Foschini, Co-founder and Chief Data Scientist at Evidation

    Domino and Snowflake are critical to the future of our business. They allow new use cases to keep flowing, so we can empower individuals to manage their health better before they become patients.

    Biz Phillips
    Senior Health Data Scientist

    Case Study

    Empowering individuals to participate in better health outcomes

    Evidation transforms research into production-grade models in as little as eight weeks with Enterprise MLOps.

    Read Case Study


    Tech Talk | Enterprise MLOps in Snowflake's Data Cloud

    How to unite data scientists, business experts, and IT teams for putting models into production faster. Optimized data connectors, Snowpark, and enterprise MLOps. Learn from Domino and Snowflake experts, with live Q&A.

    Live on April 28th

    Domino’s growing partner ecosystem helps our customers accelerate the development and delivery of models with key capabilities of infrastructure automation, seamless collaboration, and automated reproducibility. This greatly increases the productivity of data scientists and removes bottlenecks in the data science lifecycle.