Skip to main content

Move from dbt Core to the dbt platform: What you need to know

Migration
dbt Core
dbt platform
Intermediate
Menu

    Introduction

    Moving from dbt Core to dbt streamlines analytics engineering workflows by allowing teams to develop, test, deploy, and explore data products using a single, fully managed software service.

    Explore our 3-part-guide series on moving from dbt Core to dbt. The series is ideal for users aiming for streamlined workflows and enhanced analytics:

    Guide
    Information
    Audience
    Move from dbt Core to dbt platform: What you need to knowUnderstand the considerations and methods needed in your move from dbt Core to dbt platform.Team leads
    Admins
    Move from dbt Core to dbt platform: Get startedLearn the steps needed to move from dbt Core to dbt platform.Developers
    Data engineers
    Data analysts
    Move from dbt Core to dbt platform: Optimization tipsLearn how to optimize your dbt experience with common scenarios and useful tips.Everyone

    Why move to the dbt platform?

    If your team is using dbt Core today, you could be reading this guide because:

    • You’ve realized the burden of maintaining that deployment.
    • The person who set it up has since left.
    • You’re interested in what dbt could do to better manage the complexity of your dbt deployment, democratize access to more contributors, or improve security and governance practices.

    Moving from dbt Core to dbt simplifies workflows by providing a fully managed environment that improves collaboration, security, and orchestration. With dbt, you gain access to features like cross-team collaboration (dbt Mesh), version management, streamlined CI/CD, dbt Explorer for comprehensive insights, and more — making it easier to manage complex dbt deployments and scale your data workflows efficiently.

    It's ideal for teams looking to reduce the burden of maintaining their own infrastructure while enhancing governance and productivity.

     What are dbt and dbt Core?
    • dbt is the fastest and most reliable way to deploy dbt. It enables you to develop, test, deploy, and explore data products using a single, fully managed service. It also supports:

    Learn more about dbt features.

    • dbt Core is an open-source tool that enables data teams to define and execute data transformations in a cloud data warehouse following analytics engineering best practices. While this can work well for ‘single players’ and small technical teams, all development happens on a command-line interface, and production deployments must be self-hosted and maintained. This requires significant, costly work that adds up over time to maintain and scale.

    What you'll learn

    Today thousands of companies, with data teams ranging in size from 2 to 2,000, rely on dbt to accelerate data work, increase collaboration, and win the trust of the business. Understanding what you'll need to do in order to move between dbt and your current Core deployment will help you strategize and plan for your move.

    The guide outlines the following steps:

    • Considerations: Learn about the most important things you need to think about when moving from Core to Cloud.
    • Plan your move: Considerations you need to make, such as user roles and permissions, onboarding order, current workflows, and more.
    • Move to dbt: Review the steps to move your dbt Core project to dbt, including setting up your account, data platform, and Git repository.
    • Test and validate: Discover how to ensure model accuracy and performance post-move.
    • Transition and training: Learn how to fully transition to dbt and what training and support you may need.
    • Summary: Summarizes key takeaways and what you've learned in this guide.
    • What's next?: Introduces what to expect in the following guides.

    Considerations

    If your team is using dbt Core today, you could be reading this guide because:

    • You’ve realized the burden of maintaining that deployment.
    • The person who set it up has since left.
    • You’re interested in what dbt could do to better manage the complexity of your dbt deployment, democratize access to more contributors, or improve security and governance practices.

    This guide shares the technical adjustments and team collaboration strategies you’ll need to know to move your project from dbt Core to dbt. Each "build your own" deployment of dbt Core will look a little different, but after seeing hundreds of teams make the migration, there are many things in common.

    The most important things you need to think about when moving from dbt Core to dbt:

    • How is your team structured? Are there natural divisions of domain?
    • Should you have one project or multiple? Which dbt resources do you want to standardize & keep central?
    • Who should have permission to view, develop, and administer?
    • How are you scheduling your dbt models to run in production?
    • How are you currently managing Continuous integration/Continuous deployment (CI/CD) of logical changes (if at all)?
    • How do your data developers prefer to work?
    • How do you manage different data environments and the different behaviors in those environments?

    dbt provides standard mechanisms for tackling these considerations, all of which deliver long-term benefits to your organization:

    • Cross-team collaboration
    • Access control
    • Orchestration
    • Isolated data environments

    If you have rolled out your own dbt Core deployment, you have probably come up with different answers.

    Plan your move

    As you plan your move, consider your workflow and team layout to ensure a smooth transition. Here are some key considerations to keep in mind:

     Start small to minimize risk and maximize learning

    You don’t need to move every team and every developer’s workflow all at once. Many customers with large dbt deployments start by moving one team and one project.

    Once the benefits of a consolidated platform are clear, move the rest of your teams and workflows. While long-term ‘hybrid’ deployments can be challenging, it may make sense as a temporary on-ramp.

     User roles and responsibilities

    Assess the users or personas involved in the pre-move, during the move, and post-move.

    • Administrators: Plan for new access controls in dbt, such as deciding what teams can manage themselves and what should be standardized. Determine who will be responsible for setting up and maintaining projects, data platform connections, and environments.
    • Data developers (data analysts, data engineers, analytics engineers, business analysts): Determine onboarding order, workflow adaptation in dbt, training on Cloud CLI or Studio IDE usage, and role changes.
    • Data consumers: Discover data insights by using dbt Explorer to view your project's resources (such as models, tests, and metrics) and their lineage to gain a better understanding of its latest production state. StarterEnterprise
     Onboarding order

    If you have multiple teams of dbt developers, think about how to start your onboarding sequence for dbt:

    • Start with downstream (like business-embedded teams) who may benefit from the Studio IDE as dev experience (less technical users) and sharing features (like auto-deferral and dbt Explorer) to share with their stakeholders, moving to more technical teams later.
    • Consider setting up a CI job in dbt (even before development or production jobs) to streamline development workflows. This is especially beneficial if there's no existing CI process.
     Analyze current workflows, review processes, and team structures

    Discover how dbt can help simplify development, orchestration, and testing:

    • Development: Develop dbt models, allowing you to build, test, run, and version control your dbt projects using the Cloud CLI (command line interface or code editor) or Studio IDE (browser-based).
    • Orchestration: Create custom schedules to run your production jobs. Schedule jobs by day of the week, time of day, or a recurring interval.
      • Set up a CI job to ensure developer effectiveness, and CD jobs to deploy changes as soon as they’re merged.
      • Link deploy jobs together by triggering a job when another one is completed.
      • For the most flexibility, use the dbt API to trigger jobs. This makes sense when you want to integrate dbt execution with other data workflows.
    • Continuous integration (CI): Use CI jobs to run your dbt projects in a temporary schema when new commits are pushed to open pull requests. This build-on-PR functionality is a great way to catch bugs before deploying to production.
      • For many teams, dbt CI represents a major improvement compared to their previous development workflows.
    • How are you defining tests today?: While testing production data is important, it’s not the most efficient way to catch logical errors introduced by developers You can use unit testing to allow you to validate your SQL modeling logic on a small set of static inputs before you materialize your full model in production.
     Understand access control

    Transition to dbt's access control mechanisms to ensure security and proper access management. dbt administrators can use dbt's permission model to control user-level access in a dbt account:

    • License-based access controls: Users are configured with account-wide license types. These licenses control the things a user can do within the application: view project metadata, develop changes within those projects, or administer access to those projects.
    • Role-based Access Control (RBAC): Users are assigned to groups with specific permissions on specific projects or all projects in the account. A user may be a member of multiple groups, and those groups may have permissions on multiple projects. EnterpriseEnterprise +
     Manage environments

    If you require isolation between production and pre-production data environments due to sensitive data, dbt can support Development, Staging, and Production data environments.

    This provides developers with the benefits of an enhanced workflow while ensuring isolation between Staging and Production data, and locking down permissions on Prod.

    Move to dbt

    This guide is your roadmap to help you think about migration strategies and what moving from dbt Core to dbt could look like.

    After reviewing the considerations and planning your move, you may want to start moving your dbt Core project to dbt:

    • Check out the detailed Move to dbt: Get started guide for useful tasks and insights for a smooth transition from dbt Core to dbt.

    For a more detailed comparison of dbt Core and dbt, check out How dbt compares with dbt Core.

    Test and validate

    After setting the foundations of dbt, it's important to validate your migration to ensure seamless functionality and data integrity:

    • Review your dbt project: Ensure your project compiles correctly and that you can run commands. Make sure your models are accurate and monitor performance post-move.
    • Start cutover: You can start the cutover to dbt by creating a dbt job with commands that only run a small subset of the DAG. Validate the tables are being populated in the proper database/schemas as expected. Then continue to expand the scope of the job to include more sections of the DAG as you gain confidence in the results.
    • Precision testing: Use unit testing to allow you to validate your SQL modeling logic on a small set of static inputs before you materialize your full model in production.
    • Access and permissions: Review and adjust access controls and permissions within dbt to maintain security protocols and safeguard your data.

    Transition and training

    Once you’ve confirmed that dbt orchestration and CI/CD are working as expected, you should pause your current orchestration tool and stop or update your current CI/CD process. This is not relevant if you’re still using an external orchestrator (such as Airflow), and you’ve swapped out dbt-core execution for dbt execution (through the API).

    Familiarize your team with dbt's features and optimize development and deployment processes. Some key features to consider include:

    • Release tracks: Choose a release track for automatic dbt version upgrades, at the cadence appropriate for your team — removing the hassle of manual updates and the risk of version discrepancies. You can also get early access to new functionality, ahead of dbt Core.
    • Development tools: Use the dbt CLI or Studio IDE to build, test, run, and version control your dbt projects.
    • Documentation and Source freshness: Automate storage of documentation and track source freshness in dbt, which streamlines project maintenance.
    • Notifications and logs: Receive immediate notifications for job failures, with direct links to the job details. Access comprehensive logs for all job runs to help with troubleshooting.
    • CI/CD: Use dbt's CI/CD feature to run your dbt projects in a temporary schema whenever new commits are pushed to open pull requests. This helps with catching bugs before deploying to production.

    Beyond your move

    Now that you’ve chosen dbt as your platform, you’ve unlocked the power of streamlining collaboration, enhancing workflow efficiency, and leveraging powerful features for analytics engineering teams. Here are some additional features you can use to unlock the full potential of dbt:

    • Audit logs: Use audit logs to review actions performed by people in your organization. Audit logs contain audited user and system events in real time. You can even export all the activity (beyond the 90 days you can view in dbt). EnterpriseEnterprise +
    • dbt APIs: Use dbt's robust APIs to create, read, update, and delete (CRUD) projects/jobs/environments project. The dbt Administrative API and Terraform provider facilitate programmatic access and configuration storage. While the Discovery API offers extensive metadata querying capabilities, such as job data, model configurations, usage, and overall project health. StarterEnterprise
    • dbt Explorer: Use dbt Explorer to view your project's resources (such as models, tests, and metrics) and their lineage to gain a better understanding of its latest production state. (Once you have a successful job in a Production environment). StarterEnterprise
    • dbt Semantic Layer: The dbt Semantic Layer allows you to define universal metrics on top of your models that can then be queried in your business intelligence (BI) tool. This means no more inconsistent metrics — there’s now a centralized way to define these metrics and create visibility in every component of the data flow. StarterEnterprise
    • dbt Mesh: Use dbt Mesh to share data models across organizations, enabling data teams to collaborate on shared data models and leverage the work of other teams. EnterpriseEnterprise +

    Additional help

    • dbt Learn courses: Access our free Learn dbt video courses for on-demand training.
    • dbt Community: Join the dbt Community to connect with other dbt users, ask questions, and share best practices.
    • dbt Support team: Our dbt Support team is always available to help you troubleshoot your dbt issues. Create a support ticket in dbt and we’ll be happy to help!
    • Account management Enterprise accounts have an account management team available to help troubleshoot solutions and account management assistance. Book a demo to learn more. EnterpriseEnterprise +

    Summary

    This guide should now have given you some insight and equipped you with a framework for moving from dbt Core to dbt. This guide has covered the following key areas:

    • Considerations: Understanding the foundational steps required for a successful migration, including evaluating your current setup and identifying key considerations unique to your team's structure and workflow needs.

    • Plan you move: Highlighting the importance of workflow redesign, role-specific responsibilities, and the adoption of new processes to harness dbt's collaborative and efficient environment.

    • Move to dbt: Linking to the guide that outlines technical steps required to transition your dbt Core project to dbt, including setting up your account, data platform, and Git repository.

    • Test and validate: Emphasizing technical transitions, including testing and validating your dbt projects within the dbt ecosystem to ensure data integrity and performance.

    • Transition and training: Share useful transition, training, and onboarding information for your team. Fully leverage dbt's capabilities, from development tools (dbt CLI and Studio IDE) to advanced features such as Catalog, the Semantic Layer, and Mesh.

    What’s next?

    Congratulations on finishing this guide, we hope it's given you insight into the considerations you need to take to best plan your move to dbt.

    For the next steps, you can continue exploring our 3-part-guide series on moving from dbt Core to dbt:

    Guide
    Information
    Audience
    Move from dbt Core to dbt platform: What you need to knowUnderstand the considerations and methods needed in your move from dbt Core to dbt platform.Team leads
    Admins
    Move from dbt Core to dbt platform: Get startedLearn the steps needed to move from dbt Core to dbt platform.Developers
    Data engineers
    Data analysts
    Move from dbt Core to dbt platform: Optimization tipsLearn how to optimize your dbt experience with common scenarios and useful tips.Everyone

    Why move to the dbt platform?

    If your team is using dbt Core today, you could be reading this guide because:

    • You’ve realized the burden of maintaining that deployment.
    • The person who set it up has since left.
    • You’re interested in what dbt could do to better manage the complexity of your dbt deployment, democratize access to more contributors, or improve security and governance practices.

    Moving from dbt Core to dbt simplifies workflows by providing a fully managed environment that improves collaboration, security, and orchestration. With dbt, you gain access to features like cross-team collaboration (dbt Mesh), version management, streamlined CI/CD, dbt Explorer for comprehensive insights, and more — making it easier to manage complex dbt deployments and scale your data workflows efficiently.

    It's ideal for teams looking to reduce the burden of maintaining their own infrastructure while enhancing governance and productivity.

    0