How to use Databricks Lakehouse and Responsible AI

Gain practical skills in using the Databricks Lakehouse platform and knowledge of AI ethics to meet the demands of the industry with this online course from Pragmatic AI Labs.

Duration

4 weeks

Weekly study

4 hours

100% online

How it works

Unlimited subscription

Learn more

Master the fundamentals of the Databricks Lakehouse platform

On this four-week course, you’ll delve into the Databricks Lakehouse platform to understand its functionality and architecture.

You’ll gain hands-on experience using the platform to develop practical skills in managing large-scale data operations, developing AI solutions, and implementing ethical AI practices.

Armed with these skills, you’ll be able to build robust, responsible AI systems. This will make you a valuable asset in the rapidly evolving field of AI and data science.

Gain an understanding of data transformation and Data Live tables

You’ll delve into data transformation and pipelines to understand how to develop efficient data pipelines using Data Live tables and Jobs.

Learning the best practices, you’ll discover how to harness data transformation for effective decision-making.

Unpack responsible Generative AI

As the AI industry continues to evolve, it’s important to consider how users can do this responsibly.

On week three of the course, you’ll explore AI ethics in detail and unpack real-world examples so you can evaluate the ethical implications of GenAI technologies.

Gain fundamental skills in Unity Catalog

Finally, you’ll learn how to use data governance strategies with Unity Catalog to help you master advanced skills in Databricks Lakehouse.

Learning from industry experts and through hands-on exercises, you’ll finish the course with the technical skills and ethical considerations needed in the AI industry.

  • Week 1

    Databricks Lakehouse Platform Fundamentals

    • Databricks Lakehouse Platform

      Learn to leverage Databricks and AI for advanced data analytics. Master the Lakehouse platform, build end-to-end ML pipelines, and responsibly integrate LLMs into your workflows. Gain hands-on experience with clusters and more.

    • Data Transformation with Apache Spark

      Explore data transformation using Apache Spark in Databricks. Learn to set up development environments, use CLI tools, and work with notebooks. Covers multi-language support, repos, and practical exercises in various languages.

    • Data Management with Delta Lake

      Learn data management with Delta Lake. Topics: Spark SQL, Catalog Explorer, table creation, querying external sources, Delta Lake pipelines, ACID transactions, Z-Ordering. Includes lab, tutorials, and quiz on RStudio and COPY INTO

  • Week 2

    Data Transformation and Pipelines

    • Data Pipelines with Delta Live Tables

      Explore Delta Live Tables for automated data pipelines. Learn key components, pipeline types, Auto Loader configuration, event querying, and end-to-end examples. Includes vacuum and garbage collection, plus hands-on labs

    • Workloads with Jobs

      Explore Databricks Jobs for orchestrating workloads. Learn about multi-task workflows, dependencies, job history, dashboards, and failure handling. Includes demos, labs, and quizzes to reinforce concepts and practical application.

    • Data Access with Unity Catalog

      Explore Unity Catalog for data access and governance. Learn about catalogs, metastores, and best practices. Hands-on with Python quickstart and object security. Includes external lab, dynamic catalog building, reflection, and quiz

  • Week 3

    Responsible Generative AI

    • AI Ethics of Generative Models

      Explore the ethical implications of generative AI. Learn about profit sharing, tragedy of the commons, game theory, and regulatory challenges. Examine issues of bias, negative externalities, and perfect competition.

    • Evaluating Real-World Performance of LLMs

      Learn to assess Large Language Model performance in real-world scenarios. Implement Elo rating systems in Python, Rust, R, and Julia. Gain hands-on experience through coding labs.

    • Exploring Production LLM Workflows

      Dive into practical LLM deployment using Lorax and Skypilot. Understand Ludwig for model fine-tuning. Apply these tools to fine-tune Mistral-7b and launch Mixtral.

  • Week 4

    Local LLMOps

    • Getting Started with local models

      Explore local AI models with llamafile. Learn to set up and run models like Mixtral, understand system metrics, and get hands-on experience. Dive into key concepts like Whisper.cpp.

    • Getting Started with Rust Candle

      Lesson 2 introduces Rust Candle, covering basic implementation, Starcoder exploration, and Whisper transcription. It delves into remote AWS development, AI security topics like sleeper agents and data poisoning.

    • Using Rust Candle

      Explore Rust Candle for LLM applications. Learn serverless inference, CLI & chat inference, and using Star Coder. Implement Rust Candle on AWS GPU. Includes readings, reflections, and a quiz to reinforce learning.

More courses you might like

Learners who joined this course have also enjoyed these courses.

©2025  onlincourse.com. All rights reserved