Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Panel
panelIconIdatlassian-flag_on
panelIcon:flag_on:
panelIconText:flag_on:
bgColor#FFEBE6

There are function(s) to be completed in this section!

...

  1. What is Training?

    • Training is the process of optimizing a model to make better predictions by minimizing a loss function.

    • For your BERT-based model:

      • Intent Classification: Predict the intent of a user query (e.g., addFilter or recipeRequest).

      • Slot Filling: Assign BIO tags to each token in the query (e.g., B-shortTimeKeyWord for "fast").

  2. How Does the Training Loop Work?

    • Forward Pass: The input data flows through the model to generate predictions.

    • Loss Calculation: The predictions are compared with ground truth labels to calculate the loss.

    • Backward Pass (Backpropagation): The model adjusts its weights based on the loss to improve predictions in the next iteration.

  3. What Are We Optimizing?

    • Intent Loss: Measures how well the model predicts the intent.

    • Slot Loss: Measures how well the model predicts slot tags for each token.

  4. Why Do We Use Two Loss Functions?

    • Your model performs two tasks simultaneously, so you need separate losses for each task. These are combined to train the model in a balanced way.

...

Steps to Complete the train_model Function

The train_model function is incomplete, and your task is to fill in the missing pieces. This function is the backbone of the training process for a dual-task model that handles intent classification and slot filling. Through this exercise, you’ll learn how to integrate loss functions, perform forward passes, and optimize a model’s weights effectively.

...

By completing this assignment, you’ll gain hands-on experience with key components of model training, preparing you for more advanced tasks. Take your time, think critically, and ask for clarification if needed!

...

Note

Reflection Questions: Check out the Questions to Consider in each section.

...