## Recommend - Use History when adding catids ## Gather - Maybe more dataset configurations ## Qualify - Print stats for a dataset ## Fine-tune - https://www.sbert.net/docs/sentence_transformer/loss_overview.html#loss-table - Use (anchor, positive) pairs to train a new model - Use (sentence) + class labels to train a new model - Implement BatchAllTripletLoss - Implement a two-phase training regime - 1. Train with anchored definitions then... - 2. Train with class labels ## Evaluate - Print more information about the dataset coverage of UCS - Allow skipping model testing for this - Print raw output ## Utility - Clear caches