Skip to main content

Documentation - Home Page

Welcome to Trismik's documentation! Here you can read more about our technology and our products.

Adaptive Testing

Adaptive testing is an intelligent evaluation strategy that uses Item Response Theory to optimize the evaluation process. Adaptive testing dynamically selects which questions to present to your model, significantly reducing evaluation time while maintaining statistical accuracy.

Key Features:

  • Smart Question Selection: Uses Item Response Theory (IRT) to select the next item to evaluate the model on, rather then serving the whole dataset at once
  • Real-time Adaptation: Adjusts difficulty based on model responses
  • Cost Efficient: Up to 95% reduction in required questions
  • Reliable Results: Very strong correlation with non-adaptive testing evaluations
  • Integration Ready: Seamlessly works with Scorebook evaluations

Learn More About Adaptive Testing →

Scorebook

A flexible and extensible framework for evaluating models. Scorebook provides clear contracts for data loading, model inference, and metrics computation, making it easy to run comprehensive evaluations across different datasets, models, hyperparameters, and metrics.

Key Features:

  • Flexible Data Loading: Built-in support for Hugging Face datasets, CSV, JSON, and Python lists
  • Model Agnostic: Works with any model or inference provider
  • Extensible Metric Engine: Use built-in metrics or implement your own
  • Automated Sweeping: Evaluate combinations of model hyperparameter configurations
  • Rich Results: Export results to JSON, CSV, and automatically upload results to our dashboard

Get Started with Scorebook →