Generate and Evaluate Predictions using LLMs & ML models - Learn More
Contact Sales

New: Securely & easily connect models to Label Studio Enterprise

Connecting models to generate predictions in Label Studio Enterprise has never been easier thanks to a significant update to the workflow and UI, and the addition of basic auth. Whether you want to automate labeling or evaluate models, the new ML backend connection enables you to accelerate projects with your choice of models while maintaining full control, customizability, and compliance.

What’s possible with GenAI & ML integrations

Automate labeling predictions across any dataset.

This works similarly to loading a set of existing predictions into Label Studio, except in this case, the labeling task is sent to the model as context and the predictions returned by your model are applied right inside of Label Studio. Annotators can then review and accept predictions rather than tediously entering their own.

Speed up manually intensive data labeling with AI assistance and magic tools.

For example, an image annotation task may require drawing bounding boxes around building signage and then inputting the actual text of each sign. An OCR model can accelerate this task by auto-populating the text of the sign.

For a document summarization task, an annotator can now easily and interactively prompt an LLM to assist with crafting the summary, saving the successful prompt to be used for the next labeling task.

Evaluate Model Performance

Once a dataset is annotated, you can evaluate your model's performance by comparing its predictions against the ground truth labels. Model evaluation lets you test models before production use, identify challenging edge cases, and discover and respond to data drift once a model is deployed.

These new capabilities in Label Studio Enterprise make it easier to automate and speed up the labeling workflow within the already familiar annotation UI. Annotators will see AI models instantly offer predictions in the UI that they can easily review, refine, and accept. They’ll also have the tools to evaluate results, refine model prompts, and apply automated predictions across datasets much larger than what can be annotated by hand.

Automation that’s ready for the Enterprise

Enhanced productivity for your team may look appealing, but there are important questions: How do you control which model is being used, or bring  your own custom models? How can you guarantee the safety and compliance of your data?

The good news is we built AI automation for Label Studio with demanding enterprise requirements in mind.

Teams have full control to select any available model to power these automations—from open source, from a commercial provider, or completely custom. In fact, after training or fine-tuning your own model, your team can now use the same automation workflows to evaluate the model's performance and compare its outputs to existing human annotations.

Enhancements to the ML backend integration in Label Studio Enterprise include:

  • Security. We’ve added support for basic authentication, meaning you can now connect models that require a password or authentication key.
  • Streamlined integration workflow. We’ve simplified the configuration options and improved the UI to make integrating models easier than ever.
  • Support for more models. A refreshed examples library includes support for a variety of models, inducing connectors to OpenAI, Azure OpenAI, various Hugging Face models, LangChain, Segment Anything, GroundingDINO, Yolo, Tesseract, and more.
  • Most integrations require very little configuration beyond entering an authentication key. Like almost every aspect of Label Studio, there are ways for your team to customize each aspect of the model integration, adjusting data formats and system prompts to improve accuracy and generate the precise predictions required. A more technical overview covering how models integrate with Label Studio can be found in the documentation.

Label Studio Enterprise runs fully managed cloud infrastructure offering HIPAA compliance and SOC2 certification, or on-premises. And even as you make use of automation, your data in transit between the interface and your models never touches our servers.

When it becomes faster to annotate new data and to evaluate model output, training loops run much more efficiently. You can finally have ML workflows that can keep pace with the incredible volumes of data seen at scale.

Which workflows will you automate?

If you are looking to bring ML projects to market faster, the automation capabilities in Label Studio will be a powerful asset for your team and projects. We'd love to offer guidance to help you select models, stand up an integration, and rapidly automate workflows for your entire team.

Related Content