22 Labelbox Testimonials

Industry
Company Size
15 per page
  • 15
Reset
  • “I found Labelbox and had a conversation with Brian. I was able to ask about the features I needed and enterprise level support.”

  • "This is, hands down, the best part of my experience with Labelbox so far, were highly engaged, and the quality of the annotations was amazing."

  • “Labelbox is so easy to use. The documentation is accessible, and the labeling pipeline is straightforward. We just had to upload our data, customize the editor to our exact requirements, and go. We had actually budgeted a week to get it set up, but we were done in a day.”

  • “Using Labelbox was fun for my team. It became a bit of a game to see how much could be labeled and how our data affected the performance of the models. I’m very excited to see the roadmap of how Labelbox will grow. Teams of experts like mine will be working with this tool on the front line of machine learning technology, and interfacing with the experienced team at Labelbox will make great things possible.”

  • "Labelbox has become the foundation of our training data infrastructure. Our data science teams create high quality labeled training data with our internal domain experts as well as external labeling services, all inside Labelbox. And, the support is exceptional!"

  • “We’re pleased with how Labelbox can scale with the growth of our team or expansion of our machine learning initiatives.”

  • “Field management practices and problems impacting the field vary significantly per country, field, climatic zone and field zone. Regional experts are vital to achieving high quality in annotated data.”

  • “Labelbox gave us a lot of versatility by creating thoughtful ontologies with a streamlined interface. It was a critical feature and enabled us to produce high-quality labeled data with minimal errors and inconsistencies. In addition, we really liked the infinite flexibility of how classifications are set up and the product was intuitive for our pathologists to use.”

  • "Core to the enablement of AI based machine learning algorithms for our Intelligence Community and National Security partners is the need to accurately and cost-effectively label vast amounts of training data. Labelbox, offers our partners a state-of-the-art data annotation and data labeling platform for our IC partners to quickly and cost effectively label their AI training data."

  • “Our experts found these straightforward to use and were able to annotate data very quickly to a high standard."

  • “What Labelbox has massively helped us with is fast iterations on our algorithms, helping us move twice as fast in this domain. The results we received from it are magnificent and their labeling user interface is the best we’ve seen for supporting our annotations efforts.”

  • “Using your platform and Workforce service has been very easy and effortless. The quality of the labels, especially with regard to the care that was taken to make sure the boundaries of each zone were traced out as accurately as possible. This has really improved the results of our neural network beyond what we could have ever achieved.”

  • “We needed a machine learning pipeline solution and Labelbox was it!”

  • “With the streamlined design of Labelbox, we are able to cut costs on labeling by as much as 50% while maintaining the highest quality in our training data, and get to training our models faster. With human-in-the-loop model-assisted labeling, we expect another huge reduction in time and costs to the labeling process. After a preliminary model is trained, we can run a loop to generate labels from our model’s inference, and feed those back into Labelbox, effectively cutting the labeling load of our labelers to that of reviewing for false positives. That allows us to increase our capabilities and model accuracies exponentially with respect to time for the amount of components and defects we can detect and classify.”

  • “Before using the Valohai and Labelbox platforms, we struggled with managing our training data creation infrastructure and manual experiment tracking. We’re now able to concentrate on model building and deployment, without sparing engineering effort and are able to speed up model training by over 10X.”