62 Neptune.ai Testimonials

Industry
Company Size
15 per page
  • 15
Reset
  • “I just had a look at neptune logger after a year and to be honest, I am very impressed with the improvements in UI! Earlier, it was a bit hard to compare experiments with charts. I am excited to try this! I just had a look at neptune logger after a year and to be honest, I am very impressed with the improvements in UI! Earlier, it was a bit hard to compare experiments with charts. I am excited to try this."

  • “I tested multiple loggers with pytorch-lightning integrations and found neptune to be the best fit for my needs. Friendly UI, ease of use and great documentatinon.“

  • “What we like about Neptune is that it easily hooks into multiple frameworks. Keeping track of machine learning experiments systematically over time and visualizing the output adds a lot of value for us.”

  • "Our ML teams at Waabi continuously run large-scale experiments with ML models. A significant challenge we faced was keeping track of the data they collected from experiments and exporting it in an organized and shareable way."

  • "We evaluated several commercial and open-source solutions. We looked at the features for tracking experiments, the ability to share, the quality of the documentation, and the willingness to add new features. Neptune was the best choice for our use cases."

  • "Clearly, handling the training of more than 7000 separate machine learning models without any specialized tool is practically impossible. We definitely needed a framework able to group and manage the experiments."

  • "As our company has grown from a startup to a sizeable organization of 200 people, robust security and effective user management have become increasingly evident and vital."

  • “Neptune allows us to keep all of our experiments organized in a single space. Being able to see my team’s work results any time I need makes it effortless to track progress and enables easier coordination.”

  • “I had been thinking about systems to track model metadata and it occurred to me I should look for existing solutions before building anything myself. Neptune is definitely satisfying the need to standardize and simplify tracking of experimentation and associated metadata. My favorite feature so far is probably the live tracking of performance metrics, which is helpful to understand and troubleshoot model learning. I also find the web interface to be lightweight, flexible, and intuitive.”

  • “Indeed it was a game-changer for me, as you know AI training workloads are lengthy in nature, sometimes also prone to hanging in colab environment, and just to be able to launch a set of tests trying different hyperparameters with the assurance that the experiment will be correctly recorded in terms of results and hyper-parameters was big for me.”

  • "No more DevOps needed for logging. No more starting VMs just to look at some old logs. No more moving data around to compare TensorBoards."

  • "I used Weights & Biases before Neptune. It’s impressive at the beginning, it works out of the box, and the UI is quite nice. But during the four years I used it, it didn’t improve —they didn’t fully develop the features they were working on. So I appreciate that Neptune has been noticably improved during the whole time I’ve been using it."

  • “I have been pleasantly surprised with how easy it was to set up Neptune in my PyTorch Lightning projects."

  • "We primarily use Neptune for training monitoring, particularly for loss tracking, which is crucial to decide whether to stop training if it’s not converging properly. It’s also invaluable for comparing experiments and presenting key insights through an intuitive dashboard to our managers and product owners."