62 Neptune.ai Testimonials

Industry
Company Size
15 per page
  • 15
Reset
  • "We have a mantra: always be learning. We apply this primarily to our model, which means we’re always running experiments. So me, our CEO, other people in the team—we’re constantly checking the monitoring tool. It has to be nice, smooth, and be able to handle our training data streams consistently."

  • "Clearly, handling the training of more than 7000 separate machine learning models without any specialized tool is practically impossible. We definitely needed a framework able to group and manage the experiments."

  • "When I joined this company, we were doing quite many different experiments and it’s really hard to keep track of them all so I needed something to just view the result or sometimes or also it’s intermediate results of some experiments like what [does] the data frame look like? What [does] the CSV look like? Is it reasonable? Is there something that went wrong between the process that resulted in an undesirable result? So we were doing it manually first but just writing some log value to some log server like a Splunk."

  • "As our company has grown from a startup to a sizeable organization of 200 people, robust security and effective user management have become increasingly evident and vital."

  • "Building something like a power line is a huge project, so you have to get the design right before you start. The more reasonable designs you see, the better decision you can make. Optioneer can get you design assets in minutes at a fraction of the cost of traditional design methods."

  • “I had been thinking about systems to track model metadata and it occurred to me I should look for existing solutions before building anything myself. Neptune is definitely satisfying the need to standardize and simplify tracking of experimentation and associated metadata. My favorite feature so far is probably the live tracking of performance metrics, which is helpful to understand and troubleshoot model learning. I also find the web interface to be lightweight, flexible, and intuitive.”

  • "No more DevOps needed for logging. No more starting VMs just to look at some old logs. No more moving data around to compare TensorBoards."

  • "I used Weights & Biases before Neptune. It’s impressive at the beginning, it works out of the box, and the UI is quite nice. But during the four years I used it, it didn’t improve —they didn’t fully develop the features they were working on. So I appreciate that Neptune has been noticably improved during the whole time I’ve been using it."

  • “The problem with training models on remote clusters is that every time you want to see what is going on, you need to get your FTP client up, download the logs to a machine with a graphical interface, and plot it. I tried using TensorBoard but it was painful to set up in my situation. With Neptune, seeing training progress was as simple as hitting refresh. The feedback loop between changing the code and seeing whether anything changed is just so much shorter. Much more fun and I get to focus on what I want to do. I really wish that it existed 10 years ago when I was doing my PhD.”

  • "MLflow requires what I like to call software kung fu, because you need to host it yourself. So you have to manage the entire infrastructure — sometimes it’s good, oftentimes it’s not."

  • "Versioning jupyter notebooks is a great and unique feature."

  • “Within the first few tens of runs, I realized how complete the tracking was – not just one or two numbers, but also the exact state of the code, the best-quality model snapshot stored to the cloud, the ability to quickly add notes on a particular experiment. My old methods were such a mess by comparison.”

  • “I used to keep track of my models with folders on my machine and use naming conventions to save the parameters and model architecture. Whenever I wanted to track something new about the model, I would have to update the naming structure. It was painful. There was a lot of manual work involved. Now everything happens automatically. I can compare models in the online interface that looks great. It saves me a lot of time, and I can focus on my research instead of keeping track of everything manually.“

  • “Neptune allows us to keep all of our experiments organized in a single space. Being able to see my team’s work results any time I need makes it effortless to track progress and enables easier coordination.”

  • "We’ve got a few teams across different countries and different time zones and prior to Neptune, we were just shipping each other zips of like TensorBoard logs, so being able to see it all in space and it’s all just logged to the central area is really great and has helped us compare our results a lot faster and a lot more efficiently."