62 Neptune.ai Testimonials

Industry
Company Size
15 per page
  • 15
Reset
  • “Within the first few tens of runs, I realized how complete the tracking was – not just one or two numbers, but also the exact state of the code, the best-quality model snapshot stored to the cloud, the ability to quickly add notes on a particular experiment. My old methods were such a mess by comparison.”

  • “The problem with training models on remote clusters is that every time you want to see what is going on, you need to get your FTP client up, download the logs to a machine with a graphical interface, and plot it. I tried using TensorBoard but it was painful to set up in my situation. With Neptune, seeing training progress was as simple as hitting refresh. The feedback loop between changing the code and seeing whether anything changed is just so much shorter. Much more fun and I get to focus on what I want to do. I really wish that it existed 10 years ago when I was doing my PhD.”

  • “What we like about Neptune is that it easily hooks into multiple frameworks. Keeping track of machine learning experiments systematically over time and visualizing the output adds a lot of value for us.”

  • "When I joined this company, we were doing quite many different experiments and it’s really hard to keep track of them all so I needed something to just view the result or sometimes or also it’s intermediate results of some experiments like what [does] the data frame look like? What [does] the CSV look like? Is it reasonable? Is there something that went wrong between the process that resulted in an undesirable result? So we were doing it manually first but just writing some log value to some log server like a Splunk."

  • “I just had a look at neptune logger after a year and to be honest, I am very impressed with the improvements in UI! Earlier, it was a bit hard to compare experiments with charts. I am excited to try this! I just had a look at neptune logger after a year and to be honest, I am very impressed with the improvements in UI! Earlier, it was a bit hard to compare experiments with charts. I am excited to try this."

  • “I used to keep track of my models with folders on my machine and use naming conventions to save the parameters and model architecture. Whenever I wanted to track something new about the model, I would have to update the naming structure. It was painful. There was a lot of manual work involved. Now everything happens automatically. I can compare models in the online interface that looks great. It saves me a lot of time, and I can focus on my research instead of keeping track of everything manually.“

  • "MLflow requires what I like to call software kung fu, because you need to host it yourself. So you have to manage the entire infrastructure — sometimes it’s good, oftentimes it’s not."

  • "One of the biggest challenges [we had] was managing the pipelines and the process itself because we had 40 to 50 different pipelines. Depending on the exact use case or what kind of data we’d like to output, we could have different combinations for running them to get different outputs. So basically, the entire system isn’t so simple."

  • "I like the dashboards because we need several metrics, so you code the dashboard once, have those styles, and easily see them on one screen. Then, any other person can view the same thing, so that’s pretty nice."

  • “I didn’t expect this level of support.”

  • "So I would say the main argument for using Neptune is that you can be sure that nothing gets lost, everything is transparent, and I can always go back in history and compare."

  • "Neptune works flawlessly, and integrating it with PyTorch Lightning was very smooth."

  • “Indeed it was a game-changer for me, as you know AI training workloads are lengthy in nature, sometimes also prone to hanging in colab environment, and just to be able to launch a set of tests trying different hyperparameters with the assurance that the experiment will be correctly recorded in terms of results and hyper-parameters was big for me.”

  • “The last few hours have been my first w/ Neptune and I’m really appreciative of how much time it’s saved me not having to fiddle w/ matplotlib in addition to everything else.“

  • "We initially aimed for a GKE deployment for our experiment tracking tool. However, the other solution we explored had a rigid installation process and limited support, making it unsuitable for our needs. Thankfully, Neptune’s on-premise installation offered the flexibility and adjustability we required. The process was well-prepared, and their engineers were incredibly helpful, answering all our questions and even guiding us through a simpler deployment approach. Neptune’s on-prem solution and supportive team saved the day, making it a win for us."