January '20 Newsletter

AI-Native Software Infrastructure

Determined AI provides an intuitive deep learning training platform, so that you can focus on models, rather than infrastructure. We tightly integrate hyperparameter tuning and distributed training so that you can experiment efficiently and iterate rapidly. The platform is designed to help teams of researchers scalably share cloud or on-premise GPU resources, and your workflow is automatically tracked in our metadata repository so that you can reproduce your models and collaborate with your colleagues.

3-minute demo of Determined AI

  • Forbes: 2020 predictions for AI, ML, and DL - Our CEO, Evan Sparks, contributed this article to Forbes. In it, he outlines his predictions on what will shape the machine learning industry landscape this year.

  • Machine Learning is hard. A recent paper from Luke Oaken-Rayner and Jared Dunnmon indicates that some results applying deep learning to radiology are in question because of a subtle data leakage problem. Ben Recht summarizes in a tweet.

  • RAdam presents an alternative adaptive optimization algorithm that automatically warms up learning rates. In some early internal tests, we’ve seen nice performance on large distributed vision problems!

  • Wei Hu and Simon Du summarize some of their recent work on Neural Tangent Kernels. These are an interesting alternative to Neural Networks for a standard benchmark task. Their results are promising but still below the performance of deep networks.

  • The GLUE NLP benchmark has recently been optimized by recent advances in language models (BERT, Elmo, etc.). A new, improved, and harder set of benchmarks has been devised with SuperGLUE.

  • Adrian Coyler summarizes recent work from Cynthia Rudin over on the Morning Paper. While we definitely agree that simpler, explainable models are better — unfortunately the best performing models in NLP, Computer Vision, Speech (and an increasing number of areas) are deep learning models and therefore black box. More generally, there are a lot of reasons not to like deep learning (non-convexity, ad-hoc architectures, massive compute) but people put up with them because right now no other models compete for these domains.

  • Escaping the Dark Ages of AI Infrastructure - Evan Sparks talks about the practical complexities of operationalizing AI at scale and touches on some fundamental policy questions from data privacy to job replacement.

  • A recent interview with the Google Cloud OnAir explains how our GCP customers are leveraging Determined AI to get the most out of their cloud deep learning spend.

Watch your inbox for future news and updates. You can also keep up with all our latest activities on Twitter or LinkedIn. If you have a question or comment, or would like to start a conversation with us, please reply to this email or contact us on our website.

Try Determined AI for free! Request sandbox access today.

Recent Posts

AUG 05, 2020

YogaDL: a better approach to data loading for deep learning models

JUL 30, 2020

TensorFlow Datasets: The Bad Parts

JUL 30, 2020

Choosing Your Deep Learning Infrastructure: The Cloud vs. On-Prem Debate

Stay updated