Fast.ai launches fastai v1, a deep learning library for PyTorch | Industry

Fast.ai today announced the full 1.0 release of fastai, a free, open-source learning library that runs on top of Facebook's PyTorch framework.

The project has been under development for 18 months and the same day as PyTorch 1.0, which includes deeper integrations with Caffe2, ONNX, and a series of integrations with cloud providers like Google Cloud and Azure Machine Learning as well as hardware providers like Intel and Qualcomm.

“Fastai is the first deep learning library to provide a single consistent interface to all the most commonly used deep learning applications for vision, text, tabular data, time series, and collaborative filtering. This is important for practitioners, because it means if you've learnt to create computer vision models with fastai, then you can use the same approach to create natural language processing (NLP) models, or any of the other types of model we support,” Fast.ai cofounder Jeremy Howard said in a Medium post today.

In addition to being able to achieve transfer learning and to be utilized by researchers and developers alike, fastai includes recent advances by the Fast.ai team that allowed them to train Imagenet in less than 30 minutes.

The first version of fastai was released in September 2017 and has since been used to do things like carry out transfer learning with computer vision, execute art projects, and create Clara, a neural net made by an OpenAI research fellow that generates music.

Fastai v1 can work with preinstalled datasets on Google Cloud; it also works with AWS SageMaker and with pre-configured environments with the AWS Deep Learning AMIs.

Fastai is free to use with GitHub, conda, and pip, with support for AWS coming soon.

Fast.ai seeks to democratize access to deep learning with tutorials, tools, and state of the art AI models. More than 200,000 people have taken Fast.ai's seven-week course Practical Deep Learning for Coders.

You might also like
Leave A Reply

Your email address will not be published.