Some of the most important python libraries used for AI and Machine learning
Almost every data manipulation, analysis, and computation is handled by libraries in this stack. NumPy as go-to prominent mathematical computation, Pandas for data analysis, IPython for an interactive console, and Matplotlib for data visualizations.
Tensorflow, Google’s open-sourced machine learning framework, is a high performance library that is most widely used for building Neural Networks.
PyTorch is probably Tensoflow's biggest rival. It's just another Deep Learning library, with absolutely blazing speed. To quote their website directly: “PyTorch is not a Python binding into a monolothic C++ framework. It is built to be deeply integrated into Python. You can use it naturally like you would use numpy / scipy / scikit-learn etc. You can write your new neural network layers in Python itself, using your favorite libraries and use packages such as Cython and Numba.”
The go-to library for Machine Learning, their primary focus is on building an accessible and simple –in a word, pythonic– interface. Training a Random Forest model is as simple as this: RandomForestClassifier().fit(x,y)
Interactive visualizations for humans? Plotly has got you covered! And it gets better! Cufflinks works directly with pandas.
Want some professionally styled graphs with as few lines of code as possible? Want to customize them freely using your much beloved matplolib? Season does just that, and also works seamlessly with Pandas dataframes.
With Computer Vision "taking over" the Machine Learning, OpenCV has become one of the more popular libraries for machine learning. It works using numpy arrays.
Yet another Deep Learning library, but interesting fact: This is probably the easiest way to create models with Intel Movidius Neural Compute Stick. Take a look at both versions and take advantage of this highly flexible framework.
Python can be very slow compared to C++ or other languages, but Numba is a way to fix it. With a simple decorator, some annotations, and maybe a few different syntactic configurations, your Python code can be just-in-time compiled to optimized machine instructions. Remember to thank the LLVM compiler!
This isn't a unified stack as was in Scipy’s case, but it still is a major market player in NLP. First is nltk, an extensive library with equally extensive abilities. Next comes gensim, who’s motto is “topic modelling for humans”. A similarly rich API, with excellent documentation. Nobody could forget Spacy, and it’s passion for memory-efficient solutions. As their developers say, it was “written from the ground up in carefully memory-managed Cython.”