Data Science och AI guider för doktorander

Här följer ett par guider för dig som behöver lära dig AI. Guiderna tillhandahålls kostnadsfritt av forskare vid Chalmers & GU och finns för att stötta doktorander.

Data Science och AI guider för doktorander

Här följer ett par guider för dig som behöver lära dig AI. Guiderna tillhandahålls kostnadsfritt av forskare vid Chalmers & GU och finns för att stötta doktorander.

Further Machine Learning Resources

This chapter has been a quick tour of machine learning in Python, primarily using the tools within the Scikit-Learn library. As long as the chapter

Läs »

Application: A Face Detection Pipeline

This chapter has explored a number of the central concepts and algorithms of machine learning. But moving from these concepts to real-world application can be

Läs »

In-Depth: Kernel Density Estimation

In the previous section we covered Gaussian mixture models (GMM), which are a kind of hybrid between a clustering estimator and a density estimator. Recall

Läs »

In Depth: Gaussian Mixture Models

The k-means clustering model explored in the previous section is simple and relatively easy to understand, but its simplicity leads to practical challenges in its application.

Läs »

Läs »

In-Depth: Manifold Learning

We have seen how principal component analysis (PCA) can be used in the dimensionality reduction task—reducing the number of features of a dataset while maintaining

Läs »

In Depth: Principal Component Analysis

Up until now, we have been looking in depth at supervised learning estimators: those estimators that predict labels based on labeled training data. Here we

Läs »

In-Depth: Decision Trees and Random Forests

Previously we have looked in depth at a simple generative classifier (naive Bayes; see In Depth: Naive Bayes Classification) and a powerful discriminative classifier (support vector

Läs »

In-Depth: Support Vector Machines

Support vector machines (SVMs) are a particularly powerful and flexible class of supervised algorithms for both classification and regression. In this section, we will develop

Läs »

In Depth: Linear Regression

Just as naive Bayes (discussed earlier in In Depth: Naive Bayes Classification) is a good starting point for classification tasks, linear regression models are a good

Läs »

In Depth: Naive Bayes Classification

The previous four sections have given a general overview of the concepts of machine learning. In this section and the ones that follow, we will

Läs »

Feature Engineering

The previous sections outline the fundamental ideas of machine learning, but all of the examples assume that you have numerical data in a tidy, [n_samples, n_features] format.

Läs »

Hyperparameters and Model Validation

In the previous section, we saw the basic recipe for applying a supervised machine learning model: Choose a class of model Choose model hyperparameters Fit

Läs »

Introducing Scikit-Learn

There are several Python libraries which provide solid implementations of a range of machine learning algorithms. One of the best known is Scikit-Learn, a package that

Läs »

What Is Machine Learning?

Before we take a look at the details of various machine learning methods, let’s start by looking at what machine learning is, and what it

Läs »

Further Resources

Matplotlib Resources A single chapter in a book can never hope to cover all the available features and plot types available in Matplotlib. As with

Läs »

Visualization with Seaborn

atplotlib has proven to be an incredibly useful and popular visualization tool, but even avid users will admit it often leaves much to be desired.

Läs »

Läs »

Three-Dimensional Plotting in Matplotlib

Matplotlib was initially designed with only two-dimensional plotting in mind. Around the time of the 1.0 release, some three-dimensional plotting utilities were built on top

Läs »

Customizing Matplotlib: Configurations and Stylesheets

Matplotlib’s default plot settings are often the subject of complaint among its users. While much is slated to change in the 2.0 Matplotlib release in

Läs »