Day-18 Pytorch 的 Logistic Regrssion

  • 昨天看过 Linear Regression 的部分了,那我们今天来还债 XDD
  • 大家还记得在 Day-09 那丑陋的 Logistic Regression Python code 吗 QQ,我们今天就用 Pytorch 一样来解决一下问题,并且看看如果要利用 Pytorch 改写的话,应该如何去改写吧~

一样上 Code~

  • Pytorch 的 Logistic Regression 建立
# 1) model build
# sigmoid function 交给了 torch.sigmoid
class LogisticRegression(nn.Module):

    def __init__(self, input_dim):
        super(LogisticRegression, self).__init__()

        # define layers
        self.linear = nn.Linear(input_dim, 1)

    def forward(self, x):
        y_predicted = torch.sigmoid(self.linear(x))

        return y_predicted

model = LogisticRegression(n_features)
  • loss function 的 Cross Entropy 也交给 Pytorch 的 Binary Cross Entropy
# BCE stands for Binary Cross Entropy
criterion = nn.BCELoss()
  • Optimizer 一样麻烦我们的 Gradient Descent
# SGD stands for stochastic gradient descent
optimizer = torch.optim.SGD(model.parameters(), lr=learning_rate)

完整版

import numpy as np
import pandas as pd

import torch
import torch.nn as nn

# 0) data import and preprocessing
from sklearn import datasets

iris = datasets.load_iris()
# print(iris.DESCR)

# use pandas as dataframe and merge features and targetsf
feature = pd.DataFrame(iris.data, columns=iris.feature_names)
target = pd.DataFrame(iris.target, columns=['target'])
iris_data = pd.concat([feature, target], axis=1)

# keep only sepal length in cm, sepal width in cm and target
iris_data = iris_data[['sepal length (cm)', 'sepal width (cm)', 'target']]

# keep only Iris-Setosa and Iris-Versicolour classes
iris_data = iris_data[iris_data.target <= 1]
iris_data.head(5)

feature = iris_data[['sepal length (cm)', 'sepal width (cm)']]
target = iris_data[['target']]

n_samples, n_features = feature.shape

# split training data and testing data
from sklearn.model_selection import train_test_split

feature_train, feature_test, target_train, target_test = train_test_split(
    feature, target, test_size=0.3, random_state=4
)

from sklearn.preprocessing import StandardScaler

sc = StandardScaler()
feature_train = sc.fit_transform(feature_train)
feature_test = sc.fit_transform(feature_test)
target_train = np.array(target_train)
target_test = np.array(target_test)

# change data to torch
feature_train = torch.from_numpy(feature_train.astype(np.float32))
feature_test = torch.from_numpy(feature_test.astype(np.float32))
target_train = torch.from_numpy(target_train.astype(np.float32))
target_test = torch.from_numpy(target_test.astype(np.float32))

# 1) model build
# sigmoid function 交给了 torch.sigmoid
class LogisticRegression(nn.Module):

    def __init__(self, input_dim):
        super(LogisticRegression, self).__init__()

        # define layers
        self.linear = nn.Linear(input_dim, 1)

    def forward(self, x):
        y_predicted = torch.sigmoid(self.linear(x))

        return y_predicted

model = LogisticRegression(n_features)

# 2) loss and optimizer
learning_rate = 0.01
# BCE stands for Binary Cross Entropy
criterion = nn.BCELoss()
# SGD stands for stochastic gradient descent
optimizer = torch.optim.SGD(model.parameters(), lr=learning_rate)

# 3) Training loop
epochs = 100
for epoch in range(epochs):
    # forward pass and loss
    y_predicted = model(feature_train)
    loss = criterion(y_predicted, target_train)

    # backward pass
    loss.backward()

    # optimizer
    optimizer.step()

    # init optimizer
    optimizer.zero_grad()

    if (epoch + 1) % 10 == 0:
        print(f'epoch {epoch + 1}: loss = {loss:.8f}')

# checking testing accuracy
with torch.no_grad():
    y_predicted = model(feature_test)
    y_predicted_cls = y_predicted.round()
    acc = y_predicted_cls.eq(target_test).sum() / float(target_test.shape[0])
    print(f'accuracy = {acc: .4f}')

每日小结

  • 可以看到我们已经可以里用 PyTorch 灵活的建立需要的 Model 来做到训练跟预测的状况了
  • Logistic Regression 跟昨天的 Linear Regression 并没有太大的差别,主要差别就差在多了一个 sigmoid function,而 Pytorch 也有提供 sigmoid function
  • 那今天再示范一次前面已经写过的东西想强调的部分就是,在 Pytorch Model Class 的撰写过程中,最重要的就是 __init__forward() 了,因为他们分别宣告了要层层传递的部分和要执行的流程,也就是说,我们在上面的例子可以看到 forward() 里面说明了我们实际上是先做一次 linear 计算,之後才做 sigmoid,这也代表着如果今天遇到一个未知网路,我们都可以里用撰写者写的 forward() function 去理解这个神经网路是如何传递资料的,因此整个 Model 最重要的可以说是这个部份了
  • 我们到现在基本上快要整个 Code 都用 Pytorch 来改造了,只差最後一个部份了,就是我们的资料集,让我们明天来看看 Pytorch 的 Dataset 跟 Dataloader 吧~

<<:  Day-22 树(Tree), 二元搜寻树(Binary Search Tree)

>>:  Day18-TypeScript(TS)的类别(Class)继承(Inheritance)

Day 07 - 导流专家Route 53

来到了第七天,今天让我们来一起看看Route 53吧 Route 53帮那些事情? 有了Route ...

Python -今天我想来点爬虫程序

爬虫原理: 抓取资料->分析结构->取出要的结构文字->输出想要的格式 程序码: ...

网页编排Grid-30天学会HTML+CSS,制作精美网站

Grid是什麽 Grid是砖墙式版面,使用二维的排版方式,与flexbox不同的地方是Grid一次可...

[Day 47] 留言板後台及前台(三) - 留言板画面2

我们在这里用到了文字编辑器, 我们使用的是CKEditor, 可以到 这边 下载 也可以参照 官方文...

Day 24 - Tailwind Plugin 使用 (三) => Forms

最後一个 Plugin 就是我们经常会使用到的表单,因为样式都被清除了,必须全部重写,如果我们使用...