Location>code7788 >text

Learning Artificial Intelligence from Zero - Python-Pytorch Learning (III)

Popularity:57 ℃/2024-08-09 14:30:57

preamble

This post mainly adds to the previous post about requires_grad, and an introduction to linear regression.

Closing tensor calculation

Close the tensor calculation. This is relatively simple, just read the code below.

print("============ off requires_grad ==============")
x = (3, requires_grad=True)
print(x)
x.requires_grad_(False) # turn off tensor calculation for x

print("x after turning off tensor computation for x:", x) # no more requires_grad property

x = (3, requires_grad=True)
print("New x with tensor calculation:", x)
y = () # Remove the tensor-added property from x and return the normal tensor
print("y has no tensor properties:", y)
print("x also has tensor properties:", x)
print("Tensor additional attributes for x removed from ============ region ==============")
with torch.no_grad():
    y = x+2
    print("y has no tensor properties:", y)
print("x also has tensor properties:", x)

An interesting example

Code 1 is as follows and the code works fine.

x = (1.0)
y = (2.0)
w = (1.0, requires_grad=True)
y_hat = w*x
loss = (y_hat-y)**2
print(loss)
()
print()

Code 2 is as follows, the following code does not work.

x = ([1.0,2.0])
y = ([1.0,2.0])
w = ([1.0,2.0],requires_grad=True)
y_hat = w*x
loss =(y_hat-y)**2
print(loss)
()
print()

This is because the loss of code 1 is a value, a scalar, so it can perform backward.
And the loss of code 2 is a vector, he can't perform backward.

linear regression

A lot of videos or articles say that deep learning starts with understanding linear regression. Then, when people go through the videos on linear regression, there are a bunch more.
In fact, there is absolutely no need to look at those courses and delay those hours. Moreover, you may not be able to understand it even if you delay those hours.
Linear regression is something to learn, but you don't have to brush up on the videos to learn it, it can actually be taught in a few simple sentences. It's just that no one is talking about it properly, and it all seems to be waiting for us to spend an extraordinary amount of time researching and enlightening ourselves.

A Quick Understanding of Linear Regression

First understand what linear is; A=2, B=4, the naked eye recognizes that B is twice as much as A. We say that A and B are related, what is the relationship? It is a linear relationship; linear means this, it means that the two numbers are related.
As mentioned in the last post, nouns are what hold us back from learning, and the noun linear is a concrete manifestation of that.
Regression is the process by which we find that B is twice as much as A. Simply put, linear regression is finding a number that specifies the relationship between A and B.
Find the relationship between A and B. Expressed as a function, it is y = wx + b; A carries x and B carries y. The naked eye surmises that the result w = 2 and b = 0.
Now replace A and B with two matrices, then w is also a matrix and b is still a constant. When we find w and b, we are finding the linear relationship between A and B.
At this point, we don't have to go through 30 or 40 videos on linear regression to have an idea of what linear regression is all about.

coding

Let's look directly at the code. x is the eigenvalue and y is the target value.
For example, if we have a picture of frog A, his matrix is y. Then find a picture of frog B. x is the matrix of frog B.
The linear relationship (w and b) between frog B and frog A was then calculated by linear regression.
Here the input feature x we write dead, without reading the matrix of frog B; y is also written dead, without reading frog A.
Then define w to be a matrix of the same type as x, and then define b to be a 0-tensor.
Then use the previous knowledge to find the gradient using backward and get the sum.
And then we that and.
Sum and w, b are homotopy tensors, now we use sum to correct w and b. When correcting we use learning_rate learning rate to make sure that we modify only a little bit at a time.
Then it is iterated many times to get our relation (w and b).
The code is as follows:

# Input features and target values
x = ([1.0, 2.0])
y = ([115.0, 21.0])

# Weights initialization (including bias terms)
w = ([1.0, 2.0], requires_grad=True)
b = (0.0, requires_grad=True)

# Learning rate
learning_rate = 0.01

# Optimize with multiple iterations
for epoch in range(100):
    # Prediction
    y_hat = w * x + b

    # Loss function
    loss = (y_hat - y).pow(2).mean()

    # Backpropagation
    ()

    # Update the weights and biases
    with torch.no_grad():
        w -= learning_rate *
        b -= learning_rate *

    # Zero the gradient
    .zero_()
    .zero_()

    print(f'Epoch {epoch + 1}, Loss: {()}')

# Final model parameters
print("Final weights:", w)
print("Final bias:", b)

Run the following figure:
image

As shown in the picture, I looped 100 times, but the value of loss is still larger, the meaning of loss is, the closer to 0, the more accurate this value of w and b is.
Of course, if frogs A and B are really unlike each other, then it could cycle through 1000 times and the LOSS would still be huge.
Here we loop 100 times after w=[51.8260,-9.4314] b=45.1103
Now we use y = wx + b with x, w, and b to get y_pred = 51.8260 * 1 + 45.1103= 96.9363. the first term of our y is 115.0.
You can see that the predicted value of x obtained by wx+b has become very close to the true value of y.

Now modify the run 2000 times and run the following:
image

y = wx + b Bringing in x, w, and b gives y_pred = 62.4444 * 1 + 52.5554= 114.9998.
And the first term of our y is 115.0.
As you can see, the predicted values are very close to the true values.

Portal:
Learning Artificial Intelligence from Zero - Python-Pytorch Learning (I)
Learning Artificial Intelligence from Zero - Python-Pytorch Learning (II)

That's enough studying for now.


Note: This post is original, please contact the author for authorization and attribution for any form of reproduction!



If you think this article is still good, please click [Recommend] below, thank you very much!

/kiba/p/18350389