开帖占坑
第一周作业题目: 【AI培训第四期课后作业内容帖】
AI培训第一周课后作业
针对cifar10的数据,进行基本数据统计(统计框/图片的平均,最大,最小宽高),数据预处理(对数据做旋转强,randomcrop),并做可视化分析对比数据处理前后的变化
代码:
第四期第二周作业
基础作业: 自己随机生成数据并用Megengine实现数据的线性拟合
代码:
引用
import numpy as np
import matplotlib.pyplot as plt
def generate_random_examples(n=100, noise=5):
w = np.random.randint(5, 10)
b = np.random.randint(-10, 10)
print(“The real w: {}, b: {}”.format(w, b))
data = np.zeros((n, ))
label = np.zeros((n, ))
for i in range(n):
data[i] = np.random.uniform(-10, 10)
label[i] = w * data[i] + b + np.random.uniform(-noise, noise)
plt.scatter(data[i], label[i], marker=".")
plt.plot()
plt.show()
return data, label
original_data, original_label = generate_random_examples()
epochs = 5 # 训练轮次
lr = 0.01
data = original_data
label = original_label
n = len(data)
w = 0
b = 0
def linear_model(x):
return w * x + b
for epoch in range(epochs):
loss = 0
sum_grad_w = 0
sum_grad_b = 0
for i in range(n):
pred = linear_model(data[i])
loss += (pred - label[i]) ** 2
sum_grad_w += 2 * (pred - label[i]) * data[i]
sum_grad_b += 2 * (pred - label[i])
loss = loss / n
grad_w = sum_grad_w / n
grad_b = sum_grad_b / n
w = w - lr * grad_w
b = b - lr * grad_b
print("epoch = {}, w = {:.3f}, b={:.3f}, loss = {:.3f}".format(epoch, w, b, loss))
x = np.array([-10, 10])
y = w * x + b
plt.scatter(data, label, marker=".")
plt.plot(x, y, "-b")
plt.show()
import numpy as np
import matplotlib.pyplot as plt
def gen_random_data(n, noise):
a = np.random.randint(-20, 20)
b = np.random.randint(-20, 20)
print(“generated a: {}, b: {}”.format(a, b))
#initialize data and label
data = np.zeros((n, ))
label = np.zeros((n, ))
#generate n radom sample data with noise
for i in range(n):
data[i] = np.random.uniform(-10, 10)
label[i] = a * data[i] + b + np.random.uniform(-noise, noise)
plt.scatter(data[i], label[i], marker = ".")
#
plt.plot()
plt.title('generated random data')
plt.show()
return data, label
gen_data, gen_label = gen_random_data(100,20)
import megengine as meg
import megengine.functional as F
from megengine.autodiff import GradManager
import megengine.optimizer as optim
#设置超参数
epochs = 5
lr = 0.01
#获取数据
data = meg.tensor(gen_data)
label = meg.tensor(gen_label)
#初始化参数
w = meg.Parameter([0.0])
b = meg.Parameter([0.0])
#定义模型
def linear_model(x):
return F.mul(w, x) + b
#定义求导函数和优化器
gm = GradManager().attach([w, b])
optimizer = optim.SGD([w, b], lr = lr)
#模型训练
for epoch in range(epochs):
with gm:
pred = linear_model(data)
loss = F.loss.square_loss(pred, label)
gm.backward(loss)
optimizer.step().clear_grad()
#show the parameters
print(“epoch = {}, w = {:.3f}, b = {:.3f}, loss = {:.3f}”.format(epoch, w.item(), b.item(), loss.item()))
x = np.array([-10, 10])
y = w.numpy() * x +b.numpy()
plt.scatter(data, label, marker=".")
plt.plot(x, y, “-b”)
plt.show()