Skip to content

Commit 7cb2c55

Browse files
MorvanZhouMorvan Zhou
authored and
Morvan Zhou
committed
update
1 parent 21fa225 commit 7cb2c55

File tree

1 file changed

+12
-4
lines changed

1 file changed

+12
-4
lines changed

tutorial-contents/503_visualize_gradient_descent.py

+12-4
Original file line numberDiff line numberDiff line change
@@ -18,17 +18,24 @@
1818
[5, 1],
1919
[2, 4.5]][2]
2020

21-
x = np.linspace(-1, 1, 200, dtype=np.float32)
21+
x = np.linspace(-1, 1, 200, dtype=np.float32) # x data
22+
23+
# Test (1): Visualize a simple linear function with two parameters,
24+
# you can change LR to 1 to see the different pattern in gradient descent.
2225

23-
# test 1
2426
# y_fun = lambda a, b: a * x + b
2527
# tf_y_fun = lambda a, b: a * x + b
2628

27-
# test 2
29+
30+
# Test (2): Using Tensorflow as a calibrating tool for empirical formula like following.
31+
2832
# y_fun = lambda a, b: a * x**3 + b * x**2
2933
# tf_y_fun = lambda a, b: a * x**3 + b * x**2
3034

31-
# test 3
35+
36+
# Test (3): Most simplest two parameters and two layers Neural Net, and their local & global minimum,
37+
# you can try different INIT_PARAMS set to visualize the gradient descent.
38+
3239
y_fun = lambda a, b: np.sin(b*np.cos(a*x))
3340
tf_y_fun = lambda a, b: tf.sin(b*tf.cos(a*x))
3441

@@ -50,6 +57,7 @@
5057
result, _ = sess.run([pred, train_op]) # training
5158

5259

60+
# visualization codes:
5361
print('a=', a_, 'b=', b_)
5462
plt.figure(1)
5563
plt.scatter(x, y, c='b') # plot data

0 commit comments

Comments
 (0)