蓝狮娱乐-蓝狮注册登录站

tensorflow中Adam优化器运用

日期:2024-07-22 07:26 / 作者:佚名

?

#Adam

#求一阶动量和二阶动量

m_w = beta1 * m_w + (1 - beta1) * grads[0] #求一阶动量m_w,和SGDM一阶动量表达式一样

m_b = beta1 * m_b + (1 - beta1) * grads[1] #求一阶动量m_b,和SGDM一阶动量表达式一样

v_w = beta2 * v_w + (1 - beta2) * tf.square(grads[0]) #求二阶动量v_w,和RMSprop的二阶动量表达式一样

v_b = beta2 * v_b + (1 - beta2) * tf.square(grads[1]) #求二阶动量b_b,和RMSprop的二阶动量表达式一样

#求修正项

m_w_correction = m_w / (1 -tf.pow(beta1, int(global_step))) #对一阶动量m_w除以一个式子得到修正项m_w_correction

m_b_correction = m_w / (1 -tf.pow(beta1, int(global_step))) #对一阶动量m_b除以一个式子得到修正项m_b_correction

v_w_correction = v_w / (1 -tf.pow(beta2, int(global_step))) #对一阶动量v_w除以一个式子得到修正项v_w_correction

v_b_correction = v_b?/ (1 -tf.pow(beta2, int(global_step))) #对一阶动量v_b除以一个式子得到修正项v_w_correction

#参数自更新

w1.assign_sub(learning_rate * m_w_correction / tf.sqrt(v_w_correction)) #参数w1自更新

b1.assign_sub(learning_rate * m_b_correction / tf.sqrt(v_b_correction)) #参数b1自更新)

 

?

E:\Anaconda3\envs\TF2\python.exe C:/Users/Administrator/PycharmProjects/untitled8/iris数据集分类.py
epoch: 0, loss: 0.26738786324858665
test_accuracy: 0.5666666666666667
-------------------------------------------------
epoch: 1, loss: 0.22386804968118668
test_accuracy: 0.6333333333333333
-------------------------------------------------
epoch: 2, loss: 0.17018816992640495
test_accuracy: 0.6333333333333333
-------------------------------------------------
省略........
-------------------------------------------------
epoch: 498, loss: 0.010133910043805372
test_accuracy: 1.0
-------------------------------------------------
epoch: 499, loss: 0.010131825423741248
test_accuracy: 1.0
-------------------------------------------------
total_time: 17.978153705596924
?

?

?

平台注册入口