26. Which of the following activation functions can avoid the vanishing gradient problem?
(A) ReLU
(B) tanh
(C) Sigmoid
(D) ELU
(E) None of the above

答案:登入後查看
統計: A(0), B(0), C(1), D(0), E(1) #2752366

詳解 (共 1 筆)

#4999918
ReLU的公式十分單純,就是f(x) =...
(共 146 字,隱藏中)
前往觀看
1
0