29. Suppose there is a two-input neuron with weights w1 = 3, w2 = 2, and bias b = 1.2. Let the
activation function f be the ReLU (Rectified Linear Unit) function, defined as f(x) = max(0, x).
Given the inputs p1 = -5 and p2 = 6. What is the output of this neuron?
(A) 0
(B) 1.2
(C) 2.4
(D) -1.5
(E) -1.8
詳解 (共 1 筆)
未解鎖
a neuron calculation...