# ---------- # # There are two functions to finish: # First, in activate(), write the sigmoid activation function. # Second, in update(), write the gradient descent update rule. Updates should be # performed online, revising the weights after each da…