$$\mbox{Logistic regression: }\mathbf{z} = \sigma(\mathbf{h}) = \frac{1}{1 + e^{-\mathbf{h}}}$$
$$\mbox{Cross-entropy loss: } J(\mathbf{w}) = -(\mathbf{y} log(\mathbf{z}) + (1 - \mathbf{y})log(1 - \mathbf{z})) $$ $$ \mbox{Use chain rule: } \frac{\partial{J(\mathbf{w})}}{\partial{\mathbf{w}}} = \frac{\partial{J(\mathbf{w})}}{\partial{\mathbf{z}}} \frac{\partial{\mathbf{z}}}{\partial{\mathbf{h}}} \frac{\partial{\mathbf{h}}}{\partial{\mathbf{\mathbf{w}}}}$$
$$\mbox{Gradient descent: } \mathbf{w} = \mathbf{w} - \alpha \frac{\partial{J(\mathbf{w})}}{\partial{\mathbf{w}}} $$
Let's denote the inner/Frobenius product by
and the elementwise/Hadamard product by
and elementwise/Hadamard division by
and note that the function is to be applied elementwise.
For convenience, let's use a modified loss function
Then the differential and gradient of
can be calculated as
And the gradient of the original cost function is