Broadcasting involves 2 steps
give all arrays the same number of dimensions
expand the
1dimensions to match the other arrays
With your inputs
(41,6) (41,)
one is 2d, the other 1d; broadcasting can change the 1d to (1, 41), but it does not automatically expand in the other direction (41,1).
(41,6) (1,41)
Neither (41,41) or (6,41) matches the other.
So you need to change your y to (41,1) or the x to (6,41)
x.T*y
x*y[:,None]
I'm assuming, of course, that you want element by element multiplication, not the np.dot matrix product.
Broadcasting involves 2 steps
give all arrays the same number of dimensions
expand the
1dimensions to match the other arrays
With your inputs
(41,6) (41,)
one is 2d, the other 1d; broadcasting can change the 1d to (1, 41), but it does not automatically expand in the other direction (41,1).
(41,6) (1,41)
Neither (41,41) or (6,41) matches the other.
So you need to change your y to (41,1) or the x to (6,41)
x.T*y
x*y[:,None]
I'm assuming, of course, that you want element by element multiplication, not the np.dot matrix product.
Not exactly sure, what you are trying to achieve. Maybe you could give an example of your input and your expected output. One possibility is:
import numpy as np
x = np.array([[1, 2], [1, 2], [1, 2]])
y = np.array([1, 2, 3])
res = x * np.transpose(np.array([y,]*2))
This will multiply each column of x with y, so the result of the above example is:
array([[1, 2],
[2, 4],
[3, 6]])
python - Multiply two arrays with different dimensions using numpy - Stack Overflow
python - Numpy docs: How to multiply 2 arrays of different sizes together? - Stack Overflow
python - numpy array multiplication with arrays of arbitrary dimensions - Stack Overflow
python - Multiply Numpy arrays of different sizes - Stack Overflow
Videos
You could use None (or np.newaxis) to expand A to match B:
>>> A = np.arange(10)
>>> B = np.random.random((10,3,5))
>>> C0 = np.array([A[i]*B[i,:,:] for i in range(len(A))])
>>> C1 = A[:,None,None] * B
>>> np.allclose(C0, C1)
True
But this will only work for the 2 case. Borrowing from @ajcr, with enough transposes we can get implicit broadcasting to work for the general case:
>>> C3 = (A * B.T).T
>>> np.allclose(C0, C3)
True
Alternatively, you could use einsum to provide the generality. In retrospect it's probably overkill here compared with the transpose route, but it's handy when the multiplications are more complicated.
>>> C2 = np.einsum('i,i...->i...', A, B)
>>> np.allclose(C0, C2)
True
and
>>> B = np.random.random((10,4))
>>> D0 = np.array([A[i]*B[i,:] for i in range(len(A))])
>>> D2 = np.einsum('i,i...->i...', A, B)
>>> np.allclose(D0, D2)
True
Although I like the einsum notation, I'll add a little variety to the mix ....
You can add enough extra dimensions to a so that it will broadcast across b.
>>> a.shape
(3,)
>>> b.shape
(3,2)
b has more dimensions than a
extra_dims = b.ndim - a.ndim
Add the extra dimension(s) to a
new_shape = a.shape + (1,)*extra_dims # (3,1)
new_a = a.reshape(new_shape)
Multiply
new_a * b
As a function:
def f(a, b):
'''Product across the first dimension of b.
Assumes a is 1-dimensional.
Raises AssertionError if a.ndim > b.ndim or
- the first dimensions are different
'''
assert a.shape[0] == b.shape[0], 'First dimension is different'
assert b.ndim >= a.ndim, 'a has more dimensions than b'
# add extra dimensions so that a will broadcast
extra_dims = b.ndim - a.ndim
newshape = a.shape + (1,)*extra_dims
new_a = a.reshape(newshape)
return new_a * b
I'm trying to multiply A = 20x1x10 with B = 10x20 and I'm supposed to get a 20x1x20 matrix as the output. So far I've tried
C = numpy.matmul(A,B) # and C = A @ B
But all of them seem to result a 20x20x20 matrix. Any idea how to proceed from here?
For better context, A = jacobian output of derivative of an activation function in a neural network, with shape of (N samples * 1 * values) and B = changes in output of the layer