New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
No out of bounds check during advanced array indexing #8127
Comments
@kc611 thank you for reporting this. IIRC there is a keyword argument that can be passed to the Is this perhaps what you are looking for? |
Thank you that was exactly what I was looking for. |
So, uh it does work for basic indexing, same cannot be said for advanced indexing: import numpy as np
import numba
@numba.njit(boundscheck=True)
def something():
arr = np.ones((3,3))
arr[:, np.array([2,3,5,100])] = 0
return arr
print(something()) # Does not raise error, ignores the out of bound indices completely.
print(something.py_func()) # Raises out of bounds error Note: The planning for a new implementation of advanced indexing is already underway, so this needs to handled/labelled correctly such that nobody takes this up till the respective PR is made. |
@kc611 suggest adding: diff --git a/numba/np/arrayobj.py b/numba/np/arrayobj.py
index 850bd1d..5b1d48e 100644
--- a/numba/np/arrayobj.py
+++ b/numba/np/arrayobj.py
@@ -1617,7 +1617,8 @@ def fancy_setslice(context, builder, sig, args, index_types, indices):
dest_ptr = cgutils.get_item_pointer2(context, builder, dest_data,
dest_shapes, dest_strides,
aryty.layout, dest_indices,
- wraparound=False)
+ wraparound=False,
+ boundscheck=context.enable_boundscheck)
store_item(context, builder, aryty, val, dest_ptr) see if that helps? |
Marking this as a good first issue for any newcomer. |
Numba ignores any indices that are out of bounds for a given array shape. This behavior is present in both basic and advanced indexing.
Not sure if we should be considering this as a bug or a feature :-)
Should be easily fixed by adding a bounds check during indexing during function runtime.
The text was updated successfully, but these errors were encountered: