-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update graph_utils.py #24
Conversation
Update numpy syntax
@kristiandroste thanks for opening this! Quick question, is there any reason for using
instead of something like
I'm wondering if one of these is more backwards compatible than the other. |
@calebgeniesse I tried running the jupyter notebook tutorials with the line you suggest. The trefoil knot still runs fine, and I actually got further through the haxby fmri: When using
Not sure whether this would be a compatibility issue or not. I tried downgrading kmapper to v1.1.6, which led to another couple of errors. To get around them I changed lines 78 + 79 in .../kmapper/cover.py from
to
and I commented out line 169 in .../dyneusr/mapper/wrappers.py as follows Finally I'm stuck with another networkx-related error at the same line (5) in step 4.2 of the haxby notebook:
This is more than what you asked for, but I hope it's informative 🙂 |
This definitely helps. Thanks for sharing these errors. My guess is that similar to the error we discussed over email, kmapper is returning an empty graph for some reason. If possible, can you share the error when using Otherwise, it seems like If so, we can either (1) resolve this PR, and then open a separate PR to address the kmapper compatibility; or (2) resolve both issues here, if you are okay with waiting. I'll need to spend a bit more time investigating the kmapper issues. Thanks again for helping with this! |
No worries, no rush on my end. I'm happy to help. I agree, seems like Here's the full error I get at step 3 (line 43 or cell 11) when I use
|
Ok looks like the same error just caught earlier, ie., an empty graph returned by kmapper. Let’s go with the second version? (i.e., We can keep this open to fix the kmapper stuff. Note, we may need to open a PR with kmapper if it’s something wrong with that code. But first, let me try and reproduce the issue. What version of Python and NumPy? |
Yeah let's go with I've been running |
@kristiandroste which notebook gives you the errors? Where you able to run: 02_haxby_fmri_short.ipynb? I couldn't reproduce the errors using Can you try going back to |
02_haxby_fmri_short runs properly regardless of which kmapper version I use: 1.1.6, 1.2.0, or 2.0.1 |
ok great, so the errors are in 02_haxby_fmri.ipynb? Edit: Nevermind, I was able to reproduce this error. Testing now. |
That's correct For the purpose of documentation, here's the full error I get at line 5 of cell 14 (after I've made the aforementioned edit from
|
Thanks @kristiandroste! I was able to reproduce the errors. Note, I was able to reduce the error to a warning by installing |
Sounds good. Likewise, if I can make any progress with |
So i think i found the underlying issue. There is a special undocumented option i added awhile back to help shift the cover bins a bit, so that points near the boundaries were accounted for properly. See cover = optimize_cover(
X, r=15, g=0.67,
scale_r=not True,
scale_limits=True
) The corresponding code is defined in dyneusr/mapper/utils.py, see e.g., # Define optimized limits
limits = None
if scale_limits is True:
offset = p_overlap / float(n_cubes)
limits = [[-offset, 1+offset] for _ in range(ndim)]
n_cubes += 2 #* ndim Not 100% sure why this doesn't work with Edit: To clarify, when I use |
For more context, when I use >>> cover.limits
array([[-0.04466667, 1.04466667],
[-0.04466667, 1.04466667]]) >>> cube_centers = cover.fit(lens)
>>> cube_centers
[array([-0.01262745]),
array([0.05145098]),
array([0.11552941]),
array([0.17960784]),
array([0.24368627]),
array([0.30776471]),
array([0.37184314]),
array([0.43592157]),
array([0.5]),
array([0.56407843]),
array([0.62815686]),
array([0.69223529]),
array([0.75631373]),
array([0.82039216]),
array([0.88447059]),
array([0.94854902]),
array([1.01262745])] >>> hyper_cubes = cover.transform(lens, cube_centers)
>>> [_.shape for _ in hyper_cubes]
[(3, 2),
(2, 2),
(1, 2),
(1, 2),
(1, 2),
(2, 2),
(2, 2),
(3, 2),
(3, 2),
(4, 2),
(4, 2),
(4, 2),
(3, 2),
(2, 2),
(1, 2)] and when I use >>> cover.limits
None >>> cube_centers = cover.fit(lens)
>>> cube_centers
[array([-17.97072105]),
array([-15.59127478]),
array([-13.2118285]),
array([-10.83238223]),
array([-8.45293595]),
array([-6.07348968]),
array([-3.6940434]),
array([-1.31459713]),
array([1.06484915]),
array([3.44429542]),
array([5.82374169]),
array([8.20318797]),
array([10.58263424]),
array([12.96208052]),
array([15.34152679])] >>> hyper_cubes = cover.transform(lens, cube_centers)
>>> [_.shape for _ in hyper_cubes]
[(26, 2),
(40, 2),
(48, 2),
(47, 2),
(47, 2),
(46, 2),
(50, 2),
(58, 2),
(60, 2),
(56, 2),
(49, 2),
(47, 2),
(51, 2),
(50, 2),
(32, 2)] So it seems to me like it has to do with the scaling of these limits or the lens itself. Thanks again for catching this. I'll take a closer look as soon as possible. |
@kristiandroste it seems like others are running into the I'm going to make the final change we discussed, from nx.linalg.graphmatrix.adjacency_matrix(G).toarray() to A = nx.to_numpy_array(G) After making the changes and closing this PR, I will open a new PR here, to resolve the |
use A = nx.to_numpy_array(G)
Update numpy syntax