Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Texture metamers #15

Open
wants to merge 61 commits into
base: master
Choose a base branch
from
Open

Texture metamers #15

wants to merge 61 commits into from

Conversation

billbrod
Copy link
Member

Big changes here:

  • Adds metamer-related code. This includes new info in the makefile for downloading and creating the metamer images, a new notebook for analyzing and visualizing model performance, and a new .py, util/metamers.py with some helper functions. Those functions are meant to work with the metamers code, but may be worth generalizing for general use (especially create_image_struct)

  • Adds plot, moving over the cortical_image function from util and adding several new functions for visualizing model predictions.

  • Expands documentation

  • Adds cluster_submit.py to help submit multiple versions of the model to the cluster. Especially useful for metamer stuff, but should be helpful in general.

billbrod and others added 30 commits December 13, 2016 15:37
talks about the (potentially) required matlab packages
This uses Eero's textureSynth and matlabPyrTools libraries to create
several V1 and V2 metamers (with different white noise seeds) from some
input images
model_df requires a voxel column for things to work out, but we were
assuming that "voxel_idx" would be a key in the results dictionary; this
is not always the case, so we now handle that better. (previously, we
would often accidentally insert something as voxel even if there were no
corresponding key in results)
I don't know why I couldn't figure this out originally, but now we use
STIMULI_IDX_% and VOXEL_IDX directly for the matlab code without
creating a temporary txt file to contain their values
instead of littering up the home folder of this directory, we create a
Kay2013_comparison folder, where we place all the .csv, .mat, and .svg
associated with the full and sweep calls
With this commit, the makefile contains the code to create the metamer
stimuli for the model (using the createMetamers.m script added earlier),
and the new Metamers.ipynb creates the code to run the model on these
images and visualize the results
We were making V1 metamers incorrectly: it's simply made by randomizing
the phase but otherwise matching the two-dimensional power spectra; we
don't use textureSynthesis for this at all. (this uses a function in
knkutils).
The angle was getting improperly coverted to radians, resulting in some
weird values. This fixes it
For visualizing the results of the metamers, we want to see what happens
on the surface. Therefore, we need some extra info.
Several major changes here:

 - createMetamers now takes seeds instead of numVersions, so users set
   the seeds to use directly

 - corrects how V1 metamers are made: they should be a phase-randomized
   version fo the original image, which we then clip to make sure it
   lies within the [0, 255] range (code to rescale exists as well,
   though it's not recommended)

 - Adds V2MetamerScaled, which rescaled the V2 metamer so it has the
   same overall energy as the V1 metamer (though its range is reduced)

 - Makefile now includes separate recipes for the Brodatz and
   Freeman2013 metamers, since they have slightly different options. No
   recipe included for getting the Freeman2013_stimuli, since I'm not
   sure Corey wants me making that public
With this, we're working towards finalizing the notebook, though it's
still not done. Major additions:

 - starts on explanatory text

 - code to create SNR dataframes

 - code to run full and "dumb V1" versions of the model

 - visualizations of predicted responses and SNR

 - visualizations of predicted responses projected onto to the
   image ("cortical images"), though this needs refinement

 - starts visualizations of predicted responses projected onto the
   cortex, both inflated (using pysurfer) and flattened (using custom
   code), though this requires more work.

eventually, some of this code, especially the various visualizations,
will end up in other places in the SCO repository. additionally, will
write a script to make use of the cluster, since running the model can
take some time.
Biggest changes right now are making the plotting functions
better. Currently the flat cortex plotting works with facetting and is
somewhat more general (can handle both predictions and SNR values), but
want to continue to generalize it and move it into the actual SCO
library
This commit moves several functions (save_results_dict,
create_bootstrap_df, create_met_df) from Metamers.ipynb into
util/__init__.py or util/metamers.py, where they'll be more useful
This is a simple way to submit multiple models to the cluster. Currently, it's setup for use with the metamer testing and can only run one of a small number of model permutations, but it's hopefully not too hard to modify
With these changes, output from jobs are put in the same place as the scripts and we increase the walltime (since jobs took close to 5 hours)
I don't include code to automatically download the base images, but this
will process them. The important new step is to make sure they're the
right size (that is, can be divided by 2 five times -- this probably
changes with different NUM_SCALES and NUM_ORIENTATIONS, but we won't
worry about that for now).
This rearranges util code, putting most of it into util.core and leaving
util/__init__.py with just the necessary imports

Unfortunately, this means we can't import cortical_image in the top
level init, because we need to grab calc_surface_sco in
util/metamers.py, which is also imported in util/core.py
with this, we can now pass extra_cols to those two dataframe
construction functions, for the purpose of adding extra columns from
model_df when necessary. this is a bit cleaner (and faster) than doing
it after the fact
made the regex expressions used to parse the filenames a little more
general, allows them to be set in metamers.main call, and add a brief
note in both cluster_submit and metamers.main about needing to set them
Bunch of plotting functions moved to here from the metamers notebook and
from sco/util
I was bootstrapping incorrectly. Before bootstrapping, we want to
average values so that each (voxel, image_name) pair has one value,
which we then resample a number of times equal to the number of unique
image_name values. (we make it possible to change image_name for another
column, though the documentation isn't clear)
Two main changes:

 - bugfix in metamers.main: create_SNR_df doesn't take sample_num
   arg any more

 - adds model_steps option to metamers.main, allowing user to only
   run subset of the steps
create_image_df is a little more robust:

 - allows user to specify regex for image_seed, image_type, and
   image_name in call. if user sets image_seed or image_type's regex to
   None, then they're not added to the df

 - 'normalized_stimulus_images' no longer needs to be included in
   results
This commit gets rid of image_df, which took too long to create and
manipulate, and replaces it with image_struct, a dictionary containing
the image arrays and a label DataFrame, with all the metadata we need.

overhauls the creation of this to be a bit more general and plot_images
and img_plotter to work with this new structure
For plot_predicted_responses and plot_flat_cortex, adds set_minmax, a
variable which determines whether all of the images have the same
minimum / maximum or are allowed to have their own values
since a lot of the early stuff done by plot_images and
plot_cortical_images are the same, created a new helper function to
handle it
simple little function to plot the cortical image difference between two
subsets of a plot_df column (probably model)
Forgot to change the calls to sco.plot functions within
plot_cortical_images_diff from using sco.plot (like when testing in
notebook) to relative (now that it lives within sco.plot)
based on the function with the same name that was in Metamers.ipynb,
this will average across images (similar to plot_flat_cortex) and plot
on the inflated brain using pysurfer and its connection to
freesurfer. It is less flexible because of that reliance and I do not
think it is as good, but it may still be useful
This commit cleans up the Compare with Kay2013 notebook a bit from the
last commit.
With this commit, the analysis of how the SCO model handles the V1 and
V2 metamers should be easier for other to run:

 - Makefile now includes ability to download ImageNet and Freeman2013
   seed images from OSF project page

 - Makefile's paths use ~ for home directory wherever possible

 - README expanded for information on running model on cluster and
   re-creating the Compare with Kay2013 and metamer analyses

 - Metamers notebook cleaned up so it now mainly uses plotting code
   found in sco.plot and has much more explanation of what's happening.

Still need to: re-run Freeman2013 metamer analysis and confirm that it
looks the same as earlier.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant