Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sonify ongoing research #34

Open
Daniel-Mietchen opened this issue Mar 4, 2017 · 18 comments
Open

Sonify ongoing research #34

Daniel-Mietchen opened this issue Mar 4, 2017 · 18 comments

Comments

@Daniel-Mietchen
Copy link
Collaborator

There are a number of sonifications of "recent activity" streams, e.g.

What about building something like this based on activity streams relevant to research?

Examples could be:

  • paper published
  • dataset downloaded
  • software forked
  • DOI cited in tweet
@Daniel-Mietchen Daniel-Mietchen changed the title Listen to science Sonify ongoing research Mar 4, 2017
@kshamash
Copy link

kshamash commented Mar 4, 2017

I really like this idea. Anyone else want to work on it?

@puterd
Copy link

puterd commented Mar 4, 2017

me too!

@Daniel-Mietchen
Copy link
Collaborator Author

@JosephMcArthur
Copy link
Member

Sources may include:

  • Crossref events data
  • Share api

@afandian
Copy link

afandian commented Mar 4, 2017

This is a fantastic idea. I've done something here, though it's more for status updates than research. http://status.eventdata.crossref.org/thing-action-service/index.html . So sorry that I can't be there today. I'll think about it. I'm afraid there's not much you can do with the CED at this point in time, but maybe soon.

@afandian
Copy link

afandian commented Mar 4, 2017

I would be interested to hear ideas about this beyond just "it beeps when something happens". Can sound be used to communicate information in a way that offers new insight? I played around with some ideas for the link in my above comment, but I couldn't come up with anything better than "a different type of event makes a different pitched sound". The same with visualisation. I know this is a do-a-thon not a think-a-thon, but I'm very interested in collaborating anyone (after this weekend sadly) who has ideas and wants to try them out.

With CED, we have a large number of scholarly articles 'interacting' with a large number of 'things' (including Wikipedia articles but also tweets, blogs, articles etc). It's a challenge to find a useful visualisation, and I think a useful sonification will be just as challenging. But a fun challenge!

@puterd
Copy link

puterd commented Mar 4, 2017

Hey! Yes this is exactly what we were discussing at the start. I'm going to create the audio files and want to do something more informative/interesting. It could be more like an inventory so a clap indicates something where as a snare indicates another thing? Say if the paper mentions interdisciplinary it could be indicated by a clap. Another thing might be something to indicate a new discipline or new co-ordination of subjects within an article. Its quite difficult to co-ordinate the meaning though and we're still stuck on the technical parts. I would be interested on working on this after the weekend for definite!

@puterd
Copy link

puterd commented Mar 4, 2017

It would be useful to interview some people who have to crawl through lots of scientific papers, questions like what information do they immediately need or want if they are skimming lots of articles?

@Daniel-Mietchen
Copy link
Collaborator Author

Daniel-Mietchen commented Mar 4, 2017

In terms of sonifications, there are some basic choices for each sound:

  • pitch
  • timbre
  • duration
  • volume.

In addition, individual sounds can be combined, and interact via things like rhythm and harmony.

None of this has to be linear, so one could imagine, for instance, having thresholds above or below which things happen differently.

@afandian
Copy link

afandian commented Mar 4, 2017

Also worth thinking about what objects you're tracking. Articles (that have DOIs)? Wikipedia articles? Particular words mentioned somehow?

EDIT: Sorry I'm assuming Wikipedia because of Daniel's background. CED offers a much broader set of data, although Wikipedia is an important part.

@Daniel-Mietchen
Copy link
Collaborator Author

Daniel-Mietchen commented Mar 6, 2017

The close above was just part of housekeeping when wrapping up the event.

I would certainly like to see this project develop further, and am happy to help as far as I can.

Besides the information sources already mentioned above, one could consider things like
https://openaccessbutton.org/request or a filtered version of https://hypothes.is/stream .

EDIT: @afandian the scope of this ticket is research as a whole. Most of what I do still has no wiki component.

@Daniel-Mietchen
Copy link
Collaborator Author

Daniel-Mietchen commented Mar 14, 2017

Another possible source of signal here would be the activity streams from places like ImpactStory.
Example: https://impactstory.org/u/0000-0001-7542-0286/timeline

@Daniel-Mietchen
Copy link
Collaborator Author

Here is a remake of Wikipulse: http://waldyrious.net/wikispeed/ .

@Daniel-Mietchen
Copy link
Collaborator Author

Some interesting things popping up under "sounds of research" and related searches.

@JosephMcArthur
Copy link
Member

I'm now working on this with a friend at NWSPK. We've just done listentotwitter.joshbalfour.co.uk (not launched) and have got Share going for this.

@Daniel-Mietchen
Copy link
Collaborator Author

That's great!

Based on your Twitter sonifyer, I also found https://github.com/musalbas/listentotwitter , which can give things like
http://listentotwitter.com/science
http://listentotwitter.com/research
http://listentotwitter.com/sciences
http://listentotwitter.com/scientist
http://listentotwitter.com/scientists
http://listentotwitter.com/researcher
http://listentotwitter.com/researchers

These can be run in parallel, thus giving a certain background of activity related to these terms (works with hashtags as well).

@Daniel-Mietchen
Copy link
Collaborator Author

There is a sonification do-a-thon at the International Conference of Auditory Display on June 21, where we will provide options for remote participation.

@monperrus
Copy link

FYI, we're collecting sonification examples at rethread-studio/rethread#5

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

6 participants