Skip to content

Instantly share code, notes, and snippets.

@emdupre
Created April 1, 2019 16:59
Show Gist options
  • Select an option

  • Save emdupre/3cb4d564511d495ea6bf89c6a577da74 to your computer and use it in GitHub Desktop.

Select an option

Save emdupre/3cb4d564511d495ea6bf89c6a577da74 to your computer and use it in GitHub Desktop.
Pull download keys for all files listed in an OSF repository
import json
import re
import requests
repo = '5hju4' # my example repository, update as appropriate
query = '?filter[name][contains]=confounds_regressors.tsv' # my example query, update or leave blank as appropriate
url = 'https://api.osf.io/v2/nodes/{0}/files/osfstorage/{1}'.format(repo, query)
guids = []
while True:
resp = requests.get(url)
resp.raise_for_status()
data = json.loads(resp.content)
for i in data['data']:
sub = re.search(r'sub-(\S+)_task', i['attributes']['name']).group(1)
guids.append((sub, i['id']))
url = data['links']['next']
if url is None:
break
@nicofarr
Copy link

Hi @emdupre,
Thanks, when doing those changes I do get 384 guids, corresponding to our folders.

Now, how do you query the hashes to get the contents of each folder ?

BTW where did you find this info ? Is this from the API guide on the OSF website ? We re having a hard time extracting the relevant info from this doc...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment