Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 2 additions & 6 deletions _viash.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -47,14 +47,10 @@ references:

info:
image: The name of the image file to use for the component on the website.
# Step 5: Replace the task_template to the name of the task.
test_resources:
- type: s3
path: s3://openproblems-data/resources_test/common/
dest: resources_test/common
- type: s3
path: s3://openproblems-data/resources_test/task_template/
dest: resources_test/task_template
path: s3://openproblems-data/resources_test/task_spatial_segmentation/
dest: resources_test/task_spatial_segmentation

# Step 6: Update the authors of the task.
authors:
Expand Down
3 changes: 3 additions & 0 deletions scripts/create_test_resources/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
Here we generate a small test dataset, used for `viash test`. Note that the file structure here is a bit simplified compared to `scripts/create_resources` as we only have one dataset.

Copy the data from the `task_ist_preprocessing` test resources: `mouse_brain_combined.sh`
32 changes: 32 additions & 0 deletions scripts/create_test_resources/mouse_brain_combined.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
#!/bin/bash

# get the root of the directory
REPO_ROOT=$(git rev-parse --show-toplevel)

# ensure that the command below is run from the root of the repository
cd "$REPO_ROOT"

set -e

if [ ! -d resources_test/task_spatial_segmentation/mouse_brain_combined ]; then
mkdir -p resources_test/task_spatial_segmentation/mouse_brain_combined
fi

# these files were generated by https://github.com/openproblems-bio/task_ist_preprocessing/tree/main/scripts/create_test_resources
# we can just copy them for now

aws s3 sync --profile op \
s3://openproblems-data/resources_test/task_ist_preprocessing/mouse_brain_combined/raw_ist.zarr \
resources_test/task_spatial_segmentation/mouse_brain_combined/raw_ist.zarr

aws s3 cp --profile op \
s3://openproblems-data/resources_test/task_ist_preprocessing/mouse_brain_combined/scrnaseq_reference.h5ad \
resources_test/task_spatial_segmentation/mouse_brain_combined/scrnaseq_reference.h5ad

# ...additional preprocessing if needed ...

# sync to s3
aws s3 sync --profile op \
"resources_test/task_spatial_segmentation/mouse_brain_combined/" \
"s3://openproblems-data/resources_test/task_spatial_segmentation/mouse_brain_combined/" \
--delete --dryrun
3 changes: 3 additions & 0 deletions src/base/setup_spatialdata_partial.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
setup:
- type: python
pypi: ["spatialdata", "anndata>=0.12.0", "zarr>=3.0.0"]
13 changes: 13 additions & 0 deletions src/base/setup_txsim_partial.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
setup:
- type: python
pypi: ["spatialdata==0.5.0", "anndata>=0.12.0", "pyarrow<22.0.0", "zarr<3.0.0"]
# 1. remove pyarrow when https://github.com/scverse/spatialdata/issues/1007 is fixed.
# This is actually fixed now with the spatialdata release 0.6.0. However, the new
# release now comes with zarr 3.0.0. When reading a zarr file that was saved with
# zarr 3.0.0 we can not load it with zarr<3.0.0. (PathNotFoundError: nothing found at path '')
# 2. Currently sopa enforces zarr<3.0.0. Therefore we need to save all our data with zarr<3.0.0.
# As soon as this is fixed (https://github.com/gustaveroussy/sopa/issues/347):
# - remove restriction on spatialdata
# - remove zarr<3.0.0
# - remove pyarrow<22.0.0
# - Recreate all the datasets (scripts/create_resources/combine/process_datasets.sh)
Loading