github.com/DataBiosphere/terra-scientific-pipelines-service/SubsetVcfByBedFile
Authors/Creators
Description
Terra Scientific Pipelines Service
Overview
Terra Scientific Pipelines Service, or teaspoons, facilitates running a number of defined scientific pipelines on behalf of users that users can't run themselves in Terra. The most common reason for this is that the pipeline accesses proprietary data that users are not allowed to access directly, but that may be used as e.g. a reference panel for imputation.
Supported pipelines
Current supported pipelines are:
- [in development] Imputation (TODO add link/info)
Architecture
WIP architecture doc Linked LucidChart
Development
This codebase is in initial development.
Requirements
This service is written in Java 17, and uses Postgres 13.
To run locally, you'll also need:
- jq - install with
brew install jq - vault - see DSP's setup instructions here
- Note that for Step 7, "Create a GitHub Personal Access Token", you'll want to choose the "Tokens (classic)" option, not the fine-grained access token option.
- Java 17 - can be installed manually or through IntelliJ which will do it for you when importing the project
- Postgres 13 - multiple solutions here as long as you have a postgres instance running on localhost:5432 the local app will connect appropriately
- Postgres.app https://postgresapp.com/
- Brew https://formulae.brew.sh/formula/postgresql@13
Tech stack
- Java 17 temurin
- Postgres 13.1
- Gradle - build automation tool
- SonarQube - static code security and coverage
- Trivy - security scanner for docker images
- Jib - docker image builder for Java
Local development
To run locally:
- Make sure you have the requirements installed from above. We recommend IntelliJ as an IDE.
- Clone the repo (if you see broken inputs build the project to get the generated sources)
- Run the commands in
scripts/postgres-init.sqlin your local postgres instance. You will need to be authenticated to access Vault. - Run
scripts/write-config.sh - Run
./gradlew bootRunto spin up the server. - Navigate to http://localhost:8080/#
- If this is your first time deploying to any environment, be sure to use the admin endpoint
/api/admin/v1/updatePipelineWorkspaceId/{pipelineName}/{workspaceId}to set your pipeline's workspace id. Workspace id can be found through the terra ui workspace dashboard or through the Rawls GET workspace endpoint.
Local development with debugging
If using Intellij (only IDE we use on the team), you can run the server with a debugger. Follow
the steps above but instead of running ./gradlew bootRun to spin up the server, you can run
(debug) the App.java class through intellij and set breakpoints in the code. Be sure to set the
GOOGLE_APPLICATION_CREDENTIALS=config/tsps-sa.json in the Run/Debug configuration Environment Variables.
Running Tests/Linter Locally
- Testing
- Run
./gradlew service:testto run tests
- Run
- Linting
- Run
./gradlew spotlessCheckto run linter checks - Run
./gradlew :service:spotlessApplyto apply fix any issues the linter finds
- Run
(Optional) Install pre-commit hooks
- [scripts/git-hooks/pre-commit] has been provided to help ensure all submitted changes are formatted correctly. To install all hooks in [scripts/git-hooks], run:
git config core.hooksPath scripts/git-hooks
Running SonarQube locally
SonarQube is a static analysis code that scans code for a wide range of issues, including maintainability and possible bugs. Get more information from DSP SonarQube Docs
If you get a build failure due to SonarQube and want to debug the problem locally, you need to get the sonar token from vault before running the gradle task.
export SONAR_TOKEN=$(vault read -field=sonar_token secret/secops/ci/sonarcloud/tsps)
./gradlew sonarqube
Running this task produces no output unless your project has errors. To
generate a report, run using --info:
./gradlew sonarqube --info
Connecting to the database
To connect to the Teaspoons database, we have a script in dsp-scripts that does all the setup for you. Clone that repo and make sure you're either on Broad Internal wifi or connected to the VPN. Then run the following command:
./firecloud/psql-connect.sh dev tsps
Deploying to dev
Upon merging to main, the dev environment will be automatically deployed via the GitHub Action Bump, Tag, Publish, and Deploy (that workflow is defined here).
The two tasks report-to-sherlock and set-version-in-dev will prompt Sherlock to deploy the new version to dev.
You can check the status of the deployment in Beehive and in
ArgoCD.
For more information about deployment to dev, check out DevOps' excellent documentation.
Tracing
We use OpenTelemetry for tracing, so that every request has a tracing span that can be viewed in Google Cloud Trace. (This is not yet fully set up here - to be done in TSPS-107). See this DSP blog post for more info.
Running the end-to-end tests
The end-to-end test is specified in .github/workflows/run-e2e-tests.yaml. It calls the test script defined
in the dsp-reusable-workflows repo.
The end-to-end test is automatically run nightly on the dev environment.
To run the test against a specific feature branch:
- Grab the image tag for your feature branch.
If you've opened a PR, you can find the image tag as follows:
- go to the Bump, Tag, Publish, and Deploy workflow that's triggered each time you push to your branch
- From there, go to the tag-publish-docker-deploy task
- Expand the "Construct docker image name and tag" step
- The first line should contain the image tag, something like "0.0.81-6761487".
- Navigate to the e2e-test GHA workflow
- Click on the "Run workflow" button and select your branch from the dropdown
- Enter the image tag from step 1 in the "Custom image tag" field
- If you've updated the end-to-end test in the dsp-resuable-workflows repo, enter either a commit hash or your git branch name. If you don't need to change the test, leave the default as main.
- Click the green "Run workflow" button.
Files
github.com-DataBiosphere-terra-scientific-pipelines-service-SubsetVcfByBedFile_2.3.0.zip
Files
(1.5 kB)
| Name | Size | Download all |
|---|---|---|
|
md5:3cdee7cda5648bc19ee2a78094b4f66d
|
1.5 kB | Preview Download |
Additional details
Related works
- Is identical to
- https://dockstore.org/aliases/workflow-versions/10.5281-zenodo.18968442 (URL)
- https://dockstore.org/workflows/github.com/DataBiosphere/terra-scientific-pipelines-service/SubsetVcfByBedFile:2.3.0 (URL)
- https://dockstore.org/api/ga4gh/trs/v2/tools/%23workflow%2Fgithub.com%2FDataBiosphere%2Fterra-scientific-pipelines-service%2FSubsetVcfByBedFile/versions/2.3.0/PLAIN-WDL/descriptor/SubsetVcfByBedFile.wdl (URL)