Compare commits
163 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| 47b746c4b1 | |||
| 92fe759b0f | |||
| 356914f306 | |||
| 9f3748cc49 | |||
| f7063484d9 | |||
| b8014de7c3 | |||
| f4c457043a | |||
| dffe43beee | |||
| 3f1ac10290 | |||
| 85f980be0a | |||
| d37af01b3a | |||
| 6ae0e56118 | |||
| f1478561c1 | |||
| 236390e57b | |||
| 5725f8a22f | |||
| 57d05eab38 | |||
| 62ff8e8a28 | |||
| 91fdaa89f9 | |||
| 27f9a397b2 | |||
| a76180fd48 | |||
| ffeef91166 | |||
| 9cf99ec78d | |||
| 5f4e1b7207 | |||
| 9729004982 | |||
| 5a6f2072d7 | |||
| 189ba1d18c | |||
|
|
3474d4e7e5 | ||
|
|
f1f69ad32f | ||
|
|
b10d9a8cec | ||
|
|
5277cd16f4 | ||
|
|
850257f977 | ||
|
|
6826b7ba59 | ||
|
|
dac129d16c | ||
|
|
2af2fd9b11 | ||
|
|
777dddbcaf | ||
|
|
3a34212899 | ||
|
|
6447047b86 | ||
|
|
cebed936ae | ||
| 3744123361 | |||
|
|
7eaa94a705 | ||
| 399249faa3 | |||
| 26f095cdd2 | |||
|
|
ff366fa6ba | ||
|
|
d547ca12a1 | ||
|
|
8df84579a8 | ||
|
|
35221a7495 | ||
|
|
a6a8427eed | ||
|
|
ea9ae2abf3 | ||
|
|
505e78404e | ||
|
|
4c451962ae | ||
|
|
d740ac8b9e | ||
| 8e7a272ee7 | |||
|
|
8931e921be | ||
|
|
81d4f775e9 | ||
|
|
9a379e74f2 | ||
|
|
a5b442cd2a | ||
|
|
81919bface | ||
|
|
a78d33fd5f | ||
|
|
26685633ce | ||
|
|
3b75177a4c | ||
|
|
9a0d419e18 | ||
|
|
c4f5f2c1fd | ||
|
|
3e4982425d | ||
|
|
992ab6f168 | ||
|
|
1d9a263d23 | ||
|
|
ea402ceaa9 | ||
|
|
55856a35dc | ||
|
|
3413b9ba71 | ||
|
|
ebb8f31974 | ||
|
|
43921790fa | ||
|
|
83ec579214 | ||
|
|
0bb3ba7e82 | ||
|
|
410dd003f7 | ||
|
|
7d69ca27ae | ||
|
|
d662e8509a | ||
|
|
928a331a40 | ||
|
|
0674128568 | ||
|
|
ea5752f59b | ||
|
|
8d078e672b | ||
|
|
75639c176c | ||
|
|
3fe12a8030 | ||
|
|
e2a6b19a2e | ||
|
|
6ede139084 | ||
|
|
366917c768 | ||
| d38e29160c | |||
| 847a5b2a6d | |||
| 1bd387b848 | |||
| bf6c0887d6 | |||
| 5c559f9652 | |||
| fa091db463 | |||
| 0506d76ea9 | |||
| d9ac3dc05b | |||
| 431bfb5a46 | |||
| d2e05ee4a0 | |||
| 6412e537a8 | |||
| b6f9d2697e | |||
| 11e8569c56 | |||
| f150e7baaf | |||
| f91af1148d | |||
| 179982dedc | |||
| 5efae0c64e | |||
| f172b00737 | |||
| b08f4e5790 | |||
| 7368f42e8b | |||
| 7049a902a4 | |||
| 84b3df524f | |||
| 6bca03aa7c | |||
| ca8668e5e5 | |||
| ea97f49ead | |||
| cd06b1b216 | |||
| 2efe4cbb2f | |||
| dfa0ef31b2 | |||
| ca8613ff1b | |||
| 89ef0de20c | |||
| cfe4f3dd4f | |||
| 02d58facbb | |||
| 8f2c90087d | |||
| 3365255a5b | |||
| ebb691449b | |||
| 454203a583 | |||
| 0efe638b30 | |||
| 7598e6d779 | |||
| 38c663e4f1 | |||
| 2cef1be491 | |||
| ec75498b29 | |||
| a28ecec025 | |||
| 8e769bd126 | |||
| 22892991f6 | |||
| 833bc12d4d | |||
| 9492ec8edc | |||
| 7a5c21dd36 | |||
| afe9f22379 | |||
| 448e35635f | |||
| 5d110da419 | |||
| 92629471ee | |||
| 04f26baaaa | |||
| dfe7f2eae9 | |||
| 39e713a07f | |||
| dc21d202c7 | |||
| 6fd4bd3171 | |||
| 96e380f838 | |||
| 4129d7ad7b | |||
| 8a39603554 | |||
| 8d3471cfab | |||
| f38b6aedba | |||
| df3ffadbbb | |||
| c2e805b29a | |||
| 4d91c6a3d5 | |||
| fe8997aa6f | |||
| 9cf6bdf753 | |||
| 05d23aa268 | |||
| 6a4778ad38 | |||
| 884ca7c948 | |||
| f57a616979 | |||
| 9b904d980f | |||
| afafcd0a22 | |||
| 9b7b8344de | |||
| 86fa94538a | |||
| d7835c7dc7 | |||
| 95f80fbe91 | |||
| 1e43e0c4bb | |||
| e3183ed0c9 | |||
| 097fefdfe7 |
38
.github/ISSUE_TEMPLATE/bug_report.md
vendored
Normal file
@ -0,0 +1,38 @@
|
||||
---
|
||||
name: Bug report
|
||||
about: Create a report to help us improve
|
||||
title: ''
|
||||
labels: ''
|
||||
assignees: ''
|
||||
|
||||
---
|
||||
|
||||
**Describe the bug**
|
||||
A clear and concise description of what the bug is.
|
||||
|
||||
**To Reproduce**
|
||||
Steps to reproduce the behavior:
|
||||
1. Go to '...'
|
||||
2. Click on '....'
|
||||
3. Scroll down to '....'
|
||||
4. See error
|
||||
|
||||
**Expected behavior**
|
||||
A clear and concise description of what you expected to happen.
|
||||
|
||||
**Screenshots**
|
||||
If applicable, add screenshots to help explain your problem.
|
||||
|
||||
**Desktop (please complete the following information):**
|
||||
- OS: [e.g. iOS]
|
||||
- Browser [e.g. chrome, safari]
|
||||
- Version [e.g. 22]
|
||||
|
||||
**Smartphone (please complete the following information):**
|
||||
- Device: [e.g. iPhone6]
|
||||
- OS: [e.g. iOS8.1]
|
||||
- Browser [e.g. stock browser, safari]
|
||||
- Version [e.g. 22]
|
||||
|
||||
**Additional context**
|
||||
Add any other context about the problem here.
|
||||
20
.github/ISSUE_TEMPLATE/feature_request.md
vendored
Normal file
@ -0,0 +1,20 @@
|
||||
---
|
||||
name: Feature request
|
||||
about: Suggest an idea for this project
|
||||
title: ''
|
||||
labels: ''
|
||||
assignees: ''
|
||||
|
||||
---
|
||||
|
||||
**Is your feature request related to a problem? Please describe.**
|
||||
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
|
||||
|
||||
**Describe the solution you'd like**
|
||||
A clear and concise description of what you want to happen.
|
||||
|
||||
**Describe alternatives you've considered**
|
||||
A clear and concise description of any alternative solutions or features you've considered.
|
||||
|
||||
**Additional context**
|
||||
Add any other context or screenshots about the feature request here.
|
||||
23
.github/ct/chart-schema.yaml
vendored
Normal file
@ -0,0 +1,23 @@
|
||||
name: str()
|
||||
home: str()
|
||||
version: str()
|
||||
apiVersion: str()
|
||||
appVersion: any(str(), num(), required=False)
|
||||
type: str()
|
||||
dependencies: any(required=False)
|
||||
description: str()
|
||||
keywords: list(str(), required=False)
|
||||
sources: list(str(), required=False)
|
||||
maintainers: list(include('maintainer'), required=False)
|
||||
icon: str(required=False)
|
||||
engine: str(required=False)
|
||||
condition: str(required=False)
|
||||
tags: str(required=False)
|
||||
deprecated: bool(required=False)
|
||||
kubeVersion: str(required=False)
|
||||
annotations: map(str(), str(), required=False)
|
||||
---
|
||||
maintainer:
|
||||
name: str()
|
||||
email: str(required=False)
|
||||
url: str(required=False)
|
||||
15
.github/ct/config.yaml
vendored
Normal file
@ -0,0 +1,15 @@
|
||||
debug: true
|
||||
remote: origin
|
||||
chart-yaml-schema: .github/ct/chart-schema.yaml
|
||||
validate-maintainers: false
|
||||
validate-chart-schema: true
|
||||
validate-yaml: true
|
||||
check-version-increment: true
|
||||
chart-dirs:
|
||||
- charts
|
||||
helm-extra-args: --timeout 300s
|
||||
upgrade: true
|
||||
skip-missing-values: true
|
||||
release-label: release
|
||||
release-name-template: "helm-v{{ .Version }}"
|
||||
target-branch: master
|
||||
84
.github/workflows/build-images.yaml
vendored
Normal file
@ -0,0 +1,84 @@
|
||||
name: Build Container Images
|
||||
|
||||
on:
|
||||
push:
|
||||
tags:
|
||||
- "image/v*"
|
||||
paths-ignore:
|
||||
- "charts/**"
|
||||
pull_request:
|
||||
branches: [master]
|
||||
paths-ignore:
|
||||
- "charts/**"
|
||||
env:
|
||||
IMAGES: docker.io/hapiproject/hapi
|
||||
PLATFORMS: linux/amd64,linux/arm64/v8
|
||||
|
||||
jobs:
|
||||
build:
|
||||
name: Build
|
||||
runs-on: ubuntu-22.04
|
||||
steps:
|
||||
- name: Container meta for default (distroless) image
|
||||
id: docker_meta
|
||||
uses: docker/metadata-action@v5
|
||||
with:
|
||||
images: ${{ env.IMAGES }}
|
||||
tags: |
|
||||
type=match,pattern=image/(.*),group=1,enable=${{github.event_name != 'pull_request'}}
|
||||
|
||||
|
||||
- name: Container meta for tomcat image
|
||||
id: docker_tomcat_meta
|
||||
uses: docker/metadata-action@v5
|
||||
with:
|
||||
images: ${{ env.IMAGES }}
|
||||
tags: |
|
||||
type=match,pattern=image/(.*),group=1,enable=${{github.event_name != 'pull_request'}}
|
||||
flavor: |
|
||||
suffix=-tomcat,onlatest=true
|
||||
|
||||
- name: Set up QEMU
|
||||
uses: docker/setup-qemu-action@v3
|
||||
|
||||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v3
|
||||
|
||||
- name: Login to DockerHub
|
||||
uses: docker/login-action@v3
|
||||
if: github.event_name != 'pull_request'
|
||||
with:
|
||||
username: ${{ secrets.DOCKERHUB_USERNAME }}
|
||||
password: ${{ secrets.DOCKERHUB_TOKEN }}
|
||||
|
||||
- name: Cache Docker layers
|
||||
uses: actions/cache@v3
|
||||
with:
|
||||
path: /tmp/.buildx-cache
|
||||
key: ${{ runner.os }}-buildx-${{ github.sha }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-buildx-
|
||||
|
||||
- name: Build and push default (distroless) image
|
||||
id: docker_build
|
||||
uses: docker/build-push-action@v5
|
||||
with:
|
||||
cache-from: type=local,src=/tmp/.buildx-cache
|
||||
cache-to: type=local,dest=/tmp/.buildx-cache
|
||||
push: ${{ github.event_name != 'pull_request' }}
|
||||
tags: ${{ steps.docker_meta.outputs.tags }}
|
||||
labels: ${{ steps.docker_meta.outputs.labels }}
|
||||
platforms: ${{ env.PLATFORMS }}
|
||||
target: default
|
||||
|
||||
- name: Build and push tomcat image
|
||||
id: docker_build_tomcat
|
||||
uses: docker/build-push-action@v5
|
||||
with:
|
||||
cache-from: type=local,src=/tmp/.buildx-cache
|
||||
cache-to: type=local,dest=/tmp/.buildx-cache
|
||||
push: ${{ github.event_name != 'pull_request' }}
|
||||
tags: ${{ steps.docker_tomcat_meta.outputs.tags }}
|
||||
labels: ${{ steps.docker_tomcat_meta.outputs.labels }}
|
||||
platforms: ${{ env.PLATFORMS }}
|
||||
target: tomcat
|
||||
41
.github/workflows/chart-release.yaml
vendored
Normal file
@ -0,0 +1,41 @@
|
||||
name: Release Charts
|
||||
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- main
|
||||
paths:
|
||||
- "charts/**"
|
||||
|
||||
jobs:
|
||||
release:
|
||||
runs-on: ubuntu-22.04
|
||||
steps:
|
||||
- name: Add workspace as safe directory
|
||||
run: |
|
||||
git config --global --add safe.directory /__w/FHIRFLARE-IG-Toolkit/FHIRFLARE-IG-Toolkit
|
||||
|
||||
- name: Checkout
|
||||
uses: actions/checkout@8e5e7e5ab8b370d6c329ec480221332ada57f0ab # v3.5.2
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Configure Git
|
||||
run: |
|
||||
git config user.name "$GITHUB_ACTOR"
|
||||
git config user.email "$GITHUB_ACTOR@users.noreply.github.com"
|
||||
|
||||
- name: Update dependencies
|
||||
run: find charts/ ! -path charts/ -maxdepth 1 -type d -exec helm dependency update {} \;
|
||||
|
||||
- name: Add Helm Repositories
|
||||
run: |
|
||||
helm repo add hapifhir https://hapifhir.github.io/hapi-fhir-jpaserver-starter/
|
||||
helm repo update
|
||||
|
||||
- name: Run chart-releaser
|
||||
uses: helm/chart-releaser-action@be16258da8010256c6e82849661221415f031968 # v1.5.0
|
||||
with:
|
||||
config: .github/ct/config.yaml
|
||||
env:
|
||||
CR_TOKEN: "${{ secrets.GITHUB_TOKEN }}"
|
||||
73
.github/workflows/chart-test.yaml
vendored
Normal file
@ -0,0 +1,73 @@
|
||||
name: Lint and Test Charts
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
branches:
|
||||
- master
|
||||
paths:
|
||||
- "charts/**"
|
||||
|
||||
jobs:
|
||||
lint:
|
||||
runs-on: ubuntu-22.04
|
||||
container: quay.io/helmpack/chart-testing:v3.11.0@sha256:f2fd21d30b64411105c7eafb1862783236a219d29f2292219a09fe94ca78ad2a
|
||||
steps:
|
||||
- name: Install helm-docs
|
||||
working-directory: /tmp
|
||||
env:
|
||||
HELM_DOCS_URL: https://github.com/norwoodj/helm-docs/releases/download/v1.14.2/helm-docs_1.14.2_Linux_x86_64.tar.gz
|
||||
run: |
|
||||
curl -LSs $HELM_DOCS_URL | tar xz && \
|
||||
mv ./helm-docs /usr/local/bin/helm-docs && \
|
||||
chmod +x /usr/local/bin/helm-docs && \
|
||||
helm-docs --version
|
||||
|
||||
- name: Add workspace as safe directory
|
||||
run: |
|
||||
git config --global --add safe.directory /__w/hapi-fhir-jpaserver-starter/hapi-fhir-jpaserver-starter
|
||||
|
||||
- name: Checkout
|
||||
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # v4.1.1
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Check if documentation is up-to-date
|
||||
run: helm-docs && git diff --exit-code HEAD
|
||||
|
||||
- name: Run chart-testing (lint)
|
||||
run: ct lint --config .github/ct/config.yaml
|
||||
|
||||
test:
|
||||
runs-on: ubuntu-22.04
|
||||
strategy:
|
||||
matrix:
|
||||
k8s-version: [1.30.8, 1.31.4, 1.32.0]
|
||||
needs:
|
||||
- lint
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # v4.1.1
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Set up chart-testing
|
||||
uses: helm/chart-testing-action@e6669bcd63d7cb57cb4380c33043eebe5d111992 # v2.6.1
|
||||
|
||||
- name: Run chart-testing (list-changed)
|
||||
id: list-changed
|
||||
run: |
|
||||
changed=$(ct list-changed --config .github/ct/config.yaml)
|
||||
if [[ -n "$changed" ]]; then
|
||||
echo "::set-output name=changed::true"
|
||||
fi
|
||||
|
||||
- name: Create k8s Kind Cluster
|
||||
uses: helm/kind-action@dda0770415bac9fc20092cacbc54aa298604d140 # v1.8.0
|
||||
if: ${{ steps.list-changed.outputs.changed == 'true' }}
|
||||
with:
|
||||
cluster_name: kind-cluster-k8s-${{ matrix.k8s-version }}
|
||||
node_image: kindest/node:v${{ matrix.k8s-version }}
|
||||
|
||||
- name: Run chart-testing (install)
|
||||
run: ct install --config .github/ct/config.yaml
|
||||
if: ${{ steps.list-changed.outputs.changed == 'true' }}
|
||||
58
.github/workflows/docker-publish.yml
vendored
Normal file
@ -0,0 +1,58 @@
|
||||
# This workflow builds and pushes a multi-architecture Docker image to GitHub Container Registry (ghcr.io).
|
||||
#
|
||||
# The Docker meta step is required because GitHub repository names can contain uppercase letters, but Docker image tags must be lowercase.
|
||||
# The docker/metadata-action@v5 normalizes the repository name to lowercase, ensuring the build and push steps use a valid image tag.
|
||||
#
|
||||
# This workflow builds for both AMD64 and ARM64 architectures using Docker Buildx and QEMU emulation.
|
||||
|
||||
name: Build and Push Docker image
|
||||
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- main
|
||||
- '*' # This will run the workflow on any branch
|
||||
workflow_dispatch: # This enables manual triggering
|
||||
|
||||
jobs:
|
||||
build-and-push:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Set up QEMU
|
||||
uses: docker/setup-qemu-action@v3
|
||||
|
||||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v3
|
||||
|
||||
- name: Log in to GitHub Container Registry
|
||||
uses: docker/login-action@v3
|
||||
with:
|
||||
registry: ghcr.io
|
||||
username: ${{ github.actor }}
|
||||
password: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: Docker meta
|
||||
id: meta
|
||||
uses: docker/metadata-action@v5
|
||||
with:
|
||||
images: ghcr.io/${{ github.repository }}
|
||||
|
||||
- name: Set normalized image name
|
||||
run: |
|
||||
if [[ "${{ github.ref_name }}" == "main" ]]; then
|
||||
echo "IMAGE_NAME=$(echo ${{ steps.meta.outputs.tags }} | sed 's/:main/:latest/')" >> $GITHUB_ENV
|
||||
else
|
||||
echo "IMAGE_NAME=${{ steps.meta.outputs.tags }}" >> $GITHUB_ENV
|
||||
fi
|
||||
|
||||
- name: Build and push multi-architecture Docker image
|
||||
uses: docker/build-push-action@v5
|
||||
with:
|
||||
context: .
|
||||
file: ./docker/Dockerfile
|
||||
platforms: linux/amd64,linux/arm64
|
||||
push: true
|
||||
tags: ${{ env.IMAGE_NAME }}
|
||||
40
.github/workflows/jekyll.yml
vendored
Normal file
@ -0,0 +1,40 @@
|
||||
name: Build and Deploy Jekyll site
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [ main, master ]
|
||||
pull_request:
|
||||
branches: [ main, master ]
|
||||
workflow_dispatch:
|
||||
|
||||
jobs:
|
||||
build:
|
||||
runs-on: ubuntu-latest
|
||||
defaults:
|
||||
run:
|
||||
working-directory: website
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
|
||||
- name: Setup Ruby
|
||||
uses: ruby/setup-ruby@v1
|
||||
with:
|
||||
ruby-version: '3.2'
|
||||
bundler-cache: true
|
||||
working-directory: website
|
||||
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
bundle install
|
||||
|
||||
- name: Build site
|
||||
run: bundle exec jekyll build
|
||||
|
||||
- name: Deploy to gh_pages branch
|
||||
uses: peaceiris/actions-gh-pages@v4
|
||||
with:
|
||||
github_token: ${{ secrets.GITHUB_TOKEN }}
|
||||
publish_dir: ./website/_site
|
||||
publish_branch: gh-pages
|
||||
keep_files: true
|
||||
7
.gitignore
vendored
Normal file
@ -0,0 +1,7 @@
|
||||
/logs/
|
||||
/.pydevproject
|
||||
/__pycache__/
|
||||
/myenv/
|
||||
/tmp/
|
||||
/Gemfile.lock
|
||||
/_site/
|
||||
23
.project
Normal file
@ -0,0 +1,23 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<projectDescription>
|
||||
<name>FHIRFLARE-IG-Toolkit</name>
|
||||
<comment></comment>
|
||||
<projects>
|
||||
</projects>
|
||||
<buildSpec>
|
||||
<buildCommand>
|
||||
<name>org.python.pydev.PyDevBuilder</name>
|
||||
<arguments>
|
||||
</arguments>
|
||||
</buildCommand>
|
||||
<buildCommand>
|
||||
<name>org.eclipse.wst.validation.validationbuilder</name>
|
||||
<arguments>
|
||||
</arguments>
|
||||
</buildCommand>
|
||||
</buildSpec>
|
||||
<natures>
|
||||
<nature>org.eclipse.wst.jsdt.core.jsNature</nature>
|
||||
<nature>org.python.pydev.pythonNature</nature>
|
||||
</natures>
|
||||
</projectDescription>
|
||||
7
.settings/.jsdtscope
Normal file
@ -0,0 +1,7 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<classpath>
|
||||
<classpathentry kind="con" path="org.eclipse.wst.jsdt.launching.JRE_CONTAINER"/>
|
||||
<classpathentry kind="con" path="org.eclipse.wst.jsdt.launching.baseBrowserLibrary"/>
|
||||
<classpathentry kind="src" path=""/>
|
||||
<classpathentry kind="output" path=""/>
|
||||
</classpath>
|
||||
1
.settings/org.eclipse.wst.jsdt.ui.superType.container
Normal file
@ -0,0 +1 @@
|
||||
org.eclipse.wst.jsdt.launching.JRE_CONTAINER
|
||||
1
.settings/org.eclipse.wst.jsdt.ui.superType.name
Normal file
@ -0,0 +1 @@
|
||||
Global
|
||||
378
Build and Run for first time.bat
Normal file
@ -0,0 +1,378 @@
|
||||
@echo off
|
||||
setlocal enabledelayedexpansion
|
||||
|
||||
REM --- Configuration ---
|
||||
set REPO_URL_HAPI=https://github.com/hapifhir/hapi-fhir-jpaserver-starter.git
|
||||
set REPO_URL_CANDLE=https://github.com/FHIR/fhir-candle.git
|
||||
set CLONE_DIR_HAPI=hapi-fhir-jpaserver
|
||||
set CLONE_DIR_CANDLE=fhir-candle
|
||||
set SOURCE_CONFIG_DIR=hapi-fhir-setup
|
||||
set CONFIG_FILE=application.yaml
|
||||
|
||||
REM --- Define Paths ---
|
||||
set SOURCE_CONFIG_PATH=..\%SOURCE_CONFIG_DIR%\target\classes\%CONFIG_FILE%
|
||||
set DEST_CONFIG_PATH=%CLONE_DIR_HAPI%\target\classes\%CONFIG_FILE%
|
||||
|
||||
REM --- NEW: Define a variable for the custom FHIR URL and server type ---
|
||||
set "CUSTOM_FHIR_URL_VAL="
|
||||
set "SERVER_TYPE="
|
||||
set "CANDLE_FHIR_VERSION="
|
||||
|
||||
REM === MODIFIED: Prompt for Installation Mode ===
|
||||
:GetModeChoice
|
||||
SET "APP_MODE=" REM Clear the variable first
|
||||
echo.
|
||||
echo Select Installation Mode:
|
||||
echo 1. Lite (Excludes local HAPI FHIR Server - No Git/Maven/Dotnet needed)
|
||||
echo 2. Custom URL (Uses a custom FHIR Server - No Git/Maven/Dotnet needed)
|
||||
echo 3. Hapi (Includes local HAPI FHIR Server - Requires Git & Maven)
|
||||
echo 4. Candle (Includes local FHIR Candle Server - Requires Git & Dotnet)
|
||||
CHOICE /C 1234 /N /M "Enter your choice (1, 2, 3, or 4):"
|
||||
|
||||
IF ERRORLEVEL 4 (
|
||||
SET APP_MODE=standalone
|
||||
SET SERVER_TYPE=candle
|
||||
goto :GetCandleFhirVersion
|
||||
)
|
||||
IF ERRORLEVEL 3 (
|
||||
SET APP_MODE=standalone
|
||||
SET SERVER_TYPE=hapi
|
||||
goto :ModeSet
|
||||
)
|
||||
IF ERRORLEVEL 2 (
|
||||
SET APP_MODE=standalone
|
||||
goto :GetCustomUrl
|
||||
)
|
||||
IF ERRORLEVEL 1 (
|
||||
SET APP_MODE=lite
|
||||
goto :ModeSet
|
||||
)
|
||||
|
||||
REM If somehow neither was chosen (e.g., Ctrl+C), loop back
|
||||
echo Invalid input. Please try again.
|
||||
goto :GetModeChoice
|
||||
|
||||
:GetCustomUrl
|
||||
set "CONFIRMED_URL="
|
||||
:PromptUrlLoop
|
||||
echo.
|
||||
set /p "CUSTOM_URL_INPUT=Please enter the custom FHIR server URL: "
|
||||
echo.
|
||||
echo You entered: !CUSTOM_URL_INPUT!
|
||||
set /p "CONFIRM_URL=Is this URL correct? (Y/N): "
|
||||
if /i "!CONFIRM_URL!" EQU "Y" (
|
||||
set "CONFIRMED_URL=!CUSTOM_URL_INPUT!"
|
||||
goto :ConfirmUrlLoop
|
||||
) else (
|
||||
goto :PromptUrlLoop
|
||||
)
|
||||
:ConfirmUrlLoop
|
||||
echo.
|
||||
echo Please re-enter the URL to confirm it is correct:
|
||||
set /p "CUSTOM_URL_INPUT=Re-enter URL: "
|
||||
if /i "!CUSTOM_URL_INPUT!" EQU "!CONFIRMED_URL!" (
|
||||
set "CUSTOM_FHIR_URL_VAL=!CUSTOM_URL_INPUT!"
|
||||
echo.
|
||||
echo Custom URL confirmed: !CUSTOM_FHIR_URL_VAL!
|
||||
goto :ModeSet
|
||||
) else (
|
||||
echo.
|
||||
echo URLs do not match. Please try again.
|
||||
goto :PromptUrlLoop
|
||||
)
|
||||
|
||||
:GetCandleFhirVersion
|
||||
echo.
|
||||
echo Select the FHIR version for the Candle server:
|
||||
echo 1. R4 (4.0)
|
||||
echo 2. R4B (4.3)
|
||||
echo 3. R5 (5.0)
|
||||
CHOICE /C 123 /N /M "Enter your choice (1, 2, or 3):"
|
||||
IF ERRORLEVEL 3 (
|
||||
SET CANDLE_FHIR_VERSION=r5
|
||||
goto :ModeSet
|
||||
)
|
||||
IF ERRORLEVEL 2 (
|
||||
SET CANDLE_FHIR_VERSION=r4b
|
||||
goto :ModeSet
|
||||
)
|
||||
IF ERRORLEVEL 1 (
|
||||
SET CANDLE_FHIR_VERSION=r4
|
||||
goto :ModeSet
|
||||
)
|
||||
echo Invalid input. Please try again.
|
||||
goto :GetCandleFhirVersion
|
||||
|
||||
:ModeSet
|
||||
IF "%APP_MODE%"=="" (
|
||||
echo Invalid choice detected after checks. Exiting.
|
||||
goto :eof
|
||||
)
|
||||
echo Selected Mode: %APP_MODE%
|
||||
echo Server Type: %SERVER_TYPE%
|
||||
echo.
|
||||
REM === END MODIFICATION ===
|
||||
|
||||
|
||||
REM === Conditionally Execute Server Setup ===
|
||||
IF "%SERVER_TYPE%"=="hapi" (
|
||||
echo Running Hapi server setup...
|
||||
echo.
|
||||
|
||||
REM --- Step 0: Clean up previous clone (optional) ---
|
||||
echo Checking for existing directory: %CLONE_DIR_HAPI%
|
||||
if exist "%CLONE_DIR_HAPI%" (
|
||||
echo Found existing directory, removing it...
|
||||
rmdir /s /q "%CLONE_DIR_HAPI%"
|
||||
if errorlevel 1 (
|
||||
echo ERROR: Failed to remove existing directory: %CLONE_DIR_HAPI%
|
||||
goto :error
|
||||
)
|
||||
echo Existing directory removed.
|
||||
) else (
|
||||
echo Directory does not exist, proceeding with clone.
|
||||
)
|
||||
echo.
|
||||
|
||||
REM --- Step 1: Clone the HAPI FHIR server repository ---
|
||||
echo Cloning repository: %REPO_URL_HAPI% into %CLONE_DIR_HAPI%...
|
||||
git clone "%REPO_URL_HAPI%" "%CLONE_DIR_HAPI%"
|
||||
if errorlevel 1 (
|
||||
echo ERROR: Failed to clone repository. Check Git installation and network connection.
|
||||
goto :error
|
||||
)
|
||||
echo Repository cloned successfully.
|
||||
echo.
|
||||
|
||||
REM --- Step 2: Navigate into the cloned directory ---
|
||||
echo Changing directory to %CLONE_DIR_HAPI%...
|
||||
cd "%CLONE_DIR_HAPI%"
|
||||
if errorlevel 1 (
|
||||
echo ERROR: Failed to change directory to %CLONE_DIR_HAPI%.
|
||||
goto :error
|
||||
)
|
||||
echo Current directory: %CD%
|
||||
echo.
|
||||
|
||||
REM --- Step 3: Build the HAPI server using Maven ---
|
||||
echo ===> "Starting Maven build (Step 3)...""
|
||||
cmd /c "mvn clean package -DskipTests=true -Pboot"
|
||||
echo ===> Maven command finished. Checking error level...
|
||||
if errorlevel 1 (
|
||||
echo ERROR: Maven build failed or cmd /c failed
|
||||
cd ..
|
||||
goto :error
|
||||
)
|
||||
echo Maven build completed successfully. ErrorLevel: %errorlevel%
|
||||
echo.
|
||||
|
||||
REM --- Step 4: Copy the configuration file ---
|
||||
echo ===> "Starting file copy (Step 4)..."
|
||||
echo Copying configuration file...
|
||||
echo Source: %SOURCE_CONFIG_PATH%
|
||||
echo Destination: target\classes\%CONFIG_FILE%
|
||||
xcopy "%SOURCE_CONFIG_PATH%" "target\classes\" /Y /I
|
||||
echo ===> xcopy command finished. Checking error level...
|
||||
if errorlevel 1 (
|
||||
echo WARNING: Failed to copy configuration file. Check if the source file exists.
|
||||
echo The script will continue, but the server might use default configuration.
|
||||
) else (
|
||||
echo Configuration file copied successfully. ErrorLevel: %errorlevel%
|
||||
)
|
||||
echo.
|
||||
|
||||
REM --- Step 5: Navigate back to the parent directory ---
|
||||
echo ===> "Changing directory back (Step 5)..."
|
||||
cd ..
|
||||
if errorlevel 1 (
|
||||
echo ERROR: Failed to change back to the parent directory. ErrorLevel: %errorlevel%
|
||||
goto :error
|
||||
)
|
||||
echo Current directory: %CD%
|
||||
echo.
|
||||
|
||||
) ELSE IF "%SERVER_TYPE%"=="candle" (
|
||||
echo Running FHIR Candle server setup...
|
||||
echo.
|
||||
|
||||
REM --- Step 0: Clean up previous clone (optional) ---
|
||||
echo Checking for existing directory: %CLONE_DIR_CANDLE%
|
||||
if exist "%CLONE_DIR_CANDLE%" (
|
||||
echo Found existing directory, removing it...
|
||||
rmdir /s /q "%CLONE_DIR_CANDLE%"
|
||||
if errorlevel 1 (
|
||||
echo ERROR: Failed to remove existing directory: %CLONE_DIR_CANDLE%
|
||||
goto :error
|
||||
)
|
||||
echo Existing directory removed.
|
||||
) else (
|
||||
echo Directory does not exist, proceeding with clone.
|
||||
)
|
||||
echo.
|
||||
|
||||
REM --- Step 1: Clone the FHIR Candle server repository ---
|
||||
echo Cloning repository: %REPO_URL_CANDLE% into %CLONE_DIR_CANDLE%...
|
||||
git clone "%REPO_URL_CANDLE%" "%CLONE_DIR_CANDLE%"
|
||||
if errorlevel 1 (
|
||||
echo ERROR: Failed to clone repository. Check Git and Dotnet installation and network connection.
|
||||
goto :error
|
||||
)
|
||||
echo Repository cloned successfully.
|
||||
echo.
|
||||
|
||||
REM --- Step 2: Navigate into the cloned directory ---
|
||||
echo Changing directory to %CLONE_DIR_CANDLE%...
|
||||
cd "%CLONE_DIR_CANDLE%"
|
||||
if errorlevel 1 (
|
||||
echo ERROR: Failed to change directory to %CLONE_DIR_CANDLE%.
|
||||
goto :error
|
||||
)
|
||||
echo Current directory: %CD%
|
||||
echo.
|
||||
|
||||
REM --- Step 3: Build the FHIR Candle server using Dotnet ---
|
||||
echo ===> "Starting Dotnet build (Step 3)...""
|
||||
dotnet publish -c Release -f net9.0 -o publish
|
||||
echo ===> Dotnet command finished. Checking error level...
|
||||
if errorlevel 1 (
|
||||
echo ERROR: Dotnet build failed. Check Dotnet SDK installation.
|
||||
cd ..
|
||||
goto :error
|
||||
)
|
||||
echo Dotnet build completed successfully. ErrorLevel: %errorlevel%
|
||||
echo.
|
||||
|
||||
REM --- Step 4: Navigate back to the parent directory ---
|
||||
echo ===> "Changing directory back (Step 4)..."
|
||||
cd ..
|
||||
if errorlevel 1 (
|
||||
echo ERROR: Failed to change back to the parent directory. ErrorLevel: %errorlevel%
|
||||
goto :error
|
||||
)
|
||||
echo Current directory: %CD%
|
||||
echo.
|
||||
|
||||
) ELSE (
|
||||
echo Running Lite setup, skipping server build...
|
||||
REM Ensure the server directories don't exist in Lite mode
|
||||
if exist "%CLONE_DIR_HAPI%" (
|
||||
echo Found existing HAPI directory in Lite mode. Removing it to avoid build issues...
|
||||
rmdir /s /q "%CLONE_DIR_HAPI%"
|
||||
)
|
||||
if exist "%CLONE_DIR_CANDLE%" (
|
||||
echo Found existing Candle directory in Lite mode. Removing it to avoid build issues...
|
||||
rmdir /s /q "%CLONE_DIR_CANDLE%"
|
||||
)
|
||||
REM Create empty placeholder files to satisfy Dockerfile COPY commands in Lite mode.
|
||||
mkdir "%CLONE_DIR_HAPI%\target\classes" 2> nul
|
||||
mkdir "%CLONE_DIR_HAPI%\custom" 2> nul
|
||||
echo. > "%CLONE_DIR_HAPI%\target\ROOT.war"
|
||||
echo. > "%CLONE_DIR_HAPI%\target\classes\application.yaml"
|
||||
mkdir "%CLONE_DIR_CANDLE%\publish" 2> nul
|
||||
echo. > "%CLONE_DIR_CANDLE%\publish\fhir-candle.dll"
|
||||
echo Placeholder files created for Lite mode build.
|
||||
echo.
|
||||
)
|
||||
|
||||
REM === MODIFIED: Update docker-compose.yml to set APP_MODE and HAPI_FHIR_URL ===
|
||||
echo Updating docker-compose.yml with APP_MODE=%APP_MODE% and HAPI_FHIR_URL...
|
||||
(
|
||||
echo version: '3.8'
|
||||
echo services:
|
||||
echo fhirflare:
|
||||
echo build:
|
||||
echo context: .
|
||||
IF "%SERVER_TYPE%"=="hapi" (
|
||||
echo dockerfile: Dockerfile.hapi
|
||||
) ELSE IF "%SERVER_TYPE%"=="candle" (
|
||||
echo dockerfile: Dockerfile.candle
|
||||
) ELSE (
|
||||
echo dockerfile: Dockerfile.lite
|
||||
)
|
||||
echo ports:
|
||||
IF "%SERVER_TYPE%"=="candle" (
|
||||
echo - "5000:5000"
|
||||
echo - "5001:5826"
|
||||
) ELSE (
|
||||
echo - "5000:5000"
|
||||
echo - "8080:8080"
|
||||
)
|
||||
echo volumes:
|
||||
echo - ./instance:/app/instance
|
||||
echo - ./static/uploads:/app/static/uploads
|
||||
IF "%SERVER_TYPE%"=="hapi" (
|
||||
echo - ./instance/hapi-h2-data/:/app/h2-data # Keep volume mounts consistent
|
||||
)
|
||||
echo - ./logs:/app/logs
|
||||
IF "%SERVER_TYPE%"=="hapi" (
|
||||
echo - ./hapi-fhir-jpaserver/target/ROOT.war:/usr/local/tomcat/webapps/ROOT.war
|
||||
echo - ./hapi-fhir-jpaserver/target/classes/application.yaml:/usr/local/tomcat/conf/application.yaml
|
||||
) ELSE IF "%SERVER_TYPE%"=="candle" (
|
||||
echo - ./fhir-candle/publish/:/app/fhir-candle-publish/
|
||||
)
|
||||
echo environment:
|
||||
echo - FLASK_APP=app.py
|
||||
echo - FLASK_ENV=development
|
||||
echo - NODE_PATH=/usr/lib/node_modules
|
||||
echo - APP_MODE=%APP_MODE%
|
||||
echo - APP_BASE_URL=http://localhost:5000
|
||||
IF DEFINED CUSTOM_FHIR_URL_VAL (
|
||||
echo - HAPI_FHIR_URL=!CUSTOM_FHIR_URL_VAL!
|
||||
) ELSE (
|
||||
IF "%SERVER_TYPE%"=="candle" (
|
||||
echo - HAPI_FHIR_URL=http://localhost:5826/fhir/%CANDLE_FHIR_VERSION%
|
||||
echo - ASPNETCORE_URLS=http://0.0.0.0:5826
|
||||
) ELSE (
|
||||
echo - HAPI_FHIR_URL=http://localhost:8080/fhir
|
||||
)
|
||||
)
|
||||
echo command: supervisord -c /etc/supervisord.conf
|
||||
) > docker-compose.yml.tmp
|
||||
|
||||
REM Check if docker-compose.yml.tmp was created successfully
|
||||
if not exist docker-compose.yml.tmp (
|
||||
echo ERROR: Failed to create temporary docker-compose file.
|
||||
goto :error
|
||||
)
|
||||
|
||||
REM Replace the original docker-compose.yml
|
||||
del docker-compose.yml /Q > nul 2>&1
|
||||
ren docker-compose.yml.tmp docker-compose.yml
|
||||
echo docker-compose.yml updated successfully.
|
||||
echo.
|
||||
|
||||
REM --- Step 6: Build Docker images ---
|
||||
echo ===> Starting Docker build (Step 6)...
|
||||
docker-compose build --no-cache
|
||||
if errorlevel 1 (
|
||||
echo ERROR: Docker Compose build failed. Check Docker installation and docker-compose.yml file. ErrorLevel: %errorlevel%
|
||||
goto :error
|
||||
)
|
||||
echo Docker images built successfully. ErrorLevel: %errorlevel%
|
||||
echo.
|
||||
|
||||
REM --- Step 7: Start Docker containers ---
|
||||
echo ===> Starting Docker containers (Step 7)...
|
||||
docker-compose up -d
|
||||
if errorlevel 1 (
|
||||
echo ERROR: Docker Compose up failed. Check Docker installation and container configurations. ErrorLevel: %errorlevel%
|
||||
goto :error
|
||||
)
|
||||
echo Docker containers started successfully. ErrorLevel: %errorlevel%
|
||||
echo.
|
||||
|
||||
echo ====================================
|
||||
echo Script finished successfully! (Mode: %APP_MODE%)
|
||||
echo ====================================
|
||||
goto :eof
|
||||
|
||||
:error
|
||||
echo ------------------------------------
|
||||
echo An error occurred. Script aborted.
|
||||
echo ------------------------------------
|
||||
pause
|
||||
exit /b 1
|
||||
|
||||
:eof
|
||||
echo Script execution finished.
|
||||
pause
|
||||
26
DockerCommands.MD
Normal file
@ -0,0 +1,26 @@
|
||||
Docker Commands.MD
|
||||
|
||||
|
||||
<HAPI-server.>
|
||||
to pull and clone:
|
||||
git clone https://github.com/hapifhir/hapi-fhir-jpaserver-starter.git hapi-fhir-jpaserver
|
||||
|
||||
to build:
|
||||
mvn clean package -DskipTests=true -Pboot
|
||||
|
||||
to run:
|
||||
java -jar target/ROOT.war
|
||||
|
||||
|
||||
<rest-of-the-app:>
|
||||
|
||||
docker-compose build --no-cache
|
||||
docker-compose up -d
|
||||
|
||||
|
||||
|
||||
<useful-stuff:>
|
||||
|
||||
cp <CONTAINERID>:/app/PATH/Filename.ext . - . copies to the root folder you ran it from
|
||||
|
||||
docker exec -it <CONTAINERID> bash - to get a bash - session in the container -
|
||||
73
Dockerfile
@ -1,22 +1,73 @@
|
||||
FROM python:3.9-slim
|
||||
|
||||
WORKDIR /app
|
||||
# Base image with Python and Java
|
||||
FROM tomcat:10.1-jdk17
|
||||
|
||||
# Install build dependencies, Node.js 18, and coreutils (for stdbuf)
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
python3 python3-pip python3-venv curl coreutils git \
|
||||
&& curl -fsSL https://deb.nodesource.com/setup_18.x | bash - \
|
||||
&& apt-get install -y --no-install-recommends nodejs \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
COPY requirements.txt .
|
||||
RUN pip install --no-cache-dir -r requirements.txt
|
||||
# ADDED: Install the Dotnet SDK for the FHIR Candle server
|
||||
# This makes the image a universal base for either server type
|
||||
RUN apt-get update && apt-get install -y dotnet-sdk-6.0
|
||||
|
||||
# Install specific versions of GoFSH and SUSHI
|
||||
RUN npm install -g gofsh fsh-sushi
|
||||
|
||||
# ADDED: Download the latest HL7 FHIR Validator CLI
|
||||
RUN mkdir -p /app/validator_cli
|
||||
WORKDIR /app/validator_cli
|
||||
# Download the validator JAR and a separate checksum file for verification
|
||||
RUN curl -L -o validator_cli.jar "https://github.com/hapifhir/org.hl7.fhir.core/releases/latest/download/validator_cli.jar"
|
||||
|
||||
# Set permissions for the downloaded file
|
||||
RUN chmod 755 validator_cli.jar
|
||||
|
||||
# Change back to the main app directory for the next steps
|
||||
WORKDIR /app
|
||||
# Set up Python environment
|
||||
RUN python3 -m venv /app/venv
|
||||
ENV PATH="/app/venv/bin:$PATH"
|
||||
|
||||
# ADDED: Uninstall old fhirpath just in case it's in requirements.txt
|
||||
RUN pip uninstall -y fhirpath || true
|
||||
# ADDED: Install the new fhirpathpy library
|
||||
RUN pip install --no-cache-dir fhirpathpy
|
||||
|
||||
# Copy Flask files
|
||||
COPY requirements.txt .
|
||||
# Install requirements (including Pydantic - check version compatibility if needed)
|
||||
RUN pip install --no-cache-dir -r requirements.txt
|
||||
COPY app.py .
|
||||
COPY services.py .
|
||||
COPY forms.py .
|
||||
COPY package.py .
|
||||
COPY templates/ templates/
|
||||
COPY static/ static/
|
||||
COPY tests/ tests/
|
||||
|
||||
# Ensure /tmp is writable as a fallback
|
||||
RUN mkdir -p /tmp && chmod 777 /tmp
|
||||
# Ensure /tmp, /app/h2-data, /app/static/uploads, and /app/logs are writable
|
||||
RUN mkdir -p /tmp /app/h2-data /app/static/uploads /app/logs && chmod 777 /tmp /app/h2-data /app/static/uploads /app/logs
|
||||
|
||||
EXPOSE 5000
|
||||
ENV FLASK_APP=app.py
|
||||
ENV FLASK_ENV=development
|
||||
CMD ["flask", "run", "--host=0.0.0.0"]
|
||||
# Copy pre-built HAPI WAR and configuration
|
||||
COPY hapi-fhir-jpaserver/target/ROOT.war /usr/local/tomcat/webapps/
|
||||
COPY hapi-fhir-jpaserver/target/classes/application.yaml /usr/local/tomcat/conf/
|
||||
COPY hapi-fhir-jpaserver/target/classes/application.yaml /app/config/application.yaml
|
||||
COPY hapi-fhir-jpaserver/target/classes/application.yaml /usr/local/tomcat/webapps/app/config/application.yaml
|
||||
COPY hapi-fhir-jpaserver/custom/ /usr/local/tomcat/webapps/custom/
|
||||
|
||||
# ADDED: Copy pre-built Candle DLL files
|
||||
COPY fhir-candle/publish/ /app/fhir-candle-publish/
|
||||
|
||||
# Install supervisord
|
||||
RUN pip install supervisor
|
||||
|
||||
# Configure supervisord
|
||||
COPY supervisord.conf /etc/supervisord.conf
|
||||
|
||||
# Expose ports
|
||||
EXPOSE 5000 8080 5001
|
||||
|
||||
# Start supervisord
|
||||
CMD ["supervisord", "-c", "/etc/supervisord.conf"]
|
||||
|
||||
64
Dockerfile.candle
Normal file
@ -0,0 +1,64 @@
|
||||
# Base image with Python and Dotnet
|
||||
FROM mcr.microsoft.com/dotnet/sdk:9.0
|
||||
|
||||
# Install build dependencies, Node.js 18, and coreutils (for stdbuf)
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
python3 python3-pip python3-venv \
|
||||
default-jre-headless \
|
||||
curl coreutils git \
|
||||
&& curl -fsSL https://deb.nodesource.com/setup_18.x | bash - \
|
||||
&& apt-get install -y --no-install-recommends nodejs \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Install specific versions of GoFSH and SUSHI
|
||||
RUN npm install -g gofsh fsh-sushi
|
||||
|
||||
# ADDED: Download the latest HL7 FHIR Validator CLI
|
||||
RUN mkdir -p /app/validator_cli
|
||||
WORKDIR /app/validator_cli
|
||||
# Download the validator JAR and a separate checksum file for verification
|
||||
RUN curl -L -o validator_cli.jar "https://github.com/hapifhir/org.hl7.fhir.core/releases/latest/download/validator_cli.jar"
|
||||
|
||||
# Set permissions for the downloaded file
|
||||
RUN chmod 755 validator_cli.jar
|
||||
|
||||
# Change back to the main app directory for the next steps
|
||||
WORKDIR /app
|
||||
# Set up Python environment
|
||||
RUN python3 -m venv /app/venv
|
||||
ENV PATH="/app/venv/bin:$PATH"
|
||||
|
||||
# ADDED: Uninstall old fhirpath just in case it's in requirements.txt
|
||||
RUN pip uninstall -y fhirpath || true
|
||||
# ADDED: Install the new fhirpathpy library
|
||||
RUN pip install --no-cache-dir fhirpathpy
|
||||
|
||||
# Copy Flask files
|
||||
COPY requirements.txt .
|
||||
# Install requirements (including Pydantic - check version compatibility if needed)
|
||||
RUN pip install --no-cache-dir -r requirements.txt
|
||||
COPY app.py .
|
||||
COPY services.py .
|
||||
COPY forms.py .
|
||||
COPY package.py .
|
||||
COPY templates/ templates/
|
||||
COPY static/ static/
|
||||
COPY tests/ tests/
|
||||
|
||||
# Ensure /tmp, /app/h2-data, /app/static/uploads, and /app/logs are writable
|
||||
RUN mkdir -p /tmp /app/h2-data /app/static/uploads /app/logs && chmod 777 /tmp /app/h2-data /app/static/uploads /app/logs
|
||||
|
||||
# Copy pre-built Candle DLL files
|
||||
COPY fhir-candle/publish/ /app/fhir-candle-publish/
|
||||
|
||||
# Install supervisord
|
||||
RUN pip install supervisor
|
||||
|
||||
# Configure supervisord
|
||||
COPY supervisord.conf /etc/supervisord.conf
|
||||
|
||||
# Expose ports
|
||||
EXPOSE 5000 5001
|
||||
|
||||
# Start supervisord
|
||||
CMD ["supervisord", "-c", "/etc/supervisord.conf"]
|
||||
66
Dockerfile.hapi
Normal file
@ -0,0 +1,66 @@
|
||||
# Base image with Python, Java, and Maven
|
||||
FROM tomcat:10.1-jdk17
|
||||
|
||||
# Install build dependencies, Node.js 18, and coreutils (for stdbuf)
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
python3 python3-pip python3-venv curl coreutils git maven \
|
||||
&& curl -fsSL https://deb.nodesource.com/setup_18.x | bash - \
|
||||
&& apt-get install -y --no-install-recommends nodejs \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Install specific versions of GoFSH and SUSHI
|
||||
RUN npm install -g gofsh fsh-sushi
|
||||
|
||||
# ADDED: Download the latest HL7 FHIR Validator CLI
|
||||
RUN mkdir -p /app/validator_cli
|
||||
WORKDIR /app/validator_cli
|
||||
# Download the validator JAR and a separate checksum file for verification
|
||||
RUN curl -L -o validator_cli.jar "https://github.com/hapifhir/org.hl7.fhir.core/releases/latest/download/validator_cli.jar"
|
||||
|
||||
# Set permissions for the downloaded file
|
||||
RUN chmod 755 validator_cli.jar
|
||||
|
||||
# Change back to the main app directory for the next steps
|
||||
WORKDIR /app
|
||||
# Set up Python environment
|
||||
RUN python3 -m venv /app/venv
|
||||
ENV PATH="/app/venv/bin:$PATH"
|
||||
|
||||
# ADDED: Uninstall old fhirpath just in case it's in requirements.txt
|
||||
RUN pip uninstall -y fhirpath || true
|
||||
# ADDED: Install the new fhirpathpy library
|
||||
RUN pip install --no-cache-dir fhirpathpy
|
||||
|
||||
# Copy Flask files
|
||||
COPY requirements.txt .
|
||||
# Install requirements (including Pydantic - check version compatibility if needed)
|
||||
RUN pip install --no-cache-dir -r requirements.txt
|
||||
COPY app.py .
|
||||
COPY services.py .
|
||||
COPY forms.py .
|
||||
COPY package.py .
|
||||
COPY templates/ templates/
|
||||
COPY static/ static/
|
||||
COPY tests/ tests/
|
||||
|
||||
# Ensure /tmp, /app/h2-data, /app/static/uploads, and /app/logs are writable
|
||||
RUN mkdir -p /tmp /app/h2-data /app/static/uploads /app/logs && chmod 777 /tmp /app/h2-data /app/static/uploads /app/logs
|
||||
|
||||
# Copy pre-built HAPI WAR and configuration
|
||||
COPY hapi-fhir-jpaserver/target/ROOT.war /usr/local/tomcat/webapps/
|
||||
COPY hapi-fhir-jpaserver/target/classes/application.yaml /usr/local/tomcat/conf/
|
||||
COPY hapi-fhir-jpaserver/target/classes/application.yaml /app/config/application.yaml
|
||||
COPY hapi-fhir-jpaserver/target/classes/application.yaml /usr/local/tomcat/webapps/app/config/application.yaml
|
||||
COPY hapi-fhir-jpaserver/custom/ /usr/local/tomcat/webapps/custom/
|
||||
|
||||
# Install supervisord
|
||||
RUN pip install supervisor
|
||||
|
||||
# Configure supervisord
|
||||
COPY supervisord.conf /etc/supervisord.conf
|
||||
|
||||
# Expose ports
|
||||
EXPOSE 5000 8080
|
||||
|
||||
# Start supervisord
|
||||
CMD ["supervisord", "-c", "/etc/supervisord.conf"]
|
||||
60
Dockerfile.lite
Normal file
@ -0,0 +1,60 @@
|
||||
# Base image with Python and Node.js
|
||||
FROM python:3.9-slim
|
||||
|
||||
# Install JRE and other dependencies for the validator
|
||||
# We need to install `default-jre-headless` to get a minimal Java Runtime Environment
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
default-jre-headless \
|
||||
curl coreutils git \
|
||||
&& curl -fsSL https://deb.nodesource.com/setup_18.x | bash - \
|
||||
&& apt-get install -y --no-install-recommends nodejs \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Install specific versions of GoFSH and SUSHI
|
||||
RUN npm install -g gofsh fsh-sushi
|
||||
|
||||
# ADDED: Download the latest HL7 FHIR Validator CLI
|
||||
RUN mkdir -p /app/validator_cli
|
||||
WORKDIR /app/validator_cli
|
||||
# Download the validator JAR and a separate checksum file for verification
|
||||
RUN curl -L -o validator_cli.jar "https://github.com/hapifhir/org.hl7.fhir.core/releases/latest/download/validator_cli.jar"
|
||||
|
||||
# Set permissions for the downloaded file
|
||||
RUN chmod 755 validator_cli.jar
|
||||
|
||||
# Change back to the main app directory for the next steps
|
||||
WORKDIR /app
|
||||
# Set up Python environment
|
||||
RUN python3 -m venv /app/venv
|
||||
ENV PATH="/app/venv/bin:$PATH"
|
||||
|
||||
# ADDED: Uninstall old fhirpath just in case it's in requirements.txt
|
||||
RUN pip uninstall -y fhirpath || true
|
||||
# ADDED: Install the new fhirpathpy library
|
||||
RUN pip install --no-cache-dir fhirpathpy
|
||||
|
||||
# Copy Flask files
|
||||
COPY requirements.txt .
|
||||
RUN pip install --no-cache-dir -r requirements.txt
|
||||
COPY app.py .
|
||||
COPY services.py .
|
||||
COPY forms.py .
|
||||
COPY package.py .
|
||||
COPY templates/ templates/
|
||||
COPY static/ static/
|
||||
COPY tests/ tests/
|
||||
|
||||
# Ensure necessary directories are writable
|
||||
RUN mkdir -p /app/static/uploads /app/logs && chmod 777 /app/static/uploads /app/logs
|
||||
|
||||
# Install supervisord
|
||||
RUN pip install supervisor
|
||||
|
||||
# Configure supervisord
|
||||
COPY supervisord.conf /etc/supervisord.conf
|
||||
|
||||
# Expose ports
|
||||
EXPOSE 5000
|
||||
|
||||
# Start supervisord
|
||||
CMD ["supervisord", "-c", "/etc/supervisord.conf"]
|
||||
861
README.md
@ -1,239 +1,704 @@
|
||||
# FHIRFLARE IG Toolkit
|
||||

|
||||
|
||||
## Overview
|
||||
|
||||
FHIRFLARE IG Toolkit is a Flask-based web application designed to simplify the management of FHIR Implementation Guides (IGs). It allows users to import, process, view, and manage FHIR packages, with features to handle duplicate dependencies and visualize processed IGs. The toolkit is built to assist developers, researchers, and healthcare professionals working with FHIR standards.
|
||||
The FHIRFLARE IG Toolkit is a Flask-based web application designed to streamline the management, processing, validation, and deployment of FHIR Implementation Guides (IGs) and test data. It offers a user-friendly interface for importing IG packages, extracting metadata, validating FHIR resources or bundles, pushing IGs to FHIR servers, converting FHIR resources to FHIR Shorthand (FSH), uploading complex test data sets with dependency management, and retrieving/splitting FHIR bundles. The toolkit includes live consoles for real-time feedback, making it an essential tool for FHIR developers and implementers.
|
||||
|
||||
This tool was initially developed as an IG package viewer within a larger project I was building. As the requirements expanded and it became clear there was a strong need for a lightweight, purpose-built solution to interact with IG packages, I decided to decouple it from the original codebase and release it as a standalone open-source utility for broader community use.
|
||||
The application can run in four modes:
|
||||
|
||||
### Key Features
|
||||
- **Import FHIR Packages**: Download FHIR IGs and their dependencies by specifying package names and versions.
|
||||
- **Manage Duplicates**: Detect and highlight duplicate packages with different versions, using color-coded indicators.
|
||||
- **Process IGs**: Extract and process FHIR resources, including structure definitions, must-support elements, and examples.
|
||||
- **View Details**: Explore processed IGs with detailed views of resource types and examples.
|
||||
- **Database Integration**: Store processed IGs in a SQLite database for persistence.
|
||||
- **User-Friendly Interface**: Built with Bootstrap for a responsive and intuitive UI.
|
||||
|
||||
* Installation Modes (Lite, Custom URL, Hapi, and Candle)
|
||||
This toolkit now offers four primary installation modes, configured via a simple command-line prompt during setup:
|
||||
|
||||
Lite Mode
|
||||
|
||||
Includes the Flask frontend, SQLite database, and core tooling (GoFSH, SUSHI) without an embedded FHIR server.
|
||||
|
||||
Requires you to provide a URL for an external FHIR server when using features like the FHIR API Explorer and validation.
|
||||
|
||||
Does not require Git, Maven, or .NET for setup.
|
||||
|
||||
Ideal for users who will always connect to an existing external FHIR server.
|
||||
|
||||
Custom URL Mode
|
||||
|
||||
Similar to Lite Mode, but prompts you for a specific external FHIR server URL to use as the default for the toolkit.
|
||||
|
||||
The application will be pre-configured to use your custom URL for all FHIR-related operations.
|
||||
|
||||
Does not require Git, Maven, or .NET for setup.
|
||||
|
||||
Hapi Mode
|
||||
|
||||
Includes the full FHIRFLARE toolkit and an embedded HAPI FHIR server.
|
||||
|
||||
The docker-compose configuration will proxy requests to the internal HAPI server, which is accessible via http://localhost:8080/fhir.
|
||||
|
||||
Requires Git and Maven during the initial build process to compile the HAPI FHIR server.
|
||||
|
||||
Ideal for users who want a self-contained, offline development and testing environment.
|
||||
|
||||
Candle Mode
|
||||
|
||||
Includes the full FHIRFLARE toolkit and an embedded FHIR Candle server.
|
||||
|
||||
The docker-compose configuration will proxy requests to the internal Candle server, which is accessible via http://localhost:5001/fhir/<version>.
|
||||
|
||||
Requires Git and the .NET SDK during the initial build process.
|
||||
|
||||
Ideal for users who want to use the .NET-based FHIR server for development and testing.
|
||||
|
||||
## Installation Modes (Lite vs. Standalone)
|
||||
|
||||
This toolkit offers two primary installation modes to suit different needs:
|
||||
|
||||
* **Standalone Version - Hapi / Candle:**
|
||||
* Includes the full FHIRFLARE Toolkit application **and** an embedded HAPI FHIR server running locally within the Docker environment.
|
||||
* Allows for local FHIR resource validation using HAPI FHIR's capabilities.
|
||||
* Enables the "Use Local HAPI" option in the FHIR API Explorer and FHIR UI Operations pages, proxying requests to the internal HAPI server (`http://localhost:8080/fhir`).
|
||||
* Requires Git and Maven during the initial build process (via the `.bat` script or manual steps) to prepare the HAPI FHIR server.
|
||||
* Ideal for users who want a self-contained environment for development and testing or who don't have readily available external FHIR servers.
|
||||
|
||||
* **Standalone Version - Custom Mode:**
|
||||
* Includes the full FHIRFLARE Toolkit application **and** point the Environmet variable at your chosen custom fhir url endpoint allowing for Custom setups
|
||||
|
||||
|
||||
* **Lite Version:**
|
||||
* Includes the FHIRFLARE Toolkit application **without** the embedded HAPI FHIR server.
|
||||
* Requires users to provide URLs for external FHIR servers when using features like the FHIR API Explorer and FHIR UI Operations pages. The "Use Local HAPI" option will be disabled in the UI.
|
||||
* Resource validation relies solely on local checks against downloaded StructureDefinitions, which may be less comprehensive than HAPI FHIR's validation (e.g., for terminology bindings or complex invariants).
|
||||
* **Does not require Git or Maven** for setup if using the `.bat` script or running the pre-built Docker image.
|
||||
* Ideal for users who primarily want to use the IG management, processing, and FSH conversion features, or who will always connect to existing external FHIR servers.
|
||||
|
||||
## Features
|
||||
|
||||
* **Import IGs:** Download FHIR IG packages and dependencies from a package registry, supporting flexible version formats (e.g., `1.2.3`, `1.1.0-preview`, `current`) and dependency pulling modes (Recursive, Patch Canonical, Tree Shaking).
|
||||
* **Enhanced Package Search and Import:**
|
||||
* Interactive page (`/search-and-import`) to search for FHIR IG packages from configured registries.
|
||||
* Displays package details, version history, dependencies, and dependents.
|
||||
* Utilizes a local database cache (`CachedPackage`) for faster subsequent searches.
|
||||
* Background task to refresh the package cache from registries (`/api/refresh-cache-task`).
|
||||
* Direct import from search results.
|
||||
* **Manage IGs:** View, process, unload, or delete downloaded IGs, with duplicate detection and resolution.
|
||||
* **Process IGs:** Extract resource types, profiles, must-support elements, examples, and profile relationships (`structuredefinition-compliesWithProfile` and `structuredefinition-imposeProfile`).
|
||||
* **Validate FHIR Resources/Bundles:** Validate single FHIR resources or bundles against selected IGs, with detailed error and warning reports (alpha feature). *Note: Lite version uses local SD checks only.*
|
||||
* **Push IGs:** Upload IG resources (and optionally dependencies) to a target FHIR server. Features include:
|
||||
* Real-time console output.
|
||||
* Authentication support (Bearer Token).
|
||||
* Filtering by resource type or specific files to skip.
|
||||
* Semantic comparison to skip uploading identical resources (override with **Force Upload** option).
|
||||
* Correct handling of canonical resources (searching by URL/version before deciding POST/PUT).
|
||||
* Dry run mode for simulation.
|
||||
* Verbose logging option.
|
||||
* **Upload Test Data:** Upload complex sets of test data (individual JSON/XML files or ZIP archives) to a target FHIR server. Features include:
|
||||
* Robust parsing of JSON and XML (using `fhir.resources` library when available).
|
||||
* Automatic dependency analysis based on resource references within the uploaded set.
|
||||
* Topological sorting to ensure resources are uploaded in the correct order.
|
||||
* Cycle detection in dependencies.
|
||||
* Choice of individual resource uploads or a single transaction bundle.
|
||||
* **Optional Pre-Upload Validation:** Validate resources against a selected profile package before uploading.
|
||||
* **Optional Conditional Uploads (Individual Mode):** Check resource existence (GET) and use conditional `If-Match` headers for updates (PUT) or create resources (PUT/POST). Falls back to simple PUT if unchecked.
|
||||
* Configurable error handling (stop on first error or continue).
|
||||
* Authentication support (Bearer Token).
|
||||
* Streaming progress log via the UI.
|
||||
* Handles large numbers of files using a custom form parser.
|
||||
* **Profile Relationships:** Display and validate `compliesWithProfile` and `imposeProfile` extensions in the UI (configurable).
|
||||
* **FSH Converter:** Convert FHIR JSON/XML resources to FHIR Shorthand (FSH) using GoFSH, with advanced options (Package context, Output styles, Log levels, FHIR versions, Fishing Trip, Dependencies, Indentation, Meta Profile handling, Alias File, No Alias). Includes a waiting spinner.
|
||||
* **Retrieve and Split Bundles:**
|
||||
* Retrieve specified resource types as bundles from a FHIR server.
|
||||
* Optionally fetch referenced resources, either individually or as full bundles for each referenced type.
|
||||
* Split uploaded ZIP files containing bundles into individual resource JSON files.
|
||||
* Download retrieved/split resources as a ZIP archive.
|
||||
* Streaming progress log via the UI for retrieval operations.
|
||||
* **FHIR Interaction UIs:** Explore FHIR server capabilities and interact with resources using the "FHIR API Explorer" (simple GET/POST/PUT/DELETE) and "FHIR UI Operations" (Swagger-like interface based on CapabilityStatement). *Note: Lite version requires custom server URLs.*
|
||||
* **HAPI FHIR Configuration (Standalone Mode):**
|
||||
* A dedicated page (`/config-hapi`) to view and edit the `application.yaml` configuration for the embedded HAPI FHIR server.
|
||||
* Allows modification of HAPI FHIR properties directly from the UI.
|
||||
* Option to restart the HAPI FHIR server (Tomcat) to apply changes.
|
||||
* **API Support:** RESTful API endpoints for importing, pushing, retrieving metadata, validating, uploading test data, and retrieving/splitting bundles.
|
||||
* **Live Console:** Real-time logs for push, validation, upload test data, FSH conversion, and bundle retrieval operations.
|
||||
* **Configurable Behavior:** Control validation modes, display options via `app.config`.
|
||||
* **Theming:** Supports light and dark modes.
|
||||
|
||||
## Technology Stack
|
||||
|
||||
This application is built using the following technologies:
|
||||
|
||||
* **Backend:**
|
||||
* **Python:** The primary programming language.
|
||||
* **Flask:** A lightweight web framework for building the application.
|
||||
* **SQLAlchemy:** An ORM (Object-Relational Mapper) for interacting with the database.
|
||||
* **Frontend:**
|
||||
* **HTML:** For structuring the web pages.
|
||||
* **CSS:** Styling is primarily provided by Bootstrap.
|
||||
* **Bootstrap 5.3.3:** A CSS framework for responsive and consistent design.
|
||||
* **Bootstrap Icons 1.11.3:** A library of icons for use within the user interface.
|
||||
* **JavaScript:** For client-side interactivity, particularly in the IG details view.
|
||||
* **Data Storage:**
|
||||
* **SQLite:** (Example - *You should specify your actual database here if different*) A lightweight, file-based database.
|
||||
* **Other:**
|
||||
* **tarfile:** Python's built-in module for working with tar archives.
|
||||
* **requests:** A Python library for making HTTP requests.
|
||||
* **json:** Python's built-in module for working with JSON data.
|
||||
* Python 3.12+, Flask 2.3.3, Flask-SQLAlchemy 3.0.5, Flask-WTF 1.2.1
|
||||
* Jinja2, Bootstrap 5.3.3, JavaScript (ES6), Lottie-Web 5.12.2
|
||||
* SQLite
|
||||
* Docker, Docker Compose, Supervisor
|
||||
* Node.js 18+ (for GoFSH/SUSHI), GoFSH, SUSHI
|
||||
* HAPI FHIR (Standalone version only)
|
||||
* Requests 2.31.0, Tarfile, Logging, Werkzeug
|
||||
* fhir.resources (optional, for robust XML parsing)
|
||||
|
||||
## Prerequisites
|
||||
|
||||
Before setting up the project, ensure you have the following installed:
|
||||
- **Python 3.8+**
|
||||
- **Docker** (optional, for containerized deployment)
|
||||
- **pip** (Python package manager)
|
||||
* **Docker:** Required for containerized deployment (both versions).
|
||||
* **Git & Maven:** Required **only** for building the **Standalone** version from source using the `.bat` script or manual steps. Not required for the Lite version build or for running pre-built Docker Hub images.
|
||||
* **Windows:** Required if using the `.bat` scripts.
|
||||
|
||||
## Setup Instructions
|
||||
|
||||
### 1. Clone the Repository
|
||||
```bash
|
||||
git clone https://github.com/your-username/FLARE-FHIR-IG-Toolkit.git
|
||||
cd FLARE-FHIR-IG-Toolkit
|
||||
```
|
||||
### Running Pre-built Images (General Users)
|
||||
|
||||
This is the easiest way to get started without needing Git or Maven. Choose the version you need:
|
||||
|
||||
**Lite Version (No local HAPI FHIR):**
|
||||
|
||||
### 2. Create a Virtual Environment
|
||||
```bash
|
||||
# Pull the latest Lite image
|
||||
docker pull ghcr.io/sudo-jhare/fhirflare-ig-toolkit-lite:latest
|
||||
|
||||
# Run the Lite version (maps port 5000 for the UI)
|
||||
# You'll need to create local directories for persistent data first:
|
||||
# mkdir instance logs static static/uploads instance/hapi-h2-data
|
||||
docker run -d \
|
||||
-p 5000:5000 \
|
||||
-v ./instance:/app/instance \
|
||||
-v ./static/uploads:/app/static/uploads \
|
||||
-v ./instance/hapi-h2-data:/app/h2-data \
|
||||
-v ./logs:/app/logs \
|
||||
--name fhirflare-lite \
|
||||
ghcr.io/sudo-jhare/fhirflare-ig-toolkit-lite:latest
|
||||
Standalone Version (Includes local HAPI FHIR):
|
||||
|
||||
Bash
|
||||
|
||||
# Pull the latest Standalone image
|
||||
docker pull ghcr.io/sudo-jhare/fhirflare-ig-toolkit-standalone:latest
|
||||
|
||||
# Run the Standalone version (maps ports 5000 and 8080)
|
||||
# You'll need to create local directories for persistent data first:
|
||||
# mkdir instance logs static static/uploads instance/hapi-h2-data
|
||||
docker run -d \
|
||||
-p 5000:5000 \
|
||||
-p 8080:8080 \
|
||||
-v ./instance:/app/instance \
|
||||
-v ./static/uploads:/app/static/uploads \
|
||||
-v ./instance/hapi-h2-data:/app/h2-data \
|
||||
-v ./logs:/app/logs \
|
||||
--name fhirflare-standalone \
|
||||
ghcr.io/sudo-jhare/fhirflare-ig-toolkit-standalone:latest
|
||||
Building from Source (Developers)
|
||||
Using Windows .bat Scripts (Standalone Version Only):
|
||||
|
||||
First Time Setup:
|
||||
|
||||
Run Build and Run for first time.bat:
|
||||
|
||||
Code snippet
|
||||
|
||||
cd "<project folder>"
|
||||
git clone [https://github.com/hapifhir/hapi-fhir-jpaserver-starter.git](https://github.com/hapifhir/hapi-fhir-jpaserver-starter.git) hapi-fhir-jpaserver
|
||||
copy .\\hapi-fhir-Setup\\target\\classes\\application.yaml .\\hapi-fhir-jpaserver\\target\\classes\\application.yaml
|
||||
mvn clean package -DskipTests=true -Pboot
|
||||
docker-compose build --no-cache
|
||||
docker-compose up -d
|
||||
This clones the HAPI FHIR server, copies configuration, builds the project, and starts the containers.
|
||||
|
||||
Subsequent Runs:
|
||||
|
||||
Run Run.bat:
|
||||
|
||||
Code snippet
|
||||
|
||||
cd "<project folder>"
|
||||
docker-compose up -d
|
||||
This starts the Flask app (port 5000) and HAPI FHIR server (port 8080).
|
||||
|
||||
Access the Application:
|
||||
|
||||
Flask UI: http://localhost:5000
|
||||
HAPI FHIR server: http://localhost:8080
|
||||
Manual Setup (Linux/MacOS/Windows):
|
||||
|
||||
Preparation (Standalone Version Only):
|
||||
|
||||
Bash
|
||||
|
||||
cd <project folder>
|
||||
git clone [https://github.com/hapifhir/hapi-fhir-jpaserver-starter.git](https://github.com/hapifhir/hapi-fhir-jpaserver-starter.git) hapi-fhir-jpaserver
|
||||
cp ./hapi-fhir-Setup/target/classes/application.yaml ./hapi-fhir-jpaserver/target/classes/application.yaml
|
||||
Build:
|
||||
|
||||
Bash
|
||||
|
||||
# Build HAPI FHIR (Standalone Version Only)
|
||||
mvn clean package -DskipTests=true -Pboot
|
||||
|
||||
# Build Docker Image (Specify APP_MODE=lite in docker-compose.yml for Lite version)
|
||||
docker-compose build --no-cache
|
||||
Run:
|
||||
|
||||
Bash
|
||||
|
||||
docker-compose up -d
|
||||
Access the Application:
|
||||
|
||||
Flask UI: http://localhost:5000
|
||||
HAPI FHIR server (Standalone only): http://localhost:8080
|
||||
Local Development (Without Docker):
|
||||
|
||||
Clone the Repository:
|
||||
|
||||
Bash
|
||||
|
||||
git clone [https://github.com/Sudo-JHare/FHIRFLARE-IG-Toolkit.git](https://github.com/Sudo-JHare/FHIRFLARE-IG-Toolkit.git)
|
||||
cd FHIRFLARE-IG-Toolkit
|
||||
Install Dependencies:
|
||||
|
||||
Bash
|
||||
|
||||
python -m venv venv
|
||||
source venv/bin/activate # On Windows: venv\Scripts\activate
|
||||
```
|
||||
|
||||
### 3. Install Dependencies
|
||||
Install the required Python packages using `pip`:
|
||||
```bash
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
Install Node.js, GoFSH, and SUSHI (for FSH Converter):
|
||||
|
||||
If you don’t have a `requirements.txt` file, you can install the core dependencies manually:
|
||||
Bash
|
||||
|
||||
# Example for Debian/Ubuntu
|
||||
curl -fsSL [https://deb.nodesource.com/setup_18.x](https://deb.nodesource.com/setup_18.x) | sudo bash -
|
||||
sudo apt-get install -y nodejs
|
||||
# Install globally
|
||||
npm install -g gofsh fsh-sushi
|
||||
Set Environment Variables:
|
||||
|
||||
Bash
|
||||
|
||||
export FLASK_SECRET_KEY='your-secure-secret-key'
|
||||
export API_KEY='your-api-key'
|
||||
# Optional: Set APP_MODE to 'lite' if desired
|
||||
# export APP_MODE='lite'
|
||||
Initialize Directories:
|
||||
|
||||
Bash
|
||||
|
||||
mkdir -p instance static/uploads logs
|
||||
# Ensure write permissions if needed
|
||||
# chmod -R 777 instance static/uploads logs
|
||||
Run the Application:
|
||||
|
||||
Bash
|
||||
|
||||
export FLASK_APP=app.py
|
||||
flask run
|
||||
Access at http://localhost:5000.
|
||||
|
||||
Usage
|
||||
Import an IG
|
||||
### Search, View Details, and Import Packages
|
||||
Navigate to **Search and Import Packages** (`/search-and-import`).
|
||||
1. The page will load a list of available FHIR Implementation Guide packages from a local cache or by fetching from configured registries.
|
||||
* A loading animation and progress messages are shown if fetching from registries.
|
||||
* The timestamp of the last cache update is displayed.
|
||||
2. Use the search bar to filter packages by name or author.
|
||||
3. Packages are paginated for easier Browse.
|
||||
4. For each package, you can:
|
||||
* View its latest official and absolute versions.
|
||||
* Click on the package name to navigate to a **detailed view** (`/package-details/<name>`) showing:
|
||||
* Comprehensive metadata (author, FHIR version, canonical URL, description).
|
||||
* A full list of available versions with publication dates.
|
||||
* Declared dependencies.
|
||||
* Other packages that depend on it (dependents).
|
||||
* Version history (logs).
|
||||
* Directly import a specific version using the "Import" button on the search page or the details page.
|
||||
5. **Cache Management:**
|
||||
* A "Clear & Refresh Cache" button is available to trigger a background task (`/api/refresh-cache-task`) that clears the local database and in-memory cache and fetches the latest package information from all configured registries. Progress is shown via a live log.
|
||||
|
||||
Enter a package name (e.g., hl7.fhir.au.core) and version (e.g., 1.1.0-preview).
|
||||
Choose a dependency mode:
|
||||
Current Recursive: Import all dependencies listed in package.json recursively.
|
||||
Patch Canonical Versions: Import only canonical FHIR packages (e.g., hl7.fhir.r4.core).
|
||||
Tree Shaking: Import only dependencies containing resources actually used by the main package.
|
||||
Click Import to download the package and dependencies.
|
||||
Manage IGs
|
||||
Go to Manage FHIR Packages (/view-igs) to view downloaded and processed IGs.
|
||||
|
||||
Actions:
|
||||
Process: Extract metadata (resource types, profiles, must-support elements, examples).
|
||||
Unload: Remove processed IG data from the database.
|
||||
Delete: Remove package files from the filesystem.
|
||||
Duplicates are highlighted for resolution.
|
||||
View Processed IGs
|
||||
After processing, view IG details (/view-ig/<id>), including:
|
||||
|
||||
Resource types and profiles.
|
||||
Must-support elements and examples.
|
||||
Profile relationships (compliesWithProfile, imposeProfile) if enabled (DISPLAY_PROFILE_RELATIONSHIPS).
|
||||
Interactive StructureDefinition viewer (Differential, Snapshot, Must Support, Key Elements, Constraints, Terminology, Search Params).
|
||||
Validate FHIR Resources/Bundles
|
||||
Navigate to Validate FHIR Sample (/validate-sample).
|
||||
|
||||
Select a package (e.g., hl7.fhir.au.core#1.1.0-preview).
|
||||
Choose Single Resource or Bundle mode.
|
||||
Paste or upload FHIR JSON/XML (e.g., a Patient resource).
|
||||
Submit to view validation errors/warnings. Note: Alpha feature; report issues to GitHub (remove PHI).
|
||||
Push IGs to a FHIR Server
|
||||
Go to Push IGs (/push-igs).
|
||||
|
||||
Select a downloaded package.
|
||||
Enter the Target FHIR Server URL.
|
||||
Configure Authentication (None, Bearer Token).
|
||||
Choose options: Include Dependencies, Force Upload (skips comparison check), Dry Run, Verbose Log.
|
||||
Optionally filter by Resource Types (comma-separated) or Skip Specific Files (paths within package, comma/newline separated).
|
||||
Click Push to FHIR Server to upload resources. Canonical resources are checked before upload. Identical resources are skipped unless Force Upload is checked.
|
||||
Monitor progress in the live console.
|
||||
Upload Test Data
|
||||
Navigate to Upload Test Data (/upload-test-data).
|
||||
|
||||
Enter the Target FHIR Server URL.
|
||||
Configure Authentication (None, Bearer Token).
|
||||
Select one or more .json, .xml files, or a single .zip file containing test resources.
|
||||
Optionally check Validate Resources Before Upload? and select a Validation Profile Package.
|
||||
Choose Upload Mode:
|
||||
Individual Resources: Uploads each resource one by one in dependency order.
|
||||
Transaction Bundle: Uploads all resources in a single transaction.
|
||||
Optionally check Use Conditional Upload (Individual Mode Only)? to use If-Match headers for updates.
|
||||
Choose Error Handling:
|
||||
Stop on First Error: Halts the process if any validation or upload fails.
|
||||
Continue on Error: Reports errors but attempts to process/upload remaining resources.
|
||||
Click Upload and Process. The tool parses files, optionally validates, analyzes dependencies, topologically sorts resources, and uploads them according to selected options.
|
||||
Monitor progress in the streaming log output.
|
||||
Convert FHIR to FSH
|
||||
Navigate to FSH Converter (/fsh-converter).
|
||||
|
||||
Optionally select a package for context (e.g., hl7.fhir.au.core#1.1.0-preview).
|
||||
Choose input mode:
|
||||
Upload File: Upload a FHIR JSON/XML file.
|
||||
Paste Text: Paste FHIR JSON/XML content.
|
||||
Configure options:
|
||||
Output Style: file-per-definition, group-by-fsh-type, group-by-profile, single-file.
|
||||
Log Level: error, warn, info, debug.
|
||||
FHIR Version: R4, R4B, R5, or auto-detect.
|
||||
Fishing Trip: Enable round-trip validation with SUSHI, generating a comparison report.
|
||||
Dependencies: Specify additional packages (e.g., hl7.fhir.us.core@6.1.0, one per line).
|
||||
Indent Rules: Enable context path indentation for readable FSH.
|
||||
Meta Profile: Choose only-one, first, or none for meta.profile handling.
|
||||
Alias File: Upload an FSH file with aliases (e.g., $MyAlias = http://example.org).
|
||||
No Alias: Disable automatic alias generation.
|
||||
Click Convert to FSH to generate and display FSH output, with a waiting spinner (light/dark theme) during processing.
|
||||
If Fishing Trip is enabled, view the comparison report via the "Click here for SUSHI Validation" badge button.
|
||||
Download the result as a .fsh file.
|
||||
Retrieve and Split Bundles
|
||||
Navigate to Retrieve/Split Data (/retrieve-split-data).
|
||||
|
||||
Retrieve Bundles from Server:
|
||||
|
||||
Enter the FHIR Server URL (defaults to the proxy if empty).
|
||||
Select one or more Resource Types to retrieve (e.g., Patient, Observation).
|
||||
Optionally check Fetch Referenced Resources.
|
||||
If checked, further optionally check Fetch Full Reference Bundles to retrieve entire bundles for each referenced type (e.g., all Patients if a Patient is referenced) instead of individual resources by ID.
|
||||
Click Retrieve Bundles.
|
||||
Monitor progress in the streaming log. A ZIP file containing the retrieved bundles/resources will be prepared for download.
|
||||
Split Uploaded Bundles:
|
||||
|
||||
Upload a ZIP file containing FHIR bundles (JSON format).
|
||||
Click Split Bundles.
|
||||
A ZIP file containing individual resources extracted from the bundles will be prepared for download.
|
||||
Explore FHIR Operations
|
||||
Navigate to FHIR UI Operations (/fhir-ui-operations).
|
||||
|
||||
Toggle between local HAPI (/fhir) or a custom FHIR server.
|
||||
Click Fetch Metadata to load the server’s CapabilityStatement.
|
||||
Select a resource type (e.g., Patient, Observation) or System to view operations:
|
||||
System operations: GET /metadata, POST /, GET /_history, GET/POST /$diff, POST /$reindex, POST /$expunge, etc.
|
||||
Resource operations: GET Patient/:id, POST Observation/_search, etc.
|
||||
Use Try it out to input parameters or request bodies, then Execute to view results in JSON, XML, or narrative formats.
|
||||
|
||||
### Configure Embedded HAPI FHIR Server (Standalone Mode)
|
||||
For users running the **Standalone version**, which includes an embedded HAPI FHIR server.
|
||||
1. Navigate to **Configure HAPI FHIR** (`/config-hapi`).
|
||||
2. The page displays the content of the HAPI FHIR server's `application.yaml` file.
|
||||
3. You can edit the configuration directly in the text area.
|
||||
* *Caution: Incorrect modifications can break the HAPI FHIR server.*
|
||||
4. Click **Save Configuration** to apply your changes to the `application.yaml` file.
|
||||
5. Click **Restart Tomcat** to restart the HAPI FHIR server and load the new configuration. The restart process may take a few moments.
|
||||
|
||||
API Usage
|
||||
Import IG
|
||||
Bash
|
||||
|
||||
curl -X POST http://localhost:5000/api/import-ig \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "X-API-Key: your-api-key" \
|
||||
-d '{"package_name": "hl7.fhir.au.core", "version": "1.1.0-preview", "dependency_mode": "recursive"}'
|
||||
Returns complies_with_profiles, imposed_profiles, and duplicate_packages_present info.
|
||||
|
||||
### Refresh Package Cache (Background Task)
|
||||
```bash
|
||||
pip install flask flask-sqlalchemy flask-wtf
|
||||
```
|
||||
curl -X POST http://localhost:5000/api/refresh-cache-task \
|
||||
-H "X-API-Key: your-api-key"
|
||||
|
||||
### 4. Set Up the Instance Directory
|
||||
The application uses an `instance` directory to store the SQLite database and FHIR packages. Create this directory and set appropriate permissions:
|
||||
Push IG
|
||||
Bash
|
||||
|
||||
curl -X POST http://localhost:5000/api/push-ig \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Accept: application/x-ndjson" \
|
||||
-H "X-API-Key: your-api-key" \
|
||||
-d '{
|
||||
"package_name": "hl7.fhir.au.core",
|
||||
"version": "1.1.0-preview",
|
||||
"fhir_server_url": "http://localhost:8080/fhir",
|
||||
"include_dependencies": true,
|
||||
"force_upload": false,
|
||||
"dry_run": false,
|
||||
"verbose": false,
|
||||
"auth_type": "none"
|
||||
}'
|
||||
Returns a streaming NDJSON response with progress and final summary.
|
||||
|
||||
Upload Test Data
|
||||
Bash
|
||||
|
||||
curl -X POST http://localhost:5000/api/upload-test-data \
|
||||
-H "X-API-Key: your-api-key" \
|
||||
-H "Accept: application/x-ndjson" \
|
||||
-F "fhir_server_url=http://your-fhir-server/fhir" \
|
||||
-F "auth_type=bearerToken" \
|
||||
-F "auth_token=YOUR_TOKEN" \
|
||||
-F "upload_mode=individual" \
|
||||
-F "error_handling=continue" \
|
||||
-F "validate_before_upload=true" \
|
||||
-F "validation_package_id=hl7.fhir.r4.core#4.0.1" \
|
||||
-F "use_conditional_uploads=true" \
|
||||
-F "test_data_files=@/path/to/your/patient.json" \
|
||||
-F "test_data_files=@/path/to/your/observations.zip"
|
||||
Returns a streaming NDJSON response with progress and final summary. Uses multipart/form-data for file uploads.
|
||||
|
||||
Retrieve Bundles
|
||||
Bash
|
||||
|
||||
curl -X POST http://localhost:5000/api/retrieve-bundles \
|
||||
-H "X-API-Key: your-api-key" \
|
||||
-H "Accept: application/x-ndjson" \
|
||||
-F "fhir_server_url=http://your-fhir-server/fhir" \
|
||||
-F "resources=Patient" \
|
||||
-F "resources=Observation" \
|
||||
-F "validate_references=true" \
|
||||
-F "fetch_reference_bundles=false"
|
||||
Returns a streaming NDJSON response with progress. The X-Zip-Path header in the final response part will contain the path to download the ZIP archive (e.g., /tmp/retrieved_bundles_datetime.zip).
|
||||
|
||||
Split Bundles
|
||||
Bash
|
||||
|
||||
curl -X POST http://localhost:5000/api/split-bundles \
|
||||
-H "X-API-Key: your-api-key" \
|
||||
-H "Accept: application/x-ndjson" \
|
||||
-F "split_bundle_zip_path=@/path/to/your/bundles.zip"
|
||||
Returns a streaming NDJSON response. The X-Zip-Path header in the final response part will contain the path to download the ZIP archive of split resources.
|
||||
|
||||
Validate Resource/Bundle
|
||||
Not yet exposed via API; use the UI at /validate-sample.
|
||||
|
||||
Configuration Options
|
||||
Located in app.py:
|
||||
|
||||
VALIDATE_IMPOSED_PROFILES: (Default: True) Validates resources against imposed profiles during push.
|
||||
DISPLAY_PROFILE_RELATIONSHIPS: (Default: True) Shows compliesWithProfile and imposeProfile in the UI.
|
||||
FHIR_PACKAGES_DIR: (Default: /app/instance/fhir_packages) Stores .tgz packages and metadata.
|
||||
UPLOAD_FOLDER: (Default: /app/static/uploads) Stores GoFSH output files and FSH comparison reports.
|
||||
SECRET_KEY: Required for CSRF protection and sessions. Set via environment variable or directly.
|
||||
API_KEY: Required for API authentication. Set via environment variable or directly.
|
||||
MAX_CONTENT_LENGTH: (Default: Flask default) Max size for HTTP request body (e.g., 16 * 1024 * 1024 for 16MB). Important for large uploads.
|
||||
MAX_FORM_PARTS: (Default: Werkzeug default, often 1000) Default max number of form parts. Overridden for /api/upload-test-data by CustomFormDataParser.
|
||||
|
||||
### Get HAPI FHIR Configuration (Standalone Mode)
|
||||
```bash
|
||||
mkdir instance
|
||||
mkdir instance/fhir_packages
|
||||
chmod -R 777 instance # Ensure the directory is writable
|
||||
```
|
||||
curl -X GET http://localhost:5000/api/config \
|
||||
-H "X-API-Key: your-api-key"
|
||||
|
||||
### 5. Initialize the Database
|
||||
The application uses a SQLite database (`instance/fhir_ig.db`) to store processed IGs. The database is automatically created when you first run the application, but you can also initialize it manually:
|
||||
```bash
|
||||
python -c "from app import db; db.create_all()"
|
||||
```
|
||||
Save HAPI FHIR Configuration:
|
||||
curl -X POST http://localhost:5000/api/config \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "X-API-Key: your-api-key" \
|
||||
-d '{"your_yaml_key": "your_value", ...}' # Send the full YAML content as JSON
|
||||
|
||||
### 6. Run the Application Locally
|
||||
Start the Flask development server:
|
||||
```bash
|
||||
python app.py
|
||||
```
|
||||
Restart HAPI FHIR Server:
|
||||
curl -X POST http://localhost:5000/api/restart-tomcat \
|
||||
-H "X-API-Key: your-api-key"
|
||||
|
||||
The application will be available at `http://localhost:5000`.
|
||||
Testing
|
||||
The project includes a test suite covering UI, API, database, file operations, and security.
|
||||
|
||||
### 7. (Optional) Run with Docker
|
||||
If you prefer to run the application in a Docker container:
|
||||
```bash
|
||||
docker build -t flare-fhir-ig-toolkit .
|
||||
docker run -p 5000:5000 -v $(pwd)/instance:/app/instance flare-fhir-ig-toolkit
|
||||
```
|
||||
Test Prerequisites:
|
||||
|
||||
The application will be available at `http://localhost:5000`.
|
||||
pytest: For running tests.
|
||||
pytest-mock: For mocking dependencies. Install: pip install pytest pytest-mock
|
||||
Running Tests:
|
||||
|
||||
## Database Management
|
||||
Bash
|
||||
|
||||
The application uses a SQLite database (`instance/fhir_ig.db`) to store processed IGs. Below are steps to create, purge, and recreate the database.
|
||||
cd <project folder>
|
||||
pytest tests/test_app.py -v
|
||||
Test Coverage:
|
||||
|
||||
### Creating the Database
|
||||
The database is automatically created when you first run the application. To create it manually:
|
||||
```bash
|
||||
python -c "from app import db; db.create_all()"
|
||||
```
|
||||
This will create `instance/fhir_ig.db` with the necessary tables.
|
||||
UI Pages: Homepage, Import IG, Manage IGs, Push IGs, Validate Sample, View Processed IG, FSH Converter, Upload Test Data, Retrieve/Split Data.
|
||||
API Endpoints: POST /api/import-ig, POST /api/push-ig, GET /get-structure, GET /get-example, POST /api/upload-test-data, POST /api/retrieve-bundles, POST /api/split-bundles.
|
||||
Database: IG processing, unloading, viewing.
|
||||
File Operations: Package processing, deletion, FSH output, ZIP handling.
|
||||
Security: CSRF protection, flash messages, secret key.
|
||||
FSH Converter: Form submission, file/text input, GoFSH execution, Fishing Trip comparison.
|
||||
Upload Test Data: Parsing, dependency graph, sorting, upload modes, validation, conditional uploads.
|
||||
Development Notes
|
||||
Background
|
||||
The toolkit addresses the need for a comprehensive FHIR IG management tool, with recent enhancements for resource validation, FSH conversion with advanced GoFSH features, flexible versioning, improved IG pushing, dependency-aware test data uploading, and bundle retrieval/splitting, making it a versatile platform for FHIR developers.
|
||||
|
||||
### Purging the Database
|
||||
To purge the database (delete all data while keeping the schema):
|
||||
1. Stop the application if it’s running.
|
||||
2. Run the following command to drop all tables and recreate them:
|
||||
```bash
|
||||
python -c "from app import db; db.drop_all(); db.create_all()"
|
||||
```
|
||||
3. Alternatively, you can delete the database file and recreate it:
|
||||
```bash
|
||||
rm instance/fhir_ig.db
|
||||
python -c "from app import db; db.create_all()"
|
||||
```
|
||||
Technical Decisions
|
||||
Flask: Lightweight and flexible for web development.
|
||||
SQLite: Simple for development; consider PostgreSQL for production.
|
||||
Bootstrap 5.3.3: Responsive UI with custom styling.
|
||||
Lottie-Web: Renders themed animations for FSH conversion waiting spinner.
|
||||
GoFSH/SUSHI: Integrated via Node.js for advanced FSH conversion and round-trip validation.
|
||||
Docker: Ensures consistent deployment with Flask and HAPI FHIR.
|
||||
Flexible Versioning: Supports non-standard IG versions (e.g., -preview, -ballot).
|
||||
Live Console/Streaming: Real-time feedback for complex operations (Push, Upload Test Data, FSH, Retrieve Bundles).
|
||||
Validation: Alpha feature with ongoing FHIRPath improvements.
|
||||
Dependency Management: Uses topological sort for Upload Test Data feature.
|
||||
Form Parsing: Uses custom Werkzeug parser for Upload Test Data to handle large numbers of files.
|
||||
Recent Updates
|
||||
* Enhanced package search page with caching, detailed views (dependencies, dependents, version history), and background cache refresh.
|
||||
Upload Test Data Enhancements (April 2025):
|
||||
Added optional Pre-Upload Validation against selected IG profiles.
|
||||
Added optional Conditional Uploads (GET + POST/PUT w/ If-Match) for individual mode.
|
||||
Implemented robust XML parsing using fhir.resources library (when available).
|
||||
Fixed 413 Request Entity Too Large errors for large file counts using a custom Werkzeug FormDataParser.
|
||||
Path: templates/upload_test_data.html, app.py, services.py, forms.py.
|
||||
Push IG Enhancements (April 2025):
|
||||
Added semantic comparison to skip uploading identical resources.
|
||||
Added "Force Upload" option to bypass comparison.
|
||||
Improved handling of canonical resources (search before PUT/POST).
|
||||
Added filtering by specific files to skip during push.
|
||||
More detailed summary report in stream response.
|
||||
Path: templates/cp_push_igs.html, app.py, services.py.
|
||||
Waiting Spinner for FSH Converter (April 2025):
|
||||
Added a themed (light/dark) Lottie animation spinner during FSH execution.
|
||||
Path: templates/fsh_converter.html, static/animations/, static/js/lottie-web.min.js.
|
||||
Advanced FSH Converter (April 2025):
|
||||
Added support for GoFSH advanced options: --fshing-trip, --dependency, --indent, --meta-profile, --alias-file, --no-alias.
|
||||
Displays Fishing Trip comparison reports.
|
||||
Path: templates/fsh_converter.html, app.py, services.py, forms.py.
|
||||
(New) Retrieve and Split Data (May 2025):
|
||||
Added UI and API for retrieving bundles from a FHIR server by resource type.
|
||||
Added options to fetch referenced resources (individually or as full type bundles).
|
||||
Added functionality to split uploaded ZIP files of bundles into individual resources.
|
||||
Streaming log for retrieval and ZIP download for results.
|
||||
Paths: templates/retrieve_split_data.html, app.py, services.py, forms.py.
|
||||
Known Issues and Workarounds
|
||||
Favicon 404: Clear browser cache or verify /app/static/favicon.ico.
|
||||
CSRF Errors: Set FLASK_SECRET_KEY and ensure {{ form.hidden_tag() }} in forms.
|
||||
Import Fails: Check package name/version and connectivity.
|
||||
Validation Accuracy: Alpha feature; report issues to GitHub (remove PHI).
|
||||
Package Parsing: Non-standard .tgz filenames may parse incorrectly. Fallback uses name-only parsing.
|
||||
Permissions: Ensure instance/ and static/uploads/ are writable.
|
||||
GoFSH/SUSHI Errors: Check ./logs/flask_err.log for ERROR:services:GoFSH failed. Ensure valid FHIR inputs and SUSHI installation.
|
||||
Upload Test Data XML Parsing: Relies on fhir.resources library for full validation; basic parsing used as fallback. Complex XML structures might not be fully analyzed for dependencies with basic parsing. Prefer JSON for reliable dependency analysis.
|
||||
413 Request Entity Too Large: Primarily handled by CustomFormDataParser for /api/upload-test-data. Check the parser's max_form_parts limit if still occurring. MAX_CONTENT_LENGTH in app.py controls overall size. Reverse proxy limits (client_max_body_size in Nginx) might also apply.
|
||||
|
||||
### Recreating the Database
|
||||
To completely recreate the database (e.g., after making schema changes):
|
||||
1. Stop the application if it’s running.
|
||||
2. Delete the existing database file:
|
||||
```bash
|
||||
rm instance/fhir_ig.db
|
||||
```
|
||||
3. Recreate the database:
|
||||
```bash
|
||||
python -c "from app import db; db.create_all()"
|
||||
```
|
||||
4. Restart the application:
|
||||
```bash
|
||||
python app.py
|
||||
```
|
||||
|
||||
**Note**: Ensure the `instance` directory has write permissions (`chmod -R 777 instance`) to avoid permission errors when creating or modifying the database.
|
||||
Future Improvements
|
||||
Upload Test Data: Improve XML parsing further (direct XML->fhir.resource object if possible), add visual progress bar, add upload order preview, implement transaction bundle size splitting, add 'Clear Target Server' option (with confirmation).
|
||||
Validation: Enhance FHIRPath for complex constraints; add API endpoint.
|
||||
Sorting: Sort IG versions in /view-igs (e.g., ascending).
|
||||
Duplicate Resolution: Options to keep latest version or merge resources.
|
||||
Production Database: Support PostgreSQL.
|
||||
Error Reporting: Detailed validation error paths in the UI.
|
||||
FSH Enhancements: Add API endpoint for FSH conversion; support inline instance construction.
|
||||
FHIR Operations: Add complex parameter support (e.g., /$diff with left/right).
|
||||
Retrieve/Split Data: Add option to filter resources during retrieval (e.g., by date, specific IDs).
|
||||
Completed Items
|
||||
Testing suite with basic coverage.
|
||||
API endpoints for POST /api/import-ig and POST /api/push-ig.
|
||||
Flexible versioning (-preview, -ballot).
|
||||
CSRF fixes for forms.
|
||||
Resource validation UI (alpha).
|
||||
FSH Converter with advanced GoFSH features and waiting spinner.
|
||||
Push IG enhancements (force upload, semantic comparison, canonical handling, skip files).
|
||||
Upload Test Data feature with dependency sorting, multiple upload modes, pre-upload validation, conditional uploads, robust XML parsing, and fix for large file counts.
|
||||
Retrieve and Split Data functionality with reference fetching and ZIP download.
|
||||
Far-Distant Improvements
|
||||
Cache Service: Use Redis for IG metadata caching.
|
||||
Database Optimization: Composite index on ProcessedIg.package_name and ProcessedIg.version.
|
||||
|
||||
## Usage
|
||||
|
||||
### Importing FHIR Packages
|
||||
1. Navigate to the "Import IGs" page (`/import-ig`).
|
||||
2. Enter the package name (e.g., `hl7.fhir.us.core`) and version (e.g., `1.0.0` or `current`).
|
||||
3. Click "Fetch & Download IG" to download the package and its dependencies.
|
||||
4. You’ll be redirected to the "Manage FHIR Packages" page to view the downloaded packages.
|
||||
|
||||
### Managing FHIR Packages
|
||||
- **View Downloaded Packages**: The "Manage FHIR Packages" page (`/view-igs`) lists all downloaded packages.
|
||||
- **Handle Duplicates**: Duplicate packages with different versions are highlighted with color-coded rows (e.g., yellow for one group, light blue for another).
|
||||
- **Process Packages**: Click "Process" to extract and store package details in the database.
|
||||
- **Delete Packages**: Click "Delete" to remove a package from the filesystem.
|
||||
|
||||
### Viewing Processed IGs
|
||||
- Processed packages are listed in the "Processed Packages" section.
|
||||
- Click "View" to see detailed information about a processed IG, including resource types and examples.
|
||||
- Click "Unload" to remove a processed IG from the database.
|
||||
|
||||
## Project Structure
|
||||
|
||||
```
|
||||
FLARE-FHIR-IG-Toolkit/
|
||||
Directory Structure
|
||||
FHIRFLARE-IG-Toolkit/
|
||||
├── app.py # Main Flask application
|
||||
├── instance/ # Directory for SQLite database and FHIR packages
|
||||
├── Build and Run for first time.bat # Windows script for first-time Docker setup
|
||||
├── docker-compose.yml # Docker Compose configuration
|
||||
├── Dockerfile # Docker configuration
|
||||
├── forms.py # Form definitions
|
||||
├── LICENSE.md # Apache 2.0 License
|
||||
├── README.md # Project documentation
|
||||
├── requirements.txt # Python dependencies
|
||||
├── Run.bat # Windows script for running Docker
|
||||
├── services.py # Logic for IG import, processing, validation, pushing, FSH conversion, test data upload, retrieve/split
|
||||
├── supervisord.conf # Supervisor configuration
|
||||
├── hapi-fhir-Setup/
|
||||
│ ├── README.md # HAPI FHIR setup instructions
|
||||
│ └── target/
|
||||
│ └── classes/
|
||||
│ └── application.yaml # HAPI FHIR configuration
|
||||
├── instance/
|
||||
│ ├── fhir_ig.db # SQLite database
|
||||
│ └── fhir_packages/ # Directory for downloaded FHIR packages
|
||||
├── static/ # Static files (e.g., favicon.ico, FHIRFLARE.png)
|
||||
│ ├── FHIRFLARE.png
|
||||
│ └── favicon.ico
|
||||
├── templates/ # HTML templates
|
||||
│ ├── base.html
|
||||
│ ├── cp_downloaded_igs.html
|
||||
│ ├── cp_view_processed_ig.html
|
||||
│ ├── import_ig.html
|
||||
│ └── index.html
|
||||
├── services.py # Helper functions for processing FHIR packages
|
||||
└── README.md # Project documentation
|
||||
```
|
||||
│ ├── fhir_ig.db.old # Database backup
|
||||
│ └── fhir_packages/ # Stored IG packages and metadata
|
||||
│ ├── ... (example packages) ...
|
||||
├── logs/
|
||||
│ ├── flask.log # Flask application logs
|
||||
│ ├── flask_err.log # Flask error logs
|
||||
│ ├── supervisord.log # Supervisor logs
|
||||
│ ├── supervisord.pid # Supervisor PID file
|
||||
│ ├── tomcat.log # Tomcat logs for HAPI FHIR
|
||||
│ └── tomcat_err.log # Tomcat error logs
|
||||
├── static/
|
||||
│ ├── animations/
|
||||
│ │ ├── loading-dark.json # Dark theme spinner animation
|
||||
│ │ └── loading-light.json # Light theme spinner animation
|
||||
│ ├── favicon.ico # Application favicon
|
||||
│ ├── FHIRFLARE.png # Application logo
|
||||
│ ├── js/
|
||||
│ │ └── lottie-web.min.js # Lottie library for spinner
|
||||
│ └── uploads/
|
||||
│ ├── output.fsh # Generated FSH output (temp location)
|
||||
│ └── fsh_output/ # GoFSH output directory
|
||||
│ ├── ... (example GoFSH output) ...
|
||||
├── templates/
|
||||
│ ├── base.html # Base template
|
||||
│ ├── cp_downloaded_igs.html # UI for managing IGs
|
||||
│ ├── cp_push_igs.html # UI for pushing IGs
|
||||
│ ├── cp_view_processed_ig.html # UI for viewing processed IGs
|
||||
│ ├── fhir_ui.html # UI for FHIR API explorer
|
||||
│ ├── fhir_ui_operations.html # UI for FHIR server operations
|
||||
│ ├── fsh_converter.html # UI for FSH conversion
|
||||
│ ├── import_ig.html # UI for importing IGs
|
||||
│ ├── index.html # Homepage
|
||||
│ ├── retrieve_split_data.html # UI for Retrieve and Split Data
|
||||
│ ├── upload_test_data.html # UI for Uploading Test Data
|
||||
│ ├── validate_sample.html # UI for validating resources/bundles
|
||||
│ ├── config_hapi.html # UI for HAPI FHIR Configuration
|
||||
│ └── _form_helpers.html # Form helper macros
|
||||
├── tests/
|
||||
│ └── test_app.py # Test suite
|
||||
└── hapi-fhir-jpaserver/ # HAPI FHIR server resources (if Standalone)
|
||||
|
||||
## Development Notes
|
||||
Contributing
|
||||
Fork the repository.
|
||||
Create a feature branch (git checkout -b feature/your-feature).
|
||||
Commit changes (git commit -m "Add your feature").
|
||||
Push to your branch (git push origin feature/your-feature).
|
||||
Open a Pull Request.
|
||||
Ensure code follows PEP 8 and includes tests in tests/test_app.py.
|
||||
|
||||
### Background
|
||||
The FHIRFLARE IG Toolkit was developed to address the need for a user-friendly tool to manage FHIR Implementation Guides. The project focuses on providing a seamless experience for importing, processing, and analyzing FHIR packages, with a particular emphasis on handling duplicate dependencies—a common challenge in FHIR development.
|
||||
|
||||
### Technical Decisions
|
||||
- **Flask**: Chosen for its lightweight and flexible nature, making it ideal for a small to medium-sized web application.
|
||||
- **SQLite**: Used as the database for simplicity and ease of setup. For production use, consider switching to a more robust database like PostgreSQL.
|
||||
- **Bootstrap**: Integrated for a responsive and professional UI, with custom CSS to handle duplicate package highlighting.
|
||||
- **Docker Support**: Added to simplify deployment and ensure consistency across development and production environments.
|
||||
|
||||
### Known Issues and Workarounds
|
||||
- **Bootstrap CSS Conflicts**: Early versions of the application had issues with Bootstrap’s table background styles (`--bs-table-bg`) overriding custom row colors for duplicate packages. This was resolved by setting `--bs-table-bg` to `transparent` for the affected table (see `templates/cp_downloaded_igs.html`).
|
||||
- **Database Permissions**: The `instance` directory must be writable by the application. If you encounter permission errors, ensure the directory has the correct permissions (`chmod -R 777 instance`).
|
||||
- **Package Parsing**: Some FHIR package filenames may not follow the expected `name-version.tgz` format, leading to parsing issues. The application includes a fallback to treat such files as name-only packages, but this may need further refinement.
|
||||
|
||||
### Future Improvements
|
||||
- **Sorting Versions**: Add sorting for package versions in the "Manage FHIR Packages" view to display them in a consistent order (e.g., ascending or descending).
|
||||
- **Advanced Duplicate Handling**: Implement options to resolve duplicates (e.g., keep the latest version, merge resources).
|
||||
- **Production Database**: Support for PostgreSQL or MySQL for better scalability in production environments.
|
||||
- **Testing**: Add unit tests using `pytest` to cover core functionality, especially package processing and database operations.
|
||||
- **Inbound API for IG Packages**: Develop API endpoints to allow external tools to push IG packages to FHIRFLARE. The API should automatically resolve dependencies, return a list of dependencies, and identify any duplicate dependencies. For example:
|
||||
- Endpoint: `POST /api/import-ig`
|
||||
- Request: `{ "package_name": "hl7.fhir.us.core", "version": "1.0.0" }`
|
||||
- Response: `{ "status": "success", "dependencies": ["hl7.fhir.r4.core#4.0.1"], "duplicates": ["hl7.fhir.r4.core#4.0.1 (already exists as 5.0.0)"] }`
|
||||
- **Outbound API for Pushing IGs to FHIR Servers**: Create an outbound API to push a chosen IG (with its dependencies) to a FHIR server, or allow pushing a single IG without dependencies. The API should process the server’s responses and provide feedback. For example:
|
||||
- Endpoint: `POST /api/push-ig`
|
||||
- Request: `{ "package_name": "hl7.fhir.us.core", "version": "1.0.0", "fhir_server_url": "https://fhir-server.example.com", "include_dependencies": true }`
|
||||
- Response: `{ "status": "success", "pushed_packages": ["hl7.fhir.us.core#1.0.0", "hl7.fhir.r4.core#4.0.1"], "server_response": "Resources uploaded successfully" }`
|
||||
- **Far-Distant Improvements**:
|
||||
- **Cache Service for IGs**: Implement a cache service to store all IGs, allowing for quick querying of package metadata without reprocessing. This could use an in-memory store like Redis to improve performance.
|
||||
- **Database Index Optimization**: Modify the database structure to use a composite index on `package_name` and `version` (e.g., `ProcessedIg.package_name + ProcessedIg.version` as a unique key). This would allow the `/view-igs` page and API endpoints to directly query specific packages (e.g., `/api/ig/hl7.fhir.us.core/1.0.0`) without scanning the entire table.
|
||||
|
||||
## Contributing
|
||||
|
||||
Contributions are welcome! To contribute:
|
||||
1. Fork the repository.
|
||||
2. Create a new branch (`git checkout -b feature/your-feature`).
|
||||
3. Make your changes and commit them (`git commit -m "Add your feature"`).
|
||||
4. Push to your branch (`git push origin feature/your-feature`).
|
||||
5. Open a Pull Request.
|
||||
|
||||
Please ensure your code follows the project’s coding style and includes appropriate tests.
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
- **Database Issues**: If the SQLite database (`instance/fhir_ig.db`) cannot be created, ensure the `instance` directory is writable. You may need to adjust permissions (`chmod -R 777 instance`).
|
||||
- **Package Download Fails**: Verify your internet connection and ensure the package name and version are correct.
|
||||
- **Colors Not Displaying**: If table row colors for duplicates are not showing, inspect the page with browser developer tools (F12) to check for CSS conflicts with Bootstrap.
|
||||
|
||||
## License
|
||||
|
||||
This project is licensed under the Apache License, Version 2.0. See the [LICENSE.md](LICENSE.md) file for details.
|
||||
|
||||
## Contact
|
||||
|
||||
For questions or support, please open an issue on GitHub or contact the maintainers at [your-email@example.com](mailto:your-email@example.com).
|
||||
Troubleshooting
|
||||
Favicon 404: Clear browser cache or verify /app/static/favicon.ico: docker exec -it <container_name> curl http://localhost:5000/static/favicon.ico
|
||||
CSRF Errors: Set FLASK_SECRET_KEY and ensure {{ form.hidden_tag() }} in forms.
|
||||
Import Fails: Check package name/version and connectivity.
|
||||
Validation Accuracy: Alpha feature; report issues to GitHub (remove PHI).
|
||||
Package Parsing: Non-standard .tgz filenames may parse incorrectly. Fallback uses name-only parsing.
|
||||
Permissions: Ensure instance/ and static/uploads/ are writable: chmod -R 777 instance static/uploads logs
|
||||
GoFSH/SUSHI Errors: Check ./logs/flask_err.log for ERROR:services:GoFSH failed. Ensure valid FHIR inputs and SUSHI installation: docker exec -it <container_name> sushi --version
|
||||
413 Request Entity Too Large: Increase MAX_CONTENT_LENGTH and MAX_FORM_PARTS in app.py. If using a reverse proxy (e.g., Nginx), increase its client_max_body_size setting as well. Ensure the application/container is fully restarted/rebuilt.
|
||||
License
|
||||
Licensed under the Apache 2.0 License. See LICENSE.md for details.
|
||||
151
README_INTEGRATION FHIRVINE as Moduel in FLARE.md
Normal file
@ -0,0 +1,151 @@
|
||||
# Integrating FHIRVINE as a Module in FHIRFLARE
|
||||
|
||||
## Overview
|
||||
|
||||
FHIRFLARE is a Flask-based FHIR Implementation Guide (IG) toolkit for managing and validating FHIR packages. This guide explains how to integrate FHIRVINE—a SMART on FHIR proxy—as a module within FHIRFLARE, enabling OAuth2 authentication and FHIR request proxying directly in the application. This modular approach embeds FHIRVINE’s functionality into FHIRFLARE, avoiding the need for a separate proxy service.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- FHIRFLARE repository cloned: `https://github.com/Sudo-JHare/FHIRFLARE-IG-Toolkit`.
|
||||
- FHIRVINE repository cloned: `<fhirvine-repository-url>`.
|
||||
- Python 3.11 and dependencies installed (`requirements.txt` from both projects).
|
||||
- A FHIR server (e.g., `http://hapi.fhir.org/baseR4`).
|
||||
|
||||
## Integration Steps
|
||||
|
||||
### 1. Prepare FHIRFLARE Structure
|
||||
|
||||
Ensure FHIRFLARE’s file structure supports modular integration. It should look like:
|
||||
|
||||
```
|
||||
FHIRFLARE-IG-Toolkit/
|
||||
├── app.py
|
||||
├── services.py
|
||||
├── templates/
|
||||
├── static/
|
||||
└── requirements.txt
|
||||
```
|
||||
|
||||
### 2. Copy FHIRVINE Files into FHIRFLARE
|
||||
|
||||
FHIRVINE’s core functionality (OAuth2 proxy, app registration) will be integrated as a Flask Blueprint.
|
||||
|
||||
- **Copy Files**:
|
||||
|
||||
- Copy `smart_proxy.py`, `forms.py`, `models.py`, and `app.py` (relevant parts) from FHIRVINE into a new `fhirvine/` directory in FHIRFLARE:
|
||||
|
||||
```
|
||||
FHIRFLARE-IG-Toolkit/
|
||||
├── fhirvine/
|
||||
│ ├── smart_proxy.py
|
||||
│ ├── forms.py
|
||||
│ ├── models.py
|
||||
│ └── __init__.py
|
||||
```
|
||||
|
||||
- Copy FHIRVINE’s templates (e.g., `app_gallery/`, `configure/`, `test_client.html`) into `FHIRFLARE-IG-Toolkit/templates/` while maintaining their folder structure.
|
||||
|
||||
- **Add Dependencies**:
|
||||
|
||||
- Add FHIRVINE’s dependencies to `requirements.txt` (e.g., `authlib`, `flasgger`, `flask-sqlalchemy`).
|
||||
|
||||
### 3. Modify FHIRVINE Code as a Module
|
||||
|
||||
- **Create Blueprint in** `fhirvine/__init__.py`:
|
||||
|
||||
```python
|
||||
from flask import Blueprint
|
||||
|
||||
fhirvine_bp = Blueprint('fhirvine', __name__, template_folder='templates')
|
||||
|
||||
from .smart_proxy import *
|
||||
```
|
||||
|
||||
This registers FHIRVINE as a Flask Blueprint.
|
||||
|
||||
- **Update** `smart_proxy.py`:
|
||||
|
||||
- Replace direct `app.route` decorators with `fhirvine_bp.route`. For example:
|
||||
|
||||
```python
|
||||
@fhirvine_bp.route('/authorize', methods=['GET', 'POST'])
|
||||
def authorize():
|
||||
# Existing authorization logic
|
||||
```
|
||||
|
||||
### 4. Integrate FHIRVINE Blueprint into FHIRFLARE
|
||||
|
||||
- **Update** `app.py` **in FHIRFLARE**:
|
||||
|
||||
- Import and register the FHIRVINE Blueprint:
|
||||
|
||||
```python
|
||||
from fhirvine import fhirvine_bp
|
||||
from fhirvine.models import database, RegisteredApp, OAuthToken, AuthorizationCode, Configuration
|
||||
from fhirvine.smart_proxy import configure_oauth
|
||||
|
||||
app = Flask(__name__)
|
||||
app.config.from_mapping(
|
||||
SECRET_KEY='your-secure-random-key',
|
||||
SQLALCHEMY_DATABASE_URI='sqlite:////app/instance/fhirflare.db',
|
||||
SQLALCHEMY_TRACK_MODIFICATIONS=False,
|
||||
FHIR_SERVER_URL='http://hapi.fhir.org/baseR4',
|
||||
PROXY_TIMEOUT=10,
|
||||
TOKEN_DURATION=3600,
|
||||
REFRESH_TOKEN_DURATION=86400,
|
||||
ALLOWED_SCOPES='openid profile launch launch/patient patient/*.read offline_access'
|
||||
)
|
||||
|
||||
database.init_app(app)
|
||||
configure_oauth(app, db=database, registered_app_model=RegisteredApp, oauth_token_model=OAuthToken, auth_code_model=AuthorizationCode)
|
||||
|
||||
app.register_blueprint(fhirvine_bp, url_prefix='/fhirvine')
|
||||
```
|
||||
|
||||
### 5. Update FHIRFLARE Templates
|
||||
|
||||
- **Add FHIRVINE Links to Navbar**:
|
||||
|
||||
- In `templates/base.html`, add links to FHIRVINE features:
|
||||
|
||||
```html
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="{{ url_for('fhirvine.app_gallery') }}">App Gallery</a>
|
||||
</li>
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="{{ url_for('fhirvine.test_client') }}">Test Client</a>
|
||||
</li>
|
||||
```
|
||||
|
||||
### 6. Run and Test
|
||||
|
||||
- **Install Dependencies**:
|
||||
|
||||
```bash
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
- **Run FHIRFLARE**:
|
||||
|
||||
```bash
|
||||
flask db upgrade
|
||||
flask run --host=0.0.0.0 --port=8080
|
||||
```
|
||||
|
||||
- **Access FHIRVINE Features**:
|
||||
|
||||
- App Gallery: `http://localhost:8080/fhirvine/app-gallery`
|
||||
- Test Client: `http://localhost:8080/fhirvine/test-client`
|
||||
- Proxy Requests: Use `/fhirvine/oauth2/proxy/<path>` within FHIRFLARE.
|
||||
|
||||
## Using FHIRVINE in FHIRFLARE
|
||||
|
||||
- **Register Apps**: Use `/fhirvine/app-gallery` to register SMART apps within FHIRFLARE.
|
||||
- **Authenticate**: Use `/fhirvine/oauth2/authorize` for OAuth2 flows.
|
||||
- **Proxy FHIR Requests**: FHIRFLARE can now make FHIR requests via `/fhirvine/oauth2/proxy`, leveraging FHIRVINE’s authentication.
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
- **Route Conflicts**: Ensure no overlapping routes between FHIRFLARE and FHIRVINE.
|
||||
- **Database Issues**: Verify `SQLALCHEMY_DATABASE_URI` points to the same database.
|
||||
- **Logs**: Check `flask run` logs for errors.
|
||||
25
Run.bat
Normal file
@ -0,0 +1,25 @@
|
||||
REM --- Step 1: Start Docker containers ---
|
||||
echo ===> Starting Docker containers (Step 7)...
|
||||
docker-compose up -d
|
||||
if errorlevel 1 (
|
||||
echo ERROR: Docker Compose up failed. Check Docker installation and container configurations. ErrorLevel: %errorlevel%
|
||||
goto :error
|
||||
)
|
||||
echo Docker containers started successfully. ErrorLevel: %errorlevel%
|
||||
echo.
|
||||
|
||||
echo ====================================
|
||||
echo Script finished successfully!
|
||||
echo ====================================
|
||||
goto :eof
|
||||
|
||||
:error
|
||||
echo ------------------------------------
|
||||
echo An error occurred. Script aborted.
|
||||
echo ------------------------------------
|
||||
pause
|
||||
exit /b 1
|
||||
|
||||
:eof
|
||||
echo Script execution finished.
|
||||
pause
|
||||
28
__init__.py
@ -1,28 +0,0 @@
|
||||
# app/modules/fhir_ig_importer/__init__.py
|
||||
|
||||
from flask import Blueprint
|
||||
|
||||
# --- Module Metadata ---
|
||||
metadata = {
|
||||
'module_id': 'fhir_ig_importer', # Matches folder name
|
||||
'display_name': 'FHIR IG Importer',
|
||||
'description': 'Imports FHIR Implementation Guide packages from a registry.',
|
||||
'version': '0.1.0',
|
||||
# No main nav items, will be accessed via Control Panel
|
||||
'nav_items': []
|
||||
}
|
||||
# --- End Module Metadata ---
|
||||
|
||||
# Define Blueprint
|
||||
# We'll mount this under the control panel later
|
||||
bp = Blueprint(
|
||||
metadata['module_id'],
|
||||
__name__,
|
||||
template_folder='templates',
|
||||
# Define a URL prefix if mounting standalone, but we'll likely register
|
||||
# it under /control-panel via app/__init__.py later
|
||||
# url_prefix='/fhir-importer'
|
||||
)
|
||||
|
||||
# Import routes after creating blueprint
|
||||
from . import routes, forms # Import forms too
|
||||
1
charts/.gitignore
vendored
Normal file
@ -0,0 +1 @@
|
||||
/hapi-fhir-jpaserver-0.20.0.tgz
|
||||
1
charts/fhirflare-ig-toolkit/.gitignore
vendored
Normal file
@ -0,0 +1 @@
|
||||
/rendered/
|
||||
16
charts/fhirflare-ig-toolkit/Chart.yaml
Normal file
@ -0,0 +1,16 @@
|
||||
apiVersion: v2
|
||||
name: fhirflare-ig-toolkit
|
||||
version: 0.5.0
|
||||
description: Helm chart for deploying the fhirflare-ig-toolkit application
|
||||
type: application
|
||||
appVersion: "latest"
|
||||
icon: https://github.com/jgsuess/FHIRFLARE-IG-Toolkit/raw/main/static/FHIRFLARE.png
|
||||
keywords:
|
||||
- fhir
|
||||
- healthcare
|
||||
- ig-toolkit
|
||||
- implementation-guide
|
||||
home: https://github.com/jgsuess/FHIRFLARE-IG-Toolkit
|
||||
maintainers:
|
||||
- name: Jörn Guy Süß
|
||||
email: jgsuess@gmail.com
|
||||
152
charts/fhirflare-ig-toolkit/templates/_helpers.tpl
Normal file
@ -0,0 +1,152 @@
|
||||
{{/*
|
||||
Expand the name of the chart.
|
||||
*/}}
|
||||
{{- define "fhirflare-ig-toolkit.name" -}}
|
||||
{{- default .Chart.Name .Values.nameOverride | trunc 63 | trimSuffix "-" }}
|
||||
{{- end }}
|
||||
|
||||
{{/*
|
||||
Create a default fully qualified app name.
|
||||
We truncate at 63 chars because some Kubernetes name fields are limited to this (by the DNS naming spec).
|
||||
If release name contains chart name it will be used as a full name.
|
||||
*/}}
|
||||
{{- define "fhirflare-ig-toolkit.fullname" -}}
|
||||
{{- if .Values.fullnameOverride }}
|
||||
{{- .Values.fullnameOverride | trunc 63 | trimSuffix "-" }}
|
||||
{{- else }}
|
||||
{{- $name := default .Chart.Name .Values.nameOverride }}
|
||||
{{- if contains $name .Release.Name }}
|
||||
{{- .Release.Name | trunc 63 | trimSuffix "-" }}
|
||||
{{- else }}
|
||||
{{- printf "%s-%s" .Release.Name $name | trunc 63 | trimSuffix "-" }}
|
||||
{{- end }}
|
||||
{{- end }}
|
||||
{{- end }}
|
||||
|
||||
{{/*
|
||||
Create chart name and version as used by the chart label.
|
||||
*/}}
|
||||
{{- define "fhirflare-ig-toolkit.chart" -}}
|
||||
{{- printf "%s-%s" .Chart.Name .Chart.Version | replace "+" "_" | trunc 63 | trimSuffix "-" }}
|
||||
{{- end }}
|
||||
|
||||
{{/*
|
||||
Common labels
|
||||
*/}}
|
||||
{{- define "fhirflare-ig-toolkit.labels" -}}
|
||||
helm.sh/chart: {{ include "fhirflare-ig-toolkit.chart" . }}
|
||||
{{ include "fhirflare-ig-toolkit.selectorLabels" . }}
|
||||
{{- if .Chart.AppVersion }}
|
||||
app.kubernetes.io/version: {{ .Chart.AppVersion | quote }}
|
||||
{{- end }}
|
||||
app.kubernetes.io/managed-by: {{ .Release.Service }}
|
||||
{{- end }}
|
||||
|
||||
{{/*
|
||||
Selector labels
|
||||
*/}}
|
||||
{{- define "fhirflare-ig-toolkit.selectorLabels" -}}
|
||||
app.kubernetes.io/name: {{ include "fhirflare-ig-toolkit.name" . }}
|
||||
app.kubernetes.io/instance: {{ .Release.Name }}
|
||||
{{- end }}
|
||||
|
||||
{{/*
|
||||
Create the name of the service account to use
|
||||
*/}}
|
||||
{{- define "hapi-fhir-jpaserver.serviceAccountName" -}}
|
||||
{{- if .Values.serviceAccount.create }}
|
||||
{{- default (include "hapi-fhir-jpaserver.fullname" .) .Values.serviceAccount.name }}
|
||||
{{- else }}
|
||||
{{- default "default" .Values.serviceAccount.name }}
|
||||
{{- end }}
|
||||
{{- end }}
|
||||
|
||||
{{/*
|
||||
Create a default fully qualified postgresql name.
|
||||
We truncate at 63 chars because some Kubernetes name fields are limited to this (by the DNS naming spec).
|
||||
*/}}
|
||||
{{- define "hapi-fhir-jpaserver.postgresql.fullname" -}}
|
||||
{{- $name := default "postgresql" .Values.postgresql.nameOverride -}}
|
||||
{{- printf "%s-%s" .Release.Name $name | trunc 63 | trimSuffix "-" -}}
|
||||
{{- end -}}
|
||||
|
||||
{{/*
|
||||
Get the Postgresql credentials secret name.
|
||||
*/}}
|
||||
{{- define "hapi-fhir-jpaserver.postgresql.secretName" -}}
|
||||
{{- if .Values.postgresql.enabled -}}
|
||||
{{- if .Values.postgresql.auth.existingSecret -}}
|
||||
{{- printf "%s" .Values.postgresql.auth.existingSecret -}}
|
||||
{{- else -}}
|
||||
{{- printf "%s" (include "hapi-fhir-jpaserver.postgresql.fullname" .) -}}
|
||||
{{- end -}}
|
||||
{{- else }}
|
||||
{{- if .Values.externalDatabase.existingSecret -}}
|
||||
{{- printf "%s" .Values.externalDatabase.existingSecret -}}
|
||||
{{- else -}}
|
||||
{{ printf "%s-%s" (include "hapi-fhir-jpaserver.fullname" .) "external-db" }}
|
||||
{{- end -}}
|
||||
{{- end -}}
|
||||
{{- end -}}
|
||||
|
||||
{{/*
|
||||
Get the Postgresql credentials secret key.
|
||||
*/}}
|
||||
{{- define "hapi-fhir-jpaserver.postgresql.secretKey" -}}
|
||||
{{- if .Values.postgresql.enabled -}}
|
||||
{{- if .Values.postgresql.auth.username -}}
|
||||
{{- printf "%s" .Values.postgresql.auth.secretKeys.userPasswordKey -}}
|
||||
{{- else -}}
|
||||
{{- printf "%s" .Values.postgresql.auth.secretKeys.adminPasswordKey -}}
|
||||
{{- end -}}
|
||||
{{- else }}
|
||||
{{- if .Values.externalDatabase.existingSecret -}}
|
||||
{{- printf "%s" .Values.externalDatabase.existingSecretKey -}}
|
||||
{{- else -}}
|
||||
{{- printf "postgres-password" -}}
|
||||
{{- end -}}
|
||||
{{- end -}}
|
||||
{{- end -}}
|
||||
|
||||
{{/*
|
||||
Add environment variables to configure database values
|
||||
*/}}
|
||||
{{- define "hapi-fhir-jpaserver.database.host" -}}
|
||||
{{- ternary (include "hapi-fhir-jpaserver.postgresql.fullname" .) .Values.externalDatabase.host .Values.postgresql.enabled -}}
|
||||
{{- end -}}
|
||||
|
||||
{{/*
|
||||
Add environment variables to configure database values
|
||||
*/}}
|
||||
{{- define "hapi-fhir-jpaserver.database.user" -}}
|
||||
{{- if .Values.postgresql.enabled -}}
|
||||
{{- printf "%s" .Values.postgresql.auth.username | default "postgres" -}}
|
||||
{{- else -}}
|
||||
{{- printf "%s" .Values.externalDatabase.user -}}
|
||||
{{- end -}}
|
||||
{{- end -}}
|
||||
|
||||
{{/*
|
||||
Add environment variables to configure database values
|
||||
*/}}
|
||||
{{- define "hapi-fhir-jpaserver.database.name" -}}
|
||||
{{- ternary .Values.postgresql.auth.database .Values.externalDatabase.database .Values.postgresql.enabled -}}
|
||||
{{- end -}}
|
||||
|
||||
{{/*
|
||||
Add environment variables to configure database values
|
||||
*/}}
|
||||
{{- define "hapi-fhir-jpaserver.database.port" -}}
|
||||
{{- ternary "5432" .Values.externalDatabase.port .Values.postgresql.enabled -}}
|
||||
{{- end -}}
|
||||
|
||||
{{/*
|
||||
Create the JDBC URL from the host, port and database name.
|
||||
*/}}
|
||||
{{- define "hapi-fhir-jpaserver.database.jdbcUrl" -}}
|
||||
{{- $host := (include "hapi-fhir-jpaserver.database.host" .) -}}
|
||||
{{- $port := (include "hapi-fhir-jpaserver.database.port" .) -}}
|
||||
{{- $name := (include "hapi-fhir-jpaserver.database.name" .) -}}
|
||||
{{- $appName := .Release.Name -}}
|
||||
{{ printf "jdbc:postgresql://%s:%d/%s?ApplicationName=%s" $host (int $port) $name $appName }}
|
||||
{{- end -}}
|
||||
91
charts/fhirflare-ig-toolkit/templates/deployment.yaml
Normal file
@ -0,0 +1,91 @@
|
||||
apiVersion: apps/v1
|
||||
kind: Deployment
|
||||
metadata:
|
||||
name: {{ include "fhirflare-ig-toolkit.fullname" . }}
|
||||
labels:
|
||||
{{- include "fhirflare-ig-toolkit.labels" . | nindent 4 }}
|
||||
spec:
|
||||
replicas: {{ .Values.replicaCount | default 1 }}
|
||||
selector:
|
||||
matchLabels:
|
||||
{{- include "fhirflare-ig-toolkit.selectorLabels" . | nindent 6 }}
|
||||
strategy:
|
||||
type: Recreate
|
||||
template:
|
||||
metadata:
|
||||
labels:
|
||||
{{- include "fhirflare-ig-toolkit.selectorLabels" . | nindent 8 }}
|
||||
{{- with .Values.podAnnotations }}
|
||||
annotations:
|
||||
{{- toYaml . | nindent 8 }}
|
||||
{{- end }}
|
||||
spec:
|
||||
{{- with .Values.imagePullSecrets }}
|
||||
imagePullSecrets:
|
||||
{{- toYaml . | nindent 8 }}
|
||||
{{- end }}
|
||||
securityContext:
|
||||
{{- toYaml .Values.podSecurityContext | nindent 8 }}
|
||||
containers:
|
||||
- name: {{ .Chart.Name }}
|
||||
securityContext:
|
||||
{{- toYaml .Values.securityContext | nindent 12 }}
|
||||
image: "{{ .Values.image.repository }}:{{ .Values.image.tag | default .Chart.AppVersion }}"
|
||||
imagePullPolicy: {{ .Values.image.pullPolicy }}
|
||||
args: ["supervisord", "-c", "/etc/supervisord.conf"]
|
||||
env:
|
||||
- name: APP_BASE_URL
|
||||
value: {{ .Values.config.appBaseUrl | default "http://localhost:5000" | quote }}
|
||||
- name: APP_MODE
|
||||
value: {{ .Values.config.appMode | default "lite" | quote }}
|
||||
- name: FLASK_APP
|
||||
value: {{ .Values.config.flaskApp | default "app.py" | quote }}
|
||||
- name: FLASK_ENV
|
||||
value: {{ .Values.config.flaskEnv | default "development" | quote }}
|
||||
- name: HAPI_FHIR_URL
|
||||
value: {{ .Values.config.externalHapiServerUrl | default "http://external-hapi-fhir:8080/fhir" | quote }}
|
||||
- name: NODE_PATH
|
||||
value: {{ .Values.config.nodePath | default "/usr/lib/node_modules" | quote }}
|
||||
- name: TMPDIR
|
||||
value: "/tmp-dir"
|
||||
ports:
|
||||
- name: http
|
||||
containerPort: {{ .Values.service.port | default 5000 }}
|
||||
protocol: TCP
|
||||
volumeMounts:
|
||||
- name: logs
|
||||
mountPath: /app/logs
|
||||
- name: tmp-dir
|
||||
mountPath: /tmp-dir
|
||||
{{- with .Values.resources }}
|
||||
resources:
|
||||
{{- toYaml . | nindent 12 }}
|
||||
{{- end }}
|
||||
{{- with .Values.livenessProbe }}
|
||||
livenessProbe:
|
||||
{{- toYaml . | nindent 12 }}
|
||||
{{- end }}
|
||||
{{- with .Values.readinessProbe }}
|
||||
readinessProbe:
|
||||
{{- toYaml . | nindent 12 }}
|
||||
{{- end }}
|
||||
volumes:
|
||||
- name: logs
|
||||
emptyDir: {}
|
||||
- name: tmp-dir
|
||||
emptyDir: {}
|
||||
# Always require Intel 64-bit architecture nodes
|
||||
nodeSelector:
|
||||
kubernetes.io/arch: amd64
|
||||
{{- with .Values.nodeSelector }}
|
||||
# Merge with user-defined nodeSelectors if any
|
||||
{{- toYaml . | nindent 8 }}
|
||||
{{- end }}
|
||||
{{- with .Values.affinity }}
|
||||
affinity:
|
||||
{{- toYaml . | nindent 8 }}
|
||||
{{- end }}
|
||||
{{- with .Values.tolerations }}
|
||||
tolerations:
|
||||
{{- toYaml . | nindent 8 }}
|
||||
{{- end }}
|
||||
36
charts/fhirflare-ig-toolkit/templates/ingress.yaml
Normal file
@ -0,0 +1,36 @@
|
||||
{{- if .Values.ingress.enabled -}}
|
||||
{{- $fullName := include "fhirflare-ig-toolkit.fullname" . -}}
|
||||
{{- if semverCompare ">=1.19-0" .Capabilities.KubeVersion.GitVersion }}
|
||||
apiVersion: networking.k8s.io/v1
|
||||
{{- else if semverCompare ">=1.14-0" .Capabilities.KubeVersion.GitVersion }}
|
||||
apiVersion: networking.k8s.io/v1beta1
|
||||
{{- else }}
|
||||
apiVersion: extensions/v1beta1
|
||||
{{- end }}
|
||||
kind: Ingress
|
||||
metadata:
|
||||
name: {{ $fullName }}
|
||||
labels:
|
||||
{{- include "fhirflare-ig-toolkit.labels" . | nindent 4 }}
|
||||
{{- with .Values.ingress.annotations }}
|
||||
annotations:
|
||||
{{- toYaml . | nindent 4 }}
|
||||
{{- end }}
|
||||
spec:
|
||||
rules:
|
||||
- http:
|
||||
paths:
|
||||
- path: /
|
||||
{{- if semverCompare ">=1.19-0" .Capabilities.KubeVersion.GitVersion }}
|
||||
pathType: Prefix
|
||||
backend:
|
||||
service:
|
||||
name: {{ $fullName }}
|
||||
port:
|
||||
number: {{ .Values.service.port | default 5000 }}
|
||||
{{- else }}
|
||||
backend:
|
||||
serviceName: {{ $fullName }}
|
||||
servicePort: {{ .Values.service.port | default 5000 }}
|
||||
{{- end }}
|
||||
{{- end }}
|
||||
18
charts/fhirflare-ig-toolkit/templates/service.yaml
Normal file
@ -0,0 +1,18 @@
|
||||
apiVersion: v1
|
||||
kind: Service
|
||||
metadata:
|
||||
name: {{ include "fhirflare-ig-toolkit.fullname" . }}
|
||||
labels:
|
||||
{{- include "fhirflare-ig-toolkit.labels" . | nindent 4 }}
|
||||
spec:
|
||||
type: {{ .Values.service.type | default "ClusterIP" }}
|
||||
ports:
|
||||
- name: http
|
||||
port: {{ .Values.service.port | default 5000 }}
|
||||
targetPort: {{ .Values.service.port | default 5000 }}
|
||||
protocol: TCP
|
||||
{{- if and (eq .Values.service.type "NodePort") .Values.service.nodePort }}
|
||||
nodePort: {{ .Values.service.nodePort }}
|
||||
{{- end }}
|
||||
selector:
|
||||
{{- include "fhirflare-ig-toolkit.selectorLabels" . | nindent 4 }}
|
||||
@ -0,0 +1,41 @@
|
||||
apiVersion: v1
|
||||
kind: Pod
|
||||
metadata:
|
||||
name: "{{ .Release.Name }}-fhirflare-test-endpoint"
|
||||
labels:
|
||||
helm.sh/chart: "{{ .Chart.Name }}-{{ .Chart.Version }}"
|
||||
app.kubernetes.io/name: {{ .Chart.Name }}
|
||||
app.kubernetes.io/instance: {{ .Release.Name }}
|
||||
app.kubernetes.io/version: {{ .Chart.AppVersion | quote }}
|
||||
app.kubernetes.io/managed-by: {{ .Release.Service }}
|
||||
app.kubernetes.io/component: tests
|
||||
annotations:
|
||||
"helm.sh/hook": test
|
||||
spec:
|
||||
restartPolicy: Never
|
||||
containers:
|
||||
- name: test-fhirflare-endpoint
|
||||
image: curlimages/curl:8.12.1
|
||||
command: ["curl", "--fail-with-body", "--retry", "5", "--retry-delay", "10"]
|
||||
args: ["http://fhirflare:5000"]
|
||||
securityContext:
|
||||
allowPrivilegeEscalation: false
|
||||
capabilities:
|
||||
drop:
|
||||
- ALL
|
||||
privileged: false
|
||||
readOnlyRootFilesystem: true
|
||||
runAsGroup: 65534
|
||||
runAsNonRoot: true
|
||||
runAsUser: 65534
|
||||
seccompProfile:
|
||||
type: RuntimeDefault
|
||||
resources:
|
||||
limits:
|
||||
cpu: 150m
|
||||
ephemeral-storage: 2Gi
|
||||
memory: 192Mi
|
||||
requests:
|
||||
cpu: 100m
|
||||
ephemeral-storage: 50Mi
|
||||
memory: 128Mi
|
||||
89
charts/fhirflare-ig-toolkit/values.yaml
Normal file
@ -0,0 +1,89 @@
|
||||
# Default values for fhirflare-ig-toolkit
|
||||
replicaCount: 1
|
||||
|
||||
image:
|
||||
repository: ghcr.io/jgsuess/fhirflare-ig-toolkit
|
||||
pullPolicy: Always
|
||||
tag: "latest"
|
||||
|
||||
imagePullSecrets: []
|
||||
nameOverride: ""
|
||||
fullnameOverride: ""
|
||||
|
||||
# FHIRflare specific configuration
|
||||
config:
|
||||
# Application mode: "lite" means using external HAPI server, "standalone" means running with embedded HAPI server
|
||||
appMode: "lite"
|
||||
# URL for the external HAPI FHIR server when in lite mode
|
||||
externalHapiServerUrl: "http://external-hapi-fhir:8080/fhir"
|
||||
appBaseUrl: "http://localhost:5000"
|
||||
flaskApp: "app.py"
|
||||
flaskEnv: "development"
|
||||
nodePath: "/usr/lib/node_modules"
|
||||
|
||||
service:
|
||||
type: ClusterIP
|
||||
port: 5000
|
||||
nodePort: null
|
||||
|
||||
podAnnotations: {}
|
||||
|
||||
# podSecurityContext:
|
||||
# fsGroup: 65532
|
||||
# fsGroupChangePolicy: OnRootMismatch
|
||||
# runAsNonRoot: true
|
||||
# runAsGroup: 65532
|
||||
# runAsUser: 65532
|
||||
# seccompProfile:
|
||||
# type: RuntimeDefault
|
||||
|
||||
# securityContext:
|
||||
# allowPrivilegeEscalation: false
|
||||
# capabilities:
|
||||
# drop:
|
||||
# - ALL
|
||||
# privileged: false
|
||||
# readOnlyRootFilesystem: true
|
||||
# runAsGroup: 65532
|
||||
# runAsNonRoot: true
|
||||
# runAsUser: 65532
|
||||
# seccompProfile:
|
||||
# type: RuntimeDefault
|
||||
|
||||
resources:
|
||||
limits:
|
||||
cpu: 500m
|
||||
memory: 512Mi
|
||||
ephemeral-storage: 1Gi
|
||||
requests:
|
||||
cpu: 100m
|
||||
memory: 128Mi
|
||||
ephemeral-storage: 100Mi
|
||||
|
||||
livenessProbe:
|
||||
httpGet:
|
||||
path: /
|
||||
port: http
|
||||
initialDelaySeconds: 30
|
||||
periodSeconds: 10
|
||||
timeoutSeconds: 5
|
||||
failureThreshold: 6
|
||||
successThreshold: 1
|
||||
|
||||
readinessProbe:
|
||||
httpGet:
|
||||
path: /
|
||||
port: http
|
||||
initialDelaySeconds: 5
|
||||
periodSeconds: 10
|
||||
timeoutSeconds: 5
|
||||
failureThreshold: 6
|
||||
successThreshold: 1
|
||||
|
||||
nodeSelector: {}
|
||||
tolerations: []
|
||||
affinity: {}
|
||||
|
||||
ingress:
|
||||
# -- whether to create a primitive Ingress to expose the FHIR server HTTP endpoint
|
||||
enabled: false
|
||||
23
charts/install.sh
Executable file
@ -0,0 +1,23 @@
|
||||
#!/bin/bash
|
||||
#
|
||||
# FHIRFLARE-IG-Toolkit Installation Script
|
||||
#
|
||||
# Description:
|
||||
# This script installs the FHIRFLARE-IG-Toolkit Helm chart into a Kubernetes cluster.
|
||||
# It adds the FHIRFLARE-IG-Toolkit Helm repository and then installs the chart
|
||||
# in the 'flare' namespace, creating the namespace if it doesn't exist.
|
||||
#
|
||||
# Usage:
|
||||
# ./install.sh
|
||||
#
|
||||
# Requirements:
|
||||
# - Helm (v3+)
|
||||
# - kubectl configured with access to your Kubernetes cluster
|
||||
#
|
||||
|
||||
# Add the FHIRFLARE-IG-Toolkit Helm repository
|
||||
helm repo add flare https://jgsuess.github.io/FHIRFLARE-IG-Toolkit/
|
||||
|
||||
# Install the FHIRFLARE-IG-Toolkit chart in the 'flare' namespace
|
||||
|
||||
helm install flare/fhirflare-ig-toolkit --namespace flare --create-namespace --generate-name --set hapi-fhir-jpaserver.postgresql.primary.persistence.storageClass=gp2 --atomic
|
||||
21
docker-compose.yml
Normal file
@ -0,0 +1,21 @@
|
||||
version: '3.8'
|
||||
services:
|
||||
fhirflare:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: Dockerfile.lite
|
||||
ports:
|
||||
- "5000:5000"
|
||||
- "8080:8080"
|
||||
volumes:
|
||||
- ./instance:/app/instance
|
||||
- ./static/uploads:/app/static/uploads
|
||||
- ./logs:/app/logs
|
||||
environment:
|
||||
- FLASK_APP=app.py
|
||||
- FLASK_ENV=development
|
||||
- NODE_PATH=/usr/lib/node_modules
|
||||
- APP_MODE=standalone
|
||||
- APP_BASE_URL=http://localhost:5000
|
||||
- HAPI_FHIR_URL=https://smile.sparked-fhir.com/aucore/fhir/DEFAULT/
|
||||
command: supervisord -c /etc/supervisord.conf
|
||||
22
docker-compose/all-in-one/docker-compose.yml
Normal file
@ -0,0 +1,22 @@
|
||||
# This docker-compose file uses ephemeral Docker named volumes for all data storage.
|
||||
# These volumes persist only as long as the Docker volumes exist and are deleted if you run `docker-compose down -v`.
|
||||
# No data is stored on the host filesystem. If you want persistent storage, replace these with host mounts.
|
||||
services:
|
||||
fhirflare-standalone:
|
||||
image: ${FHIRFLARE_IMAGE:-ghcr.io/sudo-jhare/fhirflare-ig-toolkit-standalone:latest}
|
||||
container_name: fhirflare-standalone
|
||||
ports:
|
||||
- "5000:5000"
|
||||
- "8080:8080"
|
||||
volumes:
|
||||
- fhirflare-instance:/app/instance
|
||||
- fhirflare-uploads:/app/static/uploads
|
||||
- fhirflare-h2-data:/app/h2-data
|
||||
- fhirflare-logs:/app/logs
|
||||
restart: unless-stopped
|
||||
|
||||
volumes:
|
||||
fhirflare-instance:
|
||||
fhirflare-uploads:
|
||||
fhirflare-h2-data:
|
||||
fhirflare-logs:
|
||||
5
docker-compose/all-in-one/down.sh
Executable file
@ -0,0 +1,5 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Stop and remove all containers defined in the Docker Compose file,
|
||||
# along with any anonymous volumes attached to them.
|
||||
docker compose down --volumes
|
||||
5
docker-compose/all-in-one/up.sh
Executable file
@ -0,0 +1,5 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Run Docker Compose
|
||||
|
||||
docker compose up --detach --force-recreate --renew-anon-volumes --always-recreate-deps
|
||||
18
docker-compose/lite/local/application.yaml
Normal file
@ -0,0 +1,18 @@
|
||||
hapi.fhir:
|
||||
ig_runtime_upload_enabled: false
|
||||
narrative_enabled: true
|
||||
logical_urls:
|
||||
- http://terminology.hl7.org/*
|
||||
- https://terminology.hl7.org/*
|
||||
- http://snomed.info/*
|
||||
- https://snomed.info/*
|
||||
- http://unitsofmeasure.org/*
|
||||
- https://unitsofmeasure.org/*
|
||||
- http://loinc.org/*
|
||||
- https://loinc.org/*
|
||||
cors:
|
||||
allow_Credentials: true
|
||||
allowed_origin:
|
||||
- '*'
|
||||
tester.home.name: FHIRFLARE Tester
|
||||
inline_resource_storage_below_size: 4000
|
||||
50
docker-compose/lite/local/docker-compose.yml
Normal file
@ -0,0 +1,50 @@
|
||||
services:
|
||||
fhirflare:
|
||||
image: ${FHIRFLARE_IMAGE:-ghcr.io/sudo-jhare/fhirflare-ig-toolkit-lite:latest}
|
||||
ports:
|
||||
- "5000:5000"
|
||||
# Ephemeral Docker named volumes for all data storage. No data is stored on the host filesystem.
|
||||
volumes:
|
||||
- fhirflare-instance:/app/instance
|
||||
- fhirflare-uploads:/app/static/uploads
|
||||
- fhirflare-h2-data:/app/h2-data
|
||||
- fhirflare-logs:/app/logs
|
||||
environment:
|
||||
- FLASK_APP=app.py
|
||||
- FLASK_ENV=development
|
||||
- NODE_PATH=/usr/lib/node_modules
|
||||
- APP_MODE=lite
|
||||
- APP_BASE_URL=http://localhost:5000
|
||||
- HAPI_FHIR_URL=http://fhir:8080/fhir
|
||||
command: supervisord -c /etc/supervisord.conf
|
||||
|
||||
fhir:
|
||||
container_name: hapi
|
||||
image: "hapiproject/hapi:v8.2.0-1"
|
||||
ports:
|
||||
- "8080:8080"
|
||||
configs:
|
||||
- source: hapi
|
||||
target: /app/config/application.yaml
|
||||
depends_on:
|
||||
- db
|
||||
|
||||
db:
|
||||
image: "postgres:17.2-bookworm"
|
||||
restart: always
|
||||
environment:
|
||||
POSTGRES_PASSWORD: admin
|
||||
POSTGRES_USER: admin
|
||||
POSTGRES_DB: hapi
|
||||
volumes:
|
||||
- ./hapi.postgress.data:/var/lib/postgresql/data
|
||||
|
||||
configs:
|
||||
hapi:
|
||||
file: ./application.yaml
|
||||
|
||||
volumes:
|
||||
fhirflare-instance:
|
||||
fhirflare-uploads:
|
||||
fhirflare-h2-data:
|
||||
fhirflare-logs:
|
||||
5
docker-compose/lite/local/down.sh
Executable file
@ -0,0 +1,5 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Stop and remove all containers defined in the Docker Compose file,
|
||||
# along with any anonymous volumes attached to them.
|
||||
docker compose down --volumes
|
||||
19
docker-compose/lite/local/readme.md
Normal file
@ -0,0 +1,19 @@
|
||||
# FHIRFLARE IG Toolkit
|
||||
|
||||
This directory provides scripts and configuration to start and stop a FHIRFLARE instance with an attached HAPI FHIR server using Docker Compose.
|
||||
|
||||
## Usage
|
||||
|
||||
- To start the FHIRFLARE toolkit and HAPI server:
|
||||
```sh
|
||||
./docker-compose/up.sh
|
||||
```
|
||||
|
||||
- To stop and remove the containers and volumes:
|
||||
```sh
|
||||
./docker-compose/down.sh
|
||||
```
|
||||
|
||||
The web interface will be available at [http://localhost:5000](http://localhost:5000) and the HAPI FHIR server at [http://localhost:8080/fhir](http://localhost:8080/fhir).
|
||||
|
||||
For more details, see the configuration files in this directory.
|
||||
5
docker-compose/lite/local/up.sh
Executable file
@ -0,0 +1,5 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Run Docker Compose
|
||||
|
||||
docker compose up --detach --force-recreate --renew-anon-volumes --always-recreate-deps
|
||||
25
docker-compose/lite/remote/docker-compose.yml
Normal file
@ -0,0 +1,25 @@
|
||||
services:
|
||||
fhirflare:
|
||||
image: ${FHIRFLARE_IMAGE:-ghcr.io/sudo-jhare/fhirflare-ig-toolkit-lite:latest}
|
||||
ports:
|
||||
- "5000:5000"
|
||||
# Ephemeral Docker named volumes for all data storage. No data is stored on the host filesystem.
|
||||
volumes:
|
||||
- fhirflare-instance:/app/instance
|
||||
- fhirflare-uploads:/app/static/uploads
|
||||
- fhirflare-h2-data:/app/h2-data
|
||||
- fhirflare-logs:/app/logs
|
||||
environment:
|
||||
- FLASK_APP=app.py
|
||||
- FLASK_ENV=development
|
||||
- NODE_PATH=/usr/lib/node_modules
|
||||
- APP_MODE=lite
|
||||
- APP_BASE_URL=http://localhost:5000
|
||||
- HAPI_FHIR_URL=https://cdr.fhirlab.net/fhir
|
||||
command: supervisord -c /etc/supervisord.conf
|
||||
|
||||
volumes:
|
||||
fhirflare-instance:
|
||||
fhirflare-uploads:
|
||||
fhirflare-h2-data:
|
||||
fhirflare-logs:
|
||||
5
docker-compose/lite/remote/down.sh
Executable file
@ -0,0 +1,5 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Stop and remove all containers defined in the Docker Compose file,
|
||||
# along with any anonymous volumes attached to them.
|
||||
docker compose down --volumes
|
||||
19
docker-compose/lite/remote/readme.md
Normal file
@ -0,0 +1,19 @@
|
||||
# FHIRFLARE IG Toolkit
|
||||
|
||||
This directory provides scripts and configuration to start and stop a FHIRFLARE instance with an attached HAPI FHIR server using Docker Compose.
|
||||
|
||||
## Usage
|
||||
|
||||
- To start the FHIRFLARE toolkit and HAPI server:
|
||||
```sh
|
||||
./docker-compose/up.sh
|
||||
```
|
||||
|
||||
- To stop and remove the containers and volumes:
|
||||
```sh
|
||||
./docker-compose/down.sh
|
||||
```
|
||||
|
||||
The web interface will be available at [http://localhost:5000](http://localhost:5000) and the HAPI FHIR server at [http://localhost:8080/fhir](http://localhost:8080/fhir).
|
||||
|
||||
For more details, see the configuration files in this directory.
|
||||
5
docker-compose/lite/remote/up.sh
Executable file
@ -0,0 +1,5 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Run Docker Compose
|
||||
|
||||
docker compose up --detach --force-recreate --renew-anon-volumes --always-recreate-deps
|
||||
66
docker/Dockerfile
Normal file
@ -0,0 +1,66 @@
|
||||
# ------------------------------------------------------------------------------
|
||||
# Dockerfile for FHIRFLARE-IG-Toolkit (Optimized for Python/Flask)
|
||||
#
|
||||
# This Dockerfile builds a container for the FHIRFLARE-IG-Toolkit application.
|
||||
#
|
||||
# Key Features:
|
||||
# - Uses python:3.11-slim as the base image for a minimal, secure Python runtime.
|
||||
# - Installs Node.js and global NPM packages (gofsh, fsh-sushi) for FHIR IG tooling.
|
||||
# - Sets up a Python virtual environment and installs all Python dependencies.
|
||||
# - Installs and configures Supervisor to manage the Flask app and related processes.
|
||||
# - Copies all necessary application code, templates, static files, and configuration.
|
||||
# - Exposes ports 5000 (Flask) and 8080 (optional, for compatibility).
|
||||
# - Entrypoint runs Supervisor for process management.
|
||||
#
|
||||
# Notes:
|
||||
# - The Dockerfile is optimized for Python. Tomcat/Java is not included.
|
||||
# - Node.js is only installed if needed for FHIR IG tooling.
|
||||
# - The image is suitable for development and production with minimal changes.
|
||||
# ------------------------------------------------------------------------------
|
||||
|
||||
# Optimized Dockerfile for Python (Flask)
|
||||
FROM python:3.11-slim AS base
|
||||
|
||||
# Install system dependencies
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
curl \
|
||||
coreutils \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Optional: Install Node.js if needed for GoFSH/SUSHI
|
||||
RUN curl -fsSL https://deb.nodesource.com/setup_18.x | bash - \
|
||||
&& apt-get install -y --no-install-recommends nodejs \
|
||||
&& npm install -g gofsh fsh-sushi \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Set workdir
|
||||
WORKDIR /app
|
||||
|
||||
# Copy requirements and install Python dependencies
|
||||
COPY requirements.txt .
|
||||
RUN python -m venv /app/venv \
|
||||
&& . /app/venv/bin/activate \
|
||||
&& pip install --upgrade pip \
|
||||
&& pip install --no-cache-dir -r requirements.txt \
|
||||
&& pip uninstall -y fhirpath || true \
|
||||
&& pip install --no-cache-dir fhirpathpy \
|
||||
&& pip install supervisor
|
||||
|
||||
# Copy application files
|
||||
COPY app.py .
|
||||
COPY services.py .
|
||||
COPY forms.py .
|
||||
COPY package.py .
|
||||
COPY templates/ templates/
|
||||
COPY static/ static/
|
||||
COPY tests/ tests/
|
||||
COPY supervisord.conf /etc/supervisord.conf
|
||||
|
||||
# Expose ports
|
||||
EXPOSE 5000 8080
|
||||
|
||||
# Set environment
|
||||
ENV PATH="/app/venv/bin:$PATH"
|
||||
|
||||
# Start supervisord
|
||||
CMD ["supervisord", "-c", "/etc/supervisord.conf"]
|
||||
7
docker/build-docker.sh
Executable file
@ -0,0 +1,7 @@
|
||||
#!/bin/bash
|
||||
# Build FHIRFLARE-IG-Toolkit Docker image
|
||||
|
||||
# Build the image using the Dockerfile in the docker directory
|
||||
docker build -f Dockerfile -t fhirflare-ig-toolkit:latest ..
|
||||
|
||||
echo "Docker image built successfully"
|
||||
376
forms.py
@ -1,19 +1,369 @@
|
||||
# app/modules/fhir_ig_importer/forms.py
|
||||
|
||||
# forms.py
|
||||
from flask_wtf import FlaskForm
|
||||
from wtforms import StringField, SubmitField
|
||||
from wtforms.validators import DataRequired, Regexp
|
||||
from wtforms import StringField, SelectField, TextAreaField, BooleanField, SubmitField, FileField, PasswordField, SelectMultipleField
|
||||
from wtforms.validators import DataRequired, Regexp, ValidationError, URL, Optional, InputRequired
|
||||
from flask import request
|
||||
import json
|
||||
import xml.etree.ElementTree as ET
|
||||
import re
|
||||
import logging
|
||||
import os
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
class RetrieveSplitDataForm(FlaskForm):
|
||||
"""Form for retrieving FHIR bundles and splitting them into individual resources."""
|
||||
fhir_server_url = StringField('FHIR Server URL', validators=[URL(), Optional()],
|
||||
render_kw={'placeholder': 'e.g., https://hapi.fhir.org/baseR4'})
|
||||
auth_type = SelectField('Authentication Type (for Custom URL)', choices=[
|
||||
('none', 'None'),
|
||||
('bearerToken', 'Bearer Token'),
|
||||
('basicAuth', 'Basic Authentication')
|
||||
], default='none', validators=[Optional()])
|
||||
auth_token = StringField('Bearer Token', validators=[Optional()],
|
||||
render_kw={'placeholder': 'Enter Bearer Token', 'type': 'password'})
|
||||
basic_auth_username = StringField('Username', validators=[Optional()],
|
||||
render_kw={'placeholder': 'Enter Basic Auth Username'})
|
||||
basic_auth_password = PasswordField('Password', validators=[Optional()],
|
||||
render_kw={'placeholder': 'Enter Basic Auth Password'})
|
||||
validate_references = BooleanField('Fetch Referenced Resources', default=False,
|
||||
description="If checked, fetches resources referenced by the initial bundles.")
|
||||
fetch_reference_bundles = BooleanField('Fetch Full Reference Bundles (instead of individual resources)', default=False,
|
||||
description="Requires 'Fetch Referenced Resources'. Fetches e.g. /Patient instead of Patient/id for each reference.",
|
||||
render_kw={'data-dependency': 'validate_references'})
|
||||
split_bundle_zip = FileField('Upload Bundles to Split (ZIP)', validators=[Optional()],
|
||||
render_kw={'accept': '.zip'})
|
||||
submit_retrieve = SubmitField('Retrieve Bundles')
|
||||
submit_split = SubmitField('Split Bundles')
|
||||
|
||||
def validate(self, extra_validators=None):
|
||||
if not super().validate(extra_validators):
|
||||
return False
|
||||
if self.fetch_reference_bundles.data and not self.validate_references.data:
|
||||
self.fetch_reference_bundles.errors.append('Cannot fetch full reference bundles unless "Fetch Referenced Resources" is also checked.')
|
||||
return False
|
||||
if self.auth_type.data == 'bearerToken' and self.submit_retrieve.data and not self.auth_token.data:
|
||||
self.auth_token.errors.append('Bearer Token is required when Bearer Token authentication is selected.')
|
||||
return False
|
||||
if self.auth_type.data == 'basicAuth' and self.submit_retrieve.data:
|
||||
if not self.basic_auth_username.data:
|
||||
self.basic_auth_username.errors.append('Username is required for Basic Authentication.')
|
||||
return False
|
||||
if not self.basic_auth_password.data:
|
||||
self.basic_auth_password.errors.append('Password is required for Basic Authentication.')
|
||||
return False
|
||||
if self.split_bundle_zip.data:
|
||||
if not self.split_bundle_zip.data.filename.lower().endswith('.zip'):
|
||||
self.split_bundle_zip.errors.append('File must be a ZIP file.')
|
||||
return False
|
||||
return True
|
||||
|
||||
class IgImportForm(FlaskForm):
|
||||
"""Form for specifying an IG package to import."""
|
||||
# Basic validation for FHIR package names (e.g., hl7.fhir.r4.core)
|
||||
package_name = StringField('Package Name (e.g., hl7.fhir.au.base)', validators=[
|
||||
"""Form for importing Implementation Guides."""
|
||||
package_name = StringField('Package Name', validators=[
|
||||
DataRequired(),
|
||||
Regexp(r'^[a-zA-Z0-9]+(\.[a-zA-Z0-9]+)+$', message='Invalid package name format.')
|
||||
])
|
||||
# Basic validation for version (e.g., 4.1.0, current)
|
||||
package_version = StringField('Package Version (e.g., 4.1.0 or current)', validators=[
|
||||
Regexp(r'^[a-zA-Z0-9][a-zA-Z0-9\-\.]*[a-zA-Z0-9]$', message="Invalid package name format.")
|
||||
], render_kw={'placeholder': 'e.g., hl7.fhir.au.core'})
|
||||
package_version = StringField('Package Version', validators=[
|
||||
DataRequired(),
|
||||
Regexp(r'^[a-zA-Z0-9\.\-]+$', message="Invalid version format. Use alphanumeric characters, dots, or hyphens (e.g., 1.2.3, 1.1.0-preview, current).")
|
||||
], render_kw={'placeholder': 'e.g., 1.1.0-preview'})
|
||||
dependency_mode = SelectField('Dependency Mode', choices=[
|
||||
('recursive', 'Current Recursive'),
|
||||
('patch-canonical', 'Patch Canonical Versions'),
|
||||
('tree-shaking', 'Tree Shaking (Only Used Dependencies)')
|
||||
], default='recursive')
|
||||
submit = SubmitField('Import')
|
||||
|
||||
class ManualIgImportForm(FlaskForm):
|
||||
"""Form for manual importing Implementation Guides via file or URL."""
|
||||
import_mode = SelectField('Import Mode', choices=[
|
||||
('file', 'Upload File'),
|
||||
('url', 'From URL')
|
||||
], default='file', validators=[DataRequired()])
|
||||
tgz_file = FileField('IG Package File (.tgz)', validators=[Optional()],
|
||||
render_kw={'accept': '.tgz'})
|
||||
tgz_url = StringField('IG Package URL', validators=[Optional(), URL()],
|
||||
render_kw={'placeholder': 'e.g., https://example.com/hl7.fhir.au.core-1.1.0-preview.tgz'})
|
||||
dependency_mode = SelectField('Dependency Mode', choices=[
|
||||
('recursive', 'Current Recursive'),
|
||||
('patch-canonical', 'Patch Canonical Versions'),
|
||||
('tree-shaking', 'Tree Shaking (Only Used Dependencies)')
|
||||
], default='recursive')
|
||||
resolve_dependencies = BooleanField('Resolve Dependencies', default=True,
|
||||
render_kw={'class': 'form-check-input'})
|
||||
submit = SubmitField('Import')
|
||||
|
||||
def validate(self, extra_validators=None):
|
||||
if not super().validate(extra_validators):
|
||||
return False
|
||||
mode = self.import_mode.data
|
||||
has_file = request and request.files and self.tgz_file.name in request.files and request.files[self.tgz_file.name].filename != ''
|
||||
has_url = bool(self.tgz_url.data) # Convert to boolean: True if non-empty string
|
||||
|
||||
# Ensure exactly one input method is used
|
||||
inputs_provided = sum([has_file, has_url])
|
||||
if inputs_provided != 1:
|
||||
if inputs_provided == 0:
|
||||
self.import_mode.errors.append('Please provide input for one import method (File or URL).')
|
||||
else:
|
||||
self.import_mode.errors.append('Please use only one import method at a time.')
|
||||
return False
|
||||
|
||||
# Validate based on import mode
|
||||
if mode == 'file':
|
||||
if not has_file:
|
||||
self.tgz_file.errors.append('A .tgz file is required for File import.')
|
||||
return False
|
||||
if not self.tgz_file.data.filename.lower().endswith('.tgz'):
|
||||
self.tgz_file.errors.append('File must be a .tgz file.')
|
||||
return False
|
||||
elif mode == 'url':
|
||||
if not has_url:
|
||||
self.tgz_url.errors.append('A valid URL is required for URL import.')
|
||||
return False
|
||||
if not self.tgz_url.data.lower().endswith('.tgz'):
|
||||
self.tgz_url.errors.append('URL must point to a .tgz file.')
|
||||
return False
|
||||
else:
|
||||
self.import_mode.errors.append('Invalid import mode selected.')
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
class ValidationForm(FlaskForm):
|
||||
"""Form for validating FHIR samples."""
|
||||
package_name = StringField('Package Name', validators=[DataRequired()])
|
||||
version = StringField('Package Version', validators=[DataRequired()])
|
||||
include_dependencies = BooleanField('Include Dependencies', default=True)
|
||||
mode = SelectField('Validation Mode', choices=[
|
||||
('single', 'Single Resource'),
|
||||
('bundle', 'Bundle')
|
||||
], default='single')
|
||||
sample_input = TextAreaField('Sample Input', validators=[
|
||||
DataRequired(),
|
||||
Regexp(r'^[a-zA-Z0-9\.\-]+$', message='Invalid version format.')
|
||||
])
|
||||
submit = SubmitField('Fetch & Download IG')
|
||||
submit = SubmitField('Validate')
|
||||
|
||||
class FSHConverterForm(FlaskForm):
|
||||
"""Form for converting FHIR resources to FSH."""
|
||||
package = SelectField('FHIR Package (Optional)', choices=[('', 'None')], validators=[Optional()])
|
||||
input_mode = SelectField('Input Mode', choices=[
|
||||
('file', 'Upload File'),
|
||||
('text', 'Paste Text')
|
||||
], validators=[DataRequired()])
|
||||
fhir_file = FileField('FHIR Resource File (JSON/XML)', validators=[Optional()])
|
||||
fhir_text = TextAreaField('FHIR Resource Text (JSON/XML)', validators=[Optional()])
|
||||
output_style = SelectField('Output Style', choices=[
|
||||
('file-per-definition', 'File per Definition'),
|
||||
('group-by-fsh-type', 'Group by FSH Type'),
|
||||
('group-by-profile', 'Group by Profile'),
|
||||
('single-file', 'Single File')
|
||||
], validators=[DataRequired()])
|
||||
log_level = SelectField('Log Level', choices=[
|
||||
('error', 'Error'),
|
||||
('warn', 'Warn'),
|
||||
('info', 'Info'),
|
||||
('debug', 'Debug')
|
||||
], validators=[DataRequired()])
|
||||
fhir_version = SelectField('FHIR Version', choices=[
|
||||
('', 'Auto-detect'),
|
||||
('4.0.1', 'R4'),
|
||||
('4.3.0', 'R4B'),
|
||||
('5.0.0', 'R5')
|
||||
], validators=[Optional()])
|
||||
fishing_trip = BooleanField('Run Fishing Trip (Round-Trip Validation with SUSHI)', default=False)
|
||||
dependencies = TextAreaField('Dependencies (e.g., hl7.fhir.us.core@6.1.0)', validators=[Optional()])
|
||||
indent_rules = BooleanField('Indent Rules with Context Paths', default=False)
|
||||
meta_profile = SelectField('Meta Profile Handling', choices=[
|
||||
('only-one', 'Only One Profile (Default)'),
|
||||
('first', 'First Profile'),
|
||||
('none', 'Ignore Profiles')
|
||||
], validators=[DataRequired()])
|
||||
alias_file = FileField('Alias FSH File', validators=[Optional()])
|
||||
no_alias = BooleanField('Disable Alias Generation', default=False)
|
||||
submit = SubmitField('Convert to FSH')
|
||||
|
||||
def validate(self, extra_validators=None):
|
||||
if not super().validate(extra_validators):
|
||||
return False
|
||||
has_file_in_request = request and request.files and self.fhir_file.name in request.files and request.files[self.fhir_file.name].filename != ''
|
||||
if self.input_mode.data == 'file' and not has_file_in_request:
|
||||
if not self.fhir_file.data:
|
||||
self.fhir_file.errors.append('File is required when input mode is Upload File.')
|
||||
return False
|
||||
if self.input_mode.data == 'text' and not self.fhir_text.data:
|
||||
self.fhir_text.errors.append('Text input is required when input mode is Paste Text.')
|
||||
return False
|
||||
if self.input_mode.data == 'text' and self.fhir_text.data:
|
||||
try:
|
||||
content = self.fhir_text.data.strip()
|
||||
if not content: pass
|
||||
elif content.startswith('{'): json.loads(content)
|
||||
elif content.startswith('<'): ET.fromstring(content)
|
||||
else:
|
||||
self.fhir_text.errors.append('Text input must be valid JSON or XML.')
|
||||
return False
|
||||
except (json.JSONDecodeError, ET.ParseError):
|
||||
self.fhir_text.errors.append('Invalid JSON or XML format.')
|
||||
return False
|
||||
if self.dependencies.data:
|
||||
for dep in self.dependencies.data.splitlines():
|
||||
dep = dep.strip()
|
||||
if dep and not re.match(r'^[a-zA-Z0-9\-\.]+@[a-zA-Z0-9\.\-]+$', dep):
|
||||
self.dependencies.errors.append(f'Invalid dependency format: "{dep}". Use package@version (e.g., hl7.fhir.us.core@6.1.0).')
|
||||
return False
|
||||
has_alias_file_in_request = request and request.files and self.alias_file.name in request.files and request.files[self.alias_file.name].filename != ''
|
||||
alias_file_data = self.alias_file.data or (request.files.get(self.alias_file.name) if request else None)
|
||||
if alias_file_data and alias_file_data.filename:
|
||||
if not alias_file_data.filename.lower().endswith('.fsh'):
|
||||
self.alias_file.errors.append('Alias file should have a .fsh extension.')
|
||||
return True
|
||||
|
||||
class TestDataUploadForm(FlaskForm):
|
||||
"""Form for uploading FHIR test data."""
|
||||
fhir_server_url = StringField('Target FHIR Server URL', validators=[DataRequired(), URL()],
|
||||
render_kw={'placeholder': 'e.g., http://localhost:8080/fhir'})
|
||||
auth_type = SelectField('Authentication Type', choices=[
|
||||
('none', 'None'),
|
||||
('bearerToken', 'Bearer Token'),
|
||||
('basic', 'Basic Authentication')
|
||||
], default='none')
|
||||
auth_token = StringField('Bearer Token', validators=[Optional()],
|
||||
render_kw={'placeholder': 'Enter Bearer Token', 'type': 'password'})
|
||||
username = StringField('Username', validators=[Optional()],
|
||||
render_kw={'placeholder': 'Enter Basic Auth Username'})
|
||||
password = PasswordField('Password', validators=[Optional()],
|
||||
render_kw={'placeholder': 'Enter Basic Auth Password'})
|
||||
test_data_file = FileField('Select Test Data File(s)', validators=[InputRequired("Please select at least one file.")],
|
||||
render_kw={'multiple': True, 'accept': '.json,.xml,.zip'})
|
||||
validate_before_upload = BooleanField('Validate Resources Before Upload?', default=False,
|
||||
description="Validate resources against selected package profile before uploading.")
|
||||
validation_package_id = SelectField('Validation Profile Package (Optional)',
|
||||
choices=[('', '-- Select Package for Validation --')],
|
||||
validators=[Optional()],
|
||||
description="Select the processed IG package to use for validation.")
|
||||
upload_mode = SelectField('Upload Mode', choices=[
|
||||
('individual', 'Individual Resources'),
|
||||
('transaction', 'Transaction Bundle')
|
||||
], default='individual')
|
||||
use_conditional_uploads = BooleanField('Use Conditional Upload (Individual Mode Only)?', default=True,
|
||||
description="If checked, checks resource existence (GET) and uses If-Match (PUT) or creates (PUT). If unchecked, uses simple PUT for all.")
|
||||
error_handling = SelectField('Error Handling', choices=[
|
||||
('stop', 'Stop on First Error'),
|
||||
('continue', 'Continue on Error')
|
||||
], default='stop')
|
||||
submit = SubmitField('Upload and Process')
|
||||
|
||||
def validate(self, extra_validators=None):
|
||||
if not super().validate(extra_validators):
|
||||
return False
|
||||
if self.validate_before_upload.data and not self.validation_package_id.data:
|
||||
self.validation_package_id.errors.append('Please select a package to validate against when pre-upload validation is enabled.')
|
||||
return False
|
||||
if self.use_conditional_uploads.data and self.upload_mode.data == 'transaction':
|
||||
self.use_conditional_uploads.errors.append('Conditional Uploads only apply to the "Individual Resources" mode.')
|
||||
return False
|
||||
if self.auth_type.data == 'bearerToken' and not self.auth_token.data:
|
||||
self.auth_token.errors.append('Bearer Token is required when Bearer Token authentication is selected.')
|
||||
return False
|
||||
if self.auth_type.data == 'basic':
|
||||
if not self.username.data:
|
||||
self.username.errors.append('Username is required for Basic Authentication.')
|
||||
return False
|
||||
if not self.password.data:
|
||||
self.password.errors.append('Password is required for Basic Authentication.')
|
||||
return False
|
||||
return True
|
||||
|
||||
class FhirRequestForm(FlaskForm):
|
||||
fhir_server_url = StringField('FHIR Server URL', validators=[URL(), Optional()],
|
||||
render_kw={'placeholder': 'e.g., https://hapi.fhir.org/baseR4'})
|
||||
auth_type = SelectField('Authentication Type (for Custom URL)', choices=[
|
||||
('none', 'None'),
|
||||
('bearerToken', 'Bearer Token'),
|
||||
('basicAuth', 'Basic Authentication')
|
||||
], default='none', validators=[Optional()])
|
||||
auth_token = StringField('Bearer Token', validators=[Optional()],
|
||||
render_kw={'placeholder': 'Enter Bearer Token', 'type': 'password'})
|
||||
basic_auth_username = StringField('Username', validators=[Optional()],
|
||||
render_kw={'placeholder': 'Enter Basic Auth Username'})
|
||||
basic_auth_password = PasswordField('Password', validators=[Optional()],
|
||||
render_kw={'placeholder': 'Enter Basic Auth Password'})
|
||||
submit = SubmitField('Send Request')
|
||||
|
||||
def validate(self, extra_validators=None):
|
||||
if not super().validate(extra_validators):
|
||||
return False
|
||||
if self.fhir_server_url.data:
|
||||
if self.auth_type.data == 'bearerToken' and not self.auth_token.data:
|
||||
self.auth_token.errors.append('Bearer Token is required when Bearer Token authentication is selected for a custom URL.')
|
||||
return False
|
||||
if self.auth_type.data == 'basicAuth':
|
||||
if not self.basic_auth_username.data:
|
||||
self.basic_auth_username.errors.append('Username is required for Basic Authentication with a custom URL.')
|
||||
return False
|
||||
if not self.basic_auth_password.data:
|
||||
self.basic_auth_password.errors.append('Password is required for Basic Authentication with a custom URL.')
|
||||
return False
|
||||
return True
|
||||
|
||||
class IgYamlForm(FlaskForm):
|
||||
"""Form to select IGs for YAML generation."""
|
||||
igs = SelectMultipleField('Select IGs to include', validators=[DataRequired()])
|
||||
generate_yaml = SubmitField('Generate YAML')
|
||||
|
||||
# --- New ValidationForm class for the new page ---
|
||||
class ValidationForm(FlaskForm):
|
||||
"""Form for validating a single FHIR resource with various options."""
|
||||
# Added fields to match the HTML template
|
||||
package_name = StringField('Package Name', validators=[Optional()])
|
||||
version = StringField('Package Version', validators=[Optional()])
|
||||
|
||||
# Main content fields
|
||||
fhir_resource = TextAreaField('FHIR Resource (JSON or XML)', validators=[InputRequired()],
|
||||
render_kw={'placeholder': 'Paste your FHIR JSON here...'})
|
||||
fhir_version = SelectField('FHIR Version', choices=[
|
||||
('4.0.1', 'R4 (4.0.1)'),
|
||||
('3.0.2', 'STU3 (3.0.2)'),
|
||||
('5.0.0', 'R5 (5.0.0)')
|
||||
], default='4.0.1', validators=[DataRequired()])
|
||||
|
||||
# Flags and options from validator settings.pdf
|
||||
do_native = BooleanField('Native Validation (doNative)')
|
||||
hint_about_must_support = BooleanField('Must Support (hintAboutMustSupport)')
|
||||
assume_valid_rest_references = BooleanField('Assume Valid Rest References (assumeValidRestReferences)')
|
||||
no_extensible_binding_warnings = BooleanField('Extensible Binding Warnings (noExtensibleBindingWarnings)')
|
||||
show_times = BooleanField('Show Times (-show-times)')
|
||||
allow_example_urls = BooleanField('Allow Example URLs (-allow-example-urls)')
|
||||
check_ips_codes = BooleanField('Check IPS Codes (-check-ips-codes)')
|
||||
allow_any_extensions = BooleanField('Allow Any Extensions (-allow-any-extensions)')
|
||||
tx_routing = BooleanField('Show Terminology Routing (-tx-routing)')
|
||||
|
||||
# SNOMED CT options
|
||||
snomed_ct_version = SelectField('Select SNOMED Version', choices=[
|
||||
('', '-- No selection --'),
|
||||
('intl', 'International edition (900000000000207008)'),
|
||||
('us', 'US edition (731000124108)'),
|
||||
('uk', 'United Kingdom Edition (999000041000000102)'),
|
||||
('es', 'Spanish Language Edition (449081005)'),
|
||||
('nl', 'Netherlands Edition (11000146104)'),
|
||||
('ca', 'Canadian Edition (20611000087101)'),
|
||||
('dk', 'Danish Edition (554471000005108)'),
|
||||
('se', 'Swedish Edition (45991000052106)'),
|
||||
('au', 'Australian Edition (32506021000036107)'),
|
||||
('be', 'Belgium Edition (11000172109)')
|
||||
], validators=[Optional()])
|
||||
|
||||
# Text input fields
|
||||
profiles = StringField('Profiles',
|
||||
validators=[Optional()],
|
||||
render_kw={'placeholder': 'e.g. http://hl7.org/fhir/us/core/StructureDefinition/us-core-patient'})
|
||||
extensions = StringField('Extensions',
|
||||
validators=[Optional()],
|
||||
render_kw={'placeholder': 'e.g. http://hl7.org/fhir/StructureDefinition/elementdefinition-namespace'})
|
||||
terminology_server = StringField('Terminology Server',
|
||||
validators=[Optional()],
|
||||
render_kw={'placeholder': 'e.g. http://tx.fhir.org'})
|
||||
|
||||
submit = SubmitField('Validate')
|
||||
111
hapi-fhir-Setup/README.md
Normal file
@ -0,0 +1,111 @@
|
||||
# Application Build and Run Guide - MANUAL STEPS
|
||||
|
||||
This guide outlines the steps to set up, build, and run the application, including the HAPI FHIR server component and the rest of the application managed via Docker Compose.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
Before you begin, ensure you have the following installed on your system:
|
||||
|
||||
* [Git](https://git-scm.com/)
|
||||
* [Maven](https://maven.apache.org/)
|
||||
* [Java Development Kit (JDK)](https://www.oracle.com/java/technologies/downloads/) (Ensure compatibility with the HAPI FHIR version)
|
||||
* [Docker](https://www.docker.com/products/docker-desktop/)
|
||||
* [Docker Compose](https://docs.docker.com/compose/install/) (Often included with Docker Desktop)
|
||||
|
||||
## Setup and Build
|
||||
|
||||
Follow these steps to clone the necessary repository and build the components.
|
||||
|
||||
### 1. Clone and Build the HAPI FHIR Server
|
||||
|
||||
First, clone the HAPI FHIR JPA Server Starter project and build the server application.
|
||||
|
||||
|
||||
# Step 1: Clone the repository
|
||||
git clone https://github.com/hapifhir/hapi-fhir-jpaserver-starter.git hapi-fhir-jpaserver hapi-fhir-jpaserver
|
||||
|
||||
# Navigate into the cloned directory
|
||||
cd hapi-fhir-jpaserver
|
||||
|
||||
copy the folder from hapi-fhir-setup/target/classes/application.yaml to the hapi-fhir-jpaserver/target/classes/application.yaml folder created above
|
||||
|
||||
# Step 2: Build the HAPI server package (skipping tests, using 'boot' profile)
|
||||
# This creates the runnable WAR file in the 'target/' directory
|
||||
mvn clean package -DskipTests=true -Pboot
|
||||
|
||||
# Return to the parent directory (or your project root)
|
||||
cd ..
|
||||
2. Build the Rest of the Application (Docker)
|
||||
Next, build the Docker images for the remaining parts of the application as defined in your docker-compose.yml file. Run this command from the root directory where your docker-compose.yml file is located.
|
||||
|
||||
|
||||
|
||||
# Step 3: Build Docker images without using cache
|
||||
docker-compose build --no-cache
|
||||
Running the Application
|
||||
Option A: Running the Full Application (Recommended)
|
||||
Use Docker Compose to start all services, including (presumably) the HAPI FHIR server if it's configured in your docker-compose.yml. Run this from the root directory containing your docker-compose.yml.
|
||||
|
||||
|
||||
|
||||
# Step 4: Start all services defined in docker-compose.yml in detached mode
|
||||
docker-compose up -d
|
||||
Option B: Running the HAPI FHIR Server Standalone (Debugging Only)
|
||||
This method runs only the HAPI FHIR server directly using the built WAR file. Use this primarily for debugging the server in isolation.
|
||||
|
||||
|
||||
|
||||
# Navigate into the HAPI server directory where you built it
|
||||
cd hapi-fhir-jpaserver
|
||||
|
||||
# Run the WAR file directly using Java
|
||||
java -jar target/ROOT.war
|
||||
|
||||
# Note: You might need to configure ports or database connections
|
||||
# separately when running this way, depending on the application's needs.
|
||||
|
||||
# Remember to navigate back when done
|
||||
# cd ..
|
||||
Useful Docker Commands
|
||||
Here are some helpful commands for interacting with your running Docker containers:
|
||||
|
||||
Copying files from a container:
|
||||
To copy a file from a running container to your local machine's current directory:
|
||||
|
||||
|
||||
|
||||
# Syntax: docker cp <CONTAINER_ID_OR_NAME>:<PATH_IN_CONTAINER> <LOCAL_DESTINATION_PATH>
|
||||
docker cp <CONTAINER_ID>:/app/PATH/Filename.ext .
|
||||
(Replace <CONTAINER_ID>, /app/PATH/Filename.ext with actual values. . refers to the current directory on your host machine.)
|
||||
|
||||
Accessing a container's shell:
|
||||
To get an interactive bash shell inside a running container:
|
||||
|
||||
|
||||
|
||||
# Syntax: docker exec -it <CONTAINER_ID_OR_NAME> bash
|
||||
docker exec -it <CONTAINER_ID> bash
|
||||
(Replace <CONTAINER_ID> with the actual container ID or name. You can find this using docker ps.)
|
||||
|
||||
Viewing running containers:
|
||||
|
||||
|
||||
|
||||
docker ps
|
||||
Viewing application logs:
|
||||
|
||||
|
||||
|
||||
# Follow logs for all services
|
||||
docker-compose logs -f
|
||||
|
||||
# Follow logs for a specific service
|
||||
docker-compose logs -f <SERVICE_NAME>
|
||||
(Replace <SERVICE_NAME> with the name defined in your docker-compose.yml)
|
||||
|
||||
Stopping the application:
|
||||
To stop the services started with docker-compose up -d:
|
||||
|
||||
|
||||
|
||||
docker-compose down
|
||||
342
hapi-fhir-Setup/target/classes/application.yaml
Normal file
@ -0,0 +1,342 @@
|
||||
#Uncomment the "servlet" and "context-path" lines below to make the fhir endpoint available at /example/path/fhir instead of the default value of /fhir
|
||||
server:
|
||||
# servlet:
|
||||
# context-path: /example/path
|
||||
port: 8080
|
||||
#Adds the option to go to eg. http://localhost:8080/actuator/health for seeing the running configuration
|
||||
#see https://docs.spring.io/spring-boot/docs/current/reference/html/actuator.html#actuator.endpoints
|
||||
management:
|
||||
#The following configuration will enable the actuator endpoints at /actuator/health, /actuator/info, /actuator/prometheus, /actuator/metrics. For security purposes, only /actuator/health is enabled by default.
|
||||
endpoints:
|
||||
enabled-by-default: false
|
||||
web:
|
||||
exposure:
|
||||
include: 'health' # or e.g. 'info,health,prometheus,metrics' or '*' for all'
|
||||
endpoint:
|
||||
info:
|
||||
enabled: true
|
||||
metrics:
|
||||
enabled: true
|
||||
health:
|
||||
enabled: true
|
||||
probes:
|
||||
enabled: true
|
||||
group:
|
||||
liveness:
|
||||
include:
|
||||
- livenessState
|
||||
- readinessState
|
||||
prometheus:
|
||||
enabled: true
|
||||
prometheus:
|
||||
metrics:
|
||||
export:
|
||||
enabled: true
|
||||
spring:
|
||||
main:
|
||||
allow-circular-references: true
|
||||
flyway:
|
||||
enabled: false
|
||||
baselineOnMigrate: true
|
||||
fail-on-missing-locations: false
|
||||
datasource:
|
||||
#url: 'jdbc:h2:file:./target/database/h2'
|
||||
url: jdbc:h2:file:/app/h2-data/fhir;DB_CLOSE_DELAY=-1;AUTO_SERVER=TRUE
|
||||
#url: jdbc:h2:mem:test_mem
|
||||
username: sa
|
||||
password: null
|
||||
driverClassName: org.h2.Driver
|
||||
max-active: 15
|
||||
|
||||
# database connection pool size
|
||||
hikari:
|
||||
maximum-pool-size: 10
|
||||
jpa:
|
||||
properties:
|
||||
hibernate.format_sql: false
|
||||
hibernate.show_sql: false
|
||||
|
||||
#Hibernate dialect is automatically detected except Postgres and H2.
|
||||
#If using H2, then supply the value of ca.uhn.fhir.jpa.model.dialect.HapiFhirH2Dialect
|
||||
#If using postgres, then supply the value of ca.uhn.fhir.jpa.model.dialect.HapiFhirPostgresDialect
|
||||
hibernate.dialect: ca.uhn.fhir.jpa.model.dialect.HapiFhirH2Dialect
|
||||
# hibernate.hbm2ddl.auto: update
|
||||
# hibernate.jdbc.batch_size: 20
|
||||
# hibernate.cache.use_query_cache: false
|
||||
# hibernate.cache.use_second_level_cache: false
|
||||
# hibernate.cache.use_structured_entries: false
|
||||
# hibernate.cache.use_minimal_puts: false
|
||||
|
||||
### These settings will enable fulltext search with lucene or elastic
|
||||
hibernate.search.enabled: false
|
||||
### lucene parameters
|
||||
# hibernate.search.backend.type: lucene
|
||||
# hibernate.search.backend.analysis.configurer: ca.uhn.fhir.jpa.search.HapiHSearchAnalysisConfigurers$HapiLuceneAnalysisConfigurer
|
||||
# hibernate.search.backend.directory.type: local-filesystem
|
||||
# hibernate.search.backend.directory.root: target/lucenefiles
|
||||
# hibernate.search.backend.lucene_version: lucene_current
|
||||
### elastic parameters ===> see also elasticsearch section below <===
|
||||
# hibernate.search.backend.type: elasticsearch
|
||||
# hibernate.search.backend.analysis.configurer: ca.uhn.fhir.jpa.search.HapiHSearchAnalysisConfigurers$HapiElasticAnalysisConfigurer
|
||||
hapi:
|
||||
fhir:
|
||||
### This flag when enabled to true, will avail evaluate measure operations from CR Module.
|
||||
### Flag is false by default, can be passed as command line argument to override.
|
||||
cr:
|
||||
enabled: false
|
||||
caregaps:
|
||||
reporter: "default"
|
||||
section_author: "default"
|
||||
cql:
|
||||
use_embedded_libraries: true
|
||||
compiler:
|
||||
### These are low-level compiler options.
|
||||
### They are not typically needed by most users.
|
||||
# validate_units: true
|
||||
# verify_only: false
|
||||
# compatibility_level: "1.5"
|
||||
error_level: Info
|
||||
signature_level: All
|
||||
# analyze_data_requirements: false
|
||||
# collapse_data_requirements: false
|
||||
# translator_format: JSON
|
||||
# enable_date_range_optimization: true
|
||||
enable_annotations: true
|
||||
enable_locators: true
|
||||
enable_results_type: true
|
||||
enable_detailed_errors: true
|
||||
# disable_list_traversal: false
|
||||
# disable_list_demotion: false
|
||||
# enable_interval_demotion: false
|
||||
# enable_interval_promotion: false
|
||||
# disable_method_invocation: false
|
||||
# require_from_keyword: false
|
||||
# disable_default_model_info_load: false
|
||||
runtime:
|
||||
debug_logging_enabled: false
|
||||
# enable_validation: false
|
||||
# enable_expression_caching: true
|
||||
terminology:
|
||||
valueset_preexpansion_mode: REQUIRE # USE_IF_PRESENT, REQUIRE, IGNORE
|
||||
valueset_expansion_mode: PERFORM_NAIVE_EXPANSION # AUTO, USE_EXPANSION_OPERATION, PERFORM_NAIVE_EXPANSION
|
||||
valueset_membership_mode: USE_EXPANSION # AUTO, USE_VALIDATE_CODE_OPERATION, USE_EXPANSION
|
||||
code_lookup_mode: USE_VALIDATE_CODE_OPERATION # AUTO, USE_VALIDATE_CODE_OPERATION, USE_CODESYSTEM_URL
|
||||
data:
|
||||
search_parameter_mode: USE_SEARCH_PARAMETERS # AUTO, USE_SEARCH_PARAMETERS, FILTER_IN_MEMORY
|
||||
terminology_parameter_mode: FILTER_IN_MEMORY # AUTO, USE_VALUE_SET_URL, USE_INLINE_CODES, FILTER_IN_MEMORY
|
||||
profile_mode: DECLARED # ENFORCED, DECLARED, OPTIONAL, TRUST, OFF
|
||||
|
||||
cdshooks:
|
||||
enabled: false
|
||||
clientIdHeaderName: client_id
|
||||
|
||||
### This enables the swagger-ui at /fhir/swagger-ui/index.html as well as the /fhir/api-docs (see https://hapifhir.io/hapi-fhir/docs/server_plain/openapi.html)
|
||||
openapi_enabled: true
|
||||
### This is the FHIR version. Choose between, DSTU2, DSTU3, R4 or R5
|
||||
fhir_version: R4
|
||||
### Flag is false by default. This flag enables runtime installation of IG's.
|
||||
ig_runtime_upload_enabled: false
|
||||
### This flag when enabled to true, will avail evaluate measure operations from CR Module.
|
||||
|
||||
### enable to use the ApacheProxyAddressStrategy which uses X-Forwarded-* headers
|
||||
### to determine the FHIR server address
|
||||
# use_apache_address_strategy: false
|
||||
### forces the use of the https:// protocol for the returned server address.
|
||||
### alternatively, it may be set using the X-Forwarded-Proto header.
|
||||
# use_apache_address_strategy_https: false
|
||||
### enables the server to overwrite defaults on HTML, css, etc. under the url pattern of eg. /content/custom **
|
||||
### Folder with custom content MUST be named custom. If omitted then default content applies
|
||||
custom_content_path: ./custom
|
||||
### enables the server host custom content. If e.g. the value ./configs/app is supplied then the content
|
||||
### will be served under /web/app
|
||||
#app_content_path: ./configs/app
|
||||
### enable to set the Server URL
|
||||
# server_address: http://hapi.fhir.org/baseR4
|
||||
# defer_indexing_for_codesystems_of_size: 101
|
||||
### Flag is true by default. This flag filters resources during package installation, allowing only those resources with a valid status (e.g. active) to be installed.
|
||||
# validate_resource_status_for_package_upload: false
|
||||
# install_transitive_ig_dependencies: true
|
||||
#implementationguides:
|
||||
### example from registry (packages.fhir.org)
|
||||
# swiss:
|
||||
# name: swiss.mednet.fhir
|
||||
# version: 0.8.0
|
||||
# reloadExisting: false
|
||||
# installMode: STORE_AND_INSTALL
|
||||
# example not from registry
|
||||
# ips_1_0_0:
|
||||
# packageUrl: https://build.fhir.org/ig/HL7/fhir-ips/package.tgz
|
||||
# name: hl7.fhir.uv.ips
|
||||
# version: 1.0.0
|
||||
# supported_resource_types:
|
||||
# - Patient
|
||||
# - Observation
|
||||
##################################################
|
||||
# Allowed Bundle Types for persistence (defaults are: COLLECTION,DOCUMENT,MESSAGE)
|
||||
##################################################
|
||||
# allowed_bundle_types: COLLECTION,DOCUMENT,MESSAGE,TRANSACTION,TRANSACTIONRESPONSE,BATCH,BATCHRESPONSE,HISTORY,SEARCHSET
|
||||
# allow_cascading_deletes: true
|
||||
# allow_contains_searches: true
|
||||
# allow_external_references: true
|
||||
# allow_multiple_delete: true
|
||||
# allow_override_default_search_params: true
|
||||
# auto_create_placeholder_reference_targets: false
|
||||
# mass_ingestion_mode_enabled: false
|
||||
### tells the server to automatically append the current version of the target resource to references at these paths
|
||||
# auto_version_reference_at_paths: Device.patient, Device.location, Device.parent, DeviceMetric.parent, DeviceMetric.source, Observation.device, Observation.subject
|
||||
# ips_enabled: false
|
||||
# default_encoding: JSON
|
||||
# default_pretty_print: true
|
||||
# default_page_size: 20
|
||||
# delete_expunge_enabled: true
|
||||
# enable_repository_validating_interceptor: true
|
||||
# enable_index_missing_fields: false
|
||||
# enable_index_of_type: true
|
||||
# enable_index_contained_resource: false
|
||||
# upliftedRefchains_enabled: true
|
||||
# resource_dbhistory_enabled: false
|
||||
### !!Extended Lucene/Elasticsearch Indexing is still a experimental feature, expect some features (e.g. _total=accurate) to not work as expected!!
|
||||
### more information here: https://hapifhir.io/hapi-fhir/docs/server_jpa/elastic.html
|
||||
advanced_lucene_indexing: false
|
||||
bulk_export_enabled: false
|
||||
bulk_import_enabled: false
|
||||
# language_search_parameter_enabled: true
|
||||
# enforce_referential_integrity_on_delete: false
|
||||
# This is an experimental feature, and does not fully support _total and other FHIR features.
|
||||
# enforce_referential_integrity_on_delete: false
|
||||
# enforce_referential_integrity_on_write: false
|
||||
# etag_support_enabled: true
|
||||
# expunge_enabled: true
|
||||
# client_id_strategy: ALPHANUMERIC
|
||||
# server_id_strategy: SEQUENTIAL_NUMERIC
|
||||
# fhirpath_interceptor_enabled: false
|
||||
# filter_search_enabled: true
|
||||
# graphql_enabled: true
|
||||
narrative_enabled: true
|
||||
mdm_enabled: false
|
||||
mdm_rules_json_location: "mdm-rules.json"
|
||||
## see: https://hapifhir.io/hapi-fhir/docs/interceptors/built_in_server_interceptors.html#jpa-server-retry-on-version-conflicts
|
||||
# userRequestRetryVersionConflictsInterceptorEnabled : false
|
||||
# local_base_urls:
|
||||
# - https://hapi.fhir.org/baseR4
|
||||
# pre_expand_value_sets: true
|
||||
# enable_task_pre_expand_value_sets: true
|
||||
# pre_expand_value_sets_default_count: 1000
|
||||
# pre_expand_value_sets_max_count: 1000
|
||||
# maximum_expansion_size: 1000
|
||||
|
||||
logical_urls:
|
||||
- http://terminology.hl7.org/*
|
||||
- https://terminology.hl7.org/*
|
||||
- http://snomed.info/*
|
||||
- https://snomed.info/*
|
||||
- http://unitsofmeasure.org/*
|
||||
- https://unitsofmeasure.org/*
|
||||
- http://loinc.org/*
|
||||
- https://loinc.org/*
|
||||
# partitioning:
|
||||
# allow_references_across_partitions: false
|
||||
# partitioning_include_in_search_hashes: false
|
||||
# conditional_create_duplicate_identifiers_enabled: false
|
||||
cors:
|
||||
allow_Credentials: true
|
||||
# These are allowed_origin patterns, see: https://docs.spring.io/spring-framework/docs/current/javadoc-api/org/springframework/web/cors/CorsConfiguration.html#setAllowedOriginPatterns-java.util.List-
|
||||
allowed_origin:
|
||||
- '*'
|
||||
|
||||
# Search coordinator thread pool sizes
|
||||
search-coord-core-pool-size: 20
|
||||
search-coord-max-pool-size: 100
|
||||
search-coord-queue-capacity: 200
|
||||
|
||||
# Search Prefetch Thresholds.
|
||||
|
||||
# This setting sets the number of search results to prefetch. For example, if this list
|
||||
# is set to [100, 1000, -1] then the server will initially load 100 results and not
|
||||
# attempt to load more. If the user requests subsequent page(s) of results and goes
|
||||
# past 100 results, the system will load the next 900 (up to the following threshold of 1000).
|
||||
# The system will progressively work through these thresholds.
|
||||
# A threshold of -1 means to load all results. Note that if the final threshold is a
|
||||
# number other than -1, the system will never prefetch more than the given number.
|
||||
search_prefetch_thresholds: 13,503,2003,-1
|
||||
|
||||
# comma-separated package names, will be @ComponentScan'ed by Spring to allow for creating custom Spring beans
|
||||
#custom-bean-packages:
|
||||
|
||||
# comma-separated list of fully qualified interceptor classes.
|
||||
# classes listed here will be fetched from the Spring context when combined with 'custom-bean-packages',
|
||||
# or will be instantiated via reflection using an no-arg contructor; then registered with the server
|
||||
#custom-interceptor-classes:
|
||||
|
||||
# comma-separated list of fully qualified provider classes.
|
||||
# classes listed here will be fetched from the Spring context when combined with 'custom-bean-packages',
|
||||
# or will be instantiated via reflection using an no-arg contructor; then registered with the server
|
||||
#custom-provider-classes:
|
||||
|
||||
# Threadpool size for BATCH'ed GETs in a bundle.
|
||||
# bundle_batch_pool_size: 10
|
||||
# bundle_batch_pool_max_size: 50
|
||||
|
||||
# logger:
|
||||
# error_format: 'ERROR - ${requestVerb} ${requestUrl}'
|
||||
# format: >-
|
||||
# Path[${servletPath}] Source[${requestHeader.x-forwarded-for}]
|
||||
# Operation[${operationType} ${operationName} ${idOrResourceName}]
|
||||
# UA[${requestHeader.user-agent}] Params[${requestParameters}]
|
||||
# ResponseEncoding[${responseEncodingNoDefault}]
|
||||
# log_exceptions: true
|
||||
# name: fhirtest.access
|
||||
# max_binary_size: 104857600
|
||||
# max_page_size: 200
|
||||
# retain_cached_searches_mins: 60
|
||||
# reuse_cached_search_results_millis: 60000
|
||||
tester:
|
||||
home:
|
||||
name: FHIRFLARE Tester
|
||||
server_address: http://localhost:8080/fhir
|
||||
refuse_to_fetch_third_party_urls: false
|
||||
fhir_version: R4
|
||||
global:
|
||||
name: Global Tester
|
||||
server_address: "http://hapi.fhir.org/baseR4"
|
||||
refuse_to_fetch_third_party_urls: false
|
||||
fhir_version: R4
|
||||
# validation:
|
||||
# requests_enabled: true
|
||||
# responses_enabled: true
|
||||
# binary_storage_enabled: true
|
||||
inline_resource_storage_below_size: 4000
|
||||
# bulk_export_enabled: true
|
||||
# subscription:
|
||||
# resthook_enabled: true
|
||||
# websocket_enabled: false
|
||||
# polling_interval_ms: 5000
|
||||
# immediately_queued: false
|
||||
# email:
|
||||
# from: some@test.com
|
||||
# host: google.com
|
||||
# port:
|
||||
# username:
|
||||
# password:
|
||||
# auth:
|
||||
# startTlsEnable:
|
||||
# startTlsRequired:
|
||||
# quitWait:
|
||||
# lastn_enabled: true
|
||||
# store_resource_in_lucene_index_enabled: true
|
||||
### This is configuration for normalized quantity search level default is 0
|
||||
### 0: NORMALIZED_QUANTITY_SEARCH_NOT_SUPPORTED - default
|
||||
### 1: NORMALIZED_QUANTITY_STORAGE_SUPPORTED
|
||||
### 2: NORMALIZED_QUANTITY_SEARCH_SUPPORTED
|
||||
# normalized_quantity_search_level: 2
|
||||
#elasticsearch:
|
||||
# debug:
|
||||
# pretty_print_json_log: false
|
||||
# refresh_after_write: false
|
||||
# enabled: false
|
||||
# password: SomePassword
|
||||
# required_index_status: YELLOW
|
||||
# rest_url: 'localhost:9200'
|
||||
# protocol: 'http'
|
||||
# schema_management_strategy: CREATE
|
||||
# username: SomeUsername
|
||||
1
migrations/README
Normal file
@ -0,0 +1 @@
|
||||
Single-database configuration for Flask.
|
||||
BIN
migrations/__pycache__/env.cpython-312.pyc
Normal file
50
migrations/alembic.ini
Normal file
@ -0,0 +1,50 @@
|
||||
# A generic, single database configuration.
|
||||
|
||||
[alembic]
|
||||
# template used to generate migration files
|
||||
# file_template = %%(rev)s_%%(slug)s
|
||||
|
||||
# set to 'true' to run the environment during
|
||||
# the 'revision' command, regardless of autogenerate
|
||||
# revision_environment = false
|
||||
|
||||
|
||||
# Logging configuration
|
||||
[loggers]
|
||||
keys = root,sqlalchemy,alembic,flask_migrate
|
||||
|
||||
[handlers]
|
||||
keys = console
|
||||
|
||||
[formatters]
|
||||
keys = generic
|
||||
|
||||
[logger_root]
|
||||
level = WARN
|
||||
handlers = console
|
||||
qualname =
|
||||
|
||||
[logger_sqlalchemy]
|
||||
level = WARN
|
||||
handlers =
|
||||
qualname = sqlalchemy.engine
|
||||
|
||||
[logger_alembic]
|
||||
level = INFO
|
||||
handlers =
|
||||
qualname = alembic
|
||||
|
||||
[logger_flask_migrate]
|
||||
level = INFO
|
||||
handlers =
|
||||
qualname = flask_migrate
|
||||
|
||||
[handler_console]
|
||||
class = StreamHandler
|
||||
args = (sys.stderr,)
|
||||
level = NOTSET
|
||||
formatter = generic
|
||||
|
||||
[formatter_generic]
|
||||
format = %(levelname)-5.5s [%(name)s] %(message)s
|
||||
datefmt = %H:%M:%S
|
||||
113
migrations/env.py
Normal file
@ -0,0 +1,113 @@
|
||||
import logging
|
||||
from logging.config import fileConfig
|
||||
|
||||
from flask import current_app
|
||||
|
||||
from alembic import context
|
||||
|
||||
# this is the Alembic Config object, which provides
|
||||
# access to the values within the .ini file in use.
|
||||
config = context.config
|
||||
|
||||
# Interpret the config file for Python logging.
|
||||
# This line sets up loggers basically.
|
||||
fileConfig(config.config_file_name)
|
||||
logger = logging.getLogger('alembic.env')
|
||||
|
||||
|
||||
def get_engine():
|
||||
try:
|
||||
# this works with Flask-SQLAlchemy<3 and Alchemical
|
||||
return current_app.extensions['migrate'].db.get_engine()
|
||||
except (TypeError, AttributeError):
|
||||
# this works with Flask-SQLAlchemy>=3
|
||||
return current_app.extensions['migrate'].db.engine
|
||||
|
||||
|
||||
def get_engine_url():
|
||||
try:
|
||||
return get_engine().url.render_as_string(hide_password=False).replace(
|
||||
'%', '%%')
|
||||
except AttributeError:
|
||||
return str(get_engine().url).replace('%', '%%')
|
||||
|
||||
|
||||
# add your model's MetaData object here
|
||||
# for 'autogenerate' support
|
||||
# from myapp import mymodel
|
||||
# target_metadata = mymodel.Base.metadata
|
||||
config.set_main_option('sqlalchemy.url', get_engine_url())
|
||||
target_db = current_app.extensions['migrate'].db
|
||||
|
||||
# other values from the config, defined by the needs of env.py,
|
||||
# can be acquired:
|
||||
# my_important_option = config.get_main_option("my_important_option")
|
||||
# ... etc.
|
||||
|
||||
|
||||
def get_metadata():
|
||||
if hasattr(target_db, 'metadatas'):
|
||||
return target_db.metadatas[None]
|
||||
return target_db.metadata
|
||||
|
||||
|
||||
def run_migrations_offline():
|
||||
"""Run migrations in 'offline' mode.
|
||||
|
||||
This configures the context with just a URL
|
||||
and not an Engine, though an Engine is acceptable
|
||||
here as well. By skipping the Engine creation
|
||||
we don't even need a DBAPI to be available.
|
||||
|
||||
Calls to context.execute() here emit the given string to the
|
||||
script output.
|
||||
|
||||
"""
|
||||
url = config.get_main_option("sqlalchemy.url")
|
||||
context.configure(
|
||||
url=url, target_metadata=get_metadata(), literal_binds=True
|
||||
)
|
||||
|
||||
with context.begin_transaction():
|
||||
context.run_migrations()
|
||||
|
||||
|
||||
def run_migrations_online():
|
||||
"""Run migrations in 'online' mode.
|
||||
|
||||
In this scenario we need to create an Engine
|
||||
and associate a connection with the context.
|
||||
|
||||
"""
|
||||
|
||||
# this callback is used to prevent an auto-migration from being generated
|
||||
# when there are no changes to the schema
|
||||
# reference: http://alembic.zzzcomputing.com/en/latest/cookbook.html
|
||||
def process_revision_directives(context, revision, directives):
|
||||
if getattr(config.cmd_opts, 'autogenerate', False):
|
||||
script = directives[0]
|
||||
if script.upgrade_ops.is_empty():
|
||||
directives[:] = []
|
||||
logger.info('No changes in schema detected.')
|
||||
|
||||
conf_args = current_app.extensions['migrate'].configure_args
|
||||
if conf_args.get("process_revision_directives") is None:
|
||||
conf_args["process_revision_directives"] = process_revision_directives
|
||||
|
||||
connectable = get_engine()
|
||||
|
||||
with connectable.connect() as connection:
|
||||
context.configure(
|
||||
connection=connection,
|
||||
target_metadata=get_metadata(),
|
||||
**conf_args
|
||||
)
|
||||
|
||||
with context.begin_transaction():
|
||||
context.run_migrations()
|
||||
|
||||
|
||||
if context.is_offline_mode():
|
||||
run_migrations_offline()
|
||||
else:
|
||||
run_migrations_online()
|
||||
24
migrations/script.py.mako
Normal file
@ -0,0 +1,24 @@
|
||||
"""${message}
|
||||
|
||||
Revision ID: ${up_revision}
|
||||
Revises: ${down_revision | comma,n}
|
||||
Create Date: ${create_date}
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
${imports if imports else ""}
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = ${repr(up_revision)}
|
||||
down_revision = ${repr(down_revision)}
|
||||
branch_labels = ${repr(branch_labels)}
|
||||
depends_on = ${repr(depends_on)}
|
||||
|
||||
|
||||
def upgrade():
|
||||
${upgrades if upgrades else "pass"}
|
||||
|
||||
|
||||
def downgrade():
|
||||
${downgrades if downgrades else "pass"}
|
||||
123
package.py
Normal file
@ -0,0 +1,123 @@
|
||||
from flask import Blueprint, jsonify, current_app, render_template
|
||||
import os
|
||||
import tarfile
|
||||
import json
|
||||
from datetime import datetime
|
||||
import time
|
||||
from services import pkg_version, safe_parse_version
|
||||
|
||||
package_bp = Blueprint('package', __name__)
|
||||
|
||||
@package_bp.route('/logs/<name>')
|
||||
def logs(name):
|
||||
"""
|
||||
Fetch logs for a package, listing each version with its publication date.
|
||||
|
||||
Args:
|
||||
name (str): The name of the package.
|
||||
|
||||
Returns:
|
||||
Rendered template with logs or an error message.
|
||||
"""
|
||||
try:
|
||||
in_memory_cache = current_app.config.get('MANUAL_PACKAGE_CACHE', [])
|
||||
if not in_memory_cache:
|
||||
current_app.logger.error(f"No in-memory cache found for package logs: {name}")
|
||||
return "<p class='text-muted'>Package cache not found.</p>"
|
||||
|
||||
package_data = next((pkg for pkg in in_memory_cache if isinstance(pkg, dict) and pkg.get('name', '').lower() == name.lower()), None)
|
||||
if not package_data:
|
||||
current_app.logger.error(f"Package not found in cache: {name}")
|
||||
return "<p class='text-muted'>Package not found.</p>"
|
||||
|
||||
# Get the versions list with pubDate
|
||||
versions = package_data.get('all_versions', [])
|
||||
if not versions:
|
||||
current_app.logger.warning(f"No versions found for package: {name}. Package data: {package_data}")
|
||||
return "<p class='text-muted'>No version history found for this package.</p>"
|
||||
|
||||
current_app.logger.debug(f"Found {len(versions)} versions for package {name}: {versions[:5]}...")
|
||||
|
||||
logs = []
|
||||
now = time.time()
|
||||
for version_info in versions:
|
||||
if not isinstance(version_info, dict):
|
||||
current_app.logger.warning(f"Invalid version info for {name}: {version_info}")
|
||||
continue
|
||||
version = version_info.get('version', '')
|
||||
pub_date_str = version_info.get('pubDate', '')
|
||||
if not version or not pub_date_str:
|
||||
current_app.logger.warning(f"Skipping version info with missing version or pubDate: {version_info}")
|
||||
continue
|
||||
|
||||
# Parse pubDate and calculate "when"
|
||||
when = "Unknown"
|
||||
try:
|
||||
pub_date = datetime.strptime(pub_date_str, "%a, %d %b %Y %H:%M:%S %Z")
|
||||
pub_time = pub_date.timestamp()
|
||||
time_diff = now - pub_time
|
||||
days_ago = int(time_diff / 86400)
|
||||
if days_ago < 1:
|
||||
hours_ago = int(time_diff / 3600)
|
||||
if hours_ago < 1:
|
||||
minutes_ago = int(time_diff / 60)
|
||||
when = f"{minutes_ago} minute{'s' if minutes_ago != 1 else ''} ago"
|
||||
else:
|
||||
when = f"{hours_ago} hour{'s' if hours_ago != 1 else ''} ago"
|
||||
else:
|
||||
when = f"{days_ago} day{'s' if days_ago != 1 else ''} ago"
|
||||
except ValueError as e:
|
||||
current_app.logger.warning(f"Failed to parse pubDate '{pub_date_str}' for version {version}: {e}")
|
||||
|
||||
logs.append({
|
||||
"version": version,
|
||||
"pubDate": pub_date_str,
|
||||
"when": when
|
||||
})
|
||||
|
||||
if not logs:
|
||||
current_app.logger.warning(f"No valid version entries with pubDate for package: {name}")
|
||||
return "<p class='text-muted'>No version history found for this package.</p>"
|
||||
|
||||
# Sort logs by version number (newest first)
|
||||
logs.sort(key=lambda x: safe_parse_version(x.get('version', '0.0.0a0')), reverse=True)
|
||||
|
||||
current_app.logger.debug(f"Rendering logs for {name} with {len(logs)} entries")
|
||||
return render_template('package.logs.html', logs=logs)
|
||||
|
||||
except Exception as e:
|
||||
current_app.logger.error(f"Error in logs endpoint for {name}: {str(e)}", exc_info=True)
|
||||
return "<p class='text-danger'>Error loading version history.</p>", 500
|
||||
|
||||
@package_bp.route('/dependents/<name>')
|
||||
def dependents(name):
|
||||
"""
|
||||
HTMX endpoint to fetch packages that depend on the current package.
|
||||
Returns an HTML fragment with a table of dependent packages.
|
||||
"""
|
||||
in_memory_cache = current_app.config.get('MANUAL_PACKAGE_CACHE', [])
|
||||
package_data = next((pkg for pkg in in_memory_cache if isinstance(pkg, dict) and pkg.get('name', '').lower() == name.lower()), None)
|
||||
|
||||
if not package_data:
|
||||
return "<p class='text-danger'>Package not found.</p>"
|
||||
|
||||
# Find dependents: packages whose dependencies include the current package
|
||||
dependents = []
|
||||
for pkg in in_memory_cache:
|
||||
if not isinstance(pkg, dict):
|
||||
continue
|
||||
dependencies = pkg.get('dependencies', [])
|
||||
for dep in dependencies:
|
||||
dep_name = dep.get('name', '')
|
||||
if dep_name.lower() == name.lower():
|
||||
dependents.append({
|
||||
"name": pkg.get('name', 'Unknown'),
|
||||
"version": pkg.get('latest_absolute_version', 'N/A'),
|
||||
"author": pkg.get('author', 'N/A'),
|
||||
"fhir_version": pkg.get('fhir_version', 'N/A'),
|
||||
"version_count": pkg.get('version_count', 0),
|
||||
"canonical": pkg.get('canonical', 'N/A')
|
||||
})
|
||||
break
|
||||
|
||||
return render_template('package.dependents.html', dependents=dependents)
|
||||
@ -4,3 +4,11 @@ Werkzeug==2.3.7
|
||||
requests==2.31.0
|
||||
Flask-WTF==1.2.1
|
||||
WTForms==3.1.2
|
||||
Pytest
|
||||
pyyaml==6.0.1
|
||||
fhir.resources==8.0.0
|
||||
Flask-Migrate==4.1.0
|
||||
cachetools
|
||||
beautifulsoup4
|
||||
feedparser==6.0.11
|
||||
flasgger
|
||||
|
||||
162
routes.py
@ -1,162 +0,0 @@
|
||||
# app/modules/fhir_ig_importer/routes.py
|
||||
|
||||
import requests
|
||||
import os
|
||||
import tarfile # Needed for find_and_extract_sd
|
||||
import gzip
|
||||
import json
|
||||
import io
|
||||
import re
|
||||
from flask import (render_template, redirect, url_for, flash, request,
|
||||
current_app, jsonify, send_file)
|
||||
from flask_login import login_required
|
||||
from app.decorators import admin_required
|
||||
from werkzeug.utils import secure_filename
|
||||
from . import bp
|
||||
from .forms import IgImportForm
|
||||
# Import the services module
|
||||
from . import services
|
||||
# Import ProcessedIg model for get_structure_definition
|
||||
from app.models import ProcessedIg
|
||||
from app import db
|
||||
|
||||
|
||||
# --- Helper: Find/Extract SD ---
|
||||
# Moved from services.py to be local to routes that use it, or keep in services and call services.find_and_extract_sd
|
||||
def find_and_extract_sd(tgz_path, resource_identifier):
|
||||
"""Helper to find and extract SD json from a given tgz path by ID, Name, or Type."""
|
||||
sd_data = None; found_path = None; logger = current_app.logger # Use current_app logger
|
||||
if not tgz_path or not os.path.exists(tgz_path): logger.error(f"File not found in find_and_extract_sd: {tgz_path}"); return None, None
|
||||
try:
|
||||
with tarfile.open(tgz_path, "r:gz") as tar:
|
||||
logger.debug(f"Searching for SD matching '{resource_identifier}' in {os.path.basename(tgz_path)}")
|
||||
for member in tar:
|
||||
if member.isfile() and member.name.startswith('package/') and member.name.lower().endswith('.json'):
|
||||
if os.path.basename(member.name).lower() in ['package.json', '.index.json', 'validation-summary.json', 'validation-oo.json']: continue
|
||||
fileobj = None
|
||||
try:
|
||||
fileobj = tar.extractfile(member)
|
||||
if fileobj:
|
||||
content_bytes = fileobj.read(); content_string = content_bytes.decode('utf-8-sig'); data = json.loads(content_string)
|
||||
if isinstance(data, dict) and data.get('resourceType') == 'StructureDefinition':
|
||||
sd_id = data.get('id'); sd_name = data.get('name'); sd_type = data.get('type')
|
||||
if resource_identifier == sd_type or resource_identifier == sd_id or resource_identifier == sd_name:
|
||||
sd_data = data; found_path = member.name; logger.info(f"Found matching SD for '{resource_identifier}' at path: {found_path}"); break
|
||||
except Exception as e: logger.warning(f"Could not read/parse potential SD {member.name}: {e}")
|
||||
finally:
|
||||
if fileobj: fileobj.close()
|
||||
if sd_data is None: logger.warning(f"SD matching '{resource_identifier}' not found within archive {os.path.basename(tgz_path)}")
|
||||
except Exception as e: logger.error(f"Error reading archive {tgz_path} in find_and_extract_sd: {e}", exc_info=True); raise
|
||||
return sd_data, found_path
|
||||
# --- End Helper ---
|
||||
|
||||
|
||||
# --- Route for the main import page ---
|
||||
@bp.route('/import-ig', methods=['GET', 'POST'])
|
||||
@login_required
|
||||
@admin_required
|
||||
def import_ig():
|
||||
"""Handles FHIR IG recursive download using services."""
|
||||
form = IgImportForm()
|
||||
template_context = {"title": "Import FHIR IG", "form": form, "results": None }
|
||||
if form.validate_on_submit():
|
||||
package_name = form.package_name.data; package_version = form.package_version.data
|
||||
template_context.update(package_name=package_name, package_version=package_version)
|
||||
flash(f"Starting full import for {package_name}#{package_version}...", "info"); current_app.logger.info(f"Calling import service for: {package_name}#{package_version}")
|
||||
try:
|
||||
# Call the CORRECT orchestrator service function
|
||||
import_results = services.import_package_and_dependencies(package_name, package_version)
|
||||
template_context["results"] = import_results
|
||||
# Flash summary messages
|
||||
dl_count = len(import_results.get('downloaded', {})); proc_count = len(import_results.get('processed', set())); error_count = len(import_results.get('errors', []))
|
||||
if dl_count > 0: flash(f"Downloaded/verified {dl_count} package file(s).", "success")
|
||||
if proc_count < dl_count and dl_count > 0 : flash(f"Dependency data extraction failed for {dl_count - proc_count} package(s).", "warning")
|
||||
if error_count > 0: flash(f"{error_count} total error(s) occurred.", "danger")
|
||||
elif dl_count == 0 and error_count == 0: flash("No packages needed downloading or initial package failed.", "info")
|
||||
elif error_count == 0: flash("Import process completed successfully.", "success")
|
||||
except Exception as e:
|
||||
fatal_error = f"Critical unexpected error during import: {e}"; template_context["fatal_error"] = fatal_error; current_app.logger.error(f"Critical import error: {e}", exc_info=True); flash(fatal_error, "danger")
|
||||
return render_template('fhir_ig_importer/import_ig_page.html', **template_context)
|
||||
return render_template('fhir_ig_importer/import_ig_page.html', **template_context)
|
||||
|
||||
|
||||
# --- Route to get StructureDefinition elements ---
|
||||
@bp.route('/get-structure')
|
||||
@login_required
|
||||
@admin_required
|
||||
def get_structure_definition():
|
||||
"""API endpoint to fetch SD elements and pre-calculated Must Support paths."""
|
||||
package_name = request.args.get('package_name'); package_version = request.args.get('package_version'); resource_identifier = request.args.get('resource_type')
|
||||
error_response_data = {"elements": [], "must_support_paths": []}
|
||||
if not all([package_name, package_version, resource_identifier]): error_response_data["error"] = "Missing query parameters"; return jsonify(error_response_data), 400
|
||||
current_app.logger.info(f"Request for structure: {package_name}#{package_version} / {resource_identifier}")
|
||||
|
||||
# Find the primary package file
|
||||
package_dir_name = 'fhir_packages'; download_dir = os.path.join(current_app.instance_path, package_dir_name)
|
||||
# Use service helper for consistency
|
||||
filename = services._construct_tgz_filename(package_name, package_version)
|
||||
tgz_path = os.path.join(download_dir, filename)
|
||||
if not os.path.exists(tgz_path): error_response_data["error"] = f"Package file not found: {filename}"; return jsonify(error_response_data), 404
|
||||
|
||||
sd_data = None; found_path = None; error_msg = None
|
||||
try:
|
||||
# Call the local helper function correctly
|
||||
sd_data, found_path = find_and_extract_sd(tgz_path, resource_identifier)
|
||||
# Fallback check
|
||||
if sd_data is None:
|
||||
core_pkg_name = "hl7.fhir.r4.core"; core_pkg_version = "4.0.1" # TODO: Make dynamic
|
||||
core_filename = services._construct_tgz_filename(core_pkg_name, core_pkg_version)
|
||||
core_tgz_path = os.path.join(download_dir, core_filename)
|
||||
if os.path.exists(core_tgz_path):
|
||||
current_app.logger.info(f"Trying fallback search in {core_pkg_name}...")
|
||||
sd_data, found_path = find_and_extract_sd(core_tgz_path, resource_identifier) # Call local helper
|
||||
else: current_app.logger.warning(f"Core package {core_tgz_path} not found.")
|
||||
except Exception as e:
|
||||
error_msg = f"Error searching package(s): {e}"; current_app.logger.error(error_msg, exc_info=True); error_response_data["error"] = error_msg; return jsonify(error_response_data), 500
|
||||
|
||||
if sd_data is None: error_msg = f"SD for '{resource_identifier}' not found."; error_response_data["error"] = error_msg; return jsonify(error_response_data), 404
|
||||
|
||||
# Extract elements
|
||||
elements = sd_data.get('snapshot', {}).get('element', [])
|
||||
if not elements: elements = sd_data.get('differential', {}).get('element', [])
|
||||
|
||||
# Fetch pre-calculated Must Support paths from DB
|
||||
must_support_paths = [];
|
||||
try:
|
||||
stmt = db.select(ProcessedIg).filter_by(package_name=package_name, package_version=package_version); processed_ig_record = db.session.scalar(stmt)
|
||||
if processed_ig_record: all_ms_paths_dict = processed_ig_record.must_support_elements; must_support_paths = all_ms_paths_dict.get(resource_identifier, [])
|
||||
else: current_app.logger.warning(f"No ProcessedIg record found for {package_name}#{package_version}")
|
||||
except Exception as e: current_app.logger.error(f"Error fetching MS paths from DB: {e}", exc_info=True)
|
||||
|
||||
current_app.logger.info(f"Returning {len(elements)} elements for {resource_identifier} from {found_path or 'Unknown File'}")
|
||||
return jsonify({"elements": elements, "must_support_paths": must_support_paths})
|
||||
|
||||
|
||||
# --- Route to get raw example file content ---
|
||||
@bp.route('/get-example')
|
||||
@login_required
|
||||
@admin_required
|
||||
def get_example_content():
|
||||
# ... (Function remains the same as response #147) ...
|
||||
package_name = request.args.get('package_name'); package_version = request.args.get('package_version'); example_member_path = request.args.get('filename')
|
||||
if not all([package_name, package_version, example_member_path]): return jsonify({"error": "Missing query parameters"}), 400
|
||||
current_app.logger.info(f"Request for example: {package_name}#{package_version} / {example_member_path}")
|
||||
package_dir_name = 'fhir_packages'; download_dir = os.path.join(current_app.instance_path, package_dir_name)
|
||||
pkg_filename = services._construct_tgz_filename(package_name, package_version) # Use service helper
|
||||
tgz_path = os.path.join(download_dir, pkg_filename)
|
||||
if not os.path.exists(tgz_path): return jsonify({"error": f"Package file not found: {pkg_filename}"}), 404
|
||||
# Basic security check on member path
|
||||
safe_member_path = secure_filename(example_member_path.replace("package/","")) # Allow paths within package/
|
||||
if not example_member_path.startswith('package/') or '..' in example_member_path: return jsonify({"error": "Invalid example file path."}), 400
|
||||
|
||||
try:
|
||||
with tarfile.open(tgz_path, "r:gz") as tar:
|
||||
try: example_member = tar.getmember(example_member_path) # Use original path here
|
||||
except KeyError: return jsonify({"error": f"Example file '{example_member_path}' not found."}), 404
|
||||
example_fileobj = tar.extractfile(example_member)
|
||||
if not example_fileobj: return jsonify({"error": "Could not extract example file."}), 500
|
||||
try: content_bytes = example_fileobj.read()
|
||||
finally: example_fileobj.close()
|
||||
return content_bytes # Return raw bytes
|
||||
except tarfile.TarError as e: err_msg = f"Error reading {tgz_path}: {e}"; current_app.logger.error(err_msg); return jsonify({"error": err_msg}), 500
|
||||
except Exception as e: err_msg = f"Unexpected error getting example {example_member_path}: {e}"; current_app.logger.error(err_msg, exc_info=True); return jsonify({"error": err_msg}), 500
|
||||
5376
services.py
395
setup_linux.sh
Normal file
@ -0,0 +1,395 @@
|
||||
#!/bin/bash
|
||||
|
||||
# --- Configuration ---
|
||||
REPO_URL_HAPI="https://github.com/hapifhir/hapi-fhir-jpaserver-starter.git"
|
||||
REPO_URL_CANDLE="https://github.com/FHIR/fhir-candle.git"
|
||||
CLONE_DIR_HAPI="hapi-fhir-jpaserver"
|
||||
CLONE_DIR_CANDLE="fhir-candle"
|
||||
SOURCE_CONFIG_DIR="hapi-fhir-Setup"
|
||||
CONFIG_FILE="application.yaml"
|
||||
|
||||
# --- Define Paths ---
|
||||
SOURCE_CONFIG_PATH="../${SOURCE_CONFIG_DIR}/target/classes/${CONFIG_FILE}"
|
||||
DEST_CONFIG_PATH="${CLONE_DIR_HAPI}/target/classes/${CONFIG_FILE}"
|
||||
|
||||
APP_MODE=""
|
||||
CUSTOM_FHIR_URL_VAL=""
|
||||
SERVER_TYPE=""
|
||||
CANDLE_FHIR_VERSION=""
|
||||
|
||||
# --- Error Handling Function ---
|
||||
handle_error() {
|
||||
echo "------------------------------------"
|
||||
echo "An error occurred: $1"
|
||||
echo "Script aborted."
|
||||
echo "------------------------------------"
|
||||
exit 1
|
||||
}
|
||||
|
||||
# === MODIFIED: Prompt for Installation Mode ===
|
||||
get_mode_choice() {
|
||||
echo ""
|
||||
echo "Select Installation Mode:"
|
||||
echo "1. Lite (Excludes local HAPI FHIR Server - No Git/Maven/Dotnet needed)"
|
||||
echo "2. Custom URL (Uses a custom FHIR Server - No Git/Maven/Dotnet needed)"
|
||||
echo "3. Hapi (Includes local HAPI FHIR Server - Requires Git & Maven)"
|
||||
echo "4. Candle (Includes local FHIR Candle Server - Requires Git & Dotnet)"
|
||||
|
||||
while true; do
|
||||
read -r -p "Enter your choice (1, 2, 3, or 4): " choice
|
||||
case "$choice" in
|
||||
1)
|
||||
APP_MODE="lite"
|
||||
break
|
||||
;;
|
||||
2)
|
||||
APP_MODE="standalone"
|
||||
get_custom_url_prompt
|
||||
break
|
||||
;;
|
||||
3)
|
||||
APP_MODE="standalone"
|
||||
SERVER_TYPE="hapi"
|
||||
break
|
||||
;;
|
||||
4)
|
||||
APP_MODE="standalone"
|
||||
SERVER_TYPE="candle"
|
||||
get_candle_fhir_version
|
||||
break
|
||||
;;
|
||||
*)
|
||||
echo "Invalid input. Please try again."
|
||||
;;
|
||||
esac
|
||||
done
|
||||
echo "Selected Mode: $APP_MODE"
|
||||
echo "Server Type: $SERVER_TYPE"
|
||||
echo
|
||||
}
|
||||
|
||||
# === NEW: Prompt for Custom URL ===
|
||||
get_custom_url_prompt() {
|
||||
local confirmed_url=""
|
||||
while true; do
|
||||
echo
|
||||
read -r -p "Please enter the custom FHIR server URL: " custom_url_input
|
||||
echo
|
||||
echo "You entered: $custom_url_input"
|
||||
read -r -p "Is this URL correct? (Y/N): " confirm_url
|
||||
if [[ "$confirm_url" =~ ^[Yy]$ ]]; then
|
||||
confirmed_url="$custom_url_input"
|
||||
break
|
||||
else
|
||||
echo "URL not confirmed. Please re-enter."
|
||||
fi
|
||||
done
|
||||
|
||||
while true; do
|
||||
echo
|
||||
read -r -p "Please re-enter the URL to confirm it is correct: " custom_url_input
|
||||
if [ "$custom_url_input" = "$confirmed_url" ]; then
|
||||
CUSTOM_FHIR_URL_VAL="$custom_url_input"
|
||||
echo
|
||||
echo "Custom URL confirmed: $CUSTOM_FHIR_URL_VAL"
|
||||
break
|
||||
else
|
||||
echo
|
||||
echo "URLs do not match. Please try again."
|
||||
confirmed_url="$custom_url_input"
|
||||
fi
|
||||
done
|
||||
}
|
||||
|
||||
# === NEW: Prompt for Candle FHIR version ===
|
||||
get_candle_fhir_version() {
|
||||
echo ""
|
||||
echo "Select the FHIR version for the Candle server:"
|
||||
echo "1. R4 (4.0)"
|
||||
echo "2. R4B (4.3)"
|
||||
echo "3. R5 (5.0)"
|
||||
while true; do
|
||||
read -r -p "Enter your choice (1, 2, or 3): " choice
|
||||
case "$choice" in
|
||||
1)
|
||||
CANDLE_FHIR_VERSION=r4
|
||||
break
|
||||
;;
|
||||
2)
|
||||
CANDLE_FHIR_VERSION=r4b
|
||||
break
|
||||
;;
|
||||
3)
|
||||
CANDLE_FHIR_VERSION=r5
|
||||
break
|
||||
;;
|
||||
*)
|
||||
echo "Invalid input. Please try again."
|
||||
;;
|
||||
esac
|
||||
done
|
||||
}
|
||||
|
||||
|
||||
# Call the function to get mode choice
|
||||
get_mode_choice
|
||||
|
||||
# === Conditionally Execute Server Setup ===
|
||||
case "$SERVER_TYPE" in
|
||||
"hapi")
|
||||
echo "Running Hapi server setup..."
|
||||
echo
|
||||
|
||||
# --- Step 0: Clean up previous clone (optional) ---
|
||||
echo "Checking for existing directory: $CLONE_DIR_HAPI"
|
||||
if [ -d "$CLONE_DIR_HAPI" ]; then
|
||||
echo "Found existing directory, removing it..."
|
||||
rm -rf "$CLONE_DIR_HAPI"
|
||||
if [ $? -ne 0 ]; then
|
||||
handle_error "Failed to remove existing directory: $CLONE_DIR_HAPI"
|
||||
fi
|
||||
echo "Existing directory removed."
|
||||
else
|
||||
echo "Directory does not exist, proceeding with clone."
|
||||
fi
|
||||
echo
|
||||
|
||||
# --- Step 1: Clone the HAPI FHIR server repository ---
|
||||
echo "Cloning repository: $REPO_URL_HAPI into $CLONE_DIR_HAPI..."
|
||||
git clone "$REPO_URL_HAPI" "$CLONE_DIR_HAPI"
|
||||
if [ $? -ne 0 ]; then
|
||||
handle_error "Failed to clone repository. Check Git installation and network connection."
|
||||
fi
|
||||
echo "Repository cloned successfully."
|
||||
echo
|
||||
|
||||
# --- Step 2: Navigate into the cloned directory ---
|
||||
echo "Changing directory to $CLONE_DIR_HAPI..."
|
||||
cd "$CLONE_DIR_HAPI" || handle_error "Failed to change directory to $CLONE_DIR_HAPI."
|
||||
echo "Current directory: $(pwd)"
|
||||
echo
|
||||
|
||||
# --- Step 3: Build the HAPI server using Maven ---
|
||||
echo "===> Starting Maven build (Step 3)..."
|
||||
mvn clean package -DskipTests=true -Pboot
|
||||
if [ $? -ne 0 ]; then
|
||||
echo "ERROR: Maven build failed."
|
||||
cd ..
|
||||
handle_error "Maven build process resulted in an error."
|
||||
fi
|
||||
echo "Maven build completed successfully."
|
||||
echo
|
||||
|
||||
# --- Step 4: Copy the configuration file ---
|
||||
echo "===> Starting file copy (Step 4)..."
|
||||
echo "Copying configuration file..."
|
||||
INITIAL_SCRIPT_DIR=$(pwd)
|
||||
ABSOLUTE_SOURCE_CONFIG_PATH="${INITIAL_SCRIPT_DIR}/../${SOURCE_CONFIG_DIR}/target/classes/${CONFIG_FILE}"
|
||||
|
||||
echo "Source: $ABSOLUTE_SOURCE_CONFIG_PATH"
|
||||
echo "Destination: target/classes/$CONFIG_FILE"
|
||||
|
||||
if [ ! -f "$ABSOLUTE_SOURCE_CONFIG_PATH" ]; then
|
||||
echo "WARNING: Source configuration file not found at $ABSOLUTE_SOURCE_CONFIG_PATH."
|
||||
echo "The script will continue, but the server might use default configuration."
|
||||
else
|
||||
cp "$ABSOLUTE_SOURCE_CONFIG_PATH" "target/classes/"
|
||||
if [ $? -ne 0 ]; then
|
||||
echo "WARNING: Failed to copy configuration file. Check if the source file exists and permissions."
|
||||
echo "The script will continue, but the server might use default configuration."
|
||||
else
|
||||
echo "Configuration file copied successfully."
|
||||
fi
|
||||
fi
|
||||
echo
|
||||
|
||||
# --- Step 5: Navigate back to the parent directory ---
|
||||
echo "===> Changing directory back (Step 5)..."
|
||||
cd .. || handle_error "Failed to change back to the parent directory."
|
||||
echo "Current directory: $(pwd)"
|
||||
echo
|
||||
;;
|
||||
|
||||
"candle")
|
||||
echo "Running FHIR Candle server setup..."
|
||||
echo
|
||||
|
||||
# --- Step 0: Clean up previous clone (optional) ---
|
||||
echo "Checking for existing directory: $CLONE_DIR_CANDLE"
|
||||
if [ -d "$CLONE_DIR_CANDLE" ]; then
|
||||
echo "Found existing directory, removing it..."
|
||||
rm -rf "$CLONE_DIR_CANDLE"
|
||||
if [ $? -ne 0 ]; then
|
||||
handle_error "Failed to remove existing directory: $CLONE_DIR_CANDLE"
|
||||
fi
|
||||
echo "Existing directory removed."
|
||||
else
|
||||
echo "Directory does not exist, proceeding with clone."
|
||||
fi
|
||||
echo
|
||||
|
||||
# --- Step 1: Clone the FHIR Candle server repository ---
|
||||
echo "Cloning repository: $REPO_URL_CANDLE into $CLONE_DIR_CANDLE..."
|
||||
git clone "$REPO_URL_CANDLE" "$CLONE_DIR_CANDLE"
|
||||
if [ $? -ne 0 ]; then
|
||||
handle_error "Failed to clone repository. Check Git and Dotnet SDK installation and network connection."
|
||||
fi
|
||||
echo "Repository cloned successfully."
|
||||
echo
|
||||
|
||||
# --- Step 2: Navigate into the cloned directory ---
|
||||
echo "Changing directory to $CLONE_DIR_CANDLE..."
|
||||
cd "$CLONE_DIR_CANDLE" || handle_error "Failed to change directory to $CLONE_DIR_CANDLE."
|
||||
echo "Current directory: $(pwd)"
|
||||
echo
|
||||
|
||||
# --- Step 3: Build the FHIR Candle server using Dotnet ---
|
||||
echo "===> Starting Dotnet build (Step 3)..."
|
||||
dotnet publish -c Release -f net9.0 -o publish
|
||||
if [ $? -ne 0 ]; then
|
||||
handle_error "Dotnet build failed. Check Dotnet SDK installation."
|
||||
fi
|
||||
echo "Dotnet build completed successfully."
|
||||
echo
|
||||
|
||||
# --- Step 4: Navigate back to the parent directory ---
|
||||
echo "===> Changing directory back (Step 4)..."
|
||||
cd .. || handle_error "Failed to change back to the parent directory."
|
||||
echo "Current directory: $(pwd)"
|
||||
echo
|
||||
;;
|
||||
|
||||
*) # APP_MODE is Lite, no SERVER_TYPE
|
||||
echo "Running Lite setup, skipping server build..."
|
||||
if [ -d "$CLONE_DIR_HAPI" ]; then
|
||||
echo "Found existing HAPI directory in Lite mode. Removing it to avoid build issues..."
|
||||
rm -rf "$CLONE_DIR_HAPI"
|
||||
fi
|
||||
if [ -d "$CLONE_DIR_CANDLE" ]; then
|
||||
echo "Found existing Candle directory in Lite mode. Removing it to avoid build issues..."
|
||||
rm -rf "$CLONE_DIR_CANDLE"
|
||||
fi
|
||||
mkdir -p "${CLONE_DIR_HAPI}/target/classes"
|
||||
mkdir -p "${CLONE_DIR_HAPI}/custom"
|
||||
touch "${CLONE_DIR_HAPI}/target/ROOT.war"
|
||||
touch "${CLONE_DIR_HAPI}/target/classes/application.yaml"
|
||||
mkdir -p "${CLONE_DIR_CANDLE}/publish"
|
||||
touch "${CLONE_DIR_CANDLE}/publish/fhir-candle.dll"
|
||||
echo "Placeholder files and directories created for Lite mode build."
|
||||
echo
|
||||
;;
|
||||
esac
|
||||
|
||||
# === MODIFIED: Update docker-compose.yml to set APP_MODE and HAPI_FHIR_URL and DOCKERFILE ===
|
||||
echo "Updating docker-compose.yml with APP_MODE=$APP_MODE and HAPI_FHIR_URL..."
|
||||
DOCKER_COMPOSE_TMP="docker-compose.yml.tmp"
|
||||
DOCKER_COMPOSE_ORIG="docker-compose.yml"
|
||||
|
||||
HAPI_URL_TO_USE="https://fhir.hl7.org.au/aucore/fhir/DEFAULT/"
|
||||
if [ -n "$CUSTOM_FHIR_URL_VAL" ]; then
|
||||
HAPI_URL_TO_USE="$CUSTOM_FHIR_URL_VAL"
|
||||
elif [ "$SERVER_TYPE" = "candle" ]; then
|
||||
HAPI_URL_TO_USE="http://localhost:5826/fhir/${CANDLE_FHIR_VERSION}"
|
||||
else
|
||||
HAPI_URL_TO_USE="http://localhost:8080/fhir"
|
||||
fi
|
||||
|
||||
DOCKERFILE_TO_USE="Dockerfile.lite"
|
||||
if [ "$SERVER_TYPE" = "hapi" ]; then
|
||||
DOCKERFILE_TO_USE="Dockerfile.hapi"
|
||||
elif [ "$SERVER_TYPE" = "candle" ]; then
|
||||
DOCKERFILE_TO_USE="Dockerfile.candle"
|
||||
fi
|
||||
|
||||
cat << EOF > "$DOCKER_COMPOSE_TMP"
|
||||
version: '3.8'
|
||||
services:
|
||||
fhirflare:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: ${DOCKERFILE_TO_USE}
|
||||
ports:
|
||||
EOF
|
||||
|
||||
if [ "$SERVER_TYPE" = "candle" ]; then
|
||||
cat << EOF >> "$DOCKER_COMPOSE_TMP"
|
||||
- "5000:5000"
|
||||
- "5001:5826"
|
||||
EOF
|
||||
else
|
||||
cat << EOF >> "$DOCKER_COMPOSE_TMP"
|
||||
- "5000:5000"
|
||||
- "8080:8080"
|
||||
EOF
|
||||
fi
|
||||
|
||||
cat << EOF >> "$DOCKER_COMPOSE_TMP"
|
||||
volumes:
|
||||
- ./instance:/app/instance
|
||||
- ./static/uploads:/app/static/uploads
|
||||
- ./logs:/app/logs
|
||||
EOF
|
||||
|
||||
if [ "$SERVER_TYPE" = "hapi" ]; then
|
||||
cat << EOF >> "$DOCKER_COMPOSE_TMP"
|
||||
- ./instance/hapi-h2-data/:/app/h2-data # Keep volume mounts consistent
|
||||
- ./hapi-fhir-jpaserver/target/ROOT.war:/usr/local/tomcat/webapps/ROOT.war
|
||||
- ./hapi-fhir-jpaserver/target/classes/application.yaml:/usr/local/tomcat/conf/application.yaml
|
||||
EOF
|
||||
elif [ "$SERVER_TYPE" = "candle" ]; then
|
||||
cat << EOF >> "$DOCKER_COMPOSE_TMP"
|
||||
- ./fhir-candle/publish/:/app/fhir-candle-publish/
|
||||
EOF
|
||||
fi
|
||||
|
||||
cat << EOF >> "$DOCKER_COMPOSE_TMP"
|
||||
environment:
|
||||
- FLASK_APP=app.py
|
||||
- FLASK_ENV=development
|
||||
- NODE_PATH=/usr/lib/node_modules
|
||||
- APP_MODE=${APP_MODE}
|
||||
- APP_BASE_URL=http://localhost:5000
|
||||
- HAPI_FHIR_URL=${HAPI_URL_TO_USE}
|
||||
EOF
|
||||
|
||||
if [ "$SERVER_TYPE" = "candle" ]; then
|
||||
cat << EOF >> "$DOCKER_COMPOSE_TMP"
|
||||
- ASPNETCORE_URLS=http://0.0.0.0:5826
|
||||
EOF
|
||||
fi
|
||||
|
||||
cat << EOF >> "$DOCKER_COMPOSE_TMP"
|
||||
command: supervisord -c /etc/supervisord.conf
|
||||
EOF
|
||||
|
||||
if [ ! -f "$DOCKER_COMPOSE_TMP" ]; then
|
||||
handle_error "Failed to create temporary docker-compose file ($DOCKER_COMPOSE_TMP)."
|
||||
fi
|
||||
|
||||
# Replace the original docker-compose.yml
|
||||
mv "$DOCKER_COMPOSE_TMP" "$DOCKER_COMPOSE_ORIG"
|
||||
echo "docker-compose.yml updated successfully."
|
||||
echo
|
||||
|
||||
# --- Step 6: Build Docker images ---
|
||||
echo "===> Starting Docker build (Step 6)..."
|
||||
docker-compose build --no-cache
|
||||
if [ $? -ne 0 ]; then
|
||||
handle_error "Docker Compose build failed. Check Docker installation and docker-compose.yml file."
|
||||
fi
|
||||
echo "Docker images built successfully."
|
||||
echo
|
||||
|
||||
# --- Step 7: Start Docker containers ---
|
||||
echo "===> Starting Docker containers (Step 7)..."
|
||||
docker-compose up -d
|
||||
if [ $? -ne 0 ]; then
|
||||
handle_error "Docker Compose up failed. Check Docker installation and container configurations."
|
||||
fi
|
||||
echo "Docker containers started successfully."
|
||||
echo
|
||||
|
||||
echo "===================================="
|
||||
echo "Script finished successfully! (Mode: $APP_MODE)"
|
||||
echo "===================================="
|
||||
exit 0
|
||||
|
Before Width: | Height: | Size: 1.5 MiB After Width: | Height: | Size: 706 KiB |
BIN
static/FHIRFLARE.png.old
Normal file
|
After Width: | Height: | Size: 1.5 MiB |
1
static/animations/Validation abstract.json
Normal file
1
static/animations/import-fire.json
Normal file
1
static/animations/loading-dark.json
Normal file
49070
static/animations/loading-light.json
Normal file
390
static/css/animation.css
Normal file
@ -0,0 +1,390 @@
|
||||
body {
|
||||
width: 100%;
|
||||
font-family: 'Inter', -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, sans-serif;
|
||||
}
|
||||
|
||||
/* Apply overflow and height constraints only for fire animation page
|
||||
body.fire-animation-page {
|
||||
overflow: hidden;
|
||||
height: 100vh;
|
||||
}
|
||||
*/
|
||||
/* Fire animation overlay */
|
||||
.fire-on {
|
||||
position: absolute;
|
||||
top: 0;
|
||||
left: 0;
|
||||
width: 100%;
|
||||
height: 100%;
|
||||
background: linear-gradient(#1d4456, #112630);
|
||||
opacity: 1;
|
||||
z-index: 1;
|
||||
transition: all 1200ms linear;
|
||||
}
|
||||
|
||||
.section-center {
|
||||
position: relative;
|
||||
width: 300px; /* Reduced size for landing page */
|
||||
height: 300px;
|
||||
margin: 0 auto;
|
||||
display: block;
|
||||
overflow: hidden;
|
||||
border: 8px solid rgba(0,0,0,.2);
|
||||
border-radius: 50%;
|
||||
z-index: 5;
|
||||
background-color: #1d4456;
|
||||
box-shadow: 0 0 50px 5px rgba(255,148,0,.1);
|
||||
transition: all 500ms linear;
|
||||
}
|
||||
|
||||
/* Wood and star using local images */
|
||||
.wood {
|
||||
position: absolute;
|
||||
z-index: 21;
|
||||
left: 50%;
|
||||
bottom: 12%;
|
||||
width: 80px;
|
||||
margin-left: -40px;
|
||||
height: 30px;
|
||||
background-image: url('{{ url_for('static', filename='img/wood.png') }}');
|
||||
background-size: 80px 30px;
|
||||
border-radius: 5px;
|
||||
}
|
||||
|
||||
.star {
|
||||
z-index: 2;
|
||||
position: absolute;
|
||||
top: 138px;
|
||||
left: 18px;
|
||||
background-image: url('{{ url_for('static', filename='img/star.png') }}');
|
||||
background-size: 11px 11px;
|
||||
width: 11px;
|
||||
height: 11px;
|
||||
opacity: 0.4;
|
||||
animation: starShine 3.5s linear infinite;
|
||||
transition: all 1200ms linear;
|
||||
}
|
||||
|
||||
.wood-circle {
|
||||
position: absolute;
|
||||
z-index: 20;
|
||||
left: 50%;
|
||||
bottom: 11%;
|
||||
width: 100px;
|
||||
margin-left: -50px;
|
||||
height: 20px;
|
||||
border-radius: 100%;
|
||||
background-color: #0a171d;
|
||||
}
|
||||
|
||||
.circle {
|
||||
position: absolute;
|
||||
z-index: 6;
|
||||
right: -225px;
|
||||
bottom: -337px;
|
||||
width: 562px;
|
||||
height: 525px;
|
||||
border-radius: 100%;
|
||||
background-color: #112630;
|
||||
}
|
||||
|
||||
/* Moon */
|
||||
.moon {
|
||||
position: absolute;
|
||||
top: 37px;
|
||||
left: 86px;
|
||||
width: 60px;
|
||||
height: 60px;
|
||||
background-color: #b2b7bc;
|
||||
border-radius: 50%;
|
||||
box-shadow: inset -15px 1.5px 0 0px #c0c3c9, 0 0 7px 3px rgba(228,228,222,.4);
|
||||
z-index: 1;
|
||||
animation: brilla-moon 4s alternate infinite;
|
||||
transition: all 2000ms linear;
|
||||
}
|
||||
|
||||
.moon div:nth-child(1) {
|
||||
position: absolute;
|
||||
top: 50%;
|
||||
left: 10%;
|
||||
width: 12%;
|
||||
height: 12%;
|
||||
border-radius: 50%;
|
||||
border: 1px solid #adaca2;
|
||||
box-shadow: inset 1.5px -0.75px 0 0px #85868b;
|
||||
opacity: 0.4;
|
||||
}
|
||||
|
||||
.moon div:nth-child(2) {
|
||||
position: absolute;
|
||||
top: 20%;
|
||||
left: 38%;
|
||||
width: 16%;
|
||||
height: 16%;
|
||||
border-radius: 50%;
|
||||
border: 1px solid #adaca2;
|
||||
box-shadow: inset 1.5px -0.75px 0 0px #85868b;
|
||||
opacity: 0.4;
|
||||
}
|
||||
|
||||
.moon div:nth-child(3) {
|
||||
position: absolute;
|
||||
top: 60%;
|
||||
left: 45%;
|
||||
width: 20%;
|
||||
height: 20%;
|
||||
border-radius: 50%;
|
||||
border: 1px solid #adaca2;
|
||||
box-shadow: inset 1.5px -0.75px 0 0px #85868b;
|
||||
opacity: 0.4;
|
||||
}
|
||||
|
||||
@keyframes brilla-moon {
|
||||
0% { box-shadow: inset -15px 1.5px 0 0px #c0c3c9, 0 0 7px 3px rgba(228,228,222,.4); }
|
||||
50% { box-shadow: inset -15px 1.5px 0 0px #c0c3c9, 0 0 11px 6px rgba(228,228,222,.4); }
|
||||
}
|
||||
|
||||
/* Shooting stars */
|
||||
.shooting-star {
|
||||
z-index: 2;
|
||||
width: 1px;
|
||||
height: 37px;
|
||||
border-bottom-left-radius: 50%;
|
||||
border-bottom-right-radius: 50%;
|
||||
position: absolute;
|
||||
top: 0;
|
||||
left: -52px;
|
||||
background: linear-gradient(to bottom, rgba(255, 255, 255, 0), white);
|
||||
animation: animShootingStar 6s linear infinite;
|
||||
transition: all 2000ms linear;
|
||||
}
|
||||
|
||||
@keyframes animShootingStar {
|
||||
from { transform: translateY(0px) translateX(0px) rotate(-45deg); opacity: 1; height: 3px; }
|
||||
to { transform: translateY(960px) translateX(960px) rotate(-45deg); opacity: 1; height: 600px; }
|
||||
}
|
||||
|
||||
.shooting-star-2 {
|
||||
z-index: 2;
|
||||
width: 1px;
|
||||
height: 37px;
|
||||
border-bottom-left-radius: 50%;
|
||||
border-bottom-right-radius: 50%;
|
||||
position: absolute;
|
||||
top: 0;
|
||||
left: 150px;
|
||||
background: linear-gradient(to bottom, rgba(255, 255, 255, 0), white);
|
||||
animation: animShootingStar-2 9s linear infinite;
|
||||
transition: all 2000ms linear;
|
||||
}
|
||||
|
||||
@keyframes animShootingStar-2 {
|
||||
from { transform: translateY(0px) translateX(0px) rotate(-45deg); opacity: 1; height: 3px; }
|
||||
to { transform: translateY(1440px) translateX(1440px) rotate(-45deg); opacity: 1; height: 600px; }
|
||||
}
|
||||
|
||||
/* Stars */
|
||||
.star.snd { top: 75px; left: 232px; animation-delay: 1s; }
|
||||
.star.trd { top: 97px; left: 75px; animation-delay: 1.4s; }
|
||||
.star.fth { top: 15px; left: 150px; animation-delay: 1.8s; }
|
||||
.star.fith { top: 63px; left: 165px; animation-delay: 2.2s; }
|
||||
|
||||
@keyframes starShine {
|
||||
0% { transform: scale(0.3) rotate(0deg); opacity: 0.4; }
|
||||
25% { transform: scale(1) rotate(360deg); opacity: 1; }
|
||||
50% { transform: scale(0.3) rotate(720deg); opacity: 0.4; }
|
||||
100% { transform: scale(0.3) rotate(0deg); opacity: 0.4; }
|
||||
}
|
||||
|
||||
/* Trees */
|
||||
.tree-1 {
|
||||
position: relative;
|
||||
top: 112px;
|
||||
left: 37px;
|
||||
width: 0;
|
||||
height: 0;
|
||||
z-index: 8;
|
||||
border-bottom: 67px solid #0a171d;
|
||||
border-left: 22px solid transparent;
|
||||
border-right: 22px solid transparent;
|
||||
}
|
||||
|
||||
.tree-1:before {
|
||||
position: absolute;
|
||||
bottom: -82px;
|
||||
left: 50%;
|
||||
margin-left: -3px;
|
||||
width: 6px;
|
||||
height: 22px;
|
||||
z-index: 7;
|
||||
content: '';
|
||||
background-color: #000;
|
||||
}
|
||||
|
||||
.tree-2 {
|
||||
position: relative;
|
||||
top: 0;
|
||||
left: 187px;
|
||||
width: 0;
|
||||
height: 0;
|
||||
z-index: 8;
|
||||
border-bottom: 67px solid #0a171d;
|
||||
border-left: 22px solid transparent;
|
||||
border-right: 22px solid transparent;
|
||||
}
|
||||
|
||||
.tree-2:before {
|
||||
position: absolute;
|
||||
bottom: -82px;
|
||||
left: 50%;
|
||||
margin-left: -3px;
|
||||
width: 6px;
|
||||
height: 22px;
|
||||
z-index: 7;
|
||||
content: '';
|
||||
background-color: #000;
|
||||
}
|
||||
|
||||
/* Fire */
|
||||
.fire {
|
||||
position: absolute;
|
||||
z-index: 39;
|
||||
width: 2px;
|
||||
margin-left: -1px;
|
||||
left: 50%;
|
||||
bottom: 60px;
|
||||
transition: all 1200ms linear;
|
||||
}
|
||||
|
||||
.fire span {
|
||||
display: block;
|
||||
position: absolute;
|
||||
bottom: -11px;
|
||||
margin-left: -15px;
|
||||
height: 0;
|
||||
width: 0;
|
||||
border: 22px solid #febd08; /* Main flame: yellow-orange */
|
||||
border-radius: 50%;
|
||||
border-top-left-radius: 0;
|
||||
left: -6px;
|
||||
box-shadow: 0 0 7px 3px rgba(244,110,28,0.8), 0 0 15px 7px rgba(244,110,28,0.6), 0 0 22px 11px rgba(244,110,28,0.3);
|
||||
transform: scale(0.45, 0.75) rotate(45deg);
|
||||
animation: brilla-fire 2.5s alternate infinite;
|
||||
z-index: 9;
|
||||
transition: all 1200ms linear;
|
||||
}
|
||||
|
||||
.fire span:nth-child(2) {
|
||||
left: -16px;
|
||||
border: 22px solid #e63946; /* Outside flame: red */
|
||||
box-shadow: 0 0 7px 3px rgba(230,57,70,0.8), 0 0 15px 7px rgba(230,57,70,0.6), 0 0 22px 11px rgba(230,57,70,0.3);
|
||||
transform: scale(0.3, 0.55) rotate(15deg);
|
||||
z-index: 8;
|
||||
animation: brilla-fire-red 1.5s alternate infinite;
|
||||
}
|
||||
|
||||
.fire span:nth-child(3) {
|
||||
left: 3px;
|
||||
border: 22px solid #e63946; /* Outside flame: red */
|
||||
box-shadow: 0 0 7px 3px rgba(230,57,70,0.8), 0 0 15px 7px rgba(230,57,70,0.6), 0 0 22px 11px rgba(230,57,70,0.3);
|
||||
transform: scale(0.3, 0.55) rotate(80deg);
|
||||
z-index: 8;
|
||||
animation: brilla-fire-red 2s alternate infinite;
|
||||
}
|
||||
|
||||
.fire span:after {
|
||||
display: block;
|
||||
position: absolute;
|
||||
bottom: -22px;
|
||||
content: '';
|
||||
margin-left: -3px;
|
||||
height: 22px;
|
||||
width: 9px;
|
||||
background-color: rgba(244,110,28,0.7);
|
||||
border-radius: 80px;
|
||||
border-top-right-radius: 0;
|
||||
border-bottom-right-radius: 0;
|
||||
box-shadow: 0 0 15px 7px rgba(244,110,28,0.7);
|
||||
left: -6px;
|
||||
opacity: 0.8;
|
||||
transform: rotate(-50deg);
|
||||
}
|
||||
|
||||
.fire span:nth-child(2):after {
|
||||
background-color: rgba(230,57,70,0.7); /* Match red flame */
|
||||
box-shadow: 0 0 15px 7px rgba(230,57,70,0.7);
|
||||
}
|
||||
|
||||
.fire span:nth-child(3):after {
|
||||
background-color: rgba(230,57,70,0.7); /* Match red flame */
|
||||
box-shadow: 0 0 15px 7px rgba(230,57,70,0.7);
|
||||
}
|
||||
|
||||
@keyframes brilla-fire {
|
||||
0%, 100% { box-shadow: 0 0 7px 3px rgba(244,110,28,0.8), 0 0 15px 7px rgba(244,110,28,0.6), 0 0 22px 11px rgba(244,110,28,0.3); }
|
||||
50% { box-shadow: 0 0 10px 5px rgba(244,110,28,0.8), 0 0 21px 10px rgba(244,110,28,0.6), 0 0 31px 15px rgba(244,110,28,0.3); }
|
||||
}
|
||||
|
||||
@keyframes brilla-fire-red {
|
||||
0%, 100% { box-shadow: 0 0 7px 3px rgba(230,57,70,0.8), 0 0 15px 7px rgba(230,57,70,0.6), 0 0 22px 11px rgba(230,57,70,0.3); }
|
||||
50% { box-shadow: 0 0 10px 5px rgba(230,57,70,0.8), 0 0 21px 10px rgba(230,57,70,0.6), 0 0 31px 15px rgba(230,57,70,0.3); }
|
||||
}
|
||||
|
||||
/* Smoke */
|
||||
.smoke {
|
||||
position: absolute;
|
||||
z-index: 40;
|
||||
width: 2px;
|
||||
margin-left: -1px;
|
||||
left: 50%;
|
||||
bottom: 79px;
|
||||
opacity: 0;
|
||||
transition: all 800ms linear;
|
||||
}
|
||||
|
||||
.smoke span {
|
||||
display: block;
|
||||
position: absolute;
|
||||
bottom: -26px;
|
||||
left: 50%;
|
||||
margin-left: -15px;
|
||||
height: 0;
|
||||
width: 0;
|
||||
border: 22px solid rgba(0, 0, 0, .6);
|
||||
border-radius: 16px;
|
||||
border-bottom-left-radius: 0;
|
||||
border-top-right-radius: 0;
|
||||
left: -6px;
|
||||
opacity: 0;
|
||||
transform: scale(0.2, 0.2) rotate(-45deg);
|
||||
}
|
||||
|
||||
@keyframes smokeLeft {
|
||||
0% { transform: scale(0.2, 0.2) translate(0, 0) rotate(-45deg); }
|
||||
10% { opacity: 1; transform: scale(0.2, 0.3) translate(0, -3px) rotate(-45deg); }
|
||||
60% { opacity: 0.6; transform: scale(0.3, 0.5) translate(-7px, -60px) rotate(-45deg); }
|
||||
100% { opacity: 0; transform: scale(0.4, 0.8) translate(-15px, -90px) rotate(-45deg); }
|
||||
}
|
||||
|
||||
@keyframes smokeRight {
|
||||
0% { transform: scale(0.2, 0.2) translate(0, 0) rotate(-45deg); }
|
||||
10% { opacity: 1; transform: scale(0.2, 0.3) translate(0, -3px) rotate(-45deg); }
|
||||
60% { opacity: 0.6; transform: scale(0.3, 0.5) translate(7px, -60px) rotate(-45deg); }
|
||||
100% { opacity: 0; transform: scale(0.4, 0.8) translate(15px, -90px) rotate(-45deg); }
|
||||
}
|
||||
|
||||
.smoke .s-0 { animation: smokeLeft 7s 0s infinite; }
|
||||
.smoke .s-1 { animation: smokeRight 7s 0.7s infinite; }
|
||||
.smoke .s-2 { animation: smokeLeft 7s 1.4s infinite; }
|
||||
.smoke .s-3 { animation: smokeRight 7s 2.1s infinite; }
|
||||
.smoke .s-4 { animation: smokeLeft 7s 2.8s infinite; }
|
||||
.smoke .s-5 { animation: smokeRight 7s 3.5s infinite; }
|
||||
.smoke .s-6 { animation: smokeLeft 7s 4.2s infinite; }
|
||||
.smoke .s-7 { animation: smokeRight 7s 4.9s infinite; }
|
||||
.smoke .s-8 { animation: smokeLeft 7s 5.6s infinite; }
|
||||
.smoke .s-9 { animation: smokeRight 7s 6.3s infinite; }
|
||||
|
||||
/* Fire-off state (light theme) */
|
||||
body:not(.fire-on) .section-center { box-shadow: 0 0 50px 5px rgba(200,200,200,.2); }
|
||||
body:not(.fire-on) .smoke { opacity: 1; transition-delay: 0.8s; }
|
||||
body:not(.fire-on) .fire span { bottom: -26px; transform: scale(0.15, 0.15) rotate(45deg); }
|
||||
223
static/css/fire-animation.css
Normal file
@ -0,0 +1,223 @@
|
||||
/* /app/static/css/fire-animation.css */
|
||||
|
||||
/* Removed html, body, stage styles */
|
||||
|
||||
.minifire-container { /* Add a wrapper for positioning/sizing */
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
position: relative;
|
||||
height: 130px; /* Overall height of the animation area */
|
||||
overflow: hidden; /* Hide parts extending beyond the container */
|
||||
background-color: #270537; /* Optional: Add a background */
|
||||
border-radius: 4px;
|
||||
border: 1px solid #444;
|
||||
margin-top: 0.5rem; /* Space above animation */
|
||||
}
|
||||
|
||||
.minifire-campfire {
|
||||
position: relative;
|
||||
/* Base size significantly reduced (original was 600px) */
|
||||
width: 150px;
|
||||
height: 150px;
|
||||
transform-origin: bottom center;
|
||||
/* Scale down slightly more if needed, adjusted positioning based on origin */
|
||||
transform: scale(0.8) translateY(15px); /* Pushes it down slightly */
|
||||
}
|
||||
|
||||
/* --- Scaled Down Logs --- */
|
||||
.minifire-log {
|
||||
position: absolute;
|
||||
width: 60px; /* 238/4 */
|
||||
height: 18px; /* 70/4 */
|
||||
border-radius: 8px; /* 32/4 */
|
||||
background: #781e20;
|
||||
overflow: hidden;
|
||||
opacity: 0.99;
|
||||
transform-origin: center center;
|
||||
box-shadow: 0 0 1px 0.5px rgba(0,0,0,0.15); /* Scaled shadow */
|
||||
}
|
||||
|
||||
.minifire-log:before {
|
||||
content: '';
|
||||
display: block;
|
||||
position: absolute;
|
||||
top: 50%;
|
||||
left: 9px; /* 35/4 */
|
||||
width: 2px; /* 8/4 */
|
||||
height: 2px; /* 8/4 */
|
||||
border-radius: 8px; /* 32/4 */
|
||||
background: #b35050;
|
||||
transform: translate(-50%, -50%);
|
||||
z-index: 3;
|
||||
/* Scaled box-shadows */
|
||||
box-shadow: 0 0 0 0.5px #781e20, /* 2.5/4 -> 0.6 -> 0.5 */
|
||||
0 0 0 2.5px #b35050, /* 10.5/4 -> 2.6 -> 2.5 */
|
||||
0 0 0 3.5px #781e20, /* 13/4 -> 3.25 -> 3.5 */
|
||||
0 0 0 5.5px #b35050, /* 21/4 -> 5.25 -> 5.5 */
|
||||
0 0 0 6px #781e20, /* 23.5/4 -> 5.9 -> 6 */
|
||||
0 0 0 8px #b35050; /* 31.5/4 -> 7.9 -> 8 */
|
||||
}
|
||||
|
||||
.minifire-streak {
|
||||
position: absolute;
|
||||
height: 1px; /* Min height */
|
||||
border-radius: 5px; /* 20/4 */
|
||||
background: #b35050;
|
||||
}
|
||||
/* Scaled streaks */
|
||||
.minifire-streak:nth-child(1) { top: 3px; width: 23px; } /* 10/4, 90/4 */
|
||||
.minifire-streak:nth-child(2) { top: 3px; left: 25px; width: 20px; } /* 10/4, 100/4, 80/4 */
|
||||
.minifire-streak:nth-child(3) { top: 3px; left: 48px; width: 8px; } /* 10/4, 190/4, 30/4 */
|
||||
.minifire-streak:nth-child(4) { top: 6px; width: 33px; } /* 22/4, 132/4 */
|
||||
.minifire-streak:nth-child(5) { top: 6px; left: 36px; width: 12px; } /* 22/4, 142/4, 48/4 */
|
||||
.minifire-streak:nth-child(6) { top: 6px; left: 50px; width: 7px; } /* 22/4, 200/4, 28/4 */
|
||||
.minifire-streak:nth-child(7) { top: 9px; left: 19px; width: 40px; } /* 34/4, 74/4, 160/4 */
|
||||
.minifire-streak:nth-child(8) { top: 12px; left: 28px; width: 10px; } /* 46/4, 110/4, 40/4 */
|
||||
.minifire-streak:nth-child(9) { top: 12px; left: 43px; width: 14px; } /* 46/4, 170/4, 54/4 */
|
||||
.minifire-streak:nth-child(10) { top: 15px; left: 23px; width: 28px; } /* 58/4, 90/4, 110/4 */
|
||||
|
||||
/* Scaled Log Positions (Relative to 150px campfire) */
|
||||
.minifire-log:nth-child(1) { bottom: 25px; left: 25px; transform: rotate(150deg) scaleX(0.75); z-index: 20; } /* 100/4, 100/4 */
|
||||
.minifire-log:nth-child(2) { bottom: 30px; left: 35px; transform: rotate(110deg) scaleX(0.75); z-index: 10; } /* 120/4, 140/4 */
|
||||
.minifire-log:nth-child(3) { bottom: 25px; left: 17px; transform: rotate(-10deg) scaleX(0.75); } /* 98/4, 68/4 */
|
||||
.minifire-log:nth-child(4) { bottom: 20px; left: 55px; transform: rotate(-120deg) scaleX(0.75); z-index: 26; } /* 80/4, 220/4 */
|
||||
.minifire-log:nth-child(5) { bottom: 19px; left: 53px; transform: rotate(-30deg) scaleX(0.75); z-index: 25; } /* 75/4, 210/4 */
|
||||
.minifire-log:nth-child(6) { bottom: 23px; left: 70px; transform: rotate(35deg) scaleX(0.85); z-index: 30; } /* 92/4, 280/4 */
|
||||
.minifire-log:nth-child(7) { bottom: 18px; left: 75px; transform: rotate(-30deg) scaleX(0.75); z-index: 20; } /* 70/4, 300/4 */
|
||||
|
||||
/* --- Scaled Down Sticks --- */
|
||||
.minifire-stick {
|
||||
position: absolute;
|
||||
width: 17px; /* 68/4 */
|
||||
height: 5px; /* 20/4 */
|
||||
border-radius: 3px; /* 10/4 */
|
||||
box-shadow: 0 0 1px 0.5px rgba(0,0,0,0.1);
|
||||
background: #781e20;
|
||||
transform-origin: center center;
|
||||
}
|
||||
.minifire-stick:before {
|
||||
content: '';
|
||||
display: block;
|
||||
position: absolute;
|
||||
bottom: 100%;
|
||||
left: 7px; /* 30/4 -> 7.5 */
|
||||
width: 1.5px; /* 6/4 */
|
||||
height: 5px; /* 20/4 */
|
||||
background: #781e20;
|
||||
border-radius: 3px; /* 10/4 */
|
||||
transform: translateY(50%) rotate(32deg);
|
||||
}
|
||||
.minifire-stick:after {
|
||||
content: '';
|
||||
display: block;
|
||||
position: absolute;
|
||||
top: 0;
|
||||
right: 0;
|
||||
width: 5px; /* 20/4 */
|
||||
height: 5px; /* 20/4 */
|
||||
background: #b35050;
|
||||
border-radius: 3px; /* 10/4 */
|
||||
}
|
||||
/* Scaled Stick Positions */
|
||||
.minifire-stick:nth-child(1) { left: 40px; bottom: 41px; transform: rotate(-152deg) scaleX(0.8); z-index: 12; } /* 158/4, 164/4 */
|
||||
.minifire-stick:nth-child(2) { left: 45px; bottom: 8px; transform: rotate(20deg) scaleX(0.9); } /* 180/4, 30/4 */
|
||||
.minifire-stick:nth-child(3) { left: 100px; bottom: 10px; transform: rotate(170deg) scaleX(0.9); } /* 400/4, 38/4 */
|
||||
.minifire-stick:nth-child(3):before { display: none; }
|
||||
.minifire-stick:nth-child(4) { left: 93px; bottom: 38px; transform: rotate(80deg) scaleX(0.9); z-index: 20; } /* 370/4, 150/4 */
|
||||
.minifire-stick:nth-child(4):before { display: none; }
|
||||
|
||||
/* --- Scaled Down Fire --- */
|
||||
.minifire-fire .minifire-flame {
|
||||
position: absolute;
|
||||
transform-origin: bottom center;
|
||||
opacity: 0.9;
|
||||
}
|
||||
|
||||
/* Red Flames */
|
||||
.minifire-fire__red .minifire-flame {
|
||||
width: 12px; /* 48/4 */
|
||||
border-radius: 12px; /* 48/4 */
|
||||
background: #e20f00;
|
||||
box-shadow: 0 0 20px 5px rgba(226,15,0,0.4); /* Scaled shadow */
|
||||
}
|
||||
/* Scaled positions/heights */
|
||||
.minifire-fire__red .minifire-flame:nth-child(1) { left: 35px; height: 40px; bottom: 25px; animation: minifire-fire 2s 0.15s ease-in-out infinite alternate; } /* 138/4, 160/4, 100/4 */
|
||||
.minifire-fire__red .minifire-flame:nth-child(2) { left: 47px; height: 60px; bottom: 25px; animation: minifire-fire 2s 0.35s ease-in-out infinite alternate; } /* 186/4, 240/4, 100/4 */
|
||||
.minifire-fire__red .minifire-flame:nth-child(3) { left: 59px; height: 75px; bottom: 25px; animation: minifire-fire 2s 0.1s ease-in-out infinite alternate; } /* 234/4, 300/4, 100/4 */
|
||||
.minifire-fire__red .minifire-flame:nth-child(4) { left: 71px; height: 90px; bottom: 25px; animation: minifire-fire 2s 0s ease-in-out infinite alternate; } /* 282/4, 360/4, 100/4 */
|
||||
.minifire-fire__red .minifire-flame:nth-child(5) { left: 83px; height: 78px; bottom: 25px; animation: minifire-fire 2s 0.45s ease-in-out infinite alternate; } /* 330/4, 310/4, 100/4 */
|
||||
.minifire-fire__red .minifire-flame:nth-child(6) { left: 95px; height: 58px; bottom: 25px; animation: minifire-fire 2s 0.3s ease-in-out infinite alternate; } /* 378/4, 232/4, 100/4 */
|
||||
.minifire-fire__red .minifire-flame:nth-child(7) { left: 107px; height: 35px; bottom: 25px; animation: minifire-fire 2s 0.1s ease-in-out infinite alternate; } /* 426/4, 140/4, 100/4 */
|
||||
|
||||
/* Orange Flames */
|
||||
.minifire-fire__orange .minifire-flame {
|
||||
width: 12px; border-radius: 12px; background: #ff9c00;
|
||||
box-shadow: 0 0 20px 5px rgba(255,156,0,0.4);
|
||||
}
|
||||
.minifire-fire__orange .minifire-flame:nth-child(1) { left: 35px; height: 35px; bottom: 25px; animation: minifire-fire 2s 0.05s ease-in-out infinite alternate; }
|
||||
.minifire-fire__orange .minifire-flame:nth-child(2) { left: 47px; height: 53px; bottom: 25px; animation: minifire-fire 2s 0.1s ease-in-out infinite alternate; }
|
||||
.minifire-fire__orange .minifire-flame:nth-child(3) { left: 59px; height: 63px; bottom: 25px; animation: minifire-fire 2s 0.35s ease-in-out infinite alternate; }
|
||||
.minifire-fire__orange .minifire-flame:nth-child(4) { left: 71px; height: 75px; bottom: 25px; animation: minifire-fire 2s 0.4s ease-in-out infinite alternate; }
|
||||
.minifire-fire__orange .minifire-flame:nth-child(5) { left: 83px; height: 65px; bottom: 25px; animation: minifire-fire 2s 0.5s ease-in-out infinite alternate; }
|
||||
.minifire-fire__orange .minifire-flame:nth-child(6) { left: 95px; height: 51px; bottom: 25px; animation: minifire-fire 2s 0.35s ease-in-out infinite alternate; }
|
||||
.minifire-fire__orange .minifire-flame:nth-child(7) { left: 107px; height: 28px; bottom: 25px; animation: minifire-fire 2s 0.1s ease-in-out infinite alternate; }
|
||||
|
||||
/* Yellow Flames */
|
||||
.minifire-fire__yellow .minifire-flame {
|
||||
width: 12px; border-radius: 12px; background: #ffeb6e;
|
||||
box-shadow: 0 0 20px 5px rgba(255,235,110,0.4);
|
||||
}
|
||||
.minifire-fire__yellow .minifire-flame:nth-child(1) { left: 47px; height: 35px; bottom: 25px; animation: minifire-fire 2s 0.6s ease-in-out infinite alternate; }
|
||||
.minifire-fire__yellow .minifire-flame:nth-child(2) { left: 59px; height: 43px; bottom: 30px; animation: minifire-fire 2s 0.4s ease-in-out infinite alternate; } /* Adjusted bottom slightly */
|
||||
.minifire-fire__yellow .minifire-flame:nth-child(3) { left: 71px; height: 60px; bottom: 25px; animation: minifire-fire 2s 0.38s ease-in-out infinite alternate; }
|
||||
.minifire-fire__yellow .minifire-flame:nth-child(4) { left: 83px; height: 50px; bottom: 25px; animation: minifire-fire 2s 0.22s ease-in-out infinite alternate; }
|
||||
.minifire-fire__yellow .minifire-flame:nth-child(5) { left: 95px; height: 36px; bottom: 25px; animation: minifire-fire 2s 0.18s ease-in-out infinite alternate; }
|
||||
|
||||
/* White Flames */
|
||||
.minifire-fire__white .minifire-flame {
|
||||
width: 12px; border-radius: 12px; background: #fef1d9;
|
||||
box-shadow: 0 0 20px 5px rgba(254,241,217,0.4);
|
||||
}
|
||||
.minifire-fire__white .minifire-flame:nth-child(1) { left: 39px; width: 8px; height: 25px; bottom: 25px; animation: minifire-fire 2s 0.22s ease-in-out infinite alternate; } /* Scaled width too */
|
||||
.minifire-fire__white .minifire-flame:nth-child(2) { left: 45px; width: 8px; height: 30px; bottom: 25px; animation: minifire-fire 2s 0.42s ease-in-out infinite alternate; }
|
||||
.minifire-fire__white .minifire-flame:nth-child(3) { left: 59px; height: 43px; bottom: 25px; animation: minifire-fire 2s 0.32s ease-in-out infinite alternate; }
|
||||
.minifire-fire__white .minifire-flame:nth-child(4) { left: 71px; height: 53px; bottom: 25px; animation: minifire-fire 2s 0.8s ease-in-out infinite alternate; }
|
||||
.minifire-fire__white .minifire-flame:nth-child(5) { left: 83px; height: 43px; bottom: 25px; animation: minifire-fire 2s 0.85s ease-in-out infinite alternate; }
|
||||
.minifire-fire__white .minifire-flame:nth-child(6) { left: 95px; width: 8px; height: 28px; bottom: 25px; animation: minifire-fire 2s 0.64s ease-in-out infinite alternate; }
|
||||
.minifire-fire__white .minifire-flame:nth-child(7) { left: 102px; width: 8px; height: 25px; bottom: 25px; animation: minifire-fire 2s 0.32s ease-in-out infinite alternate; }
|
||||
|
||||
/* --- Scaled Down Sparks --- */
|
||||
.minifire-spark {
|
||||
position: absolute;
|
||||
width: 1.5px; /* 6/4 */
|
||||
height: 5px; /* 20/4 */
|
||||
background: #fef1d9;
|
||||
border-radius: 5px; /* 18/4 -> 4.5 */
|
||||
z-index: 50;
|
||||
transform-origin: bottom center;
|
||||
transform: scaleY(0);
|
||||
}
|
||||
/* Scaled spark positions/animations */
|
||||
.minifire-spark:nth-child(1) { left: 40px; bottom: 53px; animation: minifire-spark 1s 0.4s linear infinite; } /* 160/4, 212/4 */
|
||||
.minifire-spark:nth-child(2) { left: 45px; bottom: 60px; animation: minifire-spark 1s 1s linear infinite; } /* 180/4, 240/4 */
|
||||
.minifire-spark:nth-child(3) { left: 52px; bottom: 80px; animation: minifire-spark 1s 0.8s linear infinite; } /* 208/4, 320/4 */
|
||||
.minifire-spark:nth-child(4) { left: 78px; bottom: 100px; animation: minifire-spark 1s 2s linear infinite; } /* 310/4, 400/4 */
|
||||
.minifire-spark:nth-child(5) { left: 90px; bottom: 95px; animation: minifire-spark 1s 0.75s linear infinite; } /* 360/4, 380/4 */
|
||||
.minifire-spark:nth-child(6) { left: 98px; bottom: 80px; animation: minifire-spark 1s 0.65s linear infinite; } /* 390/4, 320/4 */
|
||||
.minifire-spark:nth-child(7) { left: 100px; bottom: 70px; animation: minifire-spark 1s 1s linear infinite; } /* 400/4, 280/4 */
|
||||
.minifire-spark:nth-child(8) { left: 108px; bottom: 53px; animation: minifire-spark 1s 1.4s linear infinite; } /* 430/4, 210/4 */
|
||||
|
||||
/* --- Keyframes (Rename to avoid conflicts) --- */
|
||||
/* Use the same keyframe logic, just rename them */
|
||||
@keyframes minifire-fire {
|
||||
0% { transform: scaleY(1); } 28% { transform: scaleY(0.7); } 38% { transform: scaleY(0.8); } 50% { transform: scaleY(0.6); } 70% { transform: scaleY(0.95); } 82% { transform: scaleY(0.58); } 100% { transform: scaleY(1); }
|
||||
}
|
||||
@keyframes minifire-spark {
|
||||
0%, 35% { transform: scaleY(0) translateY(0); opacity: 0; }
|
||||
50% { transform: scaleY(1) translateY(0); opacity: 1; }
|
||||
/* Adjusted translateY for smaller scale */
|
||||
70% { transform: scaleY(1) translateY(-3px); opacity: 1; } /* 10/4 -> 2.5 -> 3 */
|
||||
75% { transform: scaleY(1) translateY(-3px); opacity: 0; }
|
||||
100% { transform: scaleY(0) translateY(0); opacity: 0; }
|
||||
}
|
||||
|
Before Width: | Height: | Size: 281 KiB After Width: | Height: | Size: 222 KiB |
BIN
static/favicon.ico.old
Normal file
|
After Width: | Height: | Size: 281 KiB |
BIN
static/img/star.png
Normal file
|
After Width: | Height: | Size: 436 B |
BIN
static/img/wood.png
Normal file
|
After Width: | Height: | Size: 26 KiB |
1
static/js/htmx.min.js
vendored
Normal file
1
static/js/lottie.min.js
vendored
Normal file
299
static/uploads/fsh_output/fshing-trip-comparison.html
Normal file
@ -0,0 +1,299 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8" />
|
||||
<title>FSHing Trip Comparison</title>
|
||||
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/9.9.0/styles/github.min.css" />
|
||||
<link rel="stylesheet" type="text/css" href="https://cdn.jsdelivr.net/npm/diff2html/bundles/css/diff2html.min.css" />
|
||||
<script type="text/javascript" src="https://cdn.jsdelivr.net/npm/diff2html/bundles/js/diff2html-ui.min.js"></script>
|
||||
<script>
|
||||
document.addEventListener('DOMContentLoaded', () => {
|
||||
const targetElement = document.getElementById('diff');
|
||||
const diff2htmlUi = new Diff2HtmlUI(targetElement);
|
||||
diff2htmlUi.fileListToggle(false);
|
||||
diff2htmlUi.synchronisedScroll();
|
||||
diff2htmlUi.highlightCode();
|
||||
const diffs = document.getElementsByClassName('d2h-file-wrapper');
|
||||
for (const diff of diffs) {
|
||||
diff.innerHTML = `
|
||||
<details>
|
||||
<summary>${diff.getElementsByClassName('d2h-file-name')[0].innerHTML}</summary>
|
||||
${diff.innerHTML}
|
||||
</details>`
|
||||
}
|
||||
|
||||
});
|
||||
</script>
|
||||
</head>
|
||||
<body style="text-align: center; font-family: 'Source Sans Pro', sans-serif">
|
||||
<h1>FSHing Trip Comparison</a></h1>
|
||||
|
||||
<div id="diff">
|
||||
<div class="d2h-file-list-wrapper d2h-light-color-scheme">
|
||||
<div class="d2h-file-list-header">
|
||||
<span class="d2h-file-list-title">Files changed (1)</span>
|
||||
<a class="d2h-file-switch d2h-hide">hide</a>
|
||||
<a class="d2h-file-switch d2h-show">show</a>
|
||||
</div>
|
||||
<ol class="d2h-file-list">
|
||||
<li class="d2h-file-list-line">
|
||||
<span class="d2h-file-name-wrapper">
|
||||
<svg aria-hidden="true" class="d2h-icon d2h-moved" height="16" title="renamed" version="1.1"
|
||||
viewBox="0 0 14 16" width="14">
|
||||
<path d="M6 9H3V7h3V4l5 4-5 4V9z m8-7v12c0 0.55-0.45 1-1 1H1c-0.55 0-1-0.45-1-1V2c0-0.55 0.45-1 1-1h12c0.55 0 1 0.45 1 1z m-1 0H1v12h12V2z"></path>
|
||||
</svg> <a href="#d2h-898460" class="d2h-file-name">../tmp/{tmpkn9pyg0k/input.json → tmpb_q8uf73/fsh-generated/data}/fsh-index.json</a>
|
||||
<span class="d2h-file-stats">
|
||||
<span class="d2h-lines-added">+10</span>
|
||||
<span class="d2h-lines-deleted">-0</span>
|
||||
</span>
|
||||
</span>
|
||||
</li>
|
||||
</ol>
|
||||
</div><div class="d2h-wrapper d2h-light-color-scheme">
|
||||
<div id="d2h-898460" class="d2h-file-wrapper" data-lang="json">
|
||||
<div class="d2h-file-header">
|
||||
<span class="d2h-file-name-wrapper">
|
||||
<svg aria-hidden="true" class="d2h-icon" height="16" version="1.1" viewBox="0 0 12 16" width="12">
|
||||
<path d="M6 5H2v-1h4v1zM2 8h7v-1H2v1z m0 2h7v-1H2v1z m0 2h7v-1H2v1z m10-7.5v9.5c0 0.55-0.45 1-1 1H1c-0.55 0-1-0.45-1-1V2c0-0.55 0.45-1 1-1h7.5l3.5 3.5z m-1 0.5L8 2H1v12h10V5z"></path>
|
||||
</svg> <span class="d2h-file-name">../tmp/{tmpkn9pyg0k/input.json → tmpb_q8uf73/fsh-generated/data}/fsh-index.json</span>
|
||||
<span class="d2h-tag d2h-moved d2h-moved-tag">RENAMED</span></span>
|
||||
<label class="d2h-file-collapse">
|
||||
<input class="d2h-file-collapse-input" type="checkbox" name="viewed" value="viewed">
|
||||
Viewed
|
||||
</label>
|
||||
</div>
|
||||
<div class="d2h-files-diff">
|
||||
<div class="d2h-file-side-diff">
|
||||
<div class="d2h-code-wrapper">
|
||||
<table class="d2h-diff-table">
|
||||
<tbody class="d2h-diff-tbody">
|
||||
<tr>
|
||||
<td class="d2h-code-side-linenumber d2h-info"></td>
|
||||
<td class="d2h-info">
|
||||
<div class="d2h-code-side-line">@@ -0,0 +1,10 @@</div>
|
||||
</td>
|
||||
</tr><tr>
|
||||
<td class="d2h-code-side-linenumber d2h-code-side-emptyplaceholder d2h-cntx d2h-emptyplaceholder">
|
||||
|
||||
</td>
|
||||
<td class="d2h-cntx d2h-emptyplaceholder">
|
||||
<div class="d2h-code-side-line d2h-code-side-emptyplaceholder">
|
||||
<span class="d2h-code-line-prefix"> </span>
|
||||
<span class="d2h-code-line-ctn"><br></span>
|
||||
</div>
|
||||
</td>
|
||||
</tr><tr>
|
||||
<td class="d2h-code-side-linenumber d2h-code-side-emptyplaceholder d2h-cntx d2h-emptyplaceholder">
|
||||
|
||||
</td>
|
||||
<td class="d2h-cntx d2h-emptyplaceholder">
|
||||
<div class="d2h-code-side-line d2h-code-side-emptyplaceholder">
|
||||
<span class="d2h-code-line-prefix"> </span>
|
||||
<span class="d2h-code-line-ctn"><br></span>
|
||||
</div>
|
||||
</td>
|
||||
</tr><tr>
|
||||
<td class="d2h-code-side-linenumber d2h-code-side-emptyplaceholder d2h-cntx d2h-emptyplaceholder">
|
||||
|
||||
</td>
|
||||
<td class="d2h-cntx d2h-emptyplaceholder">
|
||||
<div class="d2h-code-side-line d2h-code-side-emptyplaceholder">
|
||||
<span class="d2h-code-line-prefix"> </span>
|
||||
<span class="d2h-code-line-ctn"><br></span>
|
||||
</div>
|
||||
</td>
|
||||
</tr><tr>
|
||||
<td class="d2h-code-side-linenumber d2h-code-side-emptyplaceholder d2h-cntx d2h-emptyplaceholder">
|
||||
|
||||
</td>
|
||||
<td class="d2h-cntx d2h-emptyplaceholder">
|
||||
<div class="d2h-code-side-line d2h-code-side-emptyplaceholder">
|
||||
<span class="d2h-code-line-prefix"> </span>
|
||||
<span class="d2h-code-line-ctn"><br></span>
|
||||
</div>
|
||||
</td>
|
||||
</tr><tr>
|
||||
<td class="d2h-code-side-linenumber d2h-code-side-emptyplaceholder d2h-cntx d2h-emptyplaceholder">
|
||||
|
||||
</td>
|
||||
<td class="d2h-cntx d2h-emptyplaceholder">
|
||||
<div class="d2h-code-side-line d2h-code-side-emptyplaceholder">
|
||||
<span class="d2h-code-line-prefix"> </span>
|
||||
<span class="d2h-code-line-ctn"><br></span>
|
||||
</div>
|
||||
</td>
|
||||
</tr><tr>
|
||||
<td class="d2h-code-side-linenumber d2h-code-side-emptyplaceholder d2h-cntx d2h-emptyplaceholder">
|
||||
|
||||
</td>
|
||||
<td class="d2h-cntx d2h-emptyplaceholder">
|
||||
<div class="d2h-code-side-line d2h-code-side-emptyplaceholder">
|
||||
<span class="d2h-code-line-prefix"> </span>
|
||||
<span class="d2h-code-line-ctn"><br></span>
|
||||
</div>
|
||||
</td>
|
||||
</tr><tr>
|
||||
<td class="d2h-code-side-linenumber d2h-code-side-emptyplaceholder d2h-cntx d2h-emptyplaceholder">
|
||||
|
||||
</td>
|
||||
<td class="d2h-cntx d2h-emptyplaceholder">
|
||||
<div class="d2h-code-side-line d2h-code-side-emptyplaceholder">
|
||||
<span class="d2h-code-line-prefix"> </span>
|
||||
<span class="d2h-code-line-ctn"><br></span>
|
||||
</div>
|
||||
</td>
|
||||
</tr><tr>
|
||||
<td class="d2h-code-side-linenumber d2h-code-side-emptyplaceholder d2h-cntx d2h-emptyplaceholder">
|
||||
|
||||
</td>
|
||||
<td class="d2h-cntx d2h-emptyplaceholder">
|
||||
<div class="d2h-code-side-line d2h-code-side-emptyplaceholder">
|
||||
<span class="d2h-code-line-prefix"> </span>
|
||||
<span class="d2h-code-line-ctn"><br></span>
|
||||
</div>
|
||||
</td>
|
||||
</tr><tr>
|
||||
<td class="d2h-code-side-linenumber d2h-code-side-emptyplaceholder d2h-cntx d2h-emptyplaceholder">
|
||||
|
||||
</td>
|
||||
<td class="d2h-cntx d2h-emptyplaceholder">
|
||||
<div class="d2h-code-side-line d2h-code-side-emptyplaceholder">
|
||||
<span class="d2h-code-line-prefix"> </span>
|
||||
<span class="d2h-code-line-ctn"><br></span>
|
||||
</div>
|
||||
</td>
|
||||
</tr><tr>
|
||||
<td class="d2h-code-side-linenumber d2h-code-side-emptyplaceholder d2h-cntx d2h-emptyplaceholder">
|
||||
|
||||
</td>
|
||||
<td class="d2h-cntx d2h-emptyplaceholder">
|
||||
<div class="d2h-code-side-line d2h-code-side-emptyplaceholder">
|
||||
<span class="d2h-code-line-prefix"> </span>
|
||||
<span class="d2h-code-line-ctn"><br></span>
|
||||
</div>
|
||||
</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
<div class="d2h-file-side-diff">
|
||||
<div class="d2h-code-wrapper">
|
||||
<table class="d2h-diff-table">
|
||||
<tbody class="d2h-diff-tbody">
|
||||
<tr>
|
||||
<td class="d2h-code-side-linenumber d2h-info"></td>
|
||||
<td class="d2h-info">
|
||||
<div class="d2h-code-side-line"> </div>
|
||||
</td>
|
||||
</tr><tr>
|
||||
<td class="d2h-code-side-linenumber d2h-ins">
|
||||
1
|
||||
</td>
|
||||
<td class="d2h-ins">
|
||||
<div class="d2h-code-side-line">
|
||||
<span class="d2h-code-line-prefix">+</span>
|
||||
<span class="d2h-code-line-ctn">[</span>
|
||||
</div>
|
||||
</td>
|
||||
</tr><tr>
|
||||
<td class="d2h-code-side-linenumber d2h-ins">
|
||||
2
|
||||
</td>
|
||||
<td class="d2h-ins">
|
||||
<div class="d2h-code-side-line">
|
||||
<span class="d2h-code-line-prefix">+</span>
|
||||
<span class="d2h-code-line-ctn"> {</span>
|
||||
</div>
|
||||
</td>
|
||||
</tr><tr>
|
||||
<td class="d2h-code-side-linenumber d2h-ins">
|
||||
3
|
||||
</td>
|
||||
<td class="d2h-ins">
|
||||
<div class="d2h-code-side-line">
|
||||
<span class="d2h-code-line-prefix">+</span>
|
||||
<span class="d2h-code-line-ctn"> "outputFile": "Encounter-discharge-1.json",</span>
|
||||
</div>
|
||||
</td>
|
||||
</tr><tr>
|
||||
<td class="d2h-code-side-linenumber d2h-ins">
|
||||
4
|
||||
</td>
|
||||
<td class="d2h-ins">
|
||||
<div class="d2h-code-side-line">
|
||||
<span class="d2h-code-line-prefix">+</span>
|
||||
<span class="d2h-code-line-ctn"> "fshName": "discharge-1",</span>
|
||||
</div>
|
||||
</td>
|
||||
</tr><tr>
|
||||
<td class="d2h-code-side-linenumber d2h-ins">
|
||||
5
|
||||
</td>
|
||||
<td class="d2h-ins">
|
||||
<div class="d2h-code-side-line">
|
||||
<span class="d2h-code-line-prefix">+</span>
|
||||
<span class="d2h-code-line-ctn"> "fshType": "Instance",</span>
|
||||
</div>
|
||||
</td>
|
||||
</tr><tr>
|
||||
<td class="d2h-code-side-linenumber d2h-ins">
|
||||
6
|
||||
</td>
|
||||
<td class="d2h-ins">
|
||||
<div class="d2h-code-side-line">
|
||||
<span class="d2h-code-line-prefix">+</span>
|
||||
<span class="d2h-code-line-ctn"> "fshFile": "instances/discharge-1.fsh",</span>
|
||||
</div>
|
||||
</td>
|
||||
</tr><tr>
|
||||
<td class="d2h-code-side-linenumber d2h-ins">
|
||||
7
|
||||
</td>
|
||||
<td class="d2h-ins">
|
||||
<div class="d2h-code-side-line">
|
||||
<span class="d2h-code-line-prefix">+</span>
|
||||
<span class="d2h-code-line-ctn"> "startLine": 1,</span>
|
||||
</div>
|
||||
</td>
|
||||
</tr><tr>
|
||||
<td class="d2h-code-side-linenumber d2h-ins">
|
||||
8
|
||||
</td>
|
||||
<td class="d2h-ins">
|
||||
<div class="d2h-code-side-line">
|
||||
<span class="d2h-code-line-prefix">+</span>
|
||||
<span class="d2h-code-line-ctn"> "endLine": 12</span>
|
||||
</div>
|
||||
</td>
|
||||
</tr><tr>
|
||||
<td class="d2h-code-side-linenumber d2h-ins">
|
||||
9
|
||||
</td>
|
||||
<td class="d2h-ins">
|
||||
<div class="d2h-code-side-line">
|
||||
<span class="d2h-code-line-prefix">+</span>
|
||||
<span class="d2h-code-line-ctn"> }</span>
|
||||
</div>
|
||||
</td>
|
||||
</tr><tr>
|
||||
<td class="d2h-code-side-linenumber d2h-ins">
|
||||
10
|
||||
</td>
|
||||
<td class="d2h-ins">
|
||||
<div class="d2h-code-side-line">
|
||||
<span class="d2h-code-line-prefix">+</span>
|
||||
<span class="d2h-code-line-ctn">]</span>
|
||||
</div>
|
||||
</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</body>
|
||||
</html>
|
||||
1
static/uploads/fsh_output/input/fsh/aliases.fsh
Normal file
@ -0,0 +1 @@
|
||||
Alias: $v3-ActCode = http://terminology.hl7.org/CodeSystem/v3-ActCode
|
||||
2
static/uploads/fsh_output/input/fsh/index.txt
Normal file
@ -0,0 +1,2 @@
|
||||
Name Type File
|
||||
banks-mia-leanne Instance instances/banks-mia-leanne.fsh
|
||||
16
static/uploads/fsh_output/input/fsh/instances/aspirin.fsh
Normal file
@ -0,0 +1,16 @@
|
||||
Instance: aspirin
|
||||
InstanceOf: AllergyIntolerance
|
||||
Usage: #example
|
||||
* meta.profile = "http://hl7.org.au/fhir/core/StructureDefinition/au-core-allergyintolerance"
|
||||
* clinicalStatus = $allergyintolerance-clinical#active
|
||||
* clinicalStatus.text = "Active"
|
||||
* verificationStatus = $allergyintolerance-verification#confirmed
|
||||
* verificationStatus.text = "Confirmed"
|
||||
* category = #medication
|
||||
* criticality = #unable-to-assess
|
||||
* code = $sct#387458008
|
||||
* code.text = "Aspirin allergy"
|
||||
* patient = Reference(Patient/hayes-arianne)
|
||||
* recordedDate = "2024-02-10"
|
||||
* recorder = Reference(PractitionerRole/specialistphysicians-swanborough-erick)
|
||||
* asserter = Reference(PractitionerRole/specialistphysicians-swanborough-erick)
|
||||
@ -0,0 +1,49 @@
|
||||
Instance: banks-mia-leanne
|
||||
InstanceOf: Patient
|
||||
Usage: #example
|
||||
* meta.profile = "http://hl7.org.au/fhir/core/StructureDefinition/au-core-patient"
|
||||
* extension[0].url = "http://hl7.org/fhir/StructureDefinition/individual-genderIdentity"
|
||||
* extension[=].extension.url = "value"
|
||||
* extension[=].extension.valueCodeableConcept = $sct#446141000124107 "Identifies as female gender"
|
||||
* extension[+].url = "http://hl7.org/fhir/StructureDefinition/individual-pronouns"
|
||||
* extension[=].extension.url = "value"
|
||||
* extension[=].extension.valueCodeableConcept = $loinc#LA29519-8 "she/her/her/hers/herself"
|
||||
* extension[+].url = "http://hl7.org/fhir/StructureDefinition/individual-recordedSexOrGender"
|
||||
* extension[=].extension[0].url = "value"
|
||||
* extension[=].extension[=].valueCodeableConcept = $sct#248152002
|
||||
* extension[=].extension[=].valueCodeableConcept.text = "Female"
|
||||
* extension[=].extension[+].url = "type"
|
||||
* extension[=].extension[=].valueCodeableConcept = $sct#1515311000168102 "Biological sex at birth"
|
||||
* identifier.extension[0].url = "http://hl7.org.au/fhir/StructureDefinition/ihi-status"
|
||||
* identifier.extension[=].valueCoding = $ihi-status-1#active
|
||||
* identifier.extension[+].url = "http://hl7.org.au/fhir/StructureDefinition/ihi-record-status"
|
||||
* identifier.extension[=].valueCoding = $ihi-record-status-1#verified "verified"
|
||||
* identifier.type = $v2-0203#NI
|
||||
* identifier.type.text = "IHI"
|
||||
* identifier.system = "http://ns.electronichealth.net.au/id/hi/ihi/1.0"
|
||||
* identifier.value = "8003608333647261"
|
||||
* name.use = #usual
|
||||
* name.family = "Banks"
|
||||
* name.given[0] = "Mia"
|
||||
* name.given[+] = "Leanne"
|
||||
* telecom[0].system = #phone
|
||||
* telecom[=].value = "0270102724"
|
||||
* telecom[=].use = #work
|
||||
* telecom[+].system = #phone
|
||||
* telecom[=].value = "0491574632"
|
||||
* telecom[=].use = #mobile
|
||||
* telecom[+].system = #phone
|
||||
* telecom[=].value = "0270107520"
|
||||
* telecom[=].use = #home
|
||||
* telecom[+].system = #email
|
||||
* telecom[=].value = "mia.banks@myownpersonaldomain.com"
|
||||
* telecom[+].system = #phone
|
||||
* telecom[=].value = "270107520"
|
||||
* telecom[=].use = #home
|
||||
* gender = #female
|
||||
* birthDate = "1983-08-25"
|
||||
* address.line = "50 Sebastien St"
|
||||
* address.city = "Minjary"
|
||||
* address.state = "NSW"
|
||||
* address.postalCode = "2720"
|
||||
* address.country = "AU"
|
||||
@ -0,0 +1,12 @@
|
||||
Instance: discharge-1
|
||||
InstanceOf: Encounter
|
||||
Usage: #example
|
||||
* meta.profile = "http://hl7.org.au/fhir/core/StructureDefinition/au-core-encounter"
|
||||
* status = #finished
|
||||
* class = $v3-ActCode#EMER "emergency"
|
||||
* subject = Reference(Patient/ronny-irvine)
|
||||
* period
|
||||
* start = "2023-02-20T06:15:00+10:00"
|
||||
* end = "2023-02-20T18:19:00+10:00"
|
||||
* location.location = Reference(Location/murrabit-hospital)
|
||||
* serviceProvider = Reference(Organization/murrabit-hospital)
|
||||
15
static/uploads/fsh_output/input/fsh/instances/vkc.fsh
Normal file
@ -0,0 +1,15 @@
|
||||
Instance: vkc
|
||||
InstanceOf: Condition
|
||||
Usage: #example
|
||||
* meta.profile = "http://hl7.org.au/fhir/core/StructureDefinition/au-core-condition"
|
||||
* clinicalStatus = $condition-clinical#active "Active"
|
||||
* category = $condition-category#encounter-diagnosis "Encounter Diagnosis"
|
||||
* severity = $sct#24484000 "Severe"
|
||||
* code = $sct#317349009 "Vernal keratoconjunctivitis"
|
||||
* bodySite = $sct#368601006 "Entire conjunctiva of left eye"
|
||||
* subject = Reference(Patient/italia-sofia)
|
||||
* onsetDateTime = "2023-10-01"
|
||||
* recordedDate = "2023-10-02"
|
||||
* recorder = Reference(PractitionerRole/generalpractitioner-guthridge-jarred)
|
||||
* asserter = Reference(PractitionerRole/generalpractitioner-guthridge-jarred)
|
||||
* note.text = "Itchy and burning eye, foreign body sensation. Mucoid discharge."
|
||||
1
static/uploads/fsh_output/input/input.json
Normal file
@ -0,0 +1 @@
|
||||
{"resourceType":"Encounter","id":"discharge-1","meta":{"profile":["http://hl7.org.au/fhir/core/StructureDefinition/au-core-encounter"]},"text":{"status":"generated","div":"<div xmlns=\"http://www.w3.org/1999/xhtml\"><p class=\"res-header-id\"><b>Generated Narrative: Encounter discharge-1</b></p><a name=\"discharge-1\"> </a><a name=\"hcdischarge-1\"> </a><a name=\"discharge-1-en-AU\"> </a><div style=\"display: inline-block; background-color: #d9e0e7; padding: 6px; margin: 4px; border: 1px solid #8da1b4; border-radius: 5px; line-height: 60%\"><p style=\"margin-bottom: 0px\"/><p style=\"margin-bottom: 0px\">Profile: <a href=\"StructureDefinition-au-core-encounter.html\">AU Core Encounter</a></p></div><p><b>status</b>: Finished</p><p><b>class</b>: <a href=\"http://terminology.hl7.org/6.2.0/CodeSystem-v3-ActCode.html#v3-ActCode-EMER\">ActCode EMER</a>: emergency</p><p><b>subject</b>: <a href=\"Patient-ronny-irvine.html\">Ronny Lawrence Irvine Male, DoB: ( DVA Number:\u00a0QX827261)</a></p><p><b>period</b>: 2023-02-20 06:15:00+1000 --> 2023-02-20 18:19:00+1000</p><h3>Locations</h3><table class=\"grid\"><tr><td style=\"display: none\">-</td><td><b>Location</b></td></tr><tr><td style=\"display: none\">*</td><td><a href=\"Location-murrabit-hospital.html\">Location Murrabit Public Hospital</a></td></tr></table><p><b>serviceProvider</b>: <a href=\"Organization-murrabit-hospital.html\">Organization Murrabit Public Hospital</a></p></div>"},"status":"finished","class":{"system":"http://terminology.hl7.org/CodeSystem/v3-ActCode","code":"EMER","display":"emergency"},"subject":{"reference":"Patient/ronny-irvine"},"period":{"start":"2023-02-20T06:15:00+10:00","end":"2023-02-20T18:19:00+10:00"},"location":[{"location":{"reference":"Location/murrabit-hospital"}}],"serviceProvider":{"reference":"Organization/murrabit-hospital"}}
|
||||
8
static/uploads/fsh_output/sushi-config.yaml
Normal file
@ -0,0 +1,8 @@
|
||||
canonical: http://example.org
|
||||
fhirVersion: 4.3.0
|
||||
FSHOnly: true
|
||||
applyExtensionMetadataToRoot: false
|
||||
id: example
|
||||
name: Example
|
||||
dependencies:
|
||||
hl7.fhir.us.core: 6.1.0
|
||||
55
static/uploads/output.fsh
Normal file
@ -0,0 +1,55 @@
|
||||
Alias: $sct = http://snomed.info/sct
|
||||
Alias: $loinc = http://loinc.org
|
||||
Alias: $ihi-status-1 = https://healthterminologies.gov.au/fhir/CodeSystem/ihi-status-1
|
||||
Alias: $ihi-record-status-1 = https://healthterminologies.gov.au/fhir/CodeSystem/ihi-record-status-1
|
||||
Alias: $v2-0203 = http://terminology.hl7.org/CodeSystem/v2-0203
|
||||
|
||||
Instance: banks-mia-leanne
|
||||
InstanceOf: Patient
|
||||
Usage: #example
|
||||
* meta.profile = "http://hl7.org.au/fhir/core/StructureDefinition/au-core-patient"
|
||||
* extension[0].url = "http://hl7.org/fhir/StructureDefinition/individual-genderIdentity"
|
||||
* extension[=].extension.url = "value"
|
||||
* extension[=].extension.valueCodeableConcept = $sct#446141000124107 "Identifies as female gender"
|
||||
* extension[+].url = "http://hl7.org/fhir/StructureDefinition/individual-pronouns"
|
||||
* extension[=].extension.url = "value"
|
||||
* extension[=].extension.valueCodeableConcept = $loinc#LA29519-8 "she/her/her/hers/herself"
|
||||
* extension[+].url = "http://hl7.org/fhir/StructureDefinition/individual-recordedSexOrGender"
|
||||
* extension[=].extension[0].url = "value"
|
||||
* extension[=].extension[=].valueCodeableConcept = $sct#248152002
|
||||
* extension[=].extension[=].valueCodeableConcept.text = "Female"
|
||||
* extension[=].extension[+].url = "type"
|
||||
* extension[=].extension[=].valueCodeableConcept = $sct#1515311000168102 "Biological sex at birth"
|
||||
* identifier.extension[0].url = "http://hl7.org.au/fhir/StructureDefinition/ihi-status"
|
||||
* identifier.extension[=].valueCoding = $ihi-status-1#active
|
||||
* identifier.extension[+].url = "http://hl7.org.au/fhir/StructureDefinition/ihi-record-status"
|
||||
* identifier.extension[=].valueCoding = $ihi-record-status-1#verified "verified"
|
||||
* identifier.type = $v2-0203#NI
|
||||
* identifier.type.text = "IHI"
|
||||
* identifier.system = "http://ns.electronichealth.net.au/id/hi/ihi/1.0"
|
||||
* identifier.value = "8003608333647261"
|
||||
* name.use = #usual
|
||||
* name.family = "Banks"
|
||||
* name.given[0] = "Mia"
|
||||
* name.given[+] = "Leanne"
|
||||
* telecom[0].system = #phone
|
||||
* telecom[=].value = "0270102724"
|
||||
* telecom[=].use = #work
|
||||
* telecom[+].system = #phone
|
||||
* telecom[=].value = "0491574632"
|
||||
* telecom[=].use = #mobile
|
||||
* telecom[+].system = #phone
|
||||
* telecom[=].value = "0270107520"
|
||||
* telecom[=].use = #home
|
||||
* telecom[+].system = #email
|
||||
* telecom[=].value = "mia.banks@myownpersonaldomain.com"
|
||||
* telecom[+].system = #phone
|
||||
* telecom[=].value = "270107520"
|
||||
* telecom[=].use = #home
|
||||
* gender = #female
|
||||
* birthDate = "1983-08-25"
|
||||
* address.line = "50 Sebastien St"
|
||||
* address.city = "Minjary"
|
||||
* address.state = "NSW"
|
||||
* address.postalCode = "2720"
|
||||
* address.country = "AU"
|
||||
50
supervisord.conf
Normal file
@ -0,0 +1,50 @@
|
||||
[supervisord]
|
||||
nodaemon=true
|
||||
logfile=/app/logs/supervisord.log
|
||||
logfile_maxbytes=50MB
|
||||
logfile_backups=10
|
||||
pidfile=/app/logs/supervisord.pid
|
||||
|
||||
[program:flask]
|
||||
command=/app/venv/bin/python /app/app.py
|
||||
directory=/app
|
||||
environment=FLASK_APP="app.py",FLASK_ENV="development",NODE_PATH="/usr/lib/node_modules"
|
||||
autostart=true
|
||||
autorestart=true
|
||||
startsecs=10
|
||||
stopwaitsecs=10
|
||||
stdout_logfile=/app/logs/flask.log
|
||||
stdout_logfile_maxbytes=10MB
|
||||
stdout_logfile_backups=5
|
||||
stderr_logfile=/app/logs/flask_err.log
|
||||
stderr_logfile_maxbytes=10MB
|
||||
stderr_logfile_backups=5
|
||||
|
||||
[program:tomcat]
|
||||
command=/usr/local/tomcat/bin/catalina.sh run
|
||||
directory=/usr/local/tomcat
|
||||
environment=SPRING_CONFIG_LOCATION="file:/usr/local/tomcat/conf/application.yaml",NODE_PATH="/usr/lib/node_modules"
|
||||
autostart=true
|
||||
autorestart=true
|
||||
startsecs=30
|
||||
stopwaitsecs=30
|
||||
stdout_logfile=/app/logs/tomcat.log
|
||||
stdout_logfile_maxbytes=10MB
|
||||
stdout_logfile_backups=5
|
||||
stderr_logfile=/app/logs/tomcat_err.log
|
||||
stderr_logfile_maxbytes=10MB
|
||||
stderr_logfile_backups=5
|
||||
|
||||
[program:candle]
|
||||
command=dotnet /app/fhir-candle-publish/fhir-candle.dll
|
||||
directory=/app/fhir-candle-publish
|
||||
autostart=true
|
||||
autorestart=true
|
||||
startsecs=10
|
||||
stopwaitsecs=10
|
||||
stdout_logfile=/app/logs/candle.log
|
||||
stdout_logfile_maxbytes=10MB
|
||||
stdout_logfile_backups=5
|
||||
stderr_logfile=/app/logs/candle_err.log
|
||||
stderr_logfile_maxbytes=10MB
|
||||
stderr_logfile_backups=5
|
||||
24
templates/_flash_messages.html
Normal file
@ -0,0 +1,24 @@
|
||||
{# templates/_flash_messages.html #}
|
||||
|
||||
{# Check if there are any flashed messages #}
|
||||
{% with messages = get_flashed_messages(with_categories=true) %}
|
||||
{% if messages %}
|
||||
{# Loop through messages and display them as Bootstrap alerts #}
|
||||
{% for category, message in messages %}
|
||||
{# Map Flask message categories (e.g., 'error', 'success', 'warning') to Bootstrap alert classes #}
|
||||
{% set alert_class = 'alert-info' %} {# Default class #}
|
||||
{% if category == 'error' or category == 'danger' %}
|
||||
{% set alert_class = 'alert-danger' %}
|
||||
{% elif category == 'success' %}
|
||||
{% set alert_class = 'alert-success' %}
|
||||
{% elif category == 'warning' %}
|
||||
{% set alert_class = 'alert-warning' %}
|
||||
{% endif %}
|
||||
|
||||
<div class="alert {{ alert_class }} alert-dismissible fade show" role="alert">
|
||||
{{ message }}
|
||||
<button type="button" class="btn-close" data-bs-dismiss="alert" aria-label="Close"></button>
|
||||
</div>
|
||||
{% endfor %}
|
||||
{% endif %}
|
||||
{% endwith %}
|
||||