mirror of
https://github.com/Sudo-JHare/FHIRFLARE-IG-Toolkit.git
synced 2025-06-15 00:40:00 +00:00
rebase
This commit is contained in:
parent
18202a9b99
commit
cf055a67a8
@ -1,45 +0,0 @@
|
||||
we need to create the actual database file and the user table inside it. We use Flask-Migrate for this. These commands need to be run inside the running container.
|
||||
|
||||
Find your container ID (if you don't know it):
|
||||
|
||||
Bash
|
||||
|
||||
docker ps
|
||||
(Copy the CONTAINER ID or Name for webapp-base).
|
||||
|
||||
Initialize the Migration Repository (ONLY RUN ONCE EVER): This creates a migrations folder in your project (inside the container, and locally if you map volumes later, but the command runs inside).
|
||||
|
||||
Bash
|
||||
|
||||
docker exec <CONTAINER_ID_or_NAME> flask db init
|
||||
(Replace <CONTAINER_ID_or_NAME>)
|
||||
|
||||
Create the First Migration Script: Flask-Migrate compares your models (app/models.py) to the (non-existent) database and generates a script to create the necessary tables.
|
||||
|
||||
Bash
|
||||
|
||||
docker exec <CONTAINER_ID_or_NAME> flask db migrate -m "Initial migration; create user table."
|
||||
(You can change the message after -m). This creates a script file inside the migrations/versions/ directory.
|
||||
|
||||
Apply the Migration to the Database: This runs the script generated in the previous step, actually creating the app.db file (in /app/instance/) and the user table inside it.
|
||||
|
||||
Bash
|
||||
|
||||
docker exec <CONTAINER_ID_or_NAME> flask db upgrade
|
||||
After running these docker exec flask db ... commands, you should have a migrations folder in your project root locally (because it was created by code running inside the container using the project files mounted/copied) and an app.db file inside the /app/instance/ directory within the container.
|
||||
|
||||
Your database is now set up with a user table!
|
||||
|
||||
|
||||
flask db init
|
||||
flask db migrate -m "Initial migration; create user table."
|
||||
flask db upgrade
|
||||
|
||||
flask db migrate -m "Add role column to user table"
|
||||
flask db upgrade
|
||||
|
||||
flask db migrate -m "Add ModuleRegistry table"
|
||||
flask db upgrade
|
||||
|
||||
flask db migrate -m "Add ProcessedIg table"
|
||||
flask db upgrade
|
37
Dockerfile
37
Dockerfile
@ -1,27 +1,30 @@
|
||||
# 1. Base Image: Use an official Python runtime as a parent image
|
||||
FROM python:3.10-slim AS base
|
||||
FROM python:3.9-slim
|
||||
|
||||
# Set environment variables
|
||||
# Prevents python creating .pyc files
|
||||
ENV PYTHONDONTWRITEBYTECODE=1
|
||||
# Prevents python buffering stdout/stderr
|
||||
ENV PYTHONUNBUFFERED=1
|
||||
|
||||
# 2. Set Work Directory: Create and set the working directory in the container
|
||||
# Set working directory
|
||||
WORKDIR /app
|
||||
|
||||
# 3. Install Dependencies: Copy only the requirements file first to leverage Docker cache
|
||||
# Install system dependencies (if any needed by services.py, e.g., for FHIR processing)
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Copy requirements file and install Python dependencies
|
||||
COPY requirements.txt .
|
||||
RUN pip install --no-cache-dir -r requirements.txt
|
||||
|
||||
# 4. Copy Application Code: Copy the rest of your application code into the work directory
|
||||
COPY . .
|
||||
# Copy application files
|
||||
COPY app.py .
|
||||
COPY services.py . # Assuming you have this; replace with actual file if different
|
||||
COPY instance/ instance/ # Pre-create instance dir if needed for SQLite/packages
|
||||
|
||||
# 5. Expose Port: Tell Docker the container listens on port 5000
|
||||
# Ensure instance directory exists for SQLite DB and FHIR packages
|
||||
RUN mkdir -p /app/instance/fhir_packages
|
||||
|
||||
# Expose Flask port
|
||||
EXPOSE 5000
|
||||
|
||||
# 6. Run Command: Specify the command to run when the container starts
|
||||
# Using "flask run --host=0.0.0.0" makes the app accessible from outside the container
|
||||
# Note: FLASK_APP should be set, often via ENV or run.py structure
|
||||
# Note: For development, FLASK_DEBUG=1 might be useful (e.g., ENV FLASK_DEBUG=1)
|
||||
# Set environment variables
|
||||
ENV FLASK_APP=app.py
|
||||
ENV FLASK_ENV=development
|
||||
|
||||
# Run the app
|
||||
CMD ["flask", "run", "--host=0.0.0.0"]
|
162
app.py
Normal file
162
app.py
Normal file
@ -0,0 +1,162 @@
|
||||
from flask import Flask, render_template, request, redirect, url_for, flash
|
||||
from flask_sqlalchemy import SQLAlchemy
|
||||
import os
|
||||
import services # Assuming your existing services module for FHIR IG handling
|
||||
|
||||
app = Flask(__name__)
|
||||
app.config['SECRET_KEY'] = 'your-secret-key-here'
|
||||
app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite:///instance/fhir_ig.db'
|
||||
app.config['SQLALCHEMY_TRACK_MODIFICATIONS'] = False
|
||||
app.config['FHIR_PACKAGES_DIR'] = os.path.join(app.instance_path, 'fhir_packages')
|
||||
os.makedirs(app.config['FHIR_PACKAGES_DIR'], exist_ok=True)
|
||||
|
||||
db = SQLAlchemy(app)
|
||||
|
||||
# Simplified ProcessedIg model (no user-related fields)
|
||||
class ProcessedIg(db.Model):
|
||||
id = db.Column(db.Integer, primary_key=True)
|
||||
package_name = db.Column(db.String(128), nullable=False)
|
||||
version = db.Column(db.String(32), nullable=False)
|
||||
processed_date = db.Column(db.DateTime, nullable=False)
|
||||
resource_types_info = db.Column(db.JSON, nullable=False) # List of resource type metadata
|
||||
must_support_elements = db.Column(db.JSON, nullable=True) # Dict of MS elements
|
||||
examples = db.Column(db.JSON, nullable=True) # Dict of example filepaths
|
||||
|
||||
# Landing page with two buttons
|
||||
@app.route('/')
|
||||
def index():
|
||||
return render_template_string('''
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<title>FHIR IG Toolkit</title>
|
||||
<style>
|
||||
body { font-family: Arial, sans-serif; text-align: center; padding: 50px; }
|
||||
.button { padding: 15px 30px; margin: 10px; font-size: 16px; }
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<h1>FHIR IG Toolkit</h1>
|
||||
<p>Simple tool for importing and viewing FHIR Implementation Guides.</p>
|
||||
<a href="{{ url_for('import_ig') }}"><button class="button">Import FHIR IG</button></a>
|
||||
<a href="{{ url_for('view_igs') }}"><button class="button">View Downloaded IGs</button></a>
|
||||
</body>
|
||||
</html>
|
||||
''')
|
||||
|
||||
# Import IG route
|
||||
@app.route('/import-ig', methods=['GET', 'POST'])
|
||||
def import_ig():
|
||||
if request.method == 'POST':
|
||||
name = request.form.get('name')
|
||||
version = request.form.get('version', 'latest')
|
||||
try:
|
||||
# Call your existing service to download package and dependencies
|
||||
result = services.import_package_and_dependencies(name, version, app.config['FHIR_PACKAGES_DIR'])
|
||||
downloaded_files = result.get('downloaded', [])
|
||||
for file_path in downloaded_files:
|
||||
# Process each downloaded package
|
||||
package_info = services.process_package_file(file_path)
|
||||
processed_ig = ProcessedIg(
|
||||
package_name=package_info['name'],
|
||||
version=package_info['version'],
|
||||
processed_date=package_info['processed_date'],
|
||||
resource_types_info=package_info['resource_types_info'],
|
||||
must_support_elements=package_info.get('must_support_elements'),
|
||||
examples=package_info.get('examples')
|
||||
)
|
||||
db.session.add(processed_ig)
|
||||
db.session.commit()
|
||||
flash(f"Successfully imported {name} {version} and dependencies!", "success")
|
||||
return redirect(url_for('view_igs'))
|
||||
except Exception as e:
|
||||
flash(f"Error importing IG: {str(e)}", "error")
|
||||
return redirect(url_for('import_ig'))
|
||||
return render_template_string('''
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<title>Import FHIR IG</title>
|
||||
<style>
|
||||
body { font-family: Arial, sans-serif; padding: 20px; }
|
||||
.form { max-width: 400px; margin: 0 auto; }
|
||||
.field { margin: 10px 0; }
|
||||
input[type="text"] { width: 100%; padding: 5px; }
|
||||
.button { padding: 10px 20px; }
|
||||
.message { color: {% if category == "success" %}green{% else %}red{% endif %}; }
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<h1>Import FHIR IG</h1>
|
||||
<form class="form" method="POST">
|
||||
<div class="field">
|
||||
<label>Package Name:</label>
|
||||
<input type="text" name="name" placeholder="e.g., hl7.fhir.us.core" required>
|
||||
</div>
|
||||
<div class="field">
|
||||
<label>Version (optional):</label>
|
||||
<input type="text" name="version" placeholder="e.g., 1.0.0 or latest">
|
||||
</div>
|
||||
<button class="button" type="submit">Import</button>
|
||||
<a href="{{ url_for('index') }}"><button class="button" type="button">Back</button></a>
|
||||
</form>
|
||||
{% with messages = get_flashed_messages(with_categories=True) %}
|
||||
{% if messages %}
|
||||
{% for category, message in messages %}
|
||||
<p class="message">{{ message }}</p>
|
||||
{% endfor %}
|
||||
{% endif %}
|
||||
{% endwith %}
|
||||
</body>
|
||||
</html>
|
||||
''')
|
||||
|
||||
# View Downloaded IGs route
|
||||
@app.route('/view-igs')
|
||||
def view_igs():
|
||||
igs = ProcessedIg.query.all()
|
||||
return render_template_string('''
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<title>View Downloaded IGs</title>
|
||||
<style>
|
||||
body { font-family: Arial, sans-serif; padding: 20px; }
|
||||
table { width: 80%; margin: 20px auto; border-collapse: collapse; }
|
||||
th, td { padding: 10px; border: 1px solid #ddd; text-align: left; }
|
||||
th { background-color: #f2f2f2; }
|
||||
.button { padding: 10px 20px; }
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<h1>Downloaded FHIR IGs</h1>
|
||||
<table>
|
||||
<tr>
|
||||
<th>Package Name</th>
|
||||
<th>Version</th>
|
||||
<th>Processed Date</th>
|
||||
<th>Resource Types</th>
|
||||
</tr>
|
||||
{% for ig in igs %}
|
||||
<tr>
|
||||
<td>{{ ig.package_name }}</td>
|
||||
<td>{{ ig.version }}</td>
|
||||
<td>{{ ig.processed_date }}</td>
|
||||
<td>{{ ig.resource_types_info | length }} types</td>
|
||||
</tr>
|
||||
{% endfor %}
|
||||
</table>
|
||||
<a href="{{ url_for('index') }}"><button class="button">Back</button></a>
|
||||
</body>
|
||||
</html>
|
||||
''', igs=igs)
|
||||
|
||||
# Initialize DB
|
||||
with app.app_context():
|
||||
db.create_all()
|
||||
|
||||
if __name__ == '__main__':
|
||||
app.run(debug=True)
|
@ -1,33 +0,0 @@
|
||||
# app/__init__.py
|
||||
|
||||
import datetime
|
||||
import os
|
||||
import importlib
|
||||
import logging
|
||||
from flask import Flask, render_template, Blueprint, current_app
|
||||
from config import Config
|
||||
from flask_sqlalchemy import SQLAlchemy
|
||||
from flask_migrate import Migrate
|
||||
from flask_login import LoginManager
|
||||
|
||||
# Instantiate Extensions
|
||||
db = SQLAlchemy()
|
||||
migrate = Migrate()
|
||||
|
||||
def create_app(config_class='config.DevelopmentConfig'):
|
||||
app = Flask(__name__)
|
||||
app.config.from_object(config_class)
|
||||
|
||||
db.init_app(app)
|
||||
migrate.init_app(app, db)
|
||||
|
||||
from app.fhir_ig_importer import bp as fhir_ig_importer_bp
|
||||
app.register_blueprint(fhir_ig_importer_bp)
|
||||
|
||||
@app.route('/')
|
||||
def index():
|
||||
return render_template('index.html')
|
||||
|
||||
return app
|
||||
|
||||
from app import models
|
@ -1,14 +0,0 @@
|
||||
# app/core/__init__.py
|
||||
# Initialize the core blueprint
|
||||
|
||||
from flask import Blueprint
|
||||
|
||||
# Create a Blueprint instance for core routes
|
||||
# 'core' is the name of the blueprint
|
||||
# __name__ helps Flask locate the blueprint's resources (like templates)
|
||||
# template_folder='templates' specifies a blueprint-specific template folder (optional)
|
||||
bp = Blueprint('core', __name__, template_folder='templates')
|
||||
|
||||
# Import the routes module associated with this blueprint
|
||||
# This import is at the bottom to avoid circular dependencies
|
||||
from . import routes # noqa: F401 E402
|
@ -1,18 +0,0 @@
|
||||
# app/core/routes.py
|
||||
# Defines routes for the core part of the application (e.g., home page)
|
||||
|
||||
from flask import render_template
|
||||
from . import bp # Import the blueprint instance defined in __init__.py
|
||||
|
||||
# --- Core Routes ---
|
||||
|
||||
@bp.route('/')
|
||||
@bp.route('/index')
|
||||
def index():
|
||||
"""Renders the main home page of the application."""
|
||||
# This will look for 'index.html' first in the blueprint's template folder
|
||||
# (if defined, e.g., 'app/core/templates/index.html')
|
||||
# and then fall back to the main application's template folder ('app/templates/index.html')
|
||||
return render_template('index.html', title='Home')
|
||||
|
||||
# Add other core routes here (e.g., about page, contact page) if needed
|
@ -1,19 +0,0 @@
|
||||
# app/decorators.py
|
||||
from functools import wraps
|
||||
from flask_login import current_user
|
||||
from flask import abort
|
||||
|
||||
def admin_required(func):
|
||||
"""
|
||||
Decorator to ensure the user is logged in and has the 'admin' role.
|
||||
Aborts with 403 Forbidden if conditions are not met.
|
||||
"""
|
||||
@wraps(func)
|
||||
def decorated_view(*args, **kwargs):
|
||||
# Check if user is logged in and has the admin role (using the property we added)
|
||||
if not current_user.is_authenticated or not current_user.is_admin:
|
||||
# If not admin, return a 403 Forbidden error
|
||||
abort(403)
|
||||
# If admin, proceed with the original route function
|
||||
return func(*args, **kwargs)
|
||||
return decorated_view
|
@ -1,28 +0,0 @@
|
||||
# app/modules/fhir_ig_importer/__init__.py
|
||||
|
||||
from flask import Blueprint
|
||||
|
||||
# --- Module Metadata ---
|
||||
metadata = {
|
||||
'module_id': 'fhir_ig_importer', # Matches folder name
|
||||
'display_name': 'FHIR IG Importer',
|
||||
'description': 'Imports FHIR Implementation Guide packages from a registry.',
|
||||
'version': '0.1.0',
|
||||
# No main nav items, will be accessed via Control Panel
|
||||
'nav_items': []
|
||||
}
|
||||
# --- End Module Metadata ---
|
||||
|
||||
# Define Blueprint
|
||||
# We'll mount this under the control panel later
|
||||
bp = Blueprint(
|
||||
metadata['module_id'],
|
||||
__name__,
|
||||
template_folder='templates',
|
||||
# Define a URL prefix if mounting standalone, but we'll likely register
|
||||
# it under /control-panel via app/__init__.py later
|
||||
# url_prefix='/fhir-importer'
|
||||
)
|
||||
|
||||
# Import routes after creating blueprint
|
||||
from . import routes, forms # Import forms too
|
@ -1,19 +0,0 @@
|
||||
# app/modules/fhir_ig_importer/forms.py
|
||||
|
||||
from flask_wtf import FlaskForm
|
||||
from wtforms import StringField, SubmitField
|
||||
from wtforms.validators import DataRequired, Regexp
|
||||
|
||||
class IgImportForm(FlaskForm):
|
||||
"""Form for specifying an IG package to import."""
|
||||
# Basic validation for FHIR package names (e.g., hl7.fhir.r4.core)
|
||||
package_name = StringField('Package Name (e.g., hl7.fhir.au.base)', validators=[
|
||||
DataRequired(),
|
||||
Regexp(r'^[a-zA-Z0-9]+(\.[a-zA-Z0-9]+)+$', message='Invalid package name format.')
|
||||
])
|
||||
# Basic validation for version (e.g., 4.1.0, current)
|
||||
package_version = StringField('Package Version (e.g., 4.1.0 or current)', validators=[
|
||||
DataRequired(),
|
||||
Regexp(r'^[a-zA-Z0-9\.\-]+$', message='Invalid version format.')
|
||||
])
|
||||
submit = SubmitField('Fetch & Download IG')
|
@ -1,162 +0,0 @@
|
||||
# app/modules/fhir_ig_importer/routes.py
|
||||
|
||||
import requests
|
||||
import os
|
||||
import tarfile # Needed for find_and_extract_sd
|
||||
import gzip
|
||||
import json
|
||||
import io
|
||||
import re
|
||||
from flask import (render_template, redirect, url_for, flash, request,
|
||||
current_app, jsonify, send_file)
|
||||
from flask_login import login_required
|
||||
from app.decorators import admin_required
|
||||
from werkzeug.utils import secure_filename
|
||||
from . import bp
|
||||
from .forms import IgImportForm
|
||||
# Import the services module
|
||||
from . import services
|
||||
# Import ProcessedIg model for get_structure_definition
|
||||
from app.models import ProcessedIg
|
||||
from app import db
|
||||
|
||||
|
||||
# --- Helper: Find/Extract SD ---
|
||||
# Moved from services.py to be local to routes that use it, or keep in services and call services.find_and_extract_sd
|
||||
def find_and_extract_sd(tgz_path, resource_identifier):
|
||||
"""Helper to find and extract SD json from a given tgz path by ID, Name, or Type."""
|
||||
sd_data = None; found_path = None; logger = current_app.logger # Use current_app logger
|
||||
if not tgz_path or not os.path.exists(tgz_path): logger.error(f"File not found in find_and_extract_sd: {tgz_path}"); return None, None
|
||||
try:
|
||||
with tarfile.open(tgz_path, "r:gz") as tar:
|
||||
logger.debug(f"Searching for SD matching '{resource_identifier}' in {os.path.basename(tgz_path)}")
|
||||
for member in tar:
|
||||
if member.isfile() and member.name.startswith('package/') and member.name.lower().endswith('.json'):
|
||||
if os.path.basename(member.name).lower() in ['package.json', '.index.json', 'validation-summary.json', 'validation-oo.json']: continue
|
||||
fileobj = None
|
||||
try:
|
||||
fileobj = tar.extractfile(member)
|
||||
if fileobj:
|
||||
content_bytes = fileobj.read(); content_string = content_bytes.decode('utf-8-sig'); data = json.loads(content_string)
|
||||
if isinstance(data, dict) and data.get('resourceType') == 'StructureDefinition':
|
||||
sd_id = data.get('id'); sd_name = data.get('name'); sd_type = data.get('type')
|
||||
if resource_identifier == sd_type or resource_identifier == sd_id or resource_identifier == sd_name:
|
||||
sd_data = data; found_path = member.name; logger.info(f"Found matching SD for '{resource_identifier}' at path: {found_path}"); break
|
||||
except Exception as e: logger.warning(f"Could not read/parse potential SD {member.name}: {e}")
|
||||
finally:
|
||||
if fileobj: fileobj.close()
|
||||
if sd_data is None: logger.warning(f"SD matching '{resource_identifier}' not found within archive {os.path.basename(tgz_path)}")
|
||||
except Exception as e: logger.error(f"Error reading archive {tgz_path} in find_and_extract_sd: {e}", exc_info=True); raise
|
||||
return sd_data, found_path
|
||||
# --- End Helper ---
|
||||
|
||||
|
||||
# --- Route for the main import page ---
|
||||
@bp.route('/import-ig', methods=['GET', 'POST'])
|
||||
@login_required
|
||||
@admin_required
|
||||
def import_ig():
|
||||
"""Handles FHIR IG recursive download using services."""
|
||||
form = IgImportForm()
|
||||
template_context = {"title": "Import FHIR IG", "form": form, "results": None }
|
||||
if form.validate_on_submit():
|
||||
package_name = form.package_name.data; package_version = form.package_version.data
|
||||
template_context.update(package_name=package_name, package_version=package_version)
|
||||
flash(f"Starting full import for {package_name}#{package_version}...", "info"); current_app.logger.info(f"Calling import service for: {package_name}#{package_version}")
|
||||
try:
|
||||
# Call the CORRECT orchestrator service function
|
||||
import_results = services.import_package_and_dependencies(package_name, package_version)
|
||||
template_context["results"] = import_results
|
||||
# Flash summary messages
|
||||
dl_count = len(import_results.get('downloaded', {})); proc_count = len(import_results.get('processed', set())); error_count = len(import_results.get('errors', []))
|
||||
if dl_count > 0: flash(f"Downloaded/verified {dl_count} package file(s).", "success")
|
||||
if proc_count < dl_count and dl_count > 0 : flash(f"Dependency data extraction failed for {dl_count - proc_count} package(s).", "warning")
|
||||
if error_count > 0: flash(f"{error_count} total error(s) occurred.", "danger")
|
||||
elif dl_count == 0 and error_count == 0: flash("No packages needed downloading or initial package failed.", "info")
|
||||
elif error_count == 0: flash("Import process completed successfully.", "success")
|
||||
except Exception as e:
|
||||
fatal_error = f"Critical unexpected error during import: {e}"; template_context["fatal_error"] = fatal_error; current_app.logger.error(f"Critical import error: {e}", exc_info=True); flash(fatal_error, "danger")
|
||||
return render_template('fhir_ig_importer/import_ig_page.html', **template_context)
|
||||
return render_template('fhir_ig_importer/import_ig_page.html', **template_context)
|
||||
|
||||
|
||||
# --- Route to get StructureDefinition elements ---
|
||||
@bp.route('/get-structure')
|
||||
@login_required
|
||||
@admin_required
|
||||
def get_structure_definition():
|
||||
"""API endpoint to fetch SD elements and pre-calculated Must Support paths."""
|
||||
package_name = request.args.get('package_name'); package_version = request.args.get('package_version'); resource_identifier = request.args.get('resource_type')
|
||||
error_response_data = {"elements": [], "must_support_paths": []}
|
||||
if not all([package_name, package_version, resource_identifier]): error_response_data["error"] = "Missing query parameters"; return jsonify(error_response_data), 400
|
||||
current_app.logger.info(f"Request for structure: {package_name}#{package_version} / {resource_identifier}")
|
||||
|
||||
# Find the primary package file
|
||||
package_dir_name = 'fhir_packages'; download_dir = os.path.join(current_app.instance_path, package_dir_name)
|
||||
# Use service helper for consistency
|
||||
filename = services._construct_tgz_filename(package_name, package_version)
|
||||
tgz_path = os.path.join(download_dir, filename)
|
||||
if not os.path.exists(tgz_path): error_response_data["error"] = f"Package file not found: {filename}"; return jsonify(error_response_data), 404
|
||||
|
||||
sd_data = None; found_path = None; error_msg = None
|
||||
try:
|
||||
# Call the local helper function correctly
|
||||
sd_data, found_path = find_and_extract_sd(tgz_path, resource_identifier)
|
||||
# Fallback check
|
||||
if sd_data is None:
|
||||
core_pkg_name = "hl7.fhir.r4.core"; core_pkg_version = "4.0.1" # TODO: Make dynamic
|
||||
core_filename = services._construct_tgz_filename(core_pkg_name, core_pkg_version)
|
||||
core_tgz_path = os.path.join(download_dir, core_filename)
|
||||
if os.path.exists(core_tgz_path):
|
||||
current_app.logger.info(f"Trying fallback search in {core_pkg_name}...")
|
||||
sd_data, found_path = find_and_extract_sd(core_tgz_path, resource_identifier) # Call local helper
|
||||
else: current_app.logger.warning(f"Core package {core_tgz_path} not found.")
|
||||
except Exception as e:
|
||||
error_msg = f"Error searching package(s): {e}"; current_app.logger.error(error_msg, exc_info=True); error_response_data["error"] = error_msg; return jsonify(error_response_data), 500
|
||||
|
||||
if sd_data is None: error_msg = f"SD for '{resource_identifier}' not found."; error_response_data["error"] = error_msg; return jsonify(error_response_data), 404
|
||||
|
||||
# Extract elements
|
||||
elements = sd_data.get('snapshot', {}).get('element', [])
|
||||
if not elements: elements = sd_data.get('differential', {}).get('element', [])
|
||||
|
||||
# Fetch pre-calculated Must Support paths from DB
|
||||
must_support_paths = [];
|
||||
try:
|
||||
stmt = db.select(ProcessedIg).filter_by(package_name=package_name, package_version=package_version); processed_ig_record = db.session.scalar(stmt)
|
||||
if processed_ig_record: all_ms_paths_dict = processed_ig_record.must_support_elements; must_support_paths = all_ms_paths_dict.get(resource_identifier, [])
|
||||
else: current_app.logger.warning(f"No ProcessedIg record found for {package_name}#{package_version}")
|
||||
except Exception as e: current_app.logger.error(f"Error fetching MS paths from DB: {e}", exc_info=True)
|
||||
|
||||
current_app.logger.info(f"Returning {len(elements)} elements for {resource_identifier} from {found_path or 'Unknown File'}")
|
||||
return jsonify({"elements": elements, "must_support_paths": must_support_paths})
|
||||
|
||||
|
||||
# --- Route to get raw example file content ---
|
||||
@bp.route('/get-example')
|
||||
@login_required
|
||||
@admin_required
|
||||
def get_example_content():
|
||||
# ... (Function remains the same as response #147) ...
|
||||
package_name = request.args.get('package_name'); package_version = request.args.get('package_version'); example_member_path = request.args.get('filename')
|
||||
if not all([package_name, package_version, example_member_path]): return jsonify({"error": "Missing query parameters"}), 400
|
||||
current_app.logger.info(f"Request for example: {package_name}#{package_version} / {example_member_path}")
|
||||
package_dir_name = 'fhir_packages'; download_dir = os.path.join(current_app.instance_path, package_dir_name)
|
||||
pkg_filename = services._construct_tgz_filename(package_name, package_version) # Use service helper
|
||||
tgz_path = os.path.join(download_dir, pkg_filename)
|
||||
if not os.path.exists(tgz_path): return jsonify({"error": f"Package file not found: {pkg_filename}"}), 404
|
||||
# Basic security check on member path
|
||||
safe_member_path = secure_filename(example_member_path.replace("package/","")) # Allow paths within package/
|
||||
if not example_member_path.startswith('package/') or '..' in example_member_path: return jsonify({"error": "Invalid example file path."}), 400
|
||||
|
||||
try:
|
||||
with tarfile.open(tgz_path, "r:gz") as tar:
|
||||
try: example_member = tar.getmember(example_member_path) # Use original path here
|
||||
except KeyError: return jsonify({"error": f"Example file '{example_member_path}' not found."}), 404
|
||||
example_fileobj = tar.extractfile(example_member)
|
||||
if not example_fileobj: return jsonify({"error": "Could not extract example file."}), 500
|
||||
try: content_bytes = example_fileobj.read()
|
||||
finally: example_fileobj.close()
|
||||
return content_bytes # Return raw bytes
|
||||
except tarfile.TarError as e: err_msg = f"Error reading {tgz_path}: {e}"; current_app.logger.error(err_msg); return jsonify({"error": err_msg}), 500
|
||||
except Exception as e: err_msg = f"Unexpected error getting example {example_member_path}: {e}"; current_app.logger.error(err_msg, exc_info=True); return jsonify({"error": err_msg}), 500
|
@ -1,345 +0,0 @@
|
||||
# app/modules/fhir_ig_importer/services.py
|
||||
|
||||
import requests
|
||||
import os
|
||||
import tarfile
|
||||
import gzip
|
||||
import json
|
||||
import io
|
||||
import re
|
||||
import logging
|
||||
from flask import current_app
|
||||
from collections import defaultdict
|
||||
|
||||
# Constants
|
||||
FHIR_REGISTRY_BASE_URL = "https://packages.fhir.org"
|
||||
DOWNLOAD_DIR_NAME = "fhir_packages"
|
||||
|
||||
# --- Helper Functions ---
|
||||
|
||||
def _get_download_dir():
|
||||
"""Gets the absolute path to the download directory, creating it if needed."""
|
||||
logger = logging.getLogger(__name__)
|
||||
instance_path = None # Initialize
|
||||
try:
|
||||
# --- FIX: Indent code inside try block ---
|
||||
instance_path = current_app.instance_path
|
||||
logger.debug(f"Using instance path from current_app: {instance_path}")
|
||||
except RuntimeError:
|
||||
# --- FIX: Indent code inside except block ---
|
||||
logger.warning("No app context for instance_path, constructing relative path.")
|
||||
instance_path = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..', 'instance'))
|
||||
logger.debug(f"Constructed instance path: {instance_path}")
|
||||
|
||||
# This part depends on instance_path being set above
|
||||
if not instance_path:
|
||||
logger.error("Fatal Error: Could not determine instance path.")
|
||||
return None
|
||||
|
||||
download_dir = os.path.join(instance_path, DOWNLOAD_DIR_NAME)
|
||||
try:
|
||||
# --- FIX: Indent code inside try block ---
|
||||
os.makedirs(download_dir, exist_ok=True)
|
||||
return download_dir
|
||||
except OSError as e:
|
||||
# --- FIX: Indent code inside except block ---
|
||||
logger.error(f"Fatal Error creating dir {download_dir}: {e}", exc_info=True)
|
||||
return None
|
||||
|
||||
def sanitize_filename_part(text): # Public version
|
||||
"""Basic sanitization for name/version parts of filename."""
|
||||
# --- FIX: Indent function body ---
|
||||
safe_text = "".join(c if c.isalnum() or c in ['.', '-'] else '_' for c in text)
|
||||
safe_text = re.sub(r'_+', '_', safe_text) # Uses re
|
||||
safe_text = safe_text.strip('_-.')
|
||||
return safe_text if safe_text else "invalid_name"
|
||||
|
||||
def _construct_tgz_filename(name, version):
|
||||
"""Constructs the standard filename using the sanitized parts."""
|
||||
# --- FIX: Indent function body ---
|
||||
return f"{sanitize_filename_part(name)}-{sanitize_filename_part(version)}.tgz"
|
||||
|
||||
def find_and_extract_sd(tgz_path, resource_identifier): # Public version
|
||||
"""Helper to find and extract SD json from a given tgz path by ID, Name, or Type."""
|
||||
# --- FIX: Ensure consistent indentation ---
|
||||
sd_data = None
|
||||
found_path = None
|
||||
logger = logging.getLogger(__name__)
|
||||
if not tgz_path or not os.path.exists(tgz_path):
|
||||
logger.error(f"File not found in find_and_extract_sd: {tgz_path}")
|
||||
return None, None
|
||||
try:
|
||||
with tarfile.open(tgz_path, "r:gz") as tar:
|
||||
logger.debug(f"Searching for SD matching '{resource_identifier}' in {os.path.basename(tgz_path)}")
|
||||
for member in tar:
|
||||
if not (member.isfile() and member.name.startswith('package/') and member.name.lower().endswith('.json')):
|
||||
continue
|
||||
if os.path.basename(member.name).lower() in ['package.json', '.index.json', 'validation-summary.json', 'validation-oo.json']:
|
||||
continue
|
||||
|
||||
fileobj = None
|
||||
try:
|
||||
fileobj = tar.extractfile(member)
|
||||
if fileobj:
|
||||
content_bytes = fileobj.read()
|
||||
content_string = content_bytes.decode('utf-8-sig')
|
||||
data = json.loads(content_string)
|
||||
if isinstance(data, dict) and data.get('resourceType') == 'StructureDefinition':
|
||||
sd_id = data.get('id')
|
||||
sd_name = data.get('name')
|
||||
sd_type = data.get('type')
|
||||
# Match if requested identifier matches ID, Name, or Base Type
|
||||
if resource_identifier == sd_type or resource_identifier == sd_id or resource_identifier == sd_name:
|
||||
sd_data = data
|
||||
found_path = member.name
|
||||
logger.info(f"Found matching SD for '{resource_identifier}' at path: {found_path}")
|
||||
break # Stop searching once found
|
||||
except Exception as e:
|
||||
# Log issues reading/parsing individual files but continue search
|
||||
logger.warning(f"Could not read/parse potential SD {member.name}: {e}")
|
||||
finally:
|
||||
if fileobj: fileobj.close() # Ensure resource cleanup
|
||||
|
||||
if sd_data is None:
|
||||
logger.warning(f"SD matching '{resource_identifier}' not found within archive {os.path.basename(tgz_path)}")
|
||||
except tarfile.TarError as e:
|
||||
logger.error(f"TarError reading {tgz_path} in find_and_extract_sd: {e}")
|
||||
raise tarfile.TarError(f"Error reading package archive: {e}") from e
|
||||
except FileNotFoundError as e:
|
||||
logger.error(f"FileNotFoundError reading {tgz_path} in find_and_extract_sd: {e}")
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error in find_and_extract_sd for {tgz_path}: {e}", exc_info=True)
|
||||
raise
|
||||
return sd_data, found_path
|
||||
|
||||
# --- Core Service Functions ---
|
||||
|
||||
def download_package(name, version):
|
||||
""" Downloads a single FHIR package. Returns (save_path, error_message) """
|
||||
# --- FIX: Ensure consistent indentation ---
|
||||
logger = logging.getLogger(__name__)
|
||||
download_dir = _get_download_dir()
|
||||
if not download_dir:
|
||||
return None, "Could not get/create download directory."
|
||||
|
||||
package_id = f"{name}#{version}"
|
||||
package_url = f"{FHIR_REGISTRY_BASE_URL}/{name}/{version}"
|
||||
filename = _construct_tgz_filename(name, version) # Uses public sanitize via helper
|
||||
save_path = os.path.join(download_dir, filename)
|
||||
|
||||
if os.path.exists(save_path):
|
||||
logger.info(f"Exists: {filename}")
|
||||
return save_path, None
|
||||
|
||||
logger.info(f"Downloading: {package_id} -> {filename}")
|
||||
try:
|
||||
with requests.get(package_url, stream=True, timeout=90) as r:
|
||||
r.raise_for_status()
|
||||
with open(save_path, 'wb') as f:
|
||||
logger.debug(f"Opened {save_path} for writing.")
|
||||
for chunk in r.iter_content(chunk_size=8192):
|
||||
if chunk:
|
||||
f.write(chunk)
|
||||
logger.info(f"Success: Downloaded {filename}")
|
||||
return save_path, None
|
||||
except requests.exceptions.RequestException as e:
|
||||
err_msg = f"Download error for {package_id}: {e}"; logger.error(err_msg); return None, err_msg
|
||||
except OSError as e:
|
||||
err_msg = f"File save error for {filename}: {e}"; logger.error(err_msg); return None, err_msg
|
||||
except Exception as e:
|
||||
err_msg = f"Unexpected download error for {package_id}: {e}"; logger.error(err_msg, exc_info=True); return None, err_msg
|
||||
|
||||
def extract_dependencies(tgz_path):
|
||||
""" Extracts dependencies dict from package.json. Returns (dep_dict or None on error, error_message) """
|
||||
# --- FIX: Ensure consistent indentation ---
|
||||
logger = logging.getLogger(__name__)
|
||||
package_json_path = "package/package.json"
|
||||
dependencies = {}
|
||||
error_message = None
|
||||
if not tgz_path or not os.path.exists(tgz_path):
|
||||
return None, f"File not found at {tgz_path}"
|
||||
try:
|
||||
with tarfile.open(tgz_path, "r:gz") as tar:
|
||||
package_json_member = tar.getmember(package_json_path)
|
||||
package_json_fileobj = tar.extractfile(package_json_member)
|
||||
if package_json_fileobj:
|
||||
try:
|
||||
package_data = json.loads(package_json_fileobj.read().decode('utf-8-sig'))
|
||||
dependencies = package_data.get('dependencies', {})
|
||||
finally:
|
||||
package_json_fileobj.close()
|
||||
else:
|
||||
raise FileNotFoundError(f"Could not extract {package_json_path}")
|
||||
except KeyError:
|
||||
error_message = f"'{package_json_path}' not found in {os.path.basename(tgz_path)}.";
|
||||
logger.warning(error_message) # OK if missing
|
||||
except (json.JSONDecodeError, UnicodeDecodeError) as e:
|
||||
error_message = f"Parse error in {package_json_path}: {e}"; logger.error(error_message); dependencies = None # Parsing failed
|
||||
except (tarfile.TarError, FileNotFoundError) as e:
|
||||
error_message = f"Archive error {os.path.basename(tgz_path)}: {e}"; logger.error(error_message); dependencies = None # Archive read failed
|
||||
except Exception as e:
|
||||
error_message = f"Unexpected error extracting deps: {e}"; logger.error(error_message, exc_info=True); dependencies = None
|
||||
return dependencies, error_message
|
||||
|
||||
|
||||
# --- Recursive Import Orchestrator ---
|
||||
def import_package_and_dependencies(initial_name, initial_version):
|
||||
"""Orchestrates recursive download and dependency extraction."""
|
||||
# --- FIX: Ensure consistent indentation ---
|
||||
logger = logging.getLogger(__name__)
|
||||
logger.info(f"Starting recursive import for {initial_name}#{initial_version}")
|
||||
results = {'requested': (initial_name, initial_version), 'processed': set(), 'downloaded': {}, 'all_dependencies': {}, 'errors': [] }
|
||||
pending_queue = [(initial_name, initial_version)]
|
||||
processed_lookup = set()
|
||||
|
||||
while pending_queue:
|
||||
name, version = pending_queue.pop(0)
|
||||
package_id_tuple = (name, version)
|
||||
|
||||
if package_id_tuple in processed_lookup:
|
||||
continue
|
||||
|
||||
logger.info(f"Processing: {name}#{version}")
|
||||
processed_lookup.add(package_id_tuple)
|
||||
|
||||
save_path, dl_error = download_package(name, version)
|
||||
|
||||
if dl_error:
|
||||
error_msg = f"Download failed for {name}#{version}: {dl_error}"
|
||||
results['errors'].append(error_msg)
|
||||
if package_id_tuple == results['requested']:
|
||||
logger.error("Aborting import: Initial package download failed.")
|
||||
break
|
||||
else:
|
||||
continue
|
||||
else: # Download OK
|
||||
results['downloaded'][package_id_tuple] = save_path
|
||||
# --- Correctly indented block ---
|
||||
dependencies, dep_error = extract_dependencies(save_path)
|
||||
if dep_error:
|
||||
results['errors'].append(f"Dependency extraction failed for {name}#{version}: {dep_error}")
|
||||
elif dependencies is not None:
|
||||
results['all_dependencies'][package_id_tuple] = dependencies
|
||||
results['processed'].add(package_id_tuple)
|
||||
logger.debug(f"Dependencies for {name}#{version}: {list(dependencies.keys())}")
|
||||
for dep_name, dep_version in dependencies.items():
|
||||
if isinstance(dep_name, str) and isinstance(dep_version, str) and dep_name and dep_version:
|
||||
dep_tuple = (dep_name, dep_version)
|
||||
if dep_tuple not in processed_lookup:
|
||||
if dep_tuple not in pending_queue:
|
||||
pending_queue.append(dep_tuple)
|
||||
logger.debug(f"Added to queue: {dep_name}#{dep_version}")
|
||||
else:
|
||||
logger.warning(f"Skipping invalid dependency '{dep_name}': '{dep_version}' in {name}#{version}")
|
||||
# --- End Correctly indented block ---
|
||||
|
||||
proc_count=len(results['processed']); dl_count=len(results['downloaded']); err_count=len(results['errors'])
|
||||
logger.info(f"Import finished. Processed: {proc_count}, Downloaded/Verified: {dl_count}, Errors: {err_count}")
|
||||
return results
|
||||
|
||||
|
||||
# --- Package File Content Processor (V6.2 - Fixed MS path handling) ---
|
||||
def process_package_file(tgz_path):
|
||||
""" Extracts types, profile status, MS elements, and examples from a downloaded .tgz package (Single Pass). """
|
||||
logger = logging.getLogger(__name__)
|
||||
logger.info(f"Processing package file details (V6.2 Logic): {tgz_path}")
|
||||
|
||||
results = {'resource_types_info': [], 'must_support_elements': {}, 'examples': {}, 'errors': [] }
|
||||
resource_info = defaultdict(lambda: {'name': None, 'type': None, 'is_profile': False, 'ms_flag': False, 'ms_paths': set(), 'examples': set()})
|
||||
|
||||
if not tgz_path or not os.path.exists(tgz_path):
|
||||
results['errors'].append(f"Package file not found: {tgz_path}"); return results
|
||||
|
||||
try:
|
||||
with tarfile.open(tgz_path, "r:gz") as tar:
|
||||
for member in tar:
|
||||
if not member.isfile() or not member.name.startswith('package/') or not member.name.lower().endswith(('.json', '.xml', '.html')): continue
|
||||
member_name_lower = member.name.lower(); base_filename_lower = os.path.basename(member_name_lower); fileobj = None
|
||||
if base_filename_lower in ['package.json', '.index.json', 'validation-summary.json', 'validation-oo.json']: continue
|
||||
|
||||
is_example = member.name.startswith('package/example/') or 'example' in base_filename_lower
|
||||
is_json = member_name_lower.endswith('.json')
|
||||
|
||||
try: # Process individual member
|
||||
if is_json:
|
||||
fileobj = tar.extractfile(member);
|
||||
if not fileobj: continue
|
||||
content_bytes = fileobj.read(); content_string = content_bytes.decode('utf-8-sig'); data = json.loads(content_string)
|
||||
if not isinstance(data, dict) or 'resourceType' not in data: continue
|
||||
|
||||
resource_type = data['resourceType']; entry_key = resource_type; is_sd = False
|
||||
|
||||
if resource_type == 'StructureDefinition':
|
||||
is_sd = True; profile_id = data.get('id') or data.get('name'); sd_type = data.get('type'); sd_base = data.get('baseDefinition'); is_profile_sd = bool(sd_base);
|
||||
if not profile_id or not sd_type: logger.warning(f"SD missing ID or Type: {member.name}"); continue
|
||||
entry_key = profile_id
|
||||
|
||||
entry = resource_info[entry_key]; entry.setdefault('type', resource_type) # Ensure type exists
|
||||
|
||||
if is_sd:
|
||||
entry['name'] = entry_key; entry['type'] = sd_type; entry['is_profile'] = is_profile_sd;
|
||||
if not entry.get('sd_processed'):
|
||||
has_ms = False; ms_paths_for_sd = set()
|
||||
for element_list in [data.get('snapshot', {}).get('element', []), data.get('differential', {}).get('element', [])]:
|
||||
for element in element_list:
|
||||
if isinstance(element, dict) and element.get('mustSupport') is True:
|
||||
# --- FIX: Check path safely ---
|
||||
element_path = element.get('path')
|
||||
if element_path: # Only add if path exists
|
||||
ms_paths_for_sd.add(element_path)
|
||||
has_ms = True # Mark MS found if we added a path
|
||||
else:
|
||||
logger.warning(f"Found mustSupport=true without path in element of {entry_key}")
|
||||
# --- End FIX ---
|
||||
if ms_paths_for_sd: entry['ms_paths'] = ms_paths_for_sd # Store the set of paths
|
||||
if has_ms: entry['ms_flag'] = True; logger.debug(f" Found MS elements in {entry_key}") # Use boolean flag
|
||||
entry['sd_processed'] = True # Mark MS check done
|
||||
|
||||
elif is_example: # JSON Example
|
||||
key_to_use = None; profile_meta = data.get('meta', {}).get('profile', [])
|
||||
if profile_meta and isinstance(profile_meta, list):
|
||||
for profile_url in profile_meta: profile_id_from_meta = profile_url.split('/')[-1];
|
||||
if profile_id_from_meta in resource_info: key_to_use = profile_id_from_meta; break
|
||||
if not key_to_use: key_to_use = resource_type
|
||||
if key_to_use not in resource_info: resource_info[key_to_use].update({'name': key_to_use, 'type': resource_type})
|
||||
resource_info[key_to_use]['examples'].add(member.name)
|
||||
|
||||
elif is_example: # XML/HTML examples
|
||||
# ... (XML/HTML example association logic) ...
|
||||
guessed_type = base_filename_lower.split('-')[0].capitalize(); guessed_profile_id = base_filename_lower.split('-')[0]; key_to_use = None
|
||||
if guessed_profile_id in resource_info: key_to_use = guessed_profile_id
|
||||
elif guessed_type in resource_info: key_to_use = guessed_type
|
||||
if key_to_use: resource_info[key_to_use]['examples'].add(member.name)
|
||||
else: logger.warning(f"Could not associate non-JSON example {member.name}")
|
||||
|
||||
except Exception as e: logger.warning(f"Could not process member {member.name}: {e}", exc_info=False)
|
||||
finally:
|
||||
if fileobj: fileobj.close()
|
||||
# -- End Member Loop --
|
||||
|
||||
# --- Final formatting moved INSIDE the main try block ---
|
||||
final_list = []; final_ms_elements = {}; final_examples = {}
|
||||
logger.debug(f"Formatting results from resource_info keys: {list(resource_info.keys())}")
|
||||
for key, info in resource_info.items():
|
||||
display_name = info.get('name') or key; base_type = info.get('type')
|
||||
if display_name or base_type:
|
||||
logger.debug(f" Formatting item '{display_name}': type='{base_type}', profile='{info.get('is_profile', False)}', ms_flag='{info.get('ms_flag', False)}'")
|
||||
final_list.append({'name': display_name, 'type': base_type, 'is_profile': info.get('is_profile', False), 'must_support': info.get('ms_flag', False)}) # Ensure 'must_support' key uses 'ms_flag'
|
||||
if info['ms_paths']: final_ms_elements[display_name] = sorted(list(info['ms_paths']))
|
||||
if info['examples']: final_examples[display_name] = sorted(list(info['examples']))
|
||||
else: logger.warning(f"Skipping formatting for key: {key}")
|
||||
|
||||
results['resource_types_info'] = sorted(final_list, key=lambda x: (not x.get('is_profile', False), x.get('name', '')))
|
||||
results['must_support_elements'] = final_ms_elements
|
||||
results['examples'] = final_examples
|
||||
# --- End formatting moved inside ---
|
||||
|
||||
except Exception as e:
|
||||
err_msg = f"Error processing package file {tgz_path}: {e}"; logger.error(err_msg, exc_info=True); results['errors'].append(err_msg)
|
||||
|
||||
# Logging counts
|
||||
final_types_count = len(results['resource_types_info']); ms_count = sum(1 for r in results['resource_types_info'] if r['must_support']); total_ms_paths = sum(len(v) for v in results['must_support_elements'].values()); total_examples = sum(len(v) for v in results['examples'].values())
|
||||
logger.info(f"V6.2 Extraction: {final_types_count} items ({ms_count} MS; {total_ms_paths} MS paths; {total_examples} examples) from {os.path.basename(tgz_path)}")
|
||||
|
||||
return results
|
@ -1,364 +0,0 @@
|
||||
# app/modules/fhir_ig_importer/services.py
|
||||
|
||||
import requests
|
||||
import os
|
||||
import tarfile
|
||||
import gzip
|
||||
import json
|
||||
import io
|
||||
import re
|
||||
import logging
|
||||
from flask import current_app
|
||||
from collections import defaultdict
|
||||
|
||||
# Constants
|
||||
FHIR_REGISTRY_BASE_URL = "https://packages.fhir.org"
|
||||
DOWNLOAD_DIR_NAME = "fhir_packages"
|
||||
|
||||
# --- Helper Functions ---
|
||||
|
||||
def _get_download_dir():
|
||||
"""Gets the absolute path to the download directory, creating it if needed."""
|
||||
logger = logging.getLogger(__name__)
|
||||
try:
|
||||
instance_path = current_app.instance_path
|
||||
except RuntimeError:
|
||||
logger.warning("No app context for instance_path, constructing relative path.")
|
||||
instance_path = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..', 'instance'))
|
||||
logger.debug(f"Constructed instance path: {instance_path}")
|
||||
download_dir = os.path.join(instance_path, DOWNLOAD_DIR_NAME)
|
||||
try:
|
||||
os.makedirs(download_dir, exist_ok=True)
|
||||
return download_dir
|
||||
except OSError as e:
|
||||
logger.error(f"Fatal Error: Could not create dir {download_dir}: {e}", exc_info=True)
|
||||
return None
|
||||
|
||||
def sanitize_filename_part(text):
|
||||
"""Basic sanitization for creating filenames."""
|
||||
safe_text = "".join(c if c.isalnum() or c in ['.', '-'] else '_' for c in text)
|
||||
safe_text = re.sub(r'_+', '_', safe_text)
|
||||
safe_text = safe_text.strip('_-.')
|
||||
return safe_text if safe_text else "invalid_name"
|
||||
|
||||
def _construct_tgz_filename(name, version):
|
||||
return f"{sanitize_filename_part(name)}-{sanitize_filename_part(version)}.tgz"
|
||||
|
||||
# --- Helper to Find/Extract SD ---
|
||||
def _find_and_extract_sd(tgz_path, resource_type_to_find):
|
||||
sd_data = None
|
||||
found_path = None
|
||||
logger = current_app.logger if current_app else logging.getLogger(__name__)
|
||||
try:
|
||||
with tarfile.open(tgz_path, "r:gz") as tar:
|
||||
logger.debug(f"Searching for SD type '{resource_type_to_find}' in {tgz_path}")
|
||||
potential_paths = [
|
||||
f'package/StructureDefinition-{resource_type_to_find.lower()}.json',
|
||||
f'package/StructureDefinition-{resource_type_to_find}.json'
|
||||
]
|
||||
member_found = None
|
||||
for potential_path in potential_paths:
|
||||
try:
|
||||
member_found = tar.getmember(potential_path)
|
||||
if member_found: break
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
if not member_found:
|
||||
for member in tar:
|
||||
if member.isfile() and member.name.startswith('package/') and member.name.lower().endswith('.json'):
|
||||
filename_lower = os.path.basename(member.name).lower()
|
||||
if filename_lower in ['package.json', '.index.json', 'validation-summary.json', 'validation-oo.json']:
|
||||
continue
|
||||
sd_fileobj = None
|
||||
try:
|
||||
sd_fileobj = tar.extractfile(member)
|
||||
if sd_fileobj:
|
||||
content_bytes = sd_fileobj.read(); content_string = content_bytes.decode('utf-8-sig'); data = json.loads(content_string)
|
||||
if isinstance(data, dict) and data.get('resourceType') == 'StructureDefinition' and data.get('type') == resource_type_to_find:
|
||||
member_found = member
|
||||
break
|
||||
except Exception:
|
||||
pass
|
||||
finally:
|
||||
if sd_fileobj: sd_fileobj.close()
|
||||
|
||||
if member_found:
|
||||
sd_fileobj = None
|
||||
try:
|
||||
sd_fileobj = tar.extractfile(member_found)
|
||||
if sd_fileobj:
|
||||
content_bytes = sd_fileobj.read(); content_string = content_bytes.decode('utf-8-sig'); sd_data = json.loads(content_string)
|
||||
found_path = member_found.name; logger.info(f"Found matching SD at path: {found_path}")
|
||||
except Exception as e:
|
||||
logger.warning(f"Could not read/parse member {member_found.name} after finding it: {e}")
|
||||
sd_data = None; found_path = None
|
||||
finally:
|
||||
if sd_fileobj: sd_fileobj.close()
|
||||
|
||||
except tarfile.TarError as e:
|
||||
logger.error(f"TarError reading {tgz_path}: {e}")
|
||||
raise
|
||||
except FileNotFoundError:
|
||||
logger.error(f"FileNotFoundError reading {tgz_path}")
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error in _find_and_extract_sd for {tgz_path}: {e}", exc_info=True)
|
||||
raise
|
||||
return sd_data, found_path
|
||||
|
||||
# --- Core Service Functions ---
|
||||
|
||||
def download_package(name, version):
|
||||
logger = logging.getLogger(__name__)
|
||||
download_dir = _get_download_dir()
|
||||
if not download_dir: return None, "Could not get/create download directory."
|
||||
|
||||
package_id = f"{name}#{version}"
|
||||
package_url = f"{FHIR_REGISTRY_BASE_URL}/{name}/{version}"
|
||||
filename = _construct_tgz_filename(name, version)
|
||||
save_path = os.path.join(download_dir, filename)
|
||||
|
||||
if os.path.exists(save_path):
|
||||
logger.info(f"Package already exists: {filename}")
|
||||
return save_path, None
|
||||
|
||||
logger.info(f"Downloading: {package_id} -> {filename}")
|
||||
try:
|
||||
with requests.get(package_url, stream=True, timeout=90) as r:
|
||||
r.raise_for_status()
|
||||
with open(save_path, 'wb') as f:
|
||||
for chunk in r.iter_content(chunk_size=8192): f.write(chunk)
|
||||
logger.info(f"Success: Downloaded {filename}")
|
||||
return save_path, None
|
||||
except requests.exceptions.RequestException as e: err_msg = f"Download error for {package_id}: {e}"; logger.error(err_msg); return None, err_msg
|
||||
except OSError as e: err_msg = f"File save error for {filename}: {e}"; logger.error(err_msg); return None, err_msg
|
||||
except Exception as e: err_msg = f"Unexpected download error for {package_id}: {e}"; logger.error(err_msg, exc_info=True); return None, err_msg
|
||||
|
||||
def extract_dependencies(tgz_path):
|
||||
logger = logging.getLogger(__name__)
|
||||
package_json_path = "package/package.json"
|
||||
dependencies = {}
|
||||
error_message = None
|
||||
if not tgz_path or not os.path.exists(tgz_path): return None, f"File not found at {tgz_path}"
|
||||
try:
|
||||
with tarfile.open(tgz_path, "r:gz") as tar:
|
||||
package_json_member = tar.getmember(package_json_path)
|
||||
package_json_fileobj = tar.extractfile(package_json_member)
|
||||
if package_json_fileobj:
|
||||
try:
|
||||
package_data = json.loads(package_json_fileobj.read().decode('utf-8-sig'))
|
||||
dependencies = package_data.get('dependencies', {})
|
||||
finally: package_json_fileobj.close()
|
||||
else: raise FileNotFoundError(f"Could not extract {package_json_path}")
|
||||
except KeyError: error_message = f"'{package_json_path}' not found in {os.path.basename(tgz_path)}."; logger.warning(error_message)
|
||||
except (json.JSONDecodeError, UnicodeDecodeError) as e: error_message = f"Parse error in {package_json_path} from {os.path.basename(tgz_path)}: {e}"; logger.error(error_message); dependencies = None
|
||||
except (tarfile.TarError, FileNotFoundError) as e: error_message = f"Archive error {os.path.basename(tgz_path)}: {e}"; logger.error(error_message); dependencies = None
|
||||
except Exception as e: error_message = f"Unexpected error extracting deps: {e}"; logger.error(error_message, exc_info=True); dependencies = None
|
||||
return dependencies, error_message
|
||||
|
||||
def import_package_and_dependencies(initial_name, initial_version):
|
||||
logger = logging.getLogger(__name__)
|
||||
logger.info(f"Starting recursive import for {initial_name}#{initial_version}")
|
||||
results = {'requested': (initial_name, initial_version), 'processed': set(), 'downloaded': {}, 'all_dependencies': {}, 'errors': [] }
|
||||
pending_queue = [(initial_name, initial_version)]; processed_lookup = set()
|
||||
|
||||
while pending_queue:
|
||||
name, version = pending_queue.pop(0)
|
||||
package_id_tuple = (name, version)
|
||||
|
||||
if package_id_tuple in processed_lookup: continue
|
||||
|
||||
logger.info(f"Processing: {name}#{version}"); processed_lookup.add(package_id_tuple)
|
||||
|
||||
save_path, dl_error = download_package(name, version)
|
||||
|
||||
if dl_error:
|
||||
error_msg = f"Download failed for {name}#{version}: {dl_error}"
|
||||
results['errors'].append(error_msg)
|
||||
if package_id_tuple == results['requested']:
|
||||
logger.error("Aborting import: Initial package download failed.")
|
||||
break
|
||||
else:
|
||||
continue
|
||||
else:
|
||||
results['downloaded'][package_id_tuple] = save_path
|
||||
dependencies, dep_error = extract_dependencies(save_path)
|
||||
|
||||
if dep_error:
|
||||
results['errors'].append(f"Dependency extraction failed for {name}#{version}: {dep_error}")
|
||||
elif dependencies is not None:
|
||||
results['all_dependencies'][package_id_tuple] = dependencies
|
||||
results['processed'].add(package_id_tuple)
|
||||
logger.debug(f"Dependencies for {name}#{version}: {list(dependencies.keys())}")
|
||||
for dep_name, dep_version in dependencies.items():
|
||||
if isinstance(dep_name, str) and isinstance(dep_version, str) and dep_name and dep_version:
|
||||
dep_tuple = (dep_name, dep_version)
|
||||
if dep_tuple not in processed_lookup:
|
||||
if dep_tuple not in pending_queue:
|
||||
pending_queue.append(dep_tuple)
|
||||
logger.debug(f"Added to queue: {dep_name}#{dep_version}")
|
||||
else:
|
||||
logger.warning(f"Skipping invalid dependency entry '{dep_name}': '{dep_version}' in {name}#{version}")
|
||||
|
||||
proc_count=len(results['processed']); dl_count=len(results['downloaded']); err_count=len(results['errors'])
|
||||
logger.info(f"Import finished. Processed: {proc_count}, Downloaded/Verified: {dl_count}, Errors: {err_count}")
|
||||
return results
|
||||
|
||||
def process_package_file(tgz_path):
|
||||
logger = logging.getLogger(__name__)
|
||||
logger.info(f"Processing package file details: {tgz_path}")
|
||||
results = {'resource_types_info': [], 'must_support_elements': {}, 'examples': {}, 'errors': [] }
|
||||
resource_info = {}
|
||||
|
||||
if not os.path.exists(tgz_path):
|
||||
results['errors'].append(f"Package file not found: {tgz_path}")
|
||||
return results
|
||||
|
||||
try:
|
||||
with tarfile.open(tgz_path, "r:gz") as tar:
|
||||
for member in tar:
|
||||
if not member.isfile():
|
||||
continue
|
||||
member_name_lower = member.name.lower()
|
||||
base_filename_lower = os.path.basename(member_name_lower)
|
||||
fileobj = None
|
||||
|
||||
if member.name.startswith('package/') and member_name_lower.endswith('.json') and \
|
||||
base_filename_lower not in ['package.json', '.index.json', 'validation-summary.json']:
|
||||
try:
|
||||
fileobj = tar.extractfile(member)
|
||||
if not fileobj:
|
||||
continue
|
||||
content_string = fileobj.read().decode('utf-8-sig')
|
||||
data = json.loads(content_string)
|
||||
|
||||
if isinstance(data, dict) and data.get('resourceType'):
|
||||
resource_type = data['resourceType']
|
||||
|
||||
if member.name.startswith('package/example/'):
|
||||
ex_type = resource_type
|
||||
entry = resource_info.setdefault(ex_type, {
|
||||
'base_type': ex_type,
|
||||
'ms_flag': False,
|
||||
'ms_paths': [],
|
||||
'examples': [],
|
||||
'sd_processed': False
|
||||
})
|
||||
entry['examples'].append(member.name)
|
||||
continue
|
||||
|
||||
if resource_type == 'StructureDefinition':
|
||||
profile_id = data.get('id') or data.get('name')
|
||||
fhir_type = data.get('type')
|
||||
|
||||
if not profile_id:
|
||||
logger.warning(f"StructureDefinition missing id or name: {member.name}")
|
||||
continue
|
||||
|
||||
entry = resource_info.setdefault(profile_id, {
|
||||
'base_type': fhir_type,
|
||||
'ms_flag': False,
|
||||
'ms_paths': [],
|
||||
'examples': [],
|
||||
'sd_processed': False
|
||||
})
|
||||
|
||||
if entry['sd_processed']:
|
||||
continue
|
||||
|
||||
ms_paths = []
|
||||
has_ms = False
|
||||
for element_list in [data.get('snapshot', {}).get('element', []), data.get('differential', {}).get('element', [])]:
|
||||
for element in element_list:
|
||||
if not isinstance(element, dict):
|
||||
continue
|
||||
if element.get('mustSupport') is True:
|
||||
path = element.get('path')
|
||||
if path:
|
||||
ms_paths.append(path)
|
||||
has_ms = True
|
||||
for t in element.get('type', []):
|
||||
for ext in t.get('extension', []):
|
||||
ext_url = ext.get('url')
|
||||
if ext_url:
|
||||
ms_paths.append(f"{path}.type.extension[{ext_url}]")
|
||||
has_ms = True
|
||||
for ext in element.get('extension', []):
|
||||
ext_url = ext.get('url')
|
||||
if ext_url:
|
||||
ms_paths.append(f"{path}.extension[{ext_url}]")
|
||||
has_ms = True
|
||||
if ms_paths:
|
||||
entry['ms_paths'] = sorted(set(ms_paths))
|
||||
if has_ms:
|
||||
entry['ms_flag'] = True
|
||||
entry['sd_processed'] = True
|
||||
|
||||
except Exception as e:
|
||||
logger.warning(f"Could not read/parse member {member.name}: {e}")
|
||||
finally:
|
||||
if fileobj:
|
||||
fileobj.close()
|
||||
|
||||
elif (member.name.startswith('package/example/') or ('example' in base_filename_lower and member.name.startswith('package/'))) \
|
||||
and (member_name_lower.endswith('.xml') or member_name_lower.endswith('.html')):
|
||||
guessed_type = base_filename_lower.split('-', 1)[0].capitalize()
|
||||
if guessed_type in resource_info:
|
||||
resource_info[guessed_type]['examples'].append(member.name)
|
||||
|
||||
except Exception as e:
|
||||
err_msg = f"Error processing package file {tgz_path}: {e}"
|
||||
logger.error(err_msg, exc_info=True)
|
||||
results['errors'].append(err_msg)
|
||||
|
||||
# --- New logic: merge profiles of same base_type ---
|
||||
merged_info = {}
|
||||
grouped_by_type = defaultdict(list)
|
||||
|
||||
for profile_id, entry in resource_info.items():
|
||||
base_type = entry['base_type'] or profile_id
|
||||
grouped_by_type[base_type].append((profile_id, entry))
|
||||
|
||||
for base_type, profiles in grouped_by_type.items():
|
||||
merged_paths = set()
|
||||
merged_examples = []
|
||||
has_ms = False
|
||||
|
||||
for _, profile_entry in profiles:
|
||||
merged_paths.update(profile_entry.get('ms_paths', []))
|
||||
merged_examples.extend(profile_entry.get('examples', []))
|
||||
if profile_entry.get('ms_flag'):
|
||||
has_ms = True
|
||||
|
||||
merged_info[base_type] = {
|
||||
'base_type': base_type,
|
||||
'ms_flag': has_ms,
|
||||
'ms_paths': sorted(merged_paths),
|
||||
'examples': sorted(merged_examples),
|
||||
}
|
||||
|
||||
results['resource_types_info'] = sorted([
|
||||
{'name': k, 'base_type': v.get('base_type'), 'must_support': v['ms_flag']}
|
||||
for k, v in merged_info.items()
|
||||
], key=lambda x: x['name'])
|
||||
|
||||
results['must_support_elements'] = {
|
||||
k: v['ms_paths'] for k, v in merged_info.items() if v['ms_paths']
|
||||
}
|
||||
|
||||
results['examples'] = {
|
||||
k: v['examples'] for k, v in merged_info.items() if v['examples']
|
||||
}
|
||||
|
||||
logger.info(f"Extracted {len(results['resource_types_info'])} profiles "
|
||||
f"({sum(1 for r in results['resource_types_info'] if r['must_support'])} with MS; "
|
||||
f"{sum(len(v) for v in results['must_support_elements'].values())} MS paths; "
|
||||
f"{sum(len(v) for v in results['examples'].values())} examples) from {tgz_path}")
|
||||
|
||||
return results
|
||||
|
||||
# --- Remove or Comment Out old/unused functions ---
|
||||
# def _fetch_package_metadata(package_name, package_version): ... (REMOVED)
|
||||
# def resolve_all_dependencies(initial_package_name, initial_package_version): ... (REMOVED)
|
||||
# def process_ig_import(package_name, package_version): ... (OLD orchestrator - REMOVED)
|
@ -1,315 +0,0 @@
|
||||
# app/modules/fhir_ig_importer/services.py
|
||||
|
||||
import requests
|
||||
import os
|
||||
import tarfile
|
||||
import gzip
|
||||
import json
|
||||
import io
|
||||
import re
|
||||
import logging
|
||||
from flask import current_app
|
||||
|
||||
# Constants
|
||||
FHIR_REGISTRY_BASE_URL = "https://packages.fhir.org"
|
||||
DOWNLOAD_DIR_NAME = "fhir_packages"
|
||||
|
||||
# --- Helper Functions ---
|
||||
|
||||
def _get_download_dir():
|
||||
"""Gets the absolute path to the download directory, creating it if needed."""
|
||||
logger = logging.getLogger(__name__)
|
||||
try:
|
||||
instance_path = current_app.instance_path
|
||||
# logger.debug(f"Using instance path from current_app: {instance_path}") # Can be noisy
|
||||
except RuntimeError:
|
||||
logger.warning("No app context for instance_path, constructing relative path.")
|
||||
instance_path = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..', 'instance'))
|
||||
logger.debug(f"Constructed instance path: {instance_path}")
|
||||
download_dir = os.path.join(instance_path, DOWNLOAD_DIR_NAME)
|
||||
try:
|
||||
os.makedirs(download_dir, exist_ok=True)
|
||||
return download_dir
|
||||
except OSError as e:
|
||||
logger.error(f"Fatal Error: Could not create dir {download_dir}: {e}", exc_info=True)
|
||||
return None
|
||||
|
||||
def sanitize_filename_part(text):
|
||||
"""Basic sanitization for creating filenames."""
|
||||
# Replace common invalid chars, keep ., -
|
||||
safe_text = "".join(c if c.isalnum() or c in ['.', '-'] else '_' for c in text)
|
||||
# Replace multiple underscores with single one
|
||||
safe_text = re.sub(r'_+', '_', safe_text)
|
||||
# Remove leading/trailing underscores/hyphens/periods
|
||||
safe_text = safe_text.strip('_-.')
|
||||
return safe_text if safe_text else "invalid_name" # Ensure not empty
|
||||
|
||||
def _construct_tgz_filename(name, version):
|
||||
"""Constructs the standard filename for the package."""
|
||||
return f"{sanitize_filename_part(name)}-{sanitize_filename_part(version)}.tgz"
|
||||
|
||||
# --- Helper to Find/Extract SD ---
|
||||
def _find_and_extract_sd(tgz_path, resource_type_to_find):
|
||||
"""Helper to find and extract SD json from a given tgz path."""
|
||||
sd_data = None
|
||||
found_path = None
|
||||
logger = current_app.logger if current_app else logging.getLogger(__name__) # Use app logger if possible
|
||||
try:
|
||||
with tarfile.open(tgz_path, "r:gz") as tar:
|
||||
logger.debug(f"Searching for SD type '{resource_type_to_find}' in {tgz_path}")
|
||||
# Prioritize paths like 'package/StructureDefinition-[Type].json'
|
||||
potential_paths = [
|
||||
f'package/StructureDefinition-{resource_type_to_find.lower()}.json',
|
||||
f'package/StructureDefinition-{resource_type_to_find}.json'
|
||||
]
|
||||
member_found = None
|
||||
for potential_path in potential_paths:
|
||||
try:
|
||||
member_found = tar.getmember(potential_path)
|
||||
if member_found: break
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
# If specific paths failed, iterate
|
||||
if not member_found:
|
||||
for member in tar:
|
||||
if member.isfile() and member.name.startswith('package/') and member.name.lower().endswith('.json'):
|
||||
filename_lower = os.path.basename(member.name).lower()
|
||||
if filename_lower in ['package.json', '.index.json', 'validation-summary.json', 'validation-oo.json']:
|
||||
continue
|
||||
sd_fileobj = None
|
||||
try:
|
||||
sd_fileobj = tar.extractfile(member)
|
||||
if sd_fileobj:
|
||||
content_bytes = sd_fileobj.read(); content_string = content_bytes.decode('utf-8-sig'); data = json.loads(content_string)
|
||||
if isinstance(data, dict) and data.get('resourceType') == 'StructureDefinition' and data.get('type') == resource_type_to_find:
|
||||
member_found = member
|
||||
break
|
||||
except Exception:
|
||||
pass
|
||||
finally:
|
||||
if sd_fileobj: sd_fileobj.close()
|
||||
|
||||
if member_found:
|
||||
sd_fileobj = None
|
||||
try:
|
||||
sd_fileobj = tar.extractfile(member_found)
|
||||
if sd_fileobj:
|
||||
content_bytes = sd_fileobj.read(); content_string = content_bytes.decode('utf-8-sig'); sd_data = json.loads(content_string)
|
||||
found_path = member_found.name; logger.info(f"Found matching SD at path: {found_path}")
|
||||
except Exception as e:
|
||||
logger.warning(f"Could not read/parse member {member_found.name} after finding it: {e}")
|
||||
sd_data = None; found_path = None
|
||||
finally:
|
||||
if sd_fileobj: sd_fileobj.close()
|
||||
|
||||
except tarfile.TarError as e:
|
||||
logger.error(f"TarError reading {tgz_path}: {e}")
|
||||
raise
|
||||
except FileNotFoundError:
|
||||
logger.error(f"FileNotFoundError reading {tgz_path}")
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error in _find_and_extract_sd for {tgz_path}: {e}", exc_info=True)
|
||||
raise
|
||||
return sd_data, found_path
|
||||
# --- End Helper ---
|
||||
|
||||
# --- Core Service Functions ---
|
||||
|
||||
def download_package(name, version):
|
||||
""" Downloads a single FHIR package. Returns (save_path, error_message) """
|
||||
logger = logging.getLogger(__name__)
|
||||
download_dir = _get_download_dir()
|
||||
if not download_dir: return None, "Could not get/create download directory."
|
||||
|
||||
package_id = f"{name}#{version}"
|
||||
package_url = f"{FHIR_REGISTRY_BASE_URL}/{name}/{version}"
|
||||
filename = _construct_tgz_filename(name, version)
|
||||
save_path = os.path.join(download_dir, filename)
|
||||
|
||||
if os.path.exists(save_path):
|
||||
logger.info(f"Package already exists: {filename}")
|
||||
return save_path, None
|
||||
|
||||
logger.info(f"Downloading: {package_id} -> {filename}")
|
||||
try:
|
||||
with requests.get(package_url, stream=True, timeout=90) as r:
|
||||
r.raise_for_status()
|
||||
with open(save_path, 'wb') as f:
|
||||
for chunk in r.iter_content(chunk_size=8192): f.write(chunk)
|
||||
logger.info(f"Success: Downloaded {filename}")
|
||||
return save_path, None
|
||||
except requests.exceptions.RequestException as e: err_msg = f"Download error for {package_id}: {e}"; logger.error(err_msg); return None, err_msg
|
||||
except OSError as e: err_msg = f"File save error for {filename}: {e}"; logger.error(err_msg); return None, err_msg
|
||||
except Exception as e: err_msg = f"Unexpected download error for {package_id}: {e}"; logger.error(err_msg, exc_info=True); return None, err_msg
|
||||
|
||||
def extract_dependencies(tgz_path):
|
||||
""" Extracts dependencies dict from package.json. Returns (dep_dict, error_message) """
|
||||
logger = logging.getLogger(__name__)
|
||||
package_json_path = "package/package.json"
|
||||
dependencies = {}
|
||||
error_message = None
|
||||
if not tgz_path or not os.path.exists(tgz_path): return None, f"File not found at {tgz_path}"
|
||||
try:
|
||||
with tarfile.open(tgz_path, "r:gz") as tar:
|
||||
package_json_member = tar.getmember(package_json_path) # Raises KeyError if not found
|
||||
package_json_fileobj = tar.extractfile(package_json_member)
|
||||
if package_json_fileobj:
|
||||
try:
|
||||
package_data = json.loads(package_json_fileobj.read().decode('utf-8-sig'))
|
||||
dependencies = package_data.get('dependencies', {})
|
||||
finally: package_json_fileobj.close()
|
||||
else: raise FileNotFoundError(f"Could not extract {package_json_path}")
|
||||
except KeyError: error_message = f"'{package_json_path}' not found in {os.path.basename(tgz_path)}."; logger.warning(error_message) # No deps is okay
|
||||
except (json.JSONDecodeError, UnicodeDecodeError) as e: error_message = f"Parse error in {package_json_path} from {os.path.basename(tgz_path)}: {e}"; logger.error(error_message); dependencies = None # Parsing failed
|
||||
except (tarfile.TarError, FileNotFoundError) as e: error_message = f"Archive error {os.path.basename(tgz_path)}: {e}"; logger.error(error_message); dependencies = None # Archive read failed
|
||||
except Exception as e: error_message = f"Unexpected error extracting deps: {e}"; logger.error(error_message, exc_info=True); dependencies = None
|
||||
return dependencies, error_message
|
||||
|
||||
# --- Recursive Import Orchestrator (Corrected Indentation) ---
|
||||
def import_package_and_dependencies(initial_name, initial_version):
|
||||
"""Orchestrates recursive download and dependency extraction."""
|
||||
logger = logging.getLogger(__name__)
|
||||
logger.info(f"Starting recursive import for {initial_name}#{initial_version}")
|
||||
results = {'requested': (initial_name, initial_version), 'processed': set(), 'downloaded': {}, 'all_dependencies': {}, 'errors': [] }
|
||||
pending_queue = [(initial_name, initial_version)]; processed_lookup = set()
|
||||
|
||||
while pending_queue:
|
||||
name, version = pending_queue.pop(0)
|
||||
package_id_tuple = (name, version)
|
||||
|
||||
if package_id_tuple in processed_lookup: continue
|
||||
|
||||
logger.info(f"Processing: {name}#{version}"); processed_lookup.add(package_id_tuple)
|
||||
|
||||
# 1. Download
|
||||
save_path, dl_error = download_package(name, version)
|
||||
|
||||
if dl_error:
|
||||
error_msg = f"Download failed for {name}#{version}: {dl_error}"
|
||||
results['errors'].append(error_msg)
|
||||
if package_id_tuple == results['requested']:
|
||||
logger.error("Aborting import: Initial package download failed.")
|
||||
break # Stop processing queue if initial download fails
|
||||
else:
|
||||
continue # Skip dependency extraction for this failed download
|
||||
else:
|
||||
# Download succeeded (or file existed)
|
||||
results['downloaded'][package_id_tuple] = save_path
|
||||
|
||||
# --- FIX: Indent dependency extraction under the else block ---
|
||||
# 2. Extract Dependencies from downloaded file
|
||||
dependencies, dep_error = extract_dependencies(save_path)
|
||||
|
||||
if dep_error:
|
||||
results['errors'].append(f"Dependency extraction failed for {name}#{version}: {dep_error}")
|
||||
# Still mark as 'downloaded', but maybe not 'processed'? Let's allow queue processing.
|
||||
elif dependencies is not None: # Not None means extraction attempt happened (even if deps is {})
|
||||
results['all_dependencies'][package_id_tuple] = dependencies
|
||||
results['processed'].add(package_id_tuple) # Mark as successfully processed (downloaded + deps extracted)
|
||||
logger.debug(f"Dependencies for {name}#{version}: {list(dependencies.keys())}")
|
||||
# Add new dependencies to queue
|
||||
for dep_name, dep_version in dependencies.items():
|
||||
# Basic validation of dependency entry format
|
||||
if isinstance(dep_name, str) and isinstance(dep_version, str) and dep_name and dep_version:
|
||||
dep_tuple = (dep_name, dep_version)
|
||||
if dep_tuple not in processed_lookup: # Check processed_lookup to prevent re-queueing
|
||||
if dep_tuple not in pending_queue: # Avoid duplicate queue entries
|
||||
pending_queue.append(dep_tuple)
|
||||
logger.debug(f"Added to queue: {dep_name}#{dep_version}")
|
||||
else:
|
||||
logger.warning(f"Skipping invalid dependency entry '{dep_name}': '{dep_version}' in {name}#{version}")
|
||||
# --- End Indentation Fix ---
|
||||
|
||||
# Final Summary Log
|
||||
proc_count=len(results['processed']); dl_count=len(results['downloaded']); err_count=len(results['errors'])
|
||||
logger.info(f"Import finished. Processed: {proc_count}, Downloaded/Verified: {dl_count}, Errors: {err_count}")
|
||||
return results
|
||||
|
||||
# --- Package File Content Processor ---
|
||||
def process_package_file(tgz_path):
|
||||
""" Extracts types, MS elements, and examples from a downloaded .tgz package (Single Pass). """
|
||||
logger = logging.getLogger(__name__)
|
||||
logger.info(f"Processing package file details: {tgz_path}")
|
||||
results = {'resource_types_info': [], 'must_support_elements': {}, 'examples': {}, 'errors': [] }
|
||||
resource_info = {} # Temp dict: {'Type': {'ms_flag': False, 'ms_paths': [], 'examples': [], 'sd_processed': False}}
|
||||
|
||||
if not os.path.exists(tgz_path): errors.append(f"Package file not found: {tgz_path}"); return results
|
||||
|
||||
try:
|
||||
with tarfile.open(tgz_path, "r:gz") as tar:
|
||||
for member in tar:
|
||||
if not member.isfile(): continue
|
||||
member_name_lower = member.name.lower()
|
||||
base_filename_lower = os.path.basename(member_name_lower)
|
||||
fileobj = None
|
||||
|
||||
# Check if JSON file inside package/ (excluding known non-resources)
|
||||
if member.name.startswith('package/') and member_name_lower.endswith('.json') and \
|
||||
base_filename_lower not in ['package.json', '.index.json', 'validation-summary.json']:
|
||||
try:
|
||||
fileobj = tar.extractfile(member)
|
||||
if not fileobj: continue
|
||||
content_bytes = fileobj.read(); content_string = content_bytes.decode('utf-8-sig'); data = json.loads(content_string)
|
||||
|
||||
if isinstance(data, dict) and 'resourceType' in data:
|
||||
resource_type = data['resourceType']
|
||||
# Ensure entry exists for this type
|
||||
type_entry = resource_info.setdefault(resource_type, {'ms_flag': False, 'ms_paths': [], 'examples': [], 'sd_processed': False})
|
||||
|
||||
# If it's an example file, record it
|
||||
if member.name.startswith('package/example/'):
|
||||
type_entry['examples'].append(member.name)
|
||||
logger.debug(f"Found example for {resource_type}: {member.name}")
|
||||
|
||||
# If it's a StructureDefinition, process MS flags (only once per type)
|
||||
if resource_type == 'StructureDefinition' and not type_entry['sd_processed']:
|
||||
sd_type = data.get('type')
|
||||
if sd_type and sd_type in resource_info: # Check against types we've already seen
|
||||
ms_paths_for_type = []; has_ms = False
|
||||
for element_list in [data.get('snapshot', {}).get('element', []), data.get('differential', {}).get('element', [])]:
|
||||
for element in element_list:
|
||||
if isinstance(element, dict) and element.get('mustSupport') is True:
|
||||
element_path = element.get('path')
|
||||
if element_path:
|
||||
ms_paths_for_type.append(element_path); has_ms = True
|
||||
if ms_paths_for_type:
|
||||
resource_info[sd_type]['ms_paths'] = sorted(list(set(ms_paths_for_type))) # Store unique sorted paths
|
||||
if has_ms:
|
||||
resource_info[sd_type]['ms_flag'] = True
|
||||
resource_info[sd_type]['sd_processed'] = True # Mark as processed for MS
|
||||
logger.debug(f"Processed SD for {sd_type}, MS found: {has_ms}")
|
||||
else:
|
||||
logger.warning(f"SD {member.name} defines type '{sd_type}' which wasn't found as a resource type key.")
|
||||
|
||||
except Exception as e: logger.warning(f"Could not read/parse member {member.name}: {e}")
|
||||
finally:
|
||||
if fileobj: fileobj.close()
|
||||
|
||||
# Also find XML and HTML examples
|
||||
elif (member.name.startswith('package/example/') or ('example' in base_filename_lower and member.name.startswith('package/'))) \
|
||||
and (member_name_lower.endswith('.xml') or member_name_lower.endswith('.html')):
|
||||
# Try to guess type (imperfect)
|
||||
guessed_type = base_filename_lower.split('-', 1)[0].capitalize()
|
||||
if guessed_type in resource_info: # Only add if type is known
|
||||
resource_info[guessed_type]['examples'].append(member.name)
|
||||
logger.debug(f"Found non-JSON example for {guessed_type}: {member.name}")
|
||||
|
||||
except Exception as e: err_msg = f"Error processing package file {tgz_path}: {e}"; logger.error(err_msg, exc_info=True); results['errors'].append(err_msg)
|
||||
|
||||
# Format final results
|
||||
results['resource_types_info'] = sorted([{'name': rt, 'must_support': info['ms_flag']} for rt, info in resource_info.items()], key=lambda x: x['name'])
|
||||
results['must_support_elements'] = {rt: info['ms_paths'] for rt, info in resource_info.items() if info['ms_paths']}
|
||||
results['examples'] = {rt: sorted(info['examples']) for rt, info in resource_info.items() if info['examples']}
|
||||
|
||||
# Logging final counts
|
||||
final_types_count = len(results['resource_types_info']); ms_count = sum(1 for r in results['resource_types_info'] if r['must_support']); total_ms_paths = sum(len(v) for v in results['must_support_elements'].values()); total_examples = sum(len(v) for v in results['examples'].values())
|
||||
logger.info(f"Extracted {final_types_count} types ({ms_count} with MS; {total_ms_paths} MS paths; {total_examples} examples) from {tgz_path}")
|
||||
|
||||
return results
|
||||
|
||||
# --- Remove or Comment Out old/unused functions ---
|
||||
# def _fetch_package_metadata(package_name, package_version): ... (REMOVED)
|
||||
# def resolve_all_dependencies(initial_package_name, initial_package_version): ... (REMOVED)
|
||||
# def process_ig_import(package_name, package_version): ... (OLD orchestrator - REMOVED)
|
@ -1,124 +0,0 @@
|
||||
{# app/modules/fhir_ig_importer/templates/fhir_ig_importer/import_ig_page.html #}
|
||||
{% extends "base.html" %} {# Or your control panel base template "cp_base.html" #}
|
||||
{% from "_form_helpers.html" import render_field %} {# Assumes you have this macro #}
|
||||
|
||||
{% block content %} {# Or your specific CP content block #}
|
||||
<div class="container mt-4">
|
||||
<h2 class="mb-4"><i class="bi bi-bootstrap-reboot me-2"></i>Import FHIR Implementation Guide</h2>
|
||||
|
||||
<div class="row">
|
||||
<div class="col-md-8">
|
||||
{# --- Import Form --- #}
|
||||
<div class="card mb-4">
|
||||
<div class="card-body">
|
||||
<h5 class="card-title">Enter Package Details</h5>
|
||||
<form method="POST" action="{{ url_for('.import_ig') }}" novalidate> {# Use relative endpoint for blueprint #}
|
||||
{{ form.hidden_tag() }} {# CSRF token #}
|
||||
<div class="mb-3">
|
||||
{{ render_field(form.package_name, class="form-control" + (" is-invalid" if form.package_name.errors else ""), placeholder="e.g., hl7.fhir.au.base") }}
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
{{ render_field(form.package_version, class="form-control" + (" is-invalid" if form.package_version.errors else ""), placeholder="e.g., 4.1.0 or latest") }}
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
{{ form.submit(class="btn btn-primary", value="Fetch & Download IG") }}
|
||||
</div>
|
||||
</form>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{# --- Results Section --- #}
|
||||
<div id="results">
|
||||
{# Display Fatal Error if passed #}
|
||||
{% if fatal_error %}
|
||||
<div class="alert alert-danger" role="alert">
|
||||
<h5 class="alert-heading">Critical Error during Import</h5>
|
||||
{{ fatal_error }}
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
{# Display results if the results dictionary exists (meaning POST happened) #}
|
||||
{% if results %}
|
||||
<div class="card">
|
||||
<div class="card-body">
|
||||
{# Use results.requested for package info #}
|
||||
<h5 class="card-title">Import Results for: <code>{{ results.requested[0] }}#{{ results.requested[1] }}</code></h5>
|
||||
<hr>
|
||||
|
||||
{# --- Process Errors --- #}
|
||||
{% if results.errors %}
|
||||
<h6 class="mt-4 text-danger">Errors Encountered ({{ results.errors|length }}):</h6>
|
||||
<ul class="list-group list-group-flush mb-3">
|
||||
{% for error in results.errors %}
|
||||
<li class="list-group-item list-group-item-danger py-1">{{ error }}</li>
|
||||
{% endfor %}
|
||||
</ul>
|
||||
{% endif %}
|
||||
|
||||
{# --- Downloaded Files --- #}
|
||||
<h6 class="mt-4">Downloaded Packages ({{ results.downloaded|length }} / {{ results.processed|length }} processed):</h6>
|
||||
{# Check results.downloaded dictionary #}
|
||||
{% if results.downloaded %}
|
||||
<ul class="list-group list-group-flush mb-3">
|
||||
{# Iterate through results.downloaded items #}
|
||||
{% for (name, version), path in results.downloaded.items()|sort %}
|
||||
<li class="list-group-item py-1 d-flex justify-content-between align-items-center">
|
||||
<code>{{ name }}#{{ version }}</code>
|
||||
{# Display relative path cleanly #}
|
||||
<small class="text-muted ms-2">{{ path | replace('/app/', '') | replace('\\', '/') }}</small>
|
||||
<span class="badge bg-success rounded-pill ms-auto">Downloaded</span> {# Moved badge to end #}
|
||||
</li>
|
||||
{% endfor %}
|
||||
</ul>
|
||||
<p><small>Files saved in server's `instance/fhir_packages` directory.</small></p>
|
||||
{% else %}
|
||||
<p><small>No packages were successfully downloaded.</small></p>
|
||||
{% endif %}
|
||||
|
||||
|
||||
{# --- All Dependencies Found --- #}
|
||||
<h6 class="mt-4">Consolidated Dependencies Found:</h6>
|
||||
{# Calculate unique_deps based on results.all_dependencies #}
|
||||
{% set unique_deps = {} %}
|
||||
{% for pkg_id, deps_dict in results.all_dependencies.items() %}
|
||||
{% for dep_name, dep_version in deps_dict.items() %}
|
||||
{% set _ = unique_deps.update({(dep_name, dep_version): true}) %}
|
||||
{% endfor %}
|
||||
{% endfor %}
|
||||
|
||||
{% if unique_deps %}
|
||||
<p><small>Unique direct dependencies found across all processed packages:</small></p>
|
||||
<dl class="row">
|
||||
{% for (name, version), _ in unique_deps.items()|sort %}
|
||||
<dt class="col-sm-4 text-muted"><code>{{ name }}</code></dt>
|
||||
<dd class="col-sm-8">{{ version }}</dd>
|
||||
{% endfor %}
|
||||
</dl>
|
||||
{% else %}
|
||||
<div class="alert alert-secondary" role="alert" style="padding: 0.5rem 1rem;">
|
||||
No explicit dependencies found in any processed package metadata.
|
||||
</div>
|
||||
{% endif %}
|
||||
{# --- End Dependency Result --- #}
|
||||
|
||||
</div> {# End card-body #}
|
||||
</div> {# End card #}
|
||||
{% endif %} {# End if results #}
|
||||
</div> {# End results #}
|
||||
{# --- End Results Section --- #}
|
||||
|
||||
</div>
|
||||
<div class="col-md-4">
|
||||
{# --- Instructions Panel --- #}
|
||||
<div class="card bg-light">
|
||||
<div class="card-body">
|
||||
<h5 class="card-title">Instructions</h5>
|
||||
<p class="card-text">Enter the official FHIR package name (e.g., <code>hl7.fhir.au.base</code>) and the desired version (e.g., <code>4.1.0</code>, <code>latest</code>).</p>
|
||||
<p class="card-text">The system will download the package and attempt to list its direct dependencies.</p>
|
||||
<p class="card-text"><small>Downloads are saved to the server's `instance/fhir_packages` folder.</small></p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div> {# End row #}
|
||||
</div> {# End container #}
|
||||
{% endblock %}
|
@ -1,86 +0,0 @@
|
||||
# app/models.py
|
||||
from app import db
|
||||
from flask_login import UserMixin
|
||||
from werkzeug.security import generate_password_hash, check_password_hash
|
||||
from datetime import datetime
|
||||
import json
|
||||
|
||||
# --- ProcessedIg Model (MODIFIED for Examples) ---
|
||||
class ProcessedIg(db.Model):
|
||||
id = db.Column(db.Integer, primary_key=True)
|
||||
package_name = db.Column(db.String(150), nullable=False, index=True)
|
||||
package_version = db.Column(db.String(50), nullable=False, index=True)
|
||||
processed_at = db.Column(db.DateTime, nullable=False, default=datetime.utcnow)
|
||||
status = db.Column(db.String(50), default='processed', nullable=True)
|
||||
# Stores list of dicts: [{'name': 'Type', 'must_support': bool}, ...]
|
||||
resource_types_info_json = db.Column(db.Text, nullable=True)
|
||||
# Stores dict: {'TypeName': ['path1', 'path2'], ...}
|
||||
must_support_elements_json = db.Column(db.Text, nullable=True)
|
||||
# --- ADDED: Store example files found, grouped by type ---
|
||||
# Structure: {'TypeName': ['example1.json', 'example1.xml'], ...}
|
||||
examples_json = db.Column(db.Text, nullable=True)
|
||||
# --- End Add ---
|
||||
|
||||
__table_args__ = (db.UniqueConstraint('package_name', 'package_version', name='uq_processed_ig_name_version'),)
|
||||
|
||||
# Property for resource_types_info
|
||||
@property
|
||||
def resource_types_info(self):
|
||||
# ... (getter as before) ...
|
||||
if self.resource_types_info_json:
|
||||
try: return json.loads(self.resource_types_info_json)
|
||||
except json.JSONDecodeError: return []
|
||||
return []
|
||||
|
||||
@resource_types_info.setter
|
||||
def resource_types_info(self, types_info_list):
|
||||
# ... (setter as before) ...
|
||||
if types_info_list and isinstance(types_info_list, list):
|
||||
sorted_list = sorted(types_info_list, key=lambda x: x.get('name', ''))
|
||||
self.resource_types_info_json = json.dumps(sorted_list)
|
||||
else: self.resource_types_info_json = None
|
||||
|
||||
# Property for must_support_elements
|
||||
@property
|
||||
def must_support_elements(self):
|
||||
# ... (getter as before) ...
|
||||
if self.must_support_elements_json:
|
||||
try: return json.loads(self.must_support_elements_json)
|
||||
except json.JSONDecodeError: return {}
|
||||
return {}
|
||||
|
||||
@must_support_elements.setter
|
||||
def must_support_elements(self, ms_elements_dict):
|
||||
# ... (setter as before) ...
|
||||
if ms_elements_dict and isinstance(ms_elements_dict, dict):
|
||||
self.must_support_elements_json = json.dumps(ms_elements_dict)
|
||||
else: self.must_support_elements_json = None
|
||||
|
||||
# --- ADDED: Property for examples ---
|
||||
@property
|
||||
def examples(self):
|
||||
"""Returns the stored example filenames as a Python dict."""
|
||||
if self.examples_json:
|
||||
try:
|
||||
# Return dict {'TypeName': ['file1.json', 'file2.xml'], ...}
|
||||
return json.loads(self.examples_json)
|
||||
except json.JSONDecodeError:
|
||||
return {} # Return empty dict on parse error
|
||||
return {}
|
||||
|
||||
@examples.setter
|
||||
def examples(self, examples_dict):
|
||||
"""Stores a Python dict of example filenames as a JSON string."""
|
||||
if examples_dict and isinstance(examples_dict, dict):
|
||||
# Sort filenames within each list? Optional.
|
||||
# for key in examples_dict: examples_dict[key].sort()
|
||||
self.examples_json = json.dumps(examples_dict)
|
||||
else:
|
||||
self.examples_json = None
|
||||
# --- End Add ---
|
||||
|
||||
def __repr__(self):
|
||||
count = len(self.resource_types_info)
|
||||
ms_count = sum(1 for item in self.resource_types_info if item.get('must_support'))
|
||||
ex_count = sum(len(v) for v in self.examples.values()) # Count total example files
|
||||
return f"<ProcessedIg {self.package_name}#{self.package_version} ({self.status}, {count} types, {ms_count} MS, {ex_count} examples)>"
|
@ -1,14 +0,0 @@
|
||||
/* app/static/css/custom.css */
|
||||
/* Add custom styles here to override or supplement Bootstrap */
|
||||
|
||||
body {
|
||||
/* Example: Add a subtle background pattern or color */
|
||||
/* background-color: #f8f9fa; */
|
||||
}
|
||||
|
||||
.footer {
|
||||
/* Ensure footer stays at the bottom if needed, though */
|
||||
/* d-flex flex-column min-vh-100 on body usually handles this */
|
||||
}
|
||||
|
||||
/* Add more custom styles as needed */
|
@ -1,12 +0,0 @@
|
||||
/* app/static/js/main.js */
|
||||
// Add global JavaScript functions here
|
||||
|
||||
// Example: Initialize Bootstrap tooltips or popovers if used
|
||||
// document.addEventListener('DOMContentLoaded', function () {
|
||||
// var tooltipTriggerList = [].slice.call(document.querySelectorAll('[data-bs-toggle="tooltip"]'))
|
||||
// var tooltipList = tooltipTriggerList.map(function (tooltipTriggerEl) {
|
||||
// return new bootstrap.Tooltip(tooltipTriggerEl)
|
||||
// })
|
||||
// });
|
||||
|
||||
console.log("Custom main.js loaded.");
|
@ -1,12 +0,0 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block content %}
|
||||
<div class="text-center py-5">
|
||||
<h1 class="display-1">404</h1>
|
||||
<h2>Page Not Found</h2>
|
||||
<p class="lead">
|
||||
Sorry, we couldn't find the page you were looking for.
|
||||
</p>
|
||||
<a href="{{ url_for('core.index') }}" class="btn btn-primary mt-3">Go Back Home</a>
|
||||
</div>
|
||||
{% endblock %}
|
@ -1,12 +0,0 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block content %}
|
||||
<div class="text-center py-5">
|
||||
<h1 class="display-1">500</h1>
|
||||
<h2>Internal Server Error</h2>
|
||||
<p class="lead">
|
||||
Sorry, something went wrong on our end. We are looking into it.
|
||||
</p>
|
||||
<a href="{{ url_for('core.index') }}" class="btn btn-primary mt-3">Go Back Home</a>
|
||||
</div>
|
||||
{% endblock %}
|
@ -1,26 +0,0 @@
|
||||
{# app/templates/_form_helpers.html #}
|
||||
{% macro render_field(field, label_visible=true) %}
|
||||
<div class="form-group mb-3"> {# Add margin bottom for spacing #}
|
||||
{% if label_visible and field.label %}
|
||||
{{ field.label(class="form-label") }} {# Render label with Bootstrap class #}
|
||||
{% endif %}
|
||||
|
||||
{# Add is-invalid class if errors exist #}
|
||||
{% set css_class = 'form-control ' + kwargs.pop('class', '') %}
|
||||
{% if field.errors %}
|
||||
{% set css_class = css_class + ' is-invalid' %}
|
||||
{% endif %}
|
||||
|
||||
{# Render the field itself, passing any extra attributes #}
|
||||
{{ field(class=css_class, **kwargs) }}
|
||||
|
||||
{# Display validation errors #}
|
||||
{% if field.errors %}
|
||||
<div class="invalid-feedback">
|
||||
{% for error in field.errors %}
|
||||
{{ error }}<br>
|
||||
{% endfor %}
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
{% endmacro %}
|
@ -1,91 +0,0 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1">
|
||||
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.3/dist/css/bootstrap.min.css" rel="stylesheet" integrity="sha384-QWTKZyjpPEjISv5WaRU9OFeRpok6YctnYmDr5pNlyT2bRjXh0JMhjY6hW+ALEwIH" crossorigin="anonymous">
|
||||
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootstrap-icons@1.11.3/font/bootstrap-icons.min.css">
|
||||
<link rel="stylesheet" href="{{ url_for('static', filename='css/custom.css') }}">
|
||||
<title>{% if title %}{{ title }} - {% endif %}{{ site_name }}</title>
|
||||
</head>
|
||||
<body class="d-flex flex-column min-vh-100">
|
||||
<nav class="navbar navbar-expand-lg navbar-dark bg-primary mb-4">
|
||||
<div class="container-fluid">
|
||||
<a class="navbar-brand" href="{{ url_for('core.index') }}">{{ site_name }}</a>
|
||||
<button class="navbar-toggler" type="button" data-bs-toggle="collapse" data-bs-target="#navbarNav" aria-controls="navbarNav" aria-expanded="false" aria-label="Toggle navigation">
|
||||
<span class="navbar-toggler-icon"></span>
|
||||
</button>
|
||||
<div class="collapse navbar-collapse" id="navbarNav">
|
||||
<ul class="navbar-nav me-auto mb-2 mb-lg-0">
|
||||
<li class="nav-item">
|
||||
{# FIX: Added check for request.endpoint #}
|
||||
<a class="nav-link {{ 'active' if request.endpoint and request.endpoint == 'core.index' else '' }}" aria-current="page" href="{{ url_for('core.index') }}"><i class="bi bi-house-door me-1"></i> Home</a>
|
||||
</li>
|
||||
{% if module_menu_items %}
|
||||
{% for item in module_menu_items %}
|
||||
<li class="nav-item">
|
||||
{# FIX: Added check for request.endpoint #}
|
||||
<a class="nav-link {{ 'active' if request.endpoint and request.endpoint == item.endpoint else '' }}" href="{{ url_for(item.endpoint) }}">
|
||||
<i class="{{ item.icon | default('bi bi-box-seam me-1') }}"></i> {{ item.title }}
|
||||
</a>
|
||||
</li>
|
||||
{% endfor %}
|
||||
{% endif %}
|
||||
</ul>
|
||||
<ul class="navbar-nav ms-auto">
|
||||
{% if current_user.is_authenticated %}
|
||||
{% if current_user.role == 'admin' %}
|
||||
<li class="nav-item">
|
||||
{# FIX: Added check for request.endpoint #}
|
||||
<a class="nav-link {{ 'active' if request.endpoint and request.endpoint.startswith('control_panel.') else '' }}" href="{{ url_for('control_panel.index') }}">
|
||||
<i class="bi bi-gear me-1"></i> Control Panel
|
||||
</a>
|
||||
</li>
|
||||
{% endif %}
|
||||
<li class="nav-item"><span class="navbar-text me-2"><i class="bi bi-person-circle me-1"></i> {{ current_user.username }}</span></li>
|
||||
<li class="nav-item"><a class="nav-link" href="{{ url_for('auth.logout') }}"><i class="bi bi-box-arrow-right me-1"></i> Logout</a></li>
|
||||
{% else %}
|
||||
<li class="nav-item">
|
||||
{# FIX: Added check for request.endpoint #}
|
||||
<a class="nav-link {{ 'active' if request.endpoint and request.endpoint == 'auth.login' else '' }}" href="{{ url_for('auth.login') }}"><i class="bi bi-box-arrow-in-right me-1"></i> Login</a>
|
||||
</li>
|
||||
{% endif %}
|
||||
</ul>
|
||||
</div>
|
||||
</div>
|
||||
</nav>
|
||||
|
||||
<main class="container flex-grow-1">
|
||||
{% with messages = get_flashed_messages(with_categories=true) %}
|
||||
{% if messages %}
|
||||
{% for category, message in messages %}
|
||||
<div class="alert alert-{{ category or 'info' }} alert-dismissible fade show" role="alert">
|
||||
{{ message }}
|
||||
<button type="button" class="btn-close" data-bs-dismiss="alert" aria-label="Close"></button>
|
||||
</div>
|
||||
{% endfor %}
|
||||
{% endif %}
|
||||
{% endwith %}
|
||||
{% block content %}{% endblock %}
|
||||
</main>
|
||||
|
||||
<footer class="footer mt-auto py-3 bg-light">
|
||||
<div class="container text-center">
|
||||
{% set current_year = now.year %} <span class="text-muted">© {{ current_year }} {{ site_name }}. All rights reserved.</span>
|
||||
</div>
|
||||
</footer>
|
||||
|
||||
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.3/dist/js/bootstrap.bundle.min.js" integrity="sha384-YvpcrYf0tY3lHB60NNkmXc5s9fDVZLESaAA55NDzOxhy9GkcIdslK1eN7N6jIeHz" crossorigin="anonymous"></script>
|
||||
<script src="{{ url_for('static', filename='js/main.js') }}"></script>
|
||||
|
||||
{% block scripts %}
|
||||
{# Tooltip Initialization Script #}
|
||||
<script>
|
||||
document.addEventListener('DOMContentLoaded', function () {
|
||||
var tooltipTriggerList = [].slice.call(document.querySelectorAll('[data-bs-toggle="tooltip"]'))
|
||||
var tooltipList = tooltipTriggerList.map(function (tooltipTriggerEl) { return new bootstrap.Tooltip(tooltipTriggerEl) })
|
||||
});
|
||||
</script>
|
||||
{% endblock %}
|
||||
</body>
|
||||
</html>
|
@ -1,135 +0,0 @@
|
||||
{# app/control_panel/templates/cp_downloaded_igs.html #}
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% block content %}
|
||||
<div class="container mt-4">
|
||||
<div class="d-flex justify-content-between align-items-center mb-4">
|
||||
<h2><i class="bi bi-journal-arrow-down me-2"></i>Manage FHIR Packages</h2>
|
||||
<div>
|
||||
<a href="{{ url_for('fhir_ig_importer.import_ig') }}" class="btn btn-success"><i class="bi bi-download me-1"></i> Import More IGs</a>
|
||||
<a href="{{ url_for('control_panel.index') }}" class="btn btn-secondary"><i class="bi bi-arrow-left"></i> Back to CP Index</a>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{% if error_message %}
|
||||
<div class="alert alert-danger" role="alert">
|
||||
<h5 class="alert-heading">Error</h5>
|
||||
{{ error_message }}
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
{# NOTE: The block calculating processed_ids set using {% set %} was REMOVED from here #}
|
||||
|
||||
{# --- Start Two Column Layout --- #}
|
||||
<div class="row g-4">
|
||||
|
||||
{# --- Left Column: Downloaded Packages (Horizontal Buttons) --- #}
|
||||
<div class="col-md-6">
|
||||
<div class="card h-100">
|
||||
<div class="card-header"><i class="bi bi-folder-symlink me-1"></i> Downloaded Packages ({{ packages|length }})</div>
|
||||
<div class="card-body">
|
||||
{% if packages %}
|
||||
<div class="table-responsive">
|
||||
<table class="table table-sm table-hover">
|
||||
<thead>
|
||||
<p class="mb-2"><small><span class="badge bg-danger text-dark border me-1">Risk:</span>= Duplicate Dependancy with different versions</small></p>
|
||||
<tr><th>Package Name</th><th>Version</th><th>Actions</th></tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
{% for pkg in packages %}
|
||||
{% set is_processed = (pkg.name, pkg.version) in processed_ids %}
|
||||
{# --- ADDED: Check for duplicate name --- #}
|
||||
{% set is_duplicate = pkg.name in duplicate_names %}
|
||||
{# --- ADDED: Assign row class based on duplicate group --- #}
|
||||
<tr class="{{ duplicate_groups.get(pkg.name, '') if is_duplicate else '' }}">
|
||||
<td>
|
||||
{# --- ADDED: Risk Badge for duplicates --- #}
|
||||
{% if is_duplicate %}
|
||||
<span class="badge bg-danger mb-1 d-block">Duplicate</span>
|
||||
{% endif %}
|
||||
{# --- End Add --- #}
|
||||
<code>{{ pkg.name }}</code>
|
||||
</td>
|
||||
<td>{{ pkg.version }}</td>
|
||||
<td> {# Actions #}
|
||||
<div class="btn-group btn-group-sm" role="group">
|
||||
{% if is_processed %}
|
||||
<span class="btn btn-success disabled"><i class="bi bi-check-lg"></i> Processed</span>
|
||||
{% else %}
|
||||
<form action="{{ url_for('control_panel.process_ig') }}" method="POST" style="display: inline-block;">
|
||||
<input type="hidden" name="package_name" value="{{ pkg.name }}">
|
||||
<input type="hidden" name="package_version" value="{{ pkg.version }}">
|
||||
<button type="submit" class="btn btn-outline-primary" title="Mark as processed"><i class="bi bi-gear"></i> Process</button>
|
||||
</form>
|
||||
{% endif %}
|
||||
<form action="{{ url_for('control_panel.delete_ig_file') }}" method="POST" style="display: inline-block;" onsubmit="return confirm('Delete file \'{{ pkg.filename }}\'?');">
|
||||
<input type="hidden" name="filename" value="{{ pkg.filename }}">
|
||||
<button type="submit" class="btn btn-outline-danger" title="Delete File"><i class="bi bi-trash"></i> Delete</button>
|
||||
</form>
|
||||
</div>
|
||||
</td>
|
||||
</tr>
|
||||
{% endfor %}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
{% elif not error_message %}<p class="text-muted">No downloaded FHIR packages found.</p>{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
</div>{# --- End Left Column --- #}
|
||||
|
||||
|
||||
{# --- Right Column: Processed Packages (Vertical Buttons) --- #}
|
||||
<div class="col-md-6">
|
||||
<div class="card h-100">
|
||||
<div class="card-header"><i class="bi bi-check-circle me-1"></i> Processed Packages ({{ processed_list|length }})</div>
|
||||
<div class="card-body">
|
||||
{% if processed_list %}
|
||||
<p class="mb-2"><small><span class="badge bg-warning text-dark border me-1">MS</span> = Contains Must Support Elements</small></p>
|
||||
<div class="table-responsive">
|
||||
<table class="table table-sm table-hover">
|
||||
<thead>
|
||||
<tr><th>Package Name</th><th>Version</th><th>Resource Types</th><th>Actions</th></tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
{% for processed_ig in processed_list %}
|
||||
<tr>
|
||||
<td>{# Tooltip for Processed At / Status #}
|
||||
{% set tooltip_title_parts = [] %}
|
||||
{% if processed_ig.processed_at %}{% set _ = tooltip_title_parts.append("Processed: " + processed_ig.processed_at.strftime('%Y-%m-%d %H:%M')) %}{% endif %}
|
||||
{% if processed_ig.status %}{% set _ = tooltip_title_parts.append("Status: " + processed_ig.status) %}{% endif %}
|
||||
{% set tooltip_text = tooltip_title_parts | join('\n') %}
|
||||
<code data-bs-toggle="tooltip" data-bs-placement="top" title="{{ tooltip_text }}">{{ processed_ig.package_name }}</code>
|
||||
</td>
|
||||
<td>{{ processed_ig.package_version }}</td>
|
||||
<td> {# Resource Types Cell w/ Badges #}
|
||||
{% set types_info = processed_ig.resource_types_info %}
|
||||
{% if types_info %}<div class="d-flex flex-wrap gap-1">{% for type_info in types_info %}{% if type_info.must_support %}<span class="badge bg-warning text-dark border" title="Has Must Support">{{ type_info.name }}</span>{% else %}<span class="badge bg-light text-dark border">{{ type_info.name }}</span>{% endif %}{% endfor %}</div>{% else %}<small class="text-muted">N/A</small>{% endif %}
|
||||
</td>
|
||||
<td>
|
||||
{# Vertical Button Group #}
|
||||
<div class="btn-group-vertical btn-group-sm w-100" role="group" aria-label="Processed Package Actions for {{ processed_ig.package_name }}">
|
||||
<a href="{{ url_for('control_panel.view_processed_ig', processed_ig_id=processed_ig.id) }}" class="btn btn-outline-info w-100" title="View Details"><i class="bi bi-search"></i> View</a>
|
||||
<form action="{{ url_for('control_panel.unload_ig') }}" method="POST" onsubmit="return confirm('Unload record for \'{{ processed_ig.package_name }}#{{ processed_ig.package_version }}\'?');">
|
||||
<input type="hidden" name="processed_ig_id" value="{{ processed_ig.id }}">
|
||||
<button type="submit" class="btn btn-outline-warning w-100" title="Unload/Remove Processed Record"><i class="bi bi-x-lg"></i> Unload</button>
|
||||
</form>
|
||||
</div>
|
||||
</td>
|
||||
</tr>
|
||||
{% endfor %}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
{% elif not error_message %}<p class="text-muted">No packages recorded as processed yet.</p>{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
</div>{# --- End Right Column --- #}
|
||||
|
||||
</div>{# --- End Row --- #}
|
||||
|
||||
</div>{# End container #}
|
||||
{% endblock %}
|
||||
|
||||
{# Tooltip JS Initializer should be in base.html #}
|
||||
{% block scripts %}{{ super() }}{% endblock %}
|
@ -1,15 +0,0 @@
|
||||
{% extends "base.html" %} {% block content %} <div class="px-4 py-5 my-5 text-center">
|
||||
<img class="d-block mx-auto mb-4" src="https://placehold.co/72x57/0d6efd/white?text=PAS" alt="PAS Logo Placeholder" width="72" height="57">
|
||||
|
||||
<h1 class="display-5 fw-bold text-body-emphasis">Welcome to FLARE FHIR IG TOOLKIT</h1>
|
||||
<div class="col-lg-6 mx-auto">
|
||||
<p class="lead mb-4">
|
||||
This is the starting point for a Journey into FHIR IG's from an implementors view!.
|
||||
</p>
|
||||
<div class="d-grid gap-2 d-sm-flex justify-content-sm-center">
|
||||
<a href="/import-ig"><button type="button" class="btn btn-primary btn-lg px-4 gap-3">Import FHIR IG</button>
|
||||
<a href="/downloaded-igs"><button type="button" class="btn btn-outline-secondary btn-lg px-4">View Downloaded IGs</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endblock %}
|
@ -1,405 +0,0 @@
|
||||
{% extends "base.html" %}
|
||||
|
||||
{% from "_form_helpers.html" import render_field %}
|
||||
|
||||
{% block content %}
|
||||
<div class="container mt-4">
|
||||
<div class="d-flex justify-content-between align-items-center mb-3">
|
||||
<h2><i class="bi bi-info-circle me-2"></i>{{ title }}</h2>
|
||||
<a href="{{ url_for('control_panel.list_downloaded_igs') }}" class="btn btn-secondary"><i class="bi bi-arrow-left"></i> Back to Package List</a>
|
||||
</div>
|
||||
|
||||
{% if processed_ig %}
|
||||
<div class="card">
|
||||
<div class="card-header">Package Details</div>
|
||||
<div class="card-body">
|
||||
<dl class="row">
|
||||
<dt class="col-sm-3">Package Name</dt>
|
||||
<dd class="col-sm-9"><code>{{ processed_ig.package_name }}</code></dd>
|
||||
<dt class="col-sm-3">Package Version</dt>
|
||||
<dd class="col-sm-9">{{ processed_ig.package_version }}</dd>
|
||||
<dt class="col-sm-3">Processed At</dt>
|
||||
<dd class="col-sm-9">{{ processed_ig.processed_at.strftime('%Y-%m-%d %H:%M:%S UTC') if processed_ig.processed_at else 'N/A' }}</dd>
|
||||
<dt class="col-sm-3">Processing Status</dt>
|
||||
<dd class="col-sm-9">
|
||||
<span class="badge rounded-pill text-bg-{{ 'success' if processed_ig.status == 'processed' else ('warning' if processed_ig.status == 'processed_with_errors' else 'secondary') }}">{{ processed_ig.status or 'N/A' }}</span>
|
||||
</dd>
|
||||
</dl>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="card mt-4">
|
||||
<div class="card-header">Resource Types Found / Defined </div>
|
||||
<div class="card-body">
|
||||
{% if profile_list or base_list %}
|
||||
<p class="mb-2"><small><span class="badge bg-warning text-dark border me-1">MS</span> = Contains Must Support Elements</small></p>
|
||||
{% if profile_list %}
|
||||
<h6>Profiles Defined ({{ profile_list|length }}):</h6>
|
||||
<div class="d-flex flex-wrap gap-1 mb-3 resource-type-list">
|
||||
{% for type_info in profile_list %}
|
||||
<a href="#" class="resource-type-link text-decoration-none"
|
||||
data-package-name="{{ processed_ig.package_name }}"
|
||||
data-package-version="{{ processed_ig.package_version }}"
|
||||
data-resource-type="{{ type_info.name }}"
|
||||
aria-label="View structure for {{ type_info.name }}">
|
||||
{% if type_info.must_support %}
|
||||
<span class="badge bg-warning text-dark border" title="Contains Must Support elements">{{ type_info.name }}</span>
|
||||
{% else %}
|
||||
<span class="badge bg-light text-dark border">{{ type_info.name }}</span>
|
||||
{% endif %}
|
||||
</a>
|
||||
{% endfor %}
|
||||
</div>
|
||||
{% endif %}
|
||||
{% if base_list %}
|
||||
<p class="mb-2"><small><span class="badge bg-primary text-light border me-1">Examples</span> = - Examples will be displayed when selecting Base Types if contained in the IG</small></p>
|
||||
<h6>Base Resource Types Referenced ({{ base_list|length }}):</h6>
|
||||
<div class="d-flex flex-wrap gap-1 resource-type-list">
|
||||
{% for type_info in base_list %}
|
||||
<a href="#" class="resource-type-link text-decoration-none"
|
||||
data-package-name="{{ processed_ig.package_name }}"
|
||||
data-package-version="{{ processed_ig.package_version }}"
|
||||
data-resource-type="{{ type_info.name }}"
|
||||
aria-label="View structure for {{ type_info.name }}">
|
||||
{% if type_info.must_support %}
|
||||
<span class="badge bg-warning text-dark border" title="Contains Must Support elements">{{ type_info.name }}</span>
|
||||
{% else %}
|
||||
<span class="badge bg-light text-dark border">{{ type_info.name }}</span>
|
||||
{% endif %}
|
||||
</a>
|
||||
{% endfor %}
|
||||
</div>
|
||||
{% endif %}
|
||||
{% else %}
|
||||
<p class="text-muted"><em>No resource type information extracted or stored.</em></p>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div id="structure-display-wrapper" class="mt-4" style="display: none;">
|
||||
<div class="card">
|
||||
<div class="card-header d-flex justify-content-between align-items-center">
|
||||
<span>Structure Definition for: <code id="structure-title"></code></span>
|
||||
<button type="button" class="btn-close" aria-label="Close Details View" onclick="document.getElementById('structure-display-wrapper').style.display='none'; document.getElementById('example-display-wrapper').style.display='none'; document.getElementById('raw-structure-wrapper').style.display='none';"></button>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<div id="structure-loading" class="text-center py-3" style="display: none;">
|
||||
<div class="spinner-border text-primary" role="status"><span class="visually-hidden">Loading...</span></div>
|
||||
</div>
|
||||
<div id="structure-content"></div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div id="raw-structure-wrapper" class="mt-4" style="display: none;">
|
||||
<div class="card">
|
||||
<div class="card-header">Raw Structure Definition for <code id="raw-structure-title"></code></div>
|
||||
<div class="card-body">
|
||||
<div id="raw-structure-loading" class="text-center py-3" style="display: none;">
|
||||
<div class="spinner-border text-primary" role="status"><span class="visually-hidden">Loading...</span></div>
|
||||
</div>
|
||||
<pre><code id="raw-structure-content" class="p-2 d-block border bg-light" style="max-height: 400px; overflow-y: auto;"></code></pre>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div id="example-display-wrapper" class="mt-4" style="display: none;">
|
||||
<div class="card">
|
||||
<div class="card-header">Examples for <code id="example-resource-type-title"></code></div>
|
||||
<div class="card-body">
|
||||
<div class="mb-3" id="example-selector-wrapper" style="display: none;">
|
||||
<label for="example-select" class="form-label">Select Example:</label>
|
||||
<select class="form-select form-select-sm" id="example-select"><option selected value="">-- Select --</option></select>
|
||||
</div>
|
||||
<div id="example-loading" class="text-center py-3" style="display: none;">
|
||||
<div class="spinner-border spinner-border-sm text-secondary" role="status"><span class="visually-hidden">Loading...</span></div>
|
||||
</div>
|
||||
<div id="example-content-wrapper" style="display: none;">
|
||||
<h6 id="example-filename" class="mt-2 small text-muted"></h6>
|
||||
<div class="row">
|
||||
<div class="col-md-6">
|
||||
<h6>Raw Content</h6>
|
||||
<pre><code id="example-content-raw" class="p-2 d-block border bg-light" style="max-height: 400px; overflow-y: auto;"></code></pre>
|
||||
</div>
|
||||
<div class="col-md-6">
|
||||
<h6>Pretty-Printed JSON</h6>
|
||||
<pre><code id="example-content-json" class="p-2 d-block border bg-light" style="max-height: 400px; overflow-y: auto;"></code></pre>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{% else %}
|
||||
<div class="alert alert-warning" role="alert">Processed IG details not available.</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
{% block scripts %}
|
||||
{{ super() }}
|
||||
<script>const examplesData = {{ examples_by_type | tojson | safe }};</script>
|
||||
<script>
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
const structureDisplayWrapper = document.getElementById('structure-display-wrapper');
|
||||
const structureDisplay = document.getElementById('structure-content');
|
||||
const structureTitle = document.getElementById('structure-title');
|
||||
const structureLoading = document.getElementById('structure-loading');
|
||||
const exampleDisplayWrapper = document.getElementById('example-display-wrapper');
|
||||
const exampleSelectorWrapper = document.getElementById('example-selector-wrapper');
|
||||
const exampleSelect = document.getElementById('example-select');
|
||||
const exampleResourceTypeTitle = document.getElementById('example-resource-type-title');
|
||||
const exampleLoading = document.getElementById('example-loading');
|
||||
const exampleContentWrapper = document.getElementById('example-content-wrapper');
|
||||
const exampleFilename = document.getElementById('example-filename');
|
||||
const exampleContentRaw = document.getElementById('example-content-raw');
|
||||
const exampleContentJson = document.getElementById('example-content-json');
|
||||
const rawStructureWrapper = document.getElementById('raw-structure-wrapper');
|
||||
const rawStructureTitle = document.getElementById('raw-structure-title');
|
||||
const rawStructureContent = document.getElementById('raw-structure-content');
|
||||
const rawStructureLoading = document.getElementById('raw-structure-loading');
|
||||
|
||||
const structureBaseUrl = "/control-panel/fhir/get-structure";
|
||||
const exampleBaseUrl = "/control-panel/fhir/get-example";
|
||||
|
||||
let currentPkgName = null;
|
||||
let currentPkgVersion = null;
|
||||
let currentRawStructureData = null;
|
||||
|
||||
document.body.addEventListener('click', function(event) {
|
||||
const link = event.target.closest('.resource-type-link');
|
||||
if (!link) return;
|
||||
event.preventDefault();
|
||||
|
||||
currentPkgName = link.dataset.packageName;
|
||||
currentPkgVersion = link.dataset.packageVersion;
|
||||
const resourceType = link.dataset.resourceType;
|
||||
if (!currentPkgName || !currentPkgVersion || !resourceType) return;
|
||||
|
||||
const structureParams = new URLSearchParams({
|
||||
package_name: currentPkgName,
|
||||
package_version: currentPkgVersion,
|
||||
resource_type: resourceType
|
||||
});
|
||||
const structureFetchUrl = `${structureBaseUrl}?${structureParams.toString()}`;
|
||||
|
||||
structureTitle.textContent = `${resourceType} (${currentPkgName}#${currentPkgVersion})`;
|
||||
rawStructureTitle.textContent = `${resourceType} (${currentPkgName}#${currentPkgVersion})`;
|
||||
structureDisplay.innerHTML = '';
|
||||
rawStructureContent.textContent = '';
|
||||
structureLoading.style.display = 'block';
|
||||
rawStructureLoading.style.display = 'block';
|
||||
structureDisplayWrapper.style.display = 'block';
|
||||
rawStructureWrapper.style.display = 'block';
|
||||
exampleDisplayWrapper.style.display = 'none';
|
||||
|
||||
fetch(structureFetchUrl)
|
||||
.then(response => response.json().then(data => ({ ok: response.ok, status: response.status, data })))
|
||||
.then(result => {
|
||||
if (!result.ok) throw new Error(result.data.error || `HTTP error ${result.status}`);
|
||||
currentRawStructureData = result.data;
|
||||
renderStructureTree(result.data.elements, result.data.must_support_paths || []);
|
||||
rawStructureContent.textContent = JSON.stringify(result.data, null, 2);
|
||||
populateExampleSelector(resourceType);
|
||||
})
|
||||
.catch(error => {
|
||||
console.error('Error fetching structure:', error);
|
||||
structureDisplay.innerHTML = `<div class="alert alert-danger">Error: ${error.message}</div>`;
|
||||
rawStructureContent.textContent = `Error: ${error.message}`;
|
||||
})
|
||||
.finally(() => {
|
||||
structureLoading.style.display = 'none';
|
||||
rawStructureLoading.style.display = 'none';
|
||||
});
|
||||
});
|
||||
|
||||
function populateExampleSelector(resourceOrProfileIdentifier) {
|
||||
exampleResourceTypeTitle.textContent = resourceOrProfileIdentifier;
|
||||
const availableExamples = examplesData[resourceOrProfileIdentifier] || [];
|
||||
exampleSelect.innerHTML = '<option selected value="">-- Select --</option>';
|
||||
exampleContentWrapper.style.display = 'none';
|
||||
exampleFilename.textContent = '';
|
||||
exampleContentRaw.textContent = '';
|
||||
exampleContentJson.textContent = '';
|
||||
if (availableExamples.length > 0) {
|
||||
availableExamples.forEach(filePath => {
|
||||
const filename = filePath.split('/').pop();
|
||||
const option = document.createElement('option');
|
||||
option.value = filePath;
|
||||
option.textContent = filename;
|
||||
exampleSelect.appendChild(option);
|
||||
});
|
||||
exampleSelectorWrapper.style.display = 'block';
|
||||
exampleDisplayWrapper.style.display = 'block';
|
||||
} else {
|
||||
exampleSelectorWrapper.style.display = 'none';
|
||||
exampleDisplayWrapper.style.display = 'none';
|
||||
}
|
||||
}
|
||||
|
||||
if(exampleSelect) {
|
||||
exampleSelect.addEventListener('change', function(event) {
|
||||
const selectedFilePath = this.value;
|
||||
exampleContentRaw.textContent = '';
|
||||
exampleContentJson.textContent = '';
|
||||
exampleFilename.textContent = '';
|
||||
exampleContentWrapper.style.display = 'none';
|
||||
if (!selectedFilePath || !currentPkgName || !currentPkgVersion) return;
|
||||
|
||||
const exampleParams = new URLSearchParams({
|
||||
package_name: currentPkgName,
|
||||
package_version: currentPkgVersion,
|
||||
filename: selectedFilePath
|
||||
});
|
||||
const exampleFetchUrl = `${exampleBaseUrl}?${exampleParams.toString()}`;
|
||||
console.log("Fetching example from:", exampleFetchUrl);
|
||||
exampleLoading.style.display = 'block';
|
||||
fetch(exampleFetchUrl)
|
||||
.then(response => {
|
||||
if (!response.ok) {
|
||||
return response.text().then(errText => { throw new Error(errText || `HTTP error ${response.status}`); });
|
||||
}
|
||||
return response.text();
|
||||
})
|
||||
.then(content => {
|
||||
exampleFilename.textContent = `Source: ${selectedFilePath.split('/').pop()}`;
|
||||
exampleContentRaw.textContent = content;
|
||||
try {
|
||||
const jsonContent = JSON.parse(content);
|
||||
exampleContentJson.textContent = JSON.stringify(jsonContent, null, 2);
|
||||
} catch (e) {
|
||||
exampleContentJson.textContent = 'Not valid JSON';
|
||||
}
|
||||
exampleContentWrapper.style.display = 'block';
|
||||
})
|
||||
.catch(error => {
|
||||
exampleFilename.textContent = 'Error';
|
||||
exampleContentRaw.textContent = `Error: ${error.message}`;
|
||||
exampleContentJson.textContent = `Error: ${error.message}`;
|
||||
exampleContentWrapper.style.display = 'block';
|
||||
})
|
||||
.finally(() => {
|
||||
exampleLoading.style.display = 'none';
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
function buildTreeData(elements) {
|
||||
const treeRoot = { children: {}, element: null, name: 'Root' };
|
||||
const nodeMap = {'Root': treeRoot};
|
||||
elements.forEach(el => {
|
||||
const path = el.path;
|
||||
if (!path) return;
|
||||
const parts = path.split('.');
|
||||
let currentPath = '';
|
||||
let parentNode = treeRoot;
|
||||
for(let i = 0; i < parts.length; i++) {
|
||||
const part = parts[i];
|
||||
const currentPartName = part.includes('[') ? part.substring(0, part.indexOf('[')) : part;
|
||||
currentPath = i === 0 ? currentPartName : `${currentPath}.${part}`;
|
||||
if (!nodeMap[currentPath]) {
|
||||
const newNode = { children: {}, element: null, name: part, path: currentPath };
|
||||
parentNode.children[part] = newNode;
|
||||
nodeMap[currentPath] = newNode;
|
||||
}
|
||||
if (i === parts.length - 1) {
|
||||
nodeMap[currentPath].element = el;
|
||||
}
|
||||
parentNode = nodeMap[currentPath];
|
||||
}
|
||||
});
|
||||
return Object.values(treeRoot.children);
|
||||
}
|
||||
|
||||
function renderNodeAsLi(node, mustSupportPathsSet, level = 0) {
|
||||
if (!node || !node.element) return '';
|
||||
const el = node.element;
|
||||
const path = el.path || 'N/A';
|
||||
const min = el.min !== undefined ? el.min : '';
|
||||
const max = el.max || '';
|
||||
const short = el.short || '';
|
||||
const definition = el.definition || '';
|
||||
const isMustSupport = mustSupportPathsSet.has(path);
|
||||
const mustSupportDisplay = isMustSupport ? '<i class="bi bi-check-circle-fill text-warning"></i>' : '';
|
||||
const hasChildren = Object.keys(node.children).length > 0;
|
||||
const collapseId = `collapse-${path.replace(/[\.\:]/g, '-')}`;
|
||||
const padding = level * 20;
|
||||
const pathStyle = `padding-left: ${padding}px; white-space: nowrap;`;
|
||||
let typeString = 'N/A';
|
||||
if (el.type && el.type.length > 0) {
|
||||
typeString = el.type.map(t => {
|
||||
let s = t.code || '';
|
||||
if (t.targetProfile && t.targetProfile.length > 0) {
|
||||
const targetTypes = t.targetProfile.map(p => p.split('/').pop());
|
||||
s += `(<span class="text-muted">${targetTypes.join('|')}</span>)`;
|
||||
}
|
||||
return s;
|
||||
}).join(' | ');
|
||||
}
|
||||
const liClass = isMustSupport ? 'list-group-item py-1 px-2 list-group-item-warning' : 'list-group-item py-1 px-2';
|
||||
let childrenHtml = '';
|
||||
if (hasChildren) {
|
||||
childrenHtml += `<ul class="collapse list-group list-group-flush" id="${collapseId}">`;
|
||||
Object.values(node.children).sort((a,b) => a.name.localeCompare(b.name)).forEach(childNode => {
|
||||
childrenHtml += renderNodeAsLi(childNode, mustSupportPathsSet, level + 1);
|
||||
});
|
||||
childrenHtml += `</ul>`;
|
||||
}
|
||||
let itemHtml = `<li class="${liClass}">`;
|
||||
itemHtml += `<div class="row gx-2 align-items-center ${hasChildren ? 'collapse-toggle' : ''}" ${hasChildren ? `data-bs-toggle="collapse" href="#${collapseId}" role="button" aria-expanded="false" aria-controls="${collapseId}"` : ''}>`;
|
||||
itemHtml += `<div class="col-lg-4 col-md-3 text-truncate" style="${pathStyle}"><span style="display: inline-block; width: 1.2em; text-align: center;">`;
|
||||
if (hasChildren) {
|
||||
itemHtml += `<i class="bi bi-chevron-right small toggle-icon"></i>`;
|
||||
}
|
||||
itemHtml += `</span><code class="fw-bold ms-1">${node.name}</code></div>`;
|
||||
itemHtml += `<div class="col-lg-1 col-md-1 text-center text-muted small"><code>${min}..${max}</code></div>`;
|
||||
itemHtml += `<div class="col-lg-3 col-md-3 text-truncate small">${typeString}</div>`;
|
||||
itemHtml += `<div class="col-lg-1 col-md-1 text-center">${mustSupportDisplay}</div>`;
|
||||
let descriptionTooltipAttrs = '';
|
||||
if (definition) {
|
||||
const escapedDefinition = definition.replace(/"/g, '"').replace(/'/g, ''');
|
||||
descriptionTooltipAttrs = `data-bs-toggle="tooltip" data-bs-placement="top" title="${escapedDefinition}"`;
|
||||
}
|
||||
itemHtml += `<div class="col-lg-3 col-md-4 text-muted text-truncate small" ${descriptionTooltipAttrs}>${short}</div>`;
|
||||
itemHtml += `</div>`;
|
||||
itemHtml += childrenHtml;
|
||||
itemHtml += `</li>`;
|
||||
return itemHtml;
|
||||
}
|
||||
|
||||
function renderStructureTree(elements, mustSupportPaths) {
|
||||
if (!elements || elements.length === 0) {
|
||||
structureDisplay.innerHTML = '<p class="text-muted"><em>No elements found.</em></p>';
|
||||
return;
|
||||
}
|
||||
const mustSupportPathsSet = new Set(mustSupportPaths || []);
|
||||
console.log("Rendering tree. MS Set:", mustSupportPathsSet);
|
||||
const treeData = buildTreeData(elements);
|
||||
let html = '<p><small><span class="badge bg-warning text-dark border me-1">MS</span> = Must Support (Row Highlighted)</small></p>';
|
||||
html += `<div class="row gx-2 small fw-bold border-bottom mb-1 d-none d-md-flex"><div class="col-lg-4 col-md-3">Path</div><div class="col-lg-1 col-md-1 text-center">Card.</div><div class="col-lg-3 col-md-3">Type(s)</div><div class="col-lg-1 col-md-1 text-center">MS</div><div class="col-lg-3 col-md-4">Description</div></div>`;
|
||||
html += '<ul class="list-group list-group-flush">';
|
||||
treeData.sort((a,b) => a.name.localeCompare(b.name)).forEach(rootNode => {
|
||||
html += renderNodeAsLi(rootNode, mustSupportPathsSet, 0);
|
||||
});
|
||||
html += '</ul>';
|
||||
structureDisplay.innerHTML = html;
|
||||
var tooltipTriggerList = [].slice.call(structureDisplay.querySelectorAll('[data-bs-toggle="tooltip"]'));
|
||||
tooltipTriggerList.forEach(function (tooltipTriggerEl) {
|
||||
new bootstrap.Tooltip(tooltipTriggerEl);
|
||||
});
|
||||
structureDisplay.querySelectorAll('.collapse').forEach(collapseEl => {
|
||||
collapseEl.addEventListener('show.bs.collapse', event => {
|
||||
event.target.previousElementSibling.querySelector('.toggle-icon')?.classList.replace('bi-chevron-right', 'bi-chevron-down');
|
||||
});
|
||||
collapseEl.addEventListener('hide.bs.collapse', event => {
|
||||
event.target.previousElementSibling.querySelector('.toggle-icon')?.classList.replace('bi-chevron-down', 'bi-chevron-right');
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
var tooltipTriggerList = [].slice.call(document.querySelectorAll('[data-bs-toggle="tooltip"]'));
|
||||
tooltipList.map(function (tooltipTriggerEl) { return new bootstrap.Tooltip(tooltipTriggerEl) });
|
||||
});
|
||||
</script>
|
||||
{% endblock scripts %}
|
||||
{% endblock content %} {# Main content block END #}
|
52
config.py
52
config.py
@ -1,52 +0,0 @@
|
||||
# config.py
|
||||
# Basic configuration settings for the Flask application
|
||||
|
||||
import os
|
||||
|
||||
# Determine the base directory of the application (where config.py lives)
|
||||
basedir = os.path.abspath(os.path.dirname(__file__))
|
||||
|
||||
# Define the instance path relative to the base directory
|
||||
# This seems correct if instance folder is at the same level as config.py/run.py
|
||||
instance_path = os.path.join(basedir, 'instance')
|
||||
|
||||
class Config:
|
||||
"""Base configuration class."""
|
||||
SECRET_KEY = os.environ.get('SECRET_KEY') or 'you-will-never-guess'
|
||||
|
||||
# Database configuration (Development/Production)
|
||||
# Points to 'instance/app.db' relative to config.py location
|
||||
SQLALCHEMY_DATABASE_URI = os.environ.get('DATABASE_URL') or \
|
||||
'sqlite:///' + os.path.join(instance_path, 'app.db')
|
||||
SQLALCHEMY_TRACK_MODIFICATIONS = False # Disable modification tracking
|
||||
|
||||
# Add other global configuration variables here
|
||||
SITE_NAME = "Modular PAS Framework"
|
||||
# Add any other default settings your app needs
|
||||
|
||||
|
||||
# --- ADDED Testing Configuration ---
|
||||
class TestingConfig(Config):
|
||||
"""Configuration specific to testing."""
|
||||
TESTING = True
|
||||
|
||||
# Use a separate database file for tests inside the instance folder
|
||||
# Ensures tests don't interfere with development data
|
||||
SQLALCHEMY_DATABASE_URI = 'sqlite:///' + os.path.join(instance_path, 'test.db')
|
||||
|
||||
# Disable CSRF protection during tests for simplicity
|
||||
WTF_CSRF_ENABLED = False
|
||||
|
||||
# Ensure Flask-Login works normally during tests (not disabled)
|
||||
LOGIN_DISABLED = False
|
||||
|
||||
# Use a fixed, predictable secret key for testing sessions
|
||||
SECRET_KEY = 'testing-secret-key'
|
||||
|
||||
# Inside class TestingConfig(Config):
|
||||
SERVER_NAME = 'localhost.test' # Or just 'localhost' is usually fine
|
||||
|
||||
# --- You could add other configurations like ProductionConfig(Config) later ---
|
||||
# class ProductionConfig(Config):
|
||||
# SQLALCHEMY_DATABASE_URI = os.environ.get('DATABASE_URL') # Should be set in prod env
|
||||
# # etc...
|
BIN
instance/app.db
BIN
instance/app.db
Binary file not shown.
BIN
instance/test.db
BIN
instance/test.db
Binary file not shown.
@ -1,9 +0,0 @@
|
||||
[pytest]
|
||||
# Look for tests in the tests directory
|
||||
testpaths = tests
|
||||
# Set environment variables for testing
|
||||
# Ensures Flask uses the testing config and finds the app
|
||||
#env =
|
||||
# # Use D: prefix for default values if not set externally
|
||||
# D:FLASK_APP=run.py # Or app:create_app - CHECK THIS
|
||||
# D:FLASK_ENV=testing
|
82
rebuildDB.md
82
rebuildDB.md
@ -1,82 +0,0 @@
|
||||
tep 2: Stop Container(s)
|
||||
|
||||
Bash
|
||||
|
||||
docker stop <your_container_id_or_name>
|
||||
# or if using docker compose:
|
||||
# docker compose down
|
||||
Step 3: Delete Local Database and Migrations
|
||||
|
||||
On your local Windows machine:
|
||||
|
||||
Delete the database file: C:\GIT\SudoPas_demo\instance\app.db
|
||||
Delete the test database file (if it exists): C:\GIT\SudoPas_demo\instance\test.db
|
||||
Delete the entire migrations folder: C:\GIT\SudoPas_demo\migrations\
|
||||
Step 4: Start a Temporary Container
|
||||
|
||||
Bash
|
||||
|
||||
docker compose up -d
|
||||
# OR if not using compose:
|
||||
# docker run ... your-image-name ...
|
||||
(Get the new container ID/name)
|
||||
|
||||
Step 5: Initialize Migrations Inside Container
|
||||
|
||||
Bash
|
||||
|
||||
docker exec -w /app <temp_container_id_or_name> flask db init
|
||||
Step 6: Copy New migrations Folder Out to Local
|
||||
|
||||
Run this in your local PowerShell or Command Prompt:
|
||||
|
||||
PowerShell
|
||||
|
||||
docker cp <temp_container_id_or_name>:/app/migrations C:\GIT\SudoPas_demo\migrations
|
||||
Verify the migrations folder now exists locally again, containing alembic.ini, env.py, etc., but the versions subfolder should be empty.
|
||||
|
||||
Step 7: Stop Temporary Container
|
||||
|
||||
Bash
|
||||
|
||||
docker stop <temp_container_id_or_name>
|
||||
# or if using docker compose:
|
||||
# docker compose down
|
||||
Step 8: Rebuild Docker Image
|
||||
|
||||
Crucially, rebuild the image to include the latest models.py and the new empty migrations folder:
|
||||
|
||||
Bash
|
||||
|
||||
docker compose build
|
||||
# OR
|
||||
# docker build -t your-image-name .
|
||||
Step 9: Start Final Container
|
||||
|
||||
Bash
|
||||
|
||||
docker compose up -d
|
||||
# OR
|
||||
# docker run ... your-image-name ...
|
||||
(Get the final container ID/name)
|
||||
|
||||
Step 10: Create Initial Migration Script
|
||||
|
||||
Now, generate the first migration script based on your current models:
|
||||
|
||||
Bash
|
||||
|
||||
docker exec -w /app <final_container_id_or_name> flask db migrate -m "Initial migration with User, ModuleRegistry, ProcessedIg"
|
||||
Check that a new script appeared in your local migrations/versions/ folder.
|
||||
|
||||
Step 11: Apply Migration (Create DB & Tables)
|
||||
|
||||
Run upgrade to create the new app.db and apply the initial schema:
|
||||
|
||||
Bash
|
||||
|
||||
docker exec -w /app <final_container_id_or_name> flask db upgrade
|
||||
After this, you should have a completely fresh database matching your latest models. You will need to:
|
||||
|
||||
Create your admin user again via signup or a dedicated command (if you create one).
|
||||
Re-import any FHIR IGs via the UI if you want data to test with.
|
@ -1,17 +1,3 @@
|
||||
# requirements.txt
|
||||
# List of Python packages required for the project
|
||||
|
||||
Flask
|
||||
Flask-SQLAlchemy
|
||||
Flask-Migrate
|
||||
python-dotenv
|
||||
Flask-WTF
|
||||
email-validator
|
||||
requests>=2.20 # Or later version
|
||||
Flask-Login # <-- ADD THIS LINE
|
||||
pytest>=7.0 # Or a specific later version
|
||||
# Optional, but helpful for Flask fixtures later:
|
||||
# pytest-flask>=1.0
|
||||
|
||||
|
||||
# We will add Flask-SQLAlchemy later when we set up the database
|
||||
Flask==2.3.3
|
||||
Flask-SQLAlchemy==3.0.5
|
||||
Werkzeug==2.3.7
|
45
run.py
45
run.py
@ -1,45 +0,0 @@
|
||||
# run.py
|
||||
# Main entry point to start the Flask development server.
|
||||
|
||||
import os
|
||||
from dotenv import load_dotenv
|
||||
|
||||
# Load environment variables from .env file, if it exists
|
||||
# Useful for storing sensitive info like SECRET_KEY or DATABASE_URL locally
|
||||
dotenv_path = os.path.join(os.path.dirname(__file__), '.env')
|
||||
if os.path.exists(dotenv_path):
|
||||
load_dotenv(dotenv_path)
|
||||
print("Loaded environment variables from .env file.")
|
||||
else:
|
||||
print(".env file not found, using default config or environment variables.")
|
||||
|
||||
|
||||
# Import the application factory function (from app/__init__.py, which we'll create content for next)
|
||||
# We assume the 'app' package exists with an __init__.py containing create_app()
|
||||
try:
|
||||
from app import create_app
|
||||
except ImportError as e:
|
||||
# Provide a helpful message if the app structure isn't ready yet
|
||||
print(f"Error importing create_app: {e}")
|
||||
print("Please ensure the 'app' directory and 'app/__init__.py' exist and define the create_app function.")
|
||||
# Exit or raise the error depending on desired behavior during setup
|
||||
raise
|
||||
|
||||
# Create the application instance using the factory
|
||||
# This allows for different configurations (e.g., testing) if needed later
|
||||
# We pass the configuration object from config.py
|
||||
# from config import Config # Assuming Config class is defined in config.py
|
||||
# flask_app = create_app(Config)
|
||||
# Simpler approach if create_app handles config loading internally:
|
||||
flask_app = create_app()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
# Run the Flask development server
|
||||
# debug=True enables auto-reloading and detailed error pages (DO NOT use in production)
|
||||
# host='0.0.0.0' makes the server accessible on your local network
|
||||
print("Starting Flask development server...")
|
||||
# Port can be configured via environment variable or default to 5000
|
||||
port = int(os.environ.get('PORT', 5000))
|
||||
flask_app.run(host='0.0.0.0', port=port, debug=True)
|
||||
|
@ -1,110 +0,0 @@
|
||||
# tests/conftest.py
|
||||
|
||||
import pytest
|
||||
import os
|
||||
from app import create_app, db, discover_and_register_modules # Keep import
|
||||
from config import TestingConfig
|
||||
from app.models import User
|
||||
from tests.test_control_panel import create_test_user, login # Or move helpers to a shared file
|
||||
|
||||
|
||||
# Determine the instance path for removing the test DB later
|
||||
instance_path = os.path.join(os.path.dirname(os.path.dirname(__file__)), 'instance')
|
||||
TEST_DB_PATH = os.path.join(instance_path, 'test.db')
|
||||
|
||||
# --- Use 'function' scope for better isolation ---
|
||||
@pytest.fixture(scope='function')
|
||||
def app():
|
||||
"""
|
||||
Function-scoped test Flask application. Configured for testing.
|
||||
Handles creation and teardown of the test database FOR EACH TEST.
|
||||
Initializes modules AFTER database creation.
|
||||
"""
|
||||
# Create app with TestingConfig, skip module init initially
|
||||
app = create_app(TestingConfig, init_modules=False)
|
||||
|
||||
# Establish an application context before accessing db
|
||||
with app.app_context():
|
||||
print(f"\n--- Setting up test database for function at {app.config['SQLALCHEMY_DATABASE_URI']} ---")
|
||||
# Ensure instance folder exists
|
||||
try:
|
||||
os.makedirs(app.instance_path)
|
||||
except OSError:
|
||||
pass # Already exists
|
||||
|
||||
# Remove old test database file if it exists (paranoid check)
|
||||
if os.path.exists(TEST_DB_PATH):
|
||||
# print(f"--- Removing old test database file: {TEST_DB_PATH} ---") # Can be noisy
|
||||
os.remove(TEST_DB_PATH)
|
||||
|
||||
# Create tables based on models
|
||||
db.create_all()
|
||||
print("--- Test database tables created ---") # Keep this print for confirmation
|
||||
|
||||
# --- FIX: Run discovery AFTER DB setup ---
|
||||
print("--- Initializing modules after DB setup ---") # Keep this print
|
||||
discover_and_register_modules(app) # <-- UNCOMMENTED / ADDED THIS LINE
|
||||
# --- End Module Discovery ---
|
||||
|
||||
# Yield the app instance for use in the single test function
|
||||
yield app
|
||||
|
||||
# --- Teardown (runs after each test function) ---
|
||||
# print("\n--- Tearing down test database for function ---") # Can be noisy
|
||||
db.session.remove()
|
||||
db.drop_all()
|
||||
# Optional: Remove the test database file after test run
|
||||
# if os.path.exists(TEST_DB_PATH):
|
||||
# os.remove(TEST_DB_PATH)
|
||||
|
||||
# --- Use 'function' scope ---
|
||||
@pytest.fixture(scope='function')
|
||||
def client(app):
|
||||
"""
|
||||
Provides a Flask test client for the function-scoped app.
|
||||
"""
|
||||
return app.test_client()
|
||||
|
||||
# --- Use 'function' scope ---
|
||||
@pytest.fixture(scope='function')
|
||||
def runner(app):
|
||||
"""
|
||||
Provides a Flask test CLI runner for the function-scoped app.
|
||||
"""
|
||||
return app.test_cli_runner()
|
||||
|
||||
# --- ADDED Fixture for Logged-In Admin Client ---
|
||||
@pytest.fixture(scope='function')
|
||||
def admin_client(client, app):
|
||||
"""
|
||||
Provides a test client already logged in as a pre-created admin user.
|
||||
Uses the function-scoped 'client' and 'app' fixtures.
|
||||
"""
|
||||
admin_username = "fixture_admin"
|
||||
admin_email = "fixture_admin@example.com" # Unique email
|
||||
admin_password = "password"
|
||||
|
||||
# Create admin user within the app context provided by the 'app' fixture
|
||||
with app.app_context():
|
||||
create_test_user(
|
||||
username=admin_username,
|
||||
email=admin_email,
|
||||
password=admin_password,
|
||||
role="admin"
|
||||
)
|
||||
|
||||
# Log the admin user in using the 'client' fixture
|
||||
login_res = login(client, admin_username, admin_password)
|
||||
# Basic check to ensure login likely succeeded (redirect expected)
|
||||
if login_res.status_code != 302:
|
||||
pytest.fail("Admin login failed during fixture setup.")
|
||||
|
||||
# Yield the already-logged-in client for the test
|
||||
yield client
|
||||
|
||||
# Teardown (logout) is optional as function scope cleans up,
|
||||
# but can be added for explicit cleanup if needed.
|
||||
# client.get(url_for('auth.logout'))
|
||||
|
||||
# --- Potential Future Fixtures ---
|
||||
# (Keep commented out potential session fixture)
|
@ -1,165 +0,0 @@
|
||||
# tests/test_auth.py
|
||||
|
||||
import pytest
|
||||
from flask import url_for, session, request # Keep request import
|
||||
from app.models import User
|
||||
from app import db
|
||||
from urllib.parse import urlparse, parse_qs # Keep URL parsing tools
|
||||
|
||||
# --- Helper to create a user ---
|
||||
# (Using the version that defaults email based on username)
|
||||
def create_test_user(username="testuser", email=None, password="password", role="user"):
|
||||
"""Helper function to add a user to the test database."""
|
||||
if email is None:
|
||||
email = f"{username}@example.test" # Default email based on username
|
||||
# Check if user already exists by username or email
|
||||
user = User.query.filter((User.username == username) | (User.email == email)).first()
|
||||
if user:
|
||||
print(f"\nDEBUG: Found existing test user '{user.username}' with ID {user.id}")
|
||||
return user
|
||||
user = User(username=username, email=email, role=role)
|
||||
user.set_password(password)
|
||||
db.session.add(user)
|
||||
db.session.commit()
|
||||
print(f"\nDEBUG: Created test user '{username}' (Role: {role}) with ID {user.id}")
|
||||
return user
|
||||
|
||||
# --- Tests ---
|
||||
|
||||
def test_login_page_loads(client):
|
||||
"""Test that the login page loads correctly."""
|
||||
response = client.get(url_for('auth.login'))
|
||||
assert response.status_code == 200
|
||||
assert b"Login" in response.data
|
||||
assert b"Username" in response.data
|
||||
assert b"Password" in response.data
|
||||
|
||||
def test_successful_login_as_admin(client, app):
|
||||
"""Test logging in with correct ADMIN credentials."""
|
||||
with app.app_context():
|
||||
admin_user = create_test_user(
|
||||
username="test_admin",
|
||||
email="admin@example.test",
|
||||
password="password",
|
||||
role="admin"
|
||||
)
|
||||
assert admin_user.id is not None
|
||||
assert admin_user.role == 'admin'
|
||||
|
||||
response = client.post(url_for('auth.login'), data={
|
||||
'username': 'test_admin',
|
||||
'password': 'password'
|
||||
}, follow_redirects=True)
|
||||
|
||||
assert response.status_code == 200
|
||||
assert b"Logout" in response.data
|
||||
assert bytes(f"{admin_user.username}", 'utf-8') in response.data
|
||||
assert b"Control Panel" in response.data
|
||||
# --- CORRECTED ASSERTION ---
|
||||
assert b"Manage Modules" in response.data # Check for the button/link text
|
||||
|
||||
|
||||
def test_login_wrong_password(client, app):
|
||||
"""Test logging in with incorrect password."""
|
||||
with app.app_context():
|
||||
create_test_user(username="wrong_pass_user", password="password")
|
||||
response = client.post(url_for('auth.login'), data={
|
||||
'username': 'wrong_pass_user',
|
||||
'password': 'wrongpassword'
|
||||
}, follow_redirects=True)
|
||||
assert response.status_code == 200
|
||||
assert b"Invalid username or password" in response.data
|
||||
assert b"Logout" not in response.data
|
||||
|
||||
def test_login_wrong_username(client):
|
||||
"""Test logging in with non-existent username."""
|
||||
response = client.post(url_for('auth.login'), data={
|
||||
'username': 'nosuchuser',
|
||||
'password': 'password'
|
||||
}, follow_redirects=True)
|
||||
assert response.status_code == 200
|
||||
assert b"Invalid username or password" in response.data
|
||||
assert b"Logout" not in response.data
|
||||
|
||||
def test_successful_login_as_user(client, app):
|
||||
"""Test logging in with correct USER credentials."""
|
||||
with app.app_context():
|
||||
test_user = create_test_user(
|
||||
username="test_user",
|
||||
email="user@example.test",
|
||||
password="password",
|
||||
role="user"
|
||||
)
|
||||
assert test_user.id is not None
|
||||
assert test_user.role == 'user'
|
||||
response = client.post(url_for('auth.login'), data={
|
||||
'username': 'test_user',
|
||||
'password': 'password'
|
||||
}, follow_redirects=True)
|
||||
assert response.status_code == 200
|
||||
assert b"Logout" in response.data
|
||||
assert bytes(f"{test_user.username}", 'utf-8') in response.data
|
||||
assert b"Control Panel" not in response.data
|
||||
site_name = app.config.get('SITE_NAME', 'PAS Framework')
|
||||
assert bytes(site_name, 'utf-8') in response.data
|
||||
|
||||
|
||||
# --- Replace the existing test_logout function with this: ---
|
||||
def test_logout(client, app):
|
||||
"""Test logging out."""
|
||||
with app.app_context():
|
||||
user = create_test_user(username='logout_user', password='password')
|
||||
login_res = client.post(url_for('auth.login'), data={'username': 'logout_user', 'password': 'password'})
|
||||
assert login_res.status_code == 302
|
||||
|
||||
logout_response = client.get(url_for('auth.logout'), follow_redirects=True)
|
||||
assert logout_response.status_code == 200
|
||||
assert b"You have been logged out." in logout_response.data
|
||||
assert b"Login" in logout_response.data
|
||||
assert b"Logout" not in logout_response.data
|
||||
|
||||
# Assert: Accessing protected page redirects to login
|
||||
protected_response = client.get(url_for('control_panel.index'), follow_redirects=False)
|
||||
assert protected_response.status_code == 302
|
||||
|
||||
# --- Use Manual Path Comparison ---
|
||||
redirect_location = protected_response.headers.get('Location', '')
|
||||
parsed_location = urlparse(redirect_location)
|
||||
query_params = parse_qs(parsed_location.query)
|
||||
|
||||
# Manually define the expected RELATIVE paths
|
||||
expected_login_path_manual = '/auth/login'
|
||||
expected_next_path_manual = '/control-panel/' # Includes trailing slash from previous logs
|
||||
|
||||
# Compare the path from the header with the known relative string
|
||||
assert parsed_location.path == expected_login_path_manual
|
||||
|
||||
# Check the 'next' parameter
|
||||
assert 'next' in query_params
|
||||
assert query_params['next'][0] == expected_next_path_manual
|
||||
|
||||
|
||||
# --- Replace the existing test_login_required_redirect function with this: ---
|
||||
def test_login_required_redirect(client, app):
|
||||
"""Test that accessing a protected page redirects to login when logged out."""
|
||||
# Act: Attempt to access control panel index
|
||||
response = client.get(url_for('control_panel.index'), follow_redirects=False)
|
||||
|
||||
# Assert: Check for redirect status code (302)
|
||||
assert response.status_code == 302
|
||||
|
||||
# --- Use Manual Path Comparison ---
|
||||
redirect_location = response.headers.get('Location', '')
|
||||
parsed_location = urlparse(redirect_location)
|
||||
query_params = parse_qs(parsed_location.query)
|
||||
|
||||
# Manually define the expected RELATIVE paths
|
||||
expected_login_path_manual = '/auth/login'
|
||||
expected_next_path_manual = '/control-panel/' # Includes trailing slash
|
||||
|
||||
# Compare the path from the header with the known relative string
|
||||
assert parsed_location.path == expected_login_path_manual
|
||||
|
||||
# Check the 'next' query parameter exists and has the correct value
|
||||
assert 'next' in query_params
|
||||
assert query_params['next'][0] == expected_next_path_manual
|
@ -1,222 +0,0 @@
|
||||
# tests/test_control_panel.py
|
||||
|
||||
import pytest
|
||||
from flask import url_for
|
||||
from app.models import User, ModuleRegistry # Import models
|
||||
from app import db
|
||||
from urllib.parse import urlparse, parse_qs # Make sure this is imported
|
||||
|
||||
# --- Test Helpers ---
|
||||
def create_test_user(username="testuser", email=None, password="password", role="user"):
|
||||
if email is None: email = f"{username}@example.test"
|
||||
# Use SQLAlchemy 2.0 style query
|
||||
user = db.session.scalar(db.select(User).filter((User.username == username) | (User.email == email)))
|
||||
if user: print(f"\nDEBUG: Found existing test user '{user.username}' with ID {user.id}"); return user
|
||||
user = User(username=username, email=email, role=role); user.set_password(password)
|
||||
db.session.add(user); db.session.commit()
|
||||
print(f"\nDEBUG: Created test user '{username}' (Role: {role}) with ID {user.id}"); return user
|
||||
|
||||
def login(client, username, password):
|
||||
return client.post(url_for('auth.login'), data=dict(username=username, password=password), follow_redirects=False)
|
||||
|
||||
# --- Access Control Tests ---
|
||||
def test_cp_access_admin(client, app): # PASSED
|
||||
with app.app_context(): create_test_user(username="cp_admin", password="password", role="admin")
|
||||
login_res = login(client, "cp_admin", "password"); assert login_res.status_code == 302
|
||||
cp_index_res = client.get(url_for('control_panel.index')); assert cp_index_res.status_code == 200; assert b"Control Panel" in cp_index_res.data
|
||||
module_res = client.get(url_for('control_panel.manage_modules')); assert module_res.status_code == 200; assert b"Module Management" in module_res.data
|
||||
|
||||
def test_cp_access_user(client, app): # PASSED
|
||||
with app.app_context(): create_test_user(username="cp_user", password="password", role="user")
|
||||
login_res = login(client, "cp_user", "password"); assert login_res.status_code == 302
|
||||
cp_index_res = client.get(url_for('control_panel.index')); assert cp_index_res.status_code == 403
|
||||
module_res = client.get(url_for('control_panel.manage_modules')); assert module_res.status_code == 403
|
||||
|
||||
def test_cp_access_logged_out(client, app): # PASSED
|
||||
cp_index_res = client.get(url_for('control_panel.index'), follow_redirects=False); assert cp_index_res.status_code == 302
|
||||
module_res = client.get(url_for('control_panel.manage_modules'), follow_redirects=False); assert module_res.status_code == 302
|
||||
|
||||
|
||||
# --- Module Management Tests ---
|
||||
def test_module_manager_list(client, app): # PASSED
|
||||
with app.app_context(): create_test_user(username="module_admin", password="password", role="admin")
|
||||
login(client, "module_admin", "password")
|
||||
response = client.get(url_for('control_panel.manage_modules'))
|
||||
assert response.status_code == 200; assert b"Module Management" in response.data
|
||||
assert b"example_module" in response.data; assert b"Disabled" in response.data
|
||||
|
||||
def test_module_manager_toggle(client, app): # PASSED
|
||||
with app.app_context():
|
||||
admin = create_test_user(username="toggle_admin", password="password", role="admin")
|
||||
# Use SQLAlchemy 2.0 style query
|
||||
module_entry = db.session.scalar(db.select(ModuleRegistry).filter_by(module_id='example_module'))
|
||||
assert module_entry is not None, "Check conftest.py discovery call."
|
||||
module_entry.is_enabled = False; db.session.commit()
|
||||
login(client, "toggle_admin", "password")
|
||||
# Enable
|
||||
enable_url = url_for('control_panel.toggle_module_status', module_id='example_module')
|
||||
response_enable = client.post(enable_url, follow_redirects=False)
|
||||
assert response_enable.status_code == 302; redirect_location_enable = response_enable.headers.get('Location', ''); parsed_location_enable = urlparse(redirect_location_enable); expected_path_manual = '/control-panel/modules'; assert parsed_location_enable.path == expected_path_manual
|
||||
with client.session_transaction() as sess: assert '_flashes' in sess; assert "has been enabled" in sess['_flashes'][-1][1]
|
||||
with app.app_context(): module_entry_after_enable = db.session.scalar(db.select(ModuleRegistry).filter_by(module_id='example_module')); assert module_entry_after_enable is not None and module_entry_after_enable.is_enabled is True
|
||||
# Disable
|
||||
disable_url = url_for('control_panel.toggle_module_status', module_id='example_module')
|
||||
response_disable = client.post(disable_url, follow_redirects=False)
|
||||
assert response_disable.status_code == 302; redirect_location_disable = response_disable.headers.get('Location', ''); parsed_location_disable = urlparse(redirect_location_disable); assert parsed_location_disable.path == expected_path_manual
|
||||
with client.session_transaction() as sess: assert '_flashes' in sess; assert "has been disabled" in sess['_flashes'][-1][1]
|
||||
with app.app_context(): module_entry_after_disable = db.session.scalar(db.select(ModuleRegistry).filter_by(module_id='example_module')); assert module_entry_after_disable is not None and module_entry_after_disable.is_enabled is False
|
||||
|
||||
|
||||
# --- User CRUD Tests ---
|
||||
|
||||
def test_add_user_page_loads(client, app): # PASSED
|
||||
with app.app_context(): create_test_user(username="crud_admin", password="password", role="admin")
|
||||
login(client, "crud_admin", "password")
|
||||
response = client.get(url_for('control_panel.add_user'))
|
||||
assert response.status_code == 200; assert b"Add New User" in response.data; assert b"Username" in response.data
|
||||
assert b"Email" in response.data; assert b"Password" in response.data; assert b"Repeat Password" in response.data
|
||||
assert b"Role" in response.data; assert b"Add User" in response.data
|
||||
|
||||
def test_add_user_success(client, app): # PASSED
|
||||
with app.app_context(): create_test_user(username="crud_admin_adder", password="password", role="admin")
|
||||
login(client, "crud_admin_adder", "password")
|
||||
new_username = "new_test_user"; new_email = "new@example.com"; new_password = "new_password"; new_role = "user"
|
||||
response = client.post(url_for('control_panel.add_user'), data={'username': new_username, 'email': new_email, 'password': new_password,'password2': new_password,'role': new_role,'submit': 'Add User'}, follow_redirects=True)
|
||||
assert response.status_code == 200; assert b"User new_test_user (user) added successfully!" in response.data
|
||||
assert bytes(new_username, 'utf-8') in response.data; assert bytes(new_email, 'utf-8') in response.data
|
||||
with app.app_context():
|
||||
newly_added_user = db.session.scalar(db.select(User).filter_by(username=new_username)) # Use 2.0 style
|
||||
assert newly_added_user is not None; assert newly_added_user.email == new_email; assert newly_added_user.role == new_role; assert newly_added_user.check_password(new_password) is True
|
||||
|
||||
# --- Edit User Tests ---
|
||||
|
||||
# --- MODIFIED: Use admin_client fixture ---
|
||||
def test_edit_user_page_loads(admin_client, app): # Use admin_client instead of client
|
||||
"""Test the 'Edit User' page loads correctly with user data."""
|
||||
# Arrange: Create ONLY the target user
|
||||
with app.app_context():
|
||||
target_user = create_test_user(username="edit_target", email="edit@target.com", role="user")
|
||||
target_user_id = target_user.id # Get ID
|
||||
target_user_username = target_user.username # Store username if needed for assert
|
||||
target_user_email = target_user.email
|
||||
target_user_role = target_user.role
|
||||
|
||||
# Act: Get the 'Edit User' page using the logged-in admin client
|
||||
# Pass target user's ID
|
||||
response = admin_client.get(url_for('control_panel.edit_user', user_id=target_user_id))
|
||||
|
||||
# Assert: Check page loads and contains correct pre-filled data
|
||||
assert response.status_code == 200
|
||||
assert bytes(f"Edit User", 'utf-8') in response.data # Use simpler title from route
|
||||
# Check if form fields (rendered by template using data from route) are present
|
||||
# These assertions rely on how edit_user.html renders the form passed by the route
|
||||
assert bytes(f'value="{target_user_username}"', 'utf-8') in response.data
|
||||
assert bytes(f'value="{target_user_email}"', 'utf-8') in response.data
|
||||
assert bytes(f'<option selected value="{target_user_role}">', 'utf-8') in response.data \
|
||||
or bytes(f'<option value="{target_user_role}" selected>', 'utf-8') in response.data \
|
||||
or bytes(f'<option value="{target_user_role}" selected="selected">', 'utf-8') in response.data
|
||||
assert b"Save Changes" in response.data
|
||||
|
||||
|
||||
# Keep test_edit_user_success (it passed, but ensure it uses unique users)
|
||||
def test_edit_user_success(client, app): # Keep using regular client for separation
|
||||
"""Test successfully editing a user."""
|
||||
with app.app_context():
|
||||
target_user = create_test_user(username="edit_target_success", email="edit_success@target.com", role="user")
|
||||
target_user_id = target_user.id
|
||||
# Create a distinct admin just for this test's login action
|
||||
admin = create_test_user(username="edit_admin_success", email="edit_admin_success@example.com", role="admin")
|
||||
login(client, "edit_admin_success", "password") # Log in admin for this action
|
||||
|
||||
updated_username = "edited_username"; updated_email = "edited@example.com"; updated_role = "admin"
|
||||
response = client.post(url_for('control_panel.edit_user', user_id=target_user_id), data={'username': updated_username, 'email': updated_email, 'role': updated_role, 'submit': 'Save Changes'}, follow_redirects=True)
|
||||
# ... (assertions as before) ...
|
||||
assert response.status_code == 200
|
||||
assert bytes(f"User {updated_username} updated successfully!", 'utf-8') in response.data
|
||||
assert bytes(updated_username, 'utf-8') in response.data; assert bytes(updated_email, 'utf-8') in response.data
|
||||
with app.app_context(): edited_user = db.session.get(User, target_user_id); assert edited_user is not None; assert edited_user.username == updated_username; assert edited_user.email == updated_email; assert edited_user.role == updated_role
|
||||
|
||||
|
||||
|
||||
# --- Delete User Tests ---
|
||||
|
||||
# --- CORRECTED: test_delete_user_success ---
|
||||
def test_delete_user_success(client, app):
|
||||
"""Test successfully deleting a user."""
|
||||
# Arrange
|
||||
with app.app_context():
|
||||
target_user = create_test_user(username="delete_target", email="delete@target.com", role="user")
|
||||
target_user_id = target_user.id; target_username = target_user.username
|
||||
admin = create_test_user(username="delete_admin", email="delete_admin@example.com", role="admin")
|
||||
admin_username = admin.username
|
||||
assert db.session.get(User, target_user_id) is not None
|
||||
login(client, "delete_admin", "password")
|
||||
|
||||
# Act
|
||||
response = client.post(url_for('control_panel.delete_user', user_id=target_user_id), data={'submit': 'Delete'}, follow_redirects=True)
|
||||
|
||||
# Assert 1: Verify DB deletion (Most Reliable Check)
|
||||
with app.app_context():
|
||||
deleted_user = db.session.get(User, target_user_id)
|
||||
assert deleted_user is None, "User was not deleted from the database."
|
||||
|
||||
# Assert 2: Check page status and flash message
|
||||
assert response.status_code == 200
|
||||
assert bytes(f"User {target_username} deleted successfully!", 'utf-8') in response.data
|
||||
|
||||
# --- FIX: Removed unreliable check of rendered HTML list ---
|
||||
# assert bytes(admin_username, 'utf-8') in response.data
|
||||
# assert bytes(target_username, 'utf-8') not in response.data
|
||||
# --- End Fix ---
|
||||
|
||||
|
||||
# --- Change Password Tests ---
|
||||
|
||||
def test_change_password_page_loads(admin_client, app): # Use admin_client instead of client
|
||||
"""Test the 'Change Password' page loads correctly."""
|
||||
# Arrange: Create ONLY the target user
|
||||
with app.app_context():
|
||||
target_user = create_test_user(username="pw_target", email="pw@target.com", role="user")
|
||||
target_user_id = target_user.id
|
||||
target_user_username = target_user.username # Store username if needed
|
||||
|
||||
# Act: Get the 'Change Password' page using logged-in admin client
|
||||
response = admin_client.get(url_for('control_panel.change_password', user_id=target_user_id))
|
||||
|
||||
# Assert: Check page loads and contains expected fields
|
||||
assert response.status_code == 200
|
||||
assert bytes(f"Change Password", 'utf-8') in response.data # Use simpler title
|
||||
assert b"New Password" in response.data
|
||||
assert b"Repeat New Password" in response.data
|
||||
assert b"Change Password" in response.data
|
||||
|
||||
def test_change_password_success(client, app): # Should Pass Now
|
||||
"""Test successfully changing a user's password."""
|
||||
original_password = "old_password"
|
||||
new_password = "new_secure_password"
|
||||
with app.app_context():
|
||||
target_user = create_test_user(username="pw_target_success", email="pw_success@target.com", password=original_password, role="user")
|
||||
target_username = target_user.username
|
||||
target_user_id = target_user.id
|
||||
admin = create_test_user(username="pw_admin_success", email="pw_admin_success@example.com", role="admin")
|
||||
assert target_user.check_password(original_password) is True
|
||||
login(client, "pw_admin_success", "password")
|
||||
|
||||
response = client.post(url_for('control_panel.change_password', user_id=target_user_id), data={
|
||||
'password': new_password,
|
||||
'password2': new_password,
|
||||
'submit': 'Change Password'
|
||||
}, follow_redirects=True)
|
||||
|
||||
assert response.status_code == 200
|
||||
assert bytes(f"Password for user {target_username} has been updated.", 'utf-8') in response.data
|
||||
|
||||
# Assert: Verify password change in DB
|
||||
with app.app_context():
|
||||
# FIX: Use db.session.get
|
||||
updated_user = db.session.get(User, target_user_id)
|
||||
assert updated_user is not None
|
||||
assert updated_user.check_password(new_password) is True
|
||||
assert updated_user.check_password(original_password) is False
|
||||
|
||||
# --- TODO: Add tests for Edit/Delete/ChangePW errors ---
|
@ -1,31 +0,0 @@
|
||||
# tests/test_core.py
|
||||
|
||||
from flask import url_for
|
||||
|
||||
def test_app_exists(app):
|
||||
""" Test if the Flask app fixture loads correctly. """
|
||||
assert app is not None
|
||||
|
||||
def test_request_index_page(client, app):
|
||||
"""
|
||||
Test if the index page loads successfully (GET request).
|
||||
Uses the 'client' fixture provided by conftest.py.
|
||||
"""
|
||||
# Make a GET request to the root URL ('/')
|
||||
# Note: We use '/' here, assuming your core blueprint maps '/' or '/index'
|
||||
response = client.get('/')
|
||||
|
||||
# Assert that the HTTP status code is 200 (OK)
|
||||
assert response.status_code == 200
|
||||
|
||||
# Optional: Assert that some expected content is in the response HTML
|
||||
# We access response.data, which is bytes, hence the b"..." prefix
|
||||
# Let's check for the site name defined in config.py
|
||||
site_name = app.config.get('SITE_NAME', 'PAS Framework') # Get site name from app config
|
||||
assert bytes(site_name, 'utf-8') in response.data
|
||||
|
||||
# Optional: Test using url_for within the test context
|
||||
# This requires the app context from the fixture
|
||||
# Need to ensure SERVER_NAME is set in TestingConfig if using external=True
|
||||
# response_index = client.get(url_for('core.index'))
|
||||
# assert response_index.status_code == 200
|
Loading…
x
Reference in New Issue
Block a user