mirror of
https://github.com/Comfy-Org/ComfyUI-Manager.git
synced 2025-12-15 01:27:05 +08:00
● feat: Draft pip package policy management system (not yet integrated)
Add comprehensive pip dependency conflict resolution framework as draft implementation. This is self-contained and does not affect existing ComfyUI Manager functionality. Key components: - pip_util.py with PipBatch class for policy-driven package management - Lazy-loaded policy system supporting base + user overrides - Multi-stage policy execution (uninstall → apply_first_match → apply_all_matches → restore) - Conditional policies based on platform, installed packages, and ComfyUI version - Comprehensive test suite covering edge cases, workflows, and platform scenarios - Design and implementation documentation Policy capabilities (draft): - Package replacement (e.g., PIL → Pillow, opencv-python → opencv-contrib-python) - Version pinning to prevent dependency conflicts - Dependency protection during installations - Platform-specific handling (Linux/Windows, GPU detection) - Pre-removal and post-restoration workflows Testing infrastructure: - Pytest-based test suite with isolated environments - Dependency analysis tools for conflict detection - Coverage for policy priority, edge cases, and environment recovery Status: Draft implementation complete, integration with manager workflows pending.
This commit is contained in:
parent
1ab2b1aeb3
commit
2866193baf
@ -11,4 +11,5 @@ include extras.json
|
||||
include github-stats.json
|
||||
include model-list.json
|
||||
include alter-list.json
|
||||
include comfyui_manager/channels.list.template
|
||||
include comfyui_manager/channels.list.template
|
||||
include comfyui_manager/pip-policy.json
|
||||
713
comfyui_manager/common/pip_util.design.en.md
Normal file
713
comfyui_manager/common/pip_util.design.en.md
Normal file
@ -0,0 +1,713 @@
|
||||
# Design Document for pip_util.py Implementation
|
||||
|
||||
This is designed to minimize breaking existing installed dependencies.
|
||||
|
||||
## List of Functions to Implement
|
||||
|
||||
## Global Policy Management
|
||||
|
||||
### Global Variables
|
||||
```python
|
||||
_pip_policy_cache = None # Policy cache (program-wide, loaded once)
|
||||
```
|
||||
|
||||
### Global Functions
|
||||
|
||||
* get_pip_policy(): Returns policy for resolving pip dependency conflicts (lazy loading)
|
||||
- **Call timing**: Called whenever needed (automatically loads only once on first call)
|
||||
- **Purpose**: Returns policy cache, automatically loads if cache is empty
|
||||
- **Execution flow**:
|
||||
1. Declare global _pip_policy_cache
|
||||
2. If _pip_policy_cache is already loaded, return immediately (prevent duplicate loading)
|
||||
3. Read base policy file:
|
||||
- Path: {manager_util.comfyui_manager_path}/pip-policy.json
|
||||
- Use empty dictionary if file doesn't exist
|
||||
- Log error and use empty dictionary if JSON parsing fails
|
||||
4. Read user policy file:
|
||||
- Path: {context.manager_files_path}/pip-policy.user.json
|
||||
- Create empty JSON file if doesn't exist ({"_comment": "User-specific pip policy overrides"})
|
||||
- Log warning and use empty dictionary if JSON parsing fails
|
||||
5. Apply merge rules (merge by package name):
|
||||
- Start with base policy as base
|
||||
- For each package in user policy:
|
||||
* Package only in user policy: add to base
|
||||
* Package only in base policy: keep in base
|
||||
* Package in both: completely replace with user policy (entire package replacement, not section-level)
|
||||
6. Store merged policy in _pip_policy_cache
|
||||
7. Log policy load success (include number of loaded package policies)
|
||||
8. Return _pip_policy_cache
|
||||
- **Return value**: Dict (merged policy dictionary)
|
||||
- **Exception handling**:
|
||||
- File read failure: Log warning and treat file as empty dictionary
|
||||
- JSON parsing failure: Log error and treat file as empty dictionary
|
||||
- **Notes**:
|
||||
- Lazy loading pattern automatically loads on first call
|
||||
- Not thread-safe, caution needed in multi-threaded environments
|
||||
|
||||
- Policy file structure should support the following scenarios:
|
||||
- Dictionary structure of {dependency name -> policy object}
|
||||
- Policy object has four policy sections:
|
||||
- **uninstall**: Package removal policy (pre-processing, condition optional)
|
||||
- **apply_first_match**: Evaluate top-to-bottom and execute only the first policy that satisfies condition (exclusive)
|
||||
- **apply_all_matches**: Execute all policies that satisfy conditions (cumulative)
|
||||
- **restore**: Package restoration policy (post-processing, condition optional)
|
||||
|
||||
- Condition types:
|
||||
- installed: Check version condition of already installed dependencies
|
||||
- spec is optional
|
||||
- package field: Specify package to check (optional, defaults to self)
|
||||
- Explicit: Reference another package (e.g., numba checks numpy version)
|
||||
- Omitted: Check own version (e.g., critical-package checks its own version)
|
||||
- platform: Platform conditions (os, has_gpu, comfyui_version, etc.)
|
||||
- If condition is absent, always considered satisfied
|
||||
|
||||
- uninstall policy (pre-removal policy):
|
||||
- Removal policy list (condition is optional, evaluate top-to-bottom and execute only first match)
|
||||
- When condition satisfied (or always if no condition): remove target package and abort installation
|
||||
- If this policy is applied, all subsequent steps are ignored
|
||||
- target field specifies package to remove
|
||||
- Example: Unconditionally remove if specific package is installed
|
||||
|
||||
- Actions available in apply_first_match (determine installation method, exclusive):
|
||||
- skip: Block installation of specific dependency
|
||||
- force_version: Force change to specific version during installation
|
||||
- extra_index_url field can specify custom package repository (optional)
|
||||
- replace: Replace with different dependency
|
||||
- extra_index_url field can specify custom package repository (optional)
|
||||
|
||||
- Actions available in apply_all_matches (installation options, cumulative):
|
||||
- pin_dependencies: Pin currently installed versions of other dependencies
|
||||
- pinned_packages field specifies package list
|
||||
- Example: `pip install requests urllib3==1.26.15 certifi==2023.7.22 charset-normalizer==3.2.0`
|
||||
- Real use case: Prevent urllib3 from upgrading to 2.x when installing requests
|
||||
- on_failure: "fail" or "retry_without_pin"
|
||||
- install_with: Specify additional dependencies to install together
|
||||
- warn: Record warning message in log
|
||||
|
||||
- restore policy (post-restoration policy):
|
||||
- Restoration policy list (condition is optional, evaluate top-to-bottom and execute only first match)
|
||||
- Executed after package installation completes (post-processing)
|
||||
- When condition satisfied (or always if no condition): force install target package to specific version
|
||||
- target field specifies package to restore (can be different package)
|
||||
- version field specifies version to install
|
||||
- extra_index_url field can specify custom package repository (optional)
|
||||
- Example: Reinstall/change version if specific package is deleted or wrong version
|
||||
|
||||
- Execution order:
|
||||
1. uninstall evaluation: If condition satisfied, remove package and **terminate** (ignore subsequent steps)
|
||||
2. apply_first_match evaluation:
|
||||
- Execute first policy that satisfies condition among skip/force_version/replace
|
||||
- If no matching policy, proceed with default installation of originally requested package
|
||||
3. apply_all_matches evaluation: Apply all pin_dependencies, install_with, warn that satisfy conditions
|
||||
4. Execute actual package installation (pip install or uv pip install)
|
||||
5. restore evaluation: If condition satisfied, restore target package (post-processing)
|
||||
|
||||
## Batch Unit Class (PipBatch)
|
||||
|
||||
### Class Structure
|
||||
```python
|
||||
class PipBatch:
|
||||
"""
|
||||
pip package installation batch unit manager
|
||||
Maintains pip freeze cache during batch operations for performance optimization
|
||||
|
||||
Usage pattern:
|
||||
# Batch operations (policy auto-loaded)
|
||||
with PipBatch() as batch:
|
||||
batch.ensure_not_installed()
|
||||
batch.install("numpy>=1.20")
|
||||
batch.install("pandas>=2.0")
|
||||
batch.install("scipy>=1.7")
|
||||
batch.ensure_installed()
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
self._installed_cache = None # Installed packages cache (batch-level)
|
||||
|
||||
def __enter__(self):
|
||||
return self
|
||||
|
||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||
self._installed_cache = None
|
||||
```
|
||||
|
||||
### Private Methods
|
||||
|
||||
* PipBatch._refresh_installed_cache():
|
||||
- **Purpose**: Read currently installed package information and refresh cache
|
||||
- **Execution flow**:
|
||||
1. Generate command using manager_util.make_pip_cmd(["freeze"])
|
||||
2. Execute pip freeze via subprocess
|
||||
3. Parse output:
|
||||
- Each line is in "package_name==version" format
|
||||
- Parse "package_name==version" to create dictionary
|
||||
- Ignore editable packages (starting with -e)
|
||||
- Ignore comments (starting with #)
|
||||
4. Store parsed dictionary in self._installed_cache
|
||||
- **Return value**: None
|
||||
- **Exception handling**:
|
||||
- pip freeze failure: Set cache to empty dictionary and log warning
|
||||
- Parse failure: Ignore line and continue
|
||||
|
||||
* PipBatch._get_installed_packages():
|
||||
- **Purpose**: Return cached installed package information (refresh if cache is None)
|
||||
- **Execution flow**:
|
||||
1. If self._installed_cache is None, call _refresh_installed_cache()
|
||||
2. Return self._installed_cache
|
||||
- **Return value**: {package_name: version} dictionary
|
||||
|
||||
* PipBatch._invalidate_cache():
|
||||
- **Purpose**: Invalidate cache after package install/uninstall
|
||||
- **Execution flow**:
|
||||
1. Set self._installed_cache = None
|
||||
- **Return value**: None
|
||||
- **Call timing**: After install(), ensure_not_installed(), ensure_installed()
|
||||
|
||||
* PipBatch._parse_package_spec(package_info):
|
||||
- **Purpose**: Split package spec string into package name and version spec
|
||||
- **Parameters**:
|
||||
- package_info: "numpy", "numpy==1.26.0", "numpy>=1.20.0", "numpy~=1.20", etc.
|
||||
- **Execution flow**:
|
||||
1. Use regex to split package name and version spec
|
||||
2. Pattern: `^([a-zA-Z0-9_-]+)([><=!~]+.*)?$`
|
||||
- **Return value**: (package_name, version_spec) tuple
|
||||
- Examples: ("numpy", "==1.26.0"), ("pandas", ">=2.0.0"), ("scipy", None)
|
||||
- **Exception handling**:
|
||||
- Parse failure: Raise ValueError
|
||||
|
||||
* PipBatch._evaluate_condition(condition, package_name, installed_packages):
|
||||
- **Purpose**: Evaluate policy condition and return whether satisfied
|
||||
- **Parameters**:
|
||||
- condition: Policy condition object (dictionary)
|
||||
- package_name: Name of package currently being processed
|
||||
- installed_packages: {package_name: version} dictionary
|
||||
- **Execution flow**:
|
||||
1. If condition is None, return True (always satisfied)
|
||||
2. Branch based on condition["type"]:
|
||||
a. "installed" type:
|
||||
- target_package = condition.get("package", package_name)
|
||||
- Check current version with installed_packages.get(target_package)
|
||||
- If not installed (None), return False
|
||||
- If spec exists, compare version using packaging.specifiers.SpecifierSet
|
||||
- If no spec, only check installation status (True)
|
||||
b. "platform" type:
|
||||
- If condition["os"] exists, compare with platform.system()
|
||||
- If condition["has_gpu"] exists, check GPU presence (torch.cuda.is_available(), etc.)
|
||||
- If condition["comfyui_version"] exists, compare ComfyUI version
|
||||
- Return True if all conditions satisfied
|
||||
3. Return True if all conditions satisfied, False if any unsatisfied
|
||||
- **Return value**: bool
|
||||
- **Exception handling**:
|
||||
- Version comparison failure: Log warning and return False
|
||||
- Unknown condition type: Log warning and return False
|
||||
|
||||
|
||||
### Public Methods
|
||||
|
||||
* PipBatch.install(package_info, extra_index_url=None, override_policy=False):
|
||||
- **Purpose**: Perform policy-based pip package installation (individual package basis)
|
||||
- **Parameters**:
|
||||
- package_info: Package name and version spec (e.g., "numpy", "numpy==1.26.0", "numpy>=1.20.0")
|
||||
- extra_index_url: Additional package repository URL (optional)
|
||||
- override_policy: If True, skip policy application and install directly (default: False)
|
||||
- **Execution flow**:
|
||||
1. Call get_pip_policy() to get policy (lazy loading)
|
||||
2. Use self._parse_package_spec() to split package_info into package name and version spec
|
||||
3. Call self._get_installed_packages() to get cached installed package information
|
||||
4. If override_policy=True → Jump directly to step 10 (skip policy)
|
||||
5. Get policy for package name from policy dictionary
|
||||
6. If no policy → Jump to step 10 (default installation)
|
||||
7. **apply_first_match policy evaluation** (exclusive - only first match):
|
||||
- Iterate through policy list top-to-bottom
|
||||
- Evaluate each policy's condition with self._evaluate_condition()
|
||||
- When first condition-satisfying policy found:
|
||||
* type="skip": Log reason and return False (don't install)
|
||||
* type="force_version": Change package_info version to policy's version
|
||||
* type="replace": Completely replace package_info with policy's replacement package
|
||||
- If no matching policy, keep original package_info
|
||||
8. **apply_all_matches policy evaluation** (cumulative - all matches):
|
||||
- Iterate through policy list top-to-bottom
|
||||
- Evaluate each policy's condition with self._evaluate_condition()
|
||||
- For all condition-satisfying policies:
|
||||
* type="pin_dependencies":
|
||||
- For each package in pinned_packages, query current version with self._installed_cache.get(pkg)
|
||||
- Pin to installed version in "package==version" format
|
||||
- Add to installation package list
|
||||
* type="install_with":
|
||||
- Add additional_packages to installation package list
|
||||
* type="warn":
|
||||
- Output message as warning log
|
||||
- If allow_continue=false, wait for user confirmation (optional)
|
||||
9. Compose final installation package list:
|
||||
- Main package (modified/replaced package_info)
|
||||
- Packages pinned by pin_dependencies
|
||||
- Packages added by install_with
|
||||
10. Handle extra_index_url:
|
||||
- Parameter-passed extra_index_url takes priority
|
||||
- Otherwise use extra_index_url defined in policy
|
||||
11. Generate pip/uv command using manager_util.make_pip_cmd():
|
||||
- Basic format: ["pip", "install"] + package list
|
||||
- If extra_index_url exists: add ["--extra-index-url", url]
|
||||
12. Execute command via subprocess
|
||||
13. Handle installation failure:
|
||||
- If pin_dependencies's on_failure="retry_without_pin":
|
||||
* Retry with only main package excluding pinned packages
|
||||
- If on_failure="fail":
|
||||
* Raise exception and abort installation
|
||||
- Otherwise: Log warning and continue
|
||||
14. On successful installation:
|
||||
- Call self._invalidate_cache() (invalidate cache)
|
||||
- Log info if reason exists
|
||||
- Return True
|
||||
- **Return value**: Installation success status (bool)
|
||||
- **Exception handling**:
|
||||
- Policy parsing failure: Log warning and proceed with default installation
|
||||
- Installation failure: Log error and raise exception (depends on on_failure setting)
|
||||
- **Notes**:
|
||||
- restore policy not handled in this method (batch-processed in ensure_installed())
|
||||
- uninstall policy not handled in this method (batch-processed in ensure_not_installed())
|
||||
|
||||
* PipBatch.ensure_not_installed():
|
||||
- **Purpose**: Iterate through all policies and remove all packages satisfying uninstall conditions (batch processing)
|
||||
- **Parameters**: None
|
||||
- **Execution flow**:
|
||||
1. Call get_pip_policy() to get policy (lazy loading)
|
||||
2. Call self._get_installed_packages() to get cached installed package information
|
||||
3. Iterate through all package policies in policy dictionary:
|
||||
a. Check if each package has uninstall policy
|
||||
b. If uninstall policy exists:
|
||||
- Iterate through uninstall policy list top-to-bottom
|
||||
- Evaluate each policy's condition with self._evaluate_condition()
|
||||
- When first condition-satisfying policy found:
|
||||
* Check if target package exists in self._installed_cache
|
||||
* If installed:
|
||||
- Generate command with manager_util.make_pip_cmd(["uninstall", "-y", target])
|
||||
- Execute pip uninstall via subprocess
|
||||
- Log reason in info log
|
||||
- Add to removed package list
|
||||
- Remove package from self._installed_cache
|
||||
* Move to next package (only first match per package)
|
||||
4. Complete iteration through all package policies
|
||||
- **Return value**: List of removed package names (list of str)
|
||||
- **Exception handling**:
|
||||
- Individual package removal failure: Log warning only and continue to next package
|
||||
- **Call timing**:
|
||||
- Called at batch operation start to pre-remove conflicting packages
|
||||
- Called before multiple package installations to clean installation environment
|
||||
|
||||
* PipBatch.ensure_installed():
|
||||
- **Purpose**: Iterate through all policies and restore all packages satisfying restore conditions (batch processing)
|
||||
- **Parameters**: None
|
||||
- **Execution flow**:
|
||||
1. Call get_pip_policy() to get policy (lazy loading)
|
||||
2. Call self._get_installed_packages() to get cached installed package information
|
||||
3. Iterate through all package policies in policy dictionary:
|
||||
a. Check if each package has restore policy
|
||||
b. If restore policy exists:
|
||||
- Iterate through restore policy list top-to-bottom
|
||||
- Evaluate each policy's condition with self._evaluate_condition()
|
||||
- When first condition-satisfying policy found:
|
||||
* Get target package name (policy's "target" field)
|
||||
* Get version specified in version field
|
||||
* Check current version with self._installed_cache.get(target)
|
||||
* If current version is None or different from specified version:
|
||||
- Compose as package_spec = f"{target}=={version}" format
|
||||
- Generate command with manager_util.make_pip_cmd(["install", package_spec])
|
||||
- If extra_index_url exists, add ["--extra-index-url", url]
|
||||
- Execute pip install via subprocess
|
||||
- Log reason in info log
|
||||
- Add to restored package list
|
||||
- Update cache: self._installed_cache[target] = version
|
||||
* Move to next package (only first match per package)
|
||||
4. Complete iteration through all package policies
|
||||
- **Return value**: List of restored package names (list of str)
|
||||
- **Exception handling**:
|
||||
- Individual package installation failure: Log warning only and continue to next package
|
||||
- **Call timing**:
|
||||
- Called at batch operation end to restore essential package versions
|
||||
- Called for environment verification after multiple package installations
|
||||
|
||||
|
||||
## pip-policy.json Examples
|
||||
|
||||
### Base Policy File ({manager_util.comfyui_manager_path}/pip-policy.json)
|
||||
```json
|
||||
{
|
||||
"torch": {
|
||||
"apply_first_match": [
|
||||
{
|
||||
"type": "skip",
|
||||
"reason": "PyTorch installation should be managed manually due to CUDA compatibility"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"opencv-python": {
|
||||
"apply_first_match": [
|
||||
{
|
||||
"type": "replace",
|
||||
"replacement": "opencv-contrib-python",
|
||||
"version": ">=4.8.0",
|
||||
"reason": "opencv-contrib-python includes all opencv-python features plus extras"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"PIL": {
|
||||
"apply_first_match": [
|
||||
{
|
||||
"type": "replace",
|
||||
"replacement": "Pillow",
|
||||
"reason": "PIL is deprecated, use Pillow instead"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"click": {
|
||||
"apply_first_match": [
|
||||
{
|
||||
"condition": {
|
||||
"type": "installed",
|
||||
"package": "colorama",
|
||||
"spec": "<0.5.0"
|
||||
},
|
||||
"type": "force_version",
|
||||
"version": "8.1.3",
|
||||
"reason": "click 8.1.3 compatible with colorama <0.5"
|
||||
}
|
||||
],
|
||||
"apply_all_matches": [
|
||||
{
|
||||
"type": "pin_dependencies",
|
||||
"pinned_packages": ["colorama"],
|
||||
"reason": "Prevent colorama upgrade that may break compatibility"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"requests": {
|
||||
"apply_all_matches": [
|
||||
{
|
||||
"type": "pin_dependencies",
|
||||
"pinned_packages": ["urllib3", "certifi", "charset-normalizer"],
|
||||
"on_failure": "retry_without_pin",
|
||||
"reason": "Prevent urllib3 from upgrading to 2.x which has breaking changes"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"six": {
|
||||
"restore": [
|
||||
{
|
||||
"target": "six",
|
||||
"version": "1.16.0",
|
||||
"reason": "six must be maintained at 1.16.0 for compatibility"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"urllib3": {
|
||||
"restore": [
|
||||
{
|
||||
"condition": {
|
||||
"type": "installed",
|
||||
"spec": "!=1.26.15"
|
||||
},
|
||||
"target": "urllib3",
|
||||
"version": "1.26.15",
|
||||
"reason": "urllib3 must be 1.26.15 for compatibility with legacy code"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"onnxruntime": {
|
||||
"apply_first_match": [
|
||||
{
|
||||
"condition": {
|
||||
"type": "platform",
|
||||
"os": "linux",
|
||||
"has_gpu": true
|
||||
},
|
||||
"type": "replace",
|
||||
"replacement": "onnxruntime-gpu",
|
||||
"reason": "Use GPU version on Linux with CUDA"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"legacy-custom-node-package": {
|
||||
"apply_first_match": [
|
||||
{
|
||||
"condition": {
|
||||
"type": "platform",
|
||||
"comfyui_version": "<1.0.0"
|
||||
},
|
||||
"type": "force_version",
|
||||
"version": "0.9.0",
|
||||
"reason": "legacy-custom-node-package 0.9.0 is compatible with ComfyUI <1.0.0"
|
||||
},
|
||||
{
|
||||
"condition": {
|
||||
"type": "platform",
|
||||
"comfyui_version": ">=1.0.0"
|
||||
},
|
||||
"type": "force_version",
|
||||
"version": "1.5.0",
|
||||
"reason": "legacy-custom-node-package 1.5.0 is required for ComfyUI >=1.0.0"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"tensorflow": {
|
||||
"apply_all_matches": [
|
||||
{
|
||||
"condition": {
|
||||
"type": "installed",
|
||||
"package": "torch"
|
||||
},
|
||||
"type": "warn",
|
||||
"message": "Installing TensorFlow alongside PyTorch may cause CUDA conflicts",
|
||||
"allow_continue": true
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"some-package": {
|
||||
"uninstall": [
|
||||
{
|
||||
"condition": {
|
||||
"type": "installed",
|
||||
"package": "conflicting-package",
|
||||
"spec": ">=2.0.0"
|
||||
},
|
||||
"target": "conflicting-package",
|
||||
"reason": "conflicting-package >=2.0.0 conflicts with some-package"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"banned-malicious-package": {
|
||||
"uninstall": [
|
||||
{
|
||||
"target": "banned-malicious-package",
|
||||
"reason": "Security vulnerability CVE-2024-XXXXX, always remove if attempting to install"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"critical-package": {
|
||||
"restore": [
|
||||
{
|
||||
"condition": {
|
||||
"type": "installed",
|
||||
"package": "critical-package",
|
||||
"spec": "!=1.2.3"
|
||||
},
|
||||
"target": "critical-package",
|
||||
"version": "1.2.3",
|
||||
"extra_index_url": "https://custom-repo.example.com/simple",
|
||||
"reason": "critical-package must be version 1.2.3, restore if different or missing"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"stable-package": {
|
||||
"apply_first_match": [
|
||||
{
|
||||
"condition": {
|
||||
"type": "installed",
|
||||
"package": "critical-dependency",
|
||||
"spec": ">=2.0.0"
|
||||
},
|
||||
"type": "force_version",
|
||||
"version": "1.5.0",
|
||||
"extra_index_url": "https://custom-repo.example.com/simple",
|
||||
"reason": "stable-package 1.5.0 is required when critical-dependency >=2.0.0 is installed"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"new-experimental-package": {
|
||||
"apply_all_matches": [
|
||||
{
|
||||
"type": "pin_dependencies",
|
||||
"pinned_packages": ["numpy", "pandas", "scipy"],
|
||||
"on_failure": "retry_without_pin",
|
||||
"reason": "new-experimental-package may upgrade numpy/pandas/scipy, pin them to prevent breakage"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"pytorch-addon": {
|
||||
"apply_all_matches": [
|
||||
{
|
||||
"condition": {
|
||||
"type": "installed",
|
||||
"package": "torch",
|
||||
"spec": ">=2.0.0"
|
||||
},
|
||||
"type": "pin_dependencies",
|
||||
"pinned_packages": ["torch", "torchvision", "torchaudio"],
|
||||
"on_failure": "fail",
|
||||
"reason": "pytorch-addon must not change PyTorch ecosystem versions"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Policy Structure Schema
|
||||
```json
|
||||
{
|
||||
"$schema": "http://json-schema.org/draft-07/schema#",
|
||||
"type": "object",
|
||||
"patternProperties": {
|
||||
"^.*$": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"uninstall": {
|
||||
"type": "array",
|
||||
"description": "When condition satisfied (or always if no condition), remove package and terminate",
|
||||
"items": {
|
||||
"type": "object",
|
||||
"required": ["target"],
|
||||
"properties": {
|
||||
"condition": {
|
||||
"type": "object",
|
||||
"description": "Optional: always remove if absent",
|
||||
"required": ["type"],
|
||||
"properties": {
|
||||
"type": {"enum": ["installed", "platform"]},
|
||||
"package": {"type": "string", "description": "Optional: defaults to self"},
|
||||
"spec": {"type": "string", "description": "Optional: version condition"},
|
||||
"os": {"type": "string"},
|
||||
"has_gpu": {"type": "boolean"},
|
||||
"comfyui_version": {"type": "string"}
|
||||
}
|
||||
},
|
||||
"target": {
|
||||
"type": "string",
|
||||
"description": "Package name to remove"
|
||||
},
|
||||
"reason": {"type": "string"}
|
||||
}
|
||||
}
|
||||
},
|
||||
"restore": {
|
||||
"type": "array",
|
||||
"description": "When condition satisfied (or always if no condition), restore package and terminate",
|
||||
"items": {
|
||||
"type": "object",
|
||||
"required": ["target", "version"],
|
||||
"properties": {
|
||||
"condition": {
|
||||
"type": "object",
|
||||
"description": "Optional: always restore if absent",
|
||||
"required": ["type"],
|
||||
"properties": {
|
||||
"type": {"enum": ["installed", "platform"]},
|
||||
"package": {"type": "string", "description": "Optional: defaults to self"},
|
||||
"spec": {"type": "string", "description": "Optional: version condition"},
|
||||
"os": {"type": "string"},
|
||||
"has_gpu": {"type": "boolean"},
|
||||
"comfyui_version": {"type": "string"}
|
||||
}
|
||||
},
|
||||
"target": {
|
||||
"type": "string",
|
||||
"description": "Package name to restore"
|
||||
},
|
||||
"version": {
|
||||
"type": "string",
|
||||
"description": "Version to restore"
|
||||
},
|
||||
"extra_index_url": {"type": "string"},
|
||||
"reason": {"type": "string"}
|
||||
}
|
||||
}
|
||||
},
|
||||
"apply_first_match": {
|
||||
"type": "array",
|
||||
"description": "Execute only first condition-satisfying policy (exclusive)",
|
||||
"items": {
|
||||
"type": "object",
|
||||
"required": ["type"],
|
||||
"properties": {
|
||||
"condition": {
|
||||
"type": "object",
|
||||
"description": "Optional: always apply if absent",
|
||||
"required": ["type"],
|
||||
"properties": {
|
||||
"type": {"enum": ["installed", "platform"]},
|
||||
"package": {"type": "string", "description": "Optional: defaults to self"},
|
||||
"spec": {"type": "string", "description": "Optional: version condition"},
|
||||
"os": {"type": "string"},
|
||||
"has_gpu": {"type": "boolean"},
|
||||
"comfyui_version": {"type": "string"}
|
||||
}
|
||||
},
|
||||
"type": {
|
||||
"enum": ["skip", "force_version", "replace"],
|
||||
"description": "Exclusive action: determines installation method"
|
||||
},
|
||||
"version": {"type": "string"},
|
||||
"replacement": {"type": "string"},
|
||||
"extra_index_url": {"type": "string"},
|
||||
"reason": {"type": "string"}
|
||||
}
|
||||
}
|
||||
},
|
||||
"apply_all_matches": {
|
||||
"type": "array",
|
||||
"description": "Execute all condition-satisfying policies (cumulative)",
|
||||
"items": {
|
||||
"type": "object",
|
||||
"required": ["type"],
|
||||
"properties": {
|
||||
"condition": {
|
||||
"type": "object",
|
||||
"description": "Optional: always apply if absent",
|
||||
"required": ["type"],
|
||||
"properties": {
|
||||
"type": {"enum": ["installed", "platform"]},
|
||||
"package": {"type": "string", "description": "Optional: defaults to self"},
|
||||
"spec": {"type": "string", "description": "Optional: version condition"},
|
||||
"os": {"type": "string"},
|
||||
"has_gpu": {"type": "boolean"},
|
||||
"comfyui_version": {"type": "string"}
|
||||
}
|
||||
},
|
||||
"type": {
|
||||
"enum": ["pin_dependencies", "install_with", "warn"],
|
||||
"description": "Cumulative action: adds installation options"
|
||||
},
|
||||
"pinned_packages": {
|
||||
"type": "array",
|
||||
"items": {"type": "string"}
|
||||
},
|
||||
"on_failure": {"enum": ["fail", "retry_without_pin"]},
|
||||
"additional_packages": {"type": "array"},
|
||||
"message": {"type": "string"},
|
||||
"allow_continue": {"type": "boolean"},
|
||||
"reason": {"type": "string"}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
## Error Handling
|
||||
|
||||
* Default behavior when errors occur during policy execution:
|
||||
- Log error and continue
|
||||
- Only treat as installation failure when pin_dependencies's on_failure="fail"
|
||||
- For other cases, leave warning and attempt originally requested installation
|
||||
|
||||
|
||||
* pip_install: Performs pip package installation
|
||||
- Use manager_util.make_pip_cmd to generate commands for selective application of uv and pip
|
||||
- Provide functionality to skip policy application through override_policy flag
|
||||
614
comfyui_manager/common/pip_util.implementation-plan.en.md
Normal file
614
comfyui_manager/common/pip_util.implementation-plan.en.md
Normal file
@ -0,0 +1,614 @@
|
||||
# pip_util.py Implementation Plan Document
|
||||
|
||||
## 1. Project Overview
|
||||
|
||||
### Purpose
|
||||
Implement a policy-based pip package management system that minimizes breaking existing installed dependencies
|
||||
|
||||
### Core Features
|
||||
- JSON-based policy file loading and merging (lazy loading)
|
||||
- Per-package installation policy evaluation and application
|
||||
- Performance optimization through batch-level pip freeze caching
|
||||
- Automated conditional package removal/restoration
|
||||
|
||||
### Technology Stack
|
||||
- Python 3.x
|
||||
- packaging library (version comparison)
|
||||
- subprocess (pip command execution)
|
||||
- json (policy file parsing)
|
||||
|
||||
---
|
||||
|
||||
## 2. Architecture Design
|
||||
|
||||
### 2.1 Global Policy Management (Lazy Loading Pattern)
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────┐
|
||||
│ get_pip_policy() │
|
||||
│ - Auto-loads policy files on │
|
||||
│ first call via lazy loading │
|
||||
│ - Returns cache on subsequent calls│
|
||||
└─────────────────────────────────────┘
|
||||
│
|
||||
▼
|
||||
┌─────────────────────────────────────┐
|
||||
│ _pip_policy_cache (global) │
|
||||
│ - Merged policy dictionary │
|
||||
│ - {package_name: policy_object} │
|
||||
└─────────────────────────────────────┘
|
||||
```
|
||||
|
||||
### 2.2 Batch Operation Class (PipBatch)
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────┐
|
||||
│ PipBatch (Context Manager) │
|
||||
│ ┌───────────────────────────────┐ │
|
||||
│ │ _installed_cache │ │
|
||||
│ │ - Caches pip freeze results │ │
|
||||
│ │ - {package: version} │ │
|
||||
│ └───────────────────────────────┘ │
|
||||
│ │
|
||||
│ Public Methods: │
|
||||
│ ├─ install() │
|
||||
│ ├─ ensure_not_installed() │
|
||||
│ └─ ensure_installed() │
|
||||
│ │
|
||||
│ Private Methods: │
|
||||
│ ├─ _get_installed_packages() │
|
||||
│ ├─ _refresh_installed_cache() │
|
||||
│ ├─ _invalidate_cache() │
|
||||
│ ├─ _parse_package_spec() │
|
||||
│ └─ _evaluate_condition() │
|
||||
└─────────────────────────────────────┘
|
||||
```
|
||||
|
||||
### 2.3 Policy Evaluation Flow
|
||||
|
||||
```
|
||||
install("numpy>=1.20") called
|
||||
│
|
||||
▼
|
||||
get_pip_policy() → Load policy (lazy)
|
||||
│
|
||||
▼
|
||||
Parse package name: "numpy"
|
||||
│
|
||||
▼
|
||||
Look up "numpy" policy in policy dictionary
|
||||
│
|
||||
├─ Evaluate apply_first_match (exclusive)
|
||||
│ ├─ skip → Return False (don't install)
|
||||
│ ├─ force_version → Change version
|
||||
│ └─ replace → Replace package
|
||||
│
|
||||
├─ Evaluate apply_all_matches (cumulative)
|
||||
│ ├─ pin_dependencies → Pin dependencies
|
||||
│ ├─ install_with → Additional packages
|
||||
│ └─ warn → Warning log
|
||||
│
|
||||
▼
|
||||
Execute pip install
|
||||
│
|
||||
▼
|
||||
Invalidate cache (_invalidate_cache)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 3. Phase-by-Phase Implementation Plan
|
||||
|
||||
### Phase 1: Core Infrastructure Setup (2-3 hours)
|
||||
|
||||
#### Task 1.1: Project Structure and Dependency Setup (30 min)
|
||||
**Implementation**:
|
||||
- Create `pip_util.py` file
|
||||
- Add necessary import statements
|
||||
```python
|
||||
import json
|
||||
import logging
|
||||
import platform
|
||||
import re
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
from typing import Dict, List, Optional, Tuple
|
||||
|
||||
from packaging.specifiers import SpecifierSet
|
||||
from packaging.version import Version
|
||||
|
||||
from . import manager_util, context
|
||||
```
|
||||
- Set up logging
|
||||
```python
|
||||
logger = logging.getLogger(__name__)
|
||||
```
|
||||
|
||||
**Validation**:
|
||||
- Module loads without import errors
|
||||
- Logger works correctly
|
||||
|
||||
#### Task 1.2: Global Variable and get_pip_policy() Implementation (1 hour)
|
||||
**Implementation**:
|
||||
- Declare global variable
|
||||
```python
|
||||
_pip_policy_cache: Optional[Dict] = None
|
||||
```
|
||||
- Implement `get_pip_policy()` function
|
||||
- Check cache and early return
|
||||
- Read base policy file (`{manager_util.comfyui_manager_path}/pip-policy.json`)
|
||||
- Read user policy file (`{context.manager_files_path}/pip-policy.user.json`)
|
||||
- Create file if doesn't exist (for user policy)
|
||||
- Merge policies (complete package-level replacement)
|
||||
- Save to cache and return
|
||||
|
||||
**Exception Handling**:
|
||||
- `FileNotFoundError`: File not found → Use empty dictionary
|
||||
- `json.JSONDecodeError`: JSON parse failure → Warning log + empty dictionary
|
||||
- General exception: Warning log + empty dictionary
|
||||
|
||||
**Validation**:
|
||||
- Returns empty dictionary when policy files don't exist
|
||||
- Returns correct merged result when policy files exist
|
||||
- Confirms cache usage on second call (load log appears only once)
|
||||
|
||||
#### Task 1.3: PipBatch Class Basic Structure (30 min)
|
||||
**Implementation**:
|
||||
- Class definition and `__init__`
|
||||
```python
|
||||
class PipBatch:
|
||||
def __init__(self):
|
||||
self._installed_cache: Optional[Dict[str, str]] = None
|
||||
```
|
||||
- Context manager methods (`__enter__`, `__exit__`)
|
||||
```python
|
||||
def __enter__(self):
|
||||
return self
|
||||
|
||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||
self._installed_cache = None
|
||||
return False
|
||||
```
|
||||
|
||||
**Validation**:
|
||||
- `with PipBatch() as batch:` syntax works correctly
|
||||
- Cache cleared on `__exit__` call
|
||||
|
||||
---
|
||||
|
||||
### Phase 2: Caching and Utility Methods (2-3 hours)
|
||||
|
||||
#### Task 2.1: pip freeze Caching Methods (1 hour)
|
||||
**Implementation**:
|
||||
- Implement `_refresh_installed_cache()`
|
||||
- Call `manager_util.make_pip_cmd(["freeze"])`
|
||||
- Execute command via subprocess
|
||||
- Parse output (package==version format)
|
||||
- Exclude editable packages (-e) and comments (#)
|
||||
- Convert to dictionary and store in `self._installed_cache`
|
||||
|
||||
- Implement `_get_installed_packages()`
|
||||
- Call `_refresh_installed_cache()` if cache is None
|
||||
- Return cache
|
||||
|
||||
- Implement `_invalidate_cache()`
|
||||
- Set `self._installed_cache = None`
|
||||
|
||||
**Exception Handling**:
|
||||
- `subprocess.CalledProcessError`: pip freeze failure → Empty dictionary
|
||||
- Parse error: Ignore line + warning log
|
||||
|
||||
**Validation**:
|
||||
- pip freeze results correctly parsed into dictionary
|
||||
- New load occurs after cache invalidation and re-query
|
||||
|
||||
#### Task 2.2: Package Spec Parsing (30 min)
|
||||
**Implementation**:
|
||||
- Implement `_parse_package_spec(package_info)`
|
||||
- Regex pattern: `^([a-zA-Z0-9_-]+)([><=!~]+.*)?$`
|
||||
- Split package name and version spec
|
||||
- Return tuple: `(package_name, version_spec)`
|
||||
|
||||
**Exception Handling**:
|
||||
- Parse failure: Raise `ValueError`
|
||||
|
||||
**Validation**:
|
||||
- "numpy" → ("numpy", None)
|
||||
- "numpy==1.26.0" → ("numpy", "==1.26.0")
|
||||
- "pandas>=2.0.0" → ("pandas", ">=2.0.0")
|
||||
- Invalid format → ValueError
|
||||
|
||||
#### Task 2.3: Condition Evaluation Method (1.5 hours)
|
||||
**Implementation**:
|
||||
- Implement `_evaluate_condition(condition, package_name, installed_packages)`
|
||||
|
||||
**Handling by Condition Type**:
|
||||
1. **condition is None**: Always return True
|
||||
2. **"installed" type**:
|
||||
- `target_package = condition.get("package", package_name)`
|
||||
- Check version with `installed_packages.get(target_package)`
|
||||
- If spec exists, compare using `packaging.specifiers.SpecifierSet`
|
||||
- If no spec, only check installation status
|
||||
3. **"platform" type**:
|
||||
- `os` condition: Compare with `platform.system()`
|
||||
- `has_gpu` condition: Check `torch.cuda.is_available()` (False if torch unavailable)
|
||||
- `comfyui_version` condition: TODO (currently warning)
|
||||
|
||||
**Exception Handling**:
|
||||
- Version comparison failure: Warning log + return False
|
||||
- Unknown condition type: Warning log + return False
|
||||
|
||||
**Validation**:
|
||||
- Write test cases for each condition type
|
||||
- Verify edge case handling (torch not installed, invalid version format, etc.)
|
||||
|
||||
---
|
||||
|
||||
### Phase 3: Core Installation Logic Implementation (4-5 hours)
|
||||
|
||||
#### Task 3.1: install() Method - Basic Flow (2 hours)
|
||||
**Implementation**:
|
||||
1. Parse package spec (`_parse_package_spec`)
|
||||
2. Query installed package cache (`_get_installed_packages`)
|
||||
3. If `override_policy=True`, install directly and return
|
||||
4. Call `get_pip_policy()` to load policy
|
||||
5. Default installation if no policy exists
|
||||
|
||||
**Validation**:
|
||||
- Verify policy ignored when override_policy=True
|
||||
- Verify default installation for packages without policy
|
||||
|
||||
#### Task 3.2: install() Method - apply_first_match Policy (1 hour)
|
||||
**Implementation**:
|
||||
- Iterate through policy list top-to-bottom
|
||||
- Evaluate each policy's condition (`_evaluate_condition`)
|
||||
- When condition satisfied:
|
||||
- **skip**: Log reason and return False
|
||||
- **force_version**: Force version change
|
||||
- **replace**: Replace package
|
||||
- Apply only first match (break)
|
||||
|
||||
**Validation**:
|
||||
- Verify installation blocked by skip policy
|
||||
- Verify version changed by force_version
|
||||
- Verify package replaced by replace
|
||||
|
||||
#### Task 3.3: install() Method - apply_all_matches Policy (1 hour)
|
||||
**Implementation**:
|
||||
- Iterate through policy list top-to-bottom
|
||||
- Evaluate each policy's condition
|
||||
- Apply all condition-satisfying policies:
|
||||
- **pin_dependencies**: Pin to installed version
|
||||
- **install_with**: Add to additional package list
|
||||
- **warn**: Output warning log
|
||||
|
||||
**Validation**:
|
||||
- Verify multiple policies applied simultaneously
|
||||
- Verify version pinning by pin_dependencies
|
||||
- Verify additional package installation by install_with
|
||||
|
||||
#### Task 3.4: install() Method - Installation Execution and Retry Logic (1 hour)
|
||||
**Implementation**:
|
||||
1. Compose final package list
|
||||
2. Generate command using `manager_util.make_pip_cmd()`
|
||||
3. Handle `extra_index_url`
|
||||
4. Execute installation via subprocess
|
||||
5. Handle failure based on on_failure setting:
|
||||
- `retry_without_pin`: Retry without pins
|
||||
- `fail`: Raise exception
|
||||
- Other: Warning log
|
||||
6. Invalidate cache on success
|
||||
|
||||
**Validation**:
|
||||
- Verify normal installation
|
||||
- Verify retry logic on pin failure
|
||||
- Verify error handling
|
||||
|
||||
---
|
||||
|
||||
### Phase 4: Batch Operation Methods Implementation (2-3 hours)
|
||||
|
||||
#### Task 4.1: ensure_not_installed() Implementation (1.5 hours)
|
||||
**Implementation**:
|
||||
1. Call `get_pip_policy()`
|
||||
2. Iterate through all package policies
|
||||
3. Check each package's uninstall policy
|
||||
4. When condition satisfied:
|
||||
- Check if target package is installed
|
||||
- If installed, execute `pip uninstall -y {target}`
|
||||
- Remove from cache
|
||||
- Add to removal list
|
||||
5. Execute only first match (per package)
|
||||
6. Return list of removed packages
|
||||
|
||||
**Exception Handling**:
|
||||
- Individual package removal failure: Warning log + continue
|
||||
|
||||
**Validation**:
|
||||
- Verify package removal by uninstall policy
|
||||
- Verify batch removal of multiple packages
|
||||
- Verify continued processing of other packages even on removal failure
|
||||
|
||||
#### Task 4.2: ensure_installed() Implementation (1.5 hours)
|
||||
**Implementation**:
|
||||
1. Call `get_pip_policy()`
|
||||
2. Iterate through all package policies
|
||||
3. Check each package's restore policy
|
||||
4. When condition satisfied:
|
||||
- Check target package's current version
|
||||
- If absent or different version:
|
||||
- Execute `pip install {target}=={version}`
|
||||
- Add extra_index_url if present
|
||||
- Update cache
|
||||
- Add to restoration list
|
||||
5. Execute only first match (per package)
|
||||
6. Return list of restored packages
|
||||
|
||||
**Exception Handling**:
|
||||
- Individual package installation failure: Warning log + continue
|
||||
|
||||
**Validation**:
|
||||
- Verify package restoration by restore policy
|
||||
- Verify reinstallation on version mismatch
|
||||
- Verify continued processing of other packages even on restoration failure
|
||||
|
||||
---
|
||||
|
||||
## 4. Testing Strategy
|
||||
|
||||
### 4.1 Unit Tests
|
||||
|
||||
#### Policy Loading Tests
|
||||
```python
|
||||
def test_get_pip_policy_empty():
|
||||
"""Returns empty dictionary when policy files don't exist"""
|
||||
|
||||
def test_get_pip_policy_merge():
|
||||
"""Correctly merges base and user policies"""
|
||||
|
||||
def test_get_pip_policy_cache():
|
||||
"""Uses cache on second call"""
|
||||
```
|
||||
|
||||
#### Package Parsing Tests
|
||||
```python
|
||||
def test_parse_package_spec_simple():
|
||||
"""'numpy' → ('numpy', None)"""
|
||||
|
||||
def test_parse_package_spec_version():
|
||||
"""'numpy==1.26.0' → ('numpy', '==1.26.0')"""
|
||||
|
||||
def test_parse_package_spec_range():
|
||||
"""'pandas>=2.0.0' → ('pandas', '>=2.0.0')"""
|
||||
|
||||
def test_parse_package_spec_invalid():
|
||||
"""Invalid format → ValueError"""
|
||||
```
|
||||
|
||||
#### Condition Evaluation Tests
|
||||
```python
|
||||
def test_evaluate_condition_none():
|
||||
"""None condition → True"""
|
||||
|
||||
def test_evaluate_condition_installed():
|
||||
"""Evaluates installed package condition"""
|
||||
|
||||
def test_evaluate_condition_platform():
|
||||
"""Evaluates platform condition"""
|
||||
```
|
||||
|
||||
### 4.2 Integration Tests
|
||||
|
||||
#### Installation Policy Tests
|
||||
```python
|
||||
def test_install_with_skip_policy():
|
||||
"""Blocks installation with skip policy"""
|
||||
|
||||
def test_install_with_force_version():
|
||||
"""Changes version with force_version policy"""
|
||||
|
||||
def test_install_with_replace():
|
||||
"""Replaces package with replace policy"""
|
||||
|
||||
def test_install_with_pin_dependencies():
|
||||
"""Pins versions with pin_dependencies"""
|
||||
```
|
||||
|
||||
#### Batch Operation Tests
|
||||
```python
|
||||
def test_ensure_not_installed():
|
||||
"""Removes packages with uninstall policy"""
|
||||
|
||||
def test_ensure_installed():
|
||||
"""Restores packages with restore policy"""
|
||||
|
||||
def test_batch_workflow():
|
||||
"""Tests complete batch workflow"""
|
||||
```
|
||||
|
||||
### 4.3 Edge Case Tests
|
||||
|
||||
```python
|
||||
def test_install_without_policy():
|
||||
"""Default installation for packages without policy"""
|
||||
|
||||
def test_install_override_policy():
|
||||
"""Ignores policy with override_policy=True"""
|
||||
|
||||
def test_pip_freeze_failure():
|
||||
"""Handles empty cache on pip freeze failure"""
|
||||
|
||||
def test_json_parse_error():
|
||||
"""Handles malformed JSON files"""
|
||||
|
||||
def test_subprocess_failure():
|
||||
"""Exception handling when pip command fails"""
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 5. Error Handling Strategy
|
||||
|
||||
### 5.1 Policy Loading Errors
|
||||
- **File not found**: Warning log + empty dictionary
|
||||
- **JSON parse failure**: Error log + empty dictionary
|
||||
- **No read permission**: Warning log + empty dictionary
|
||||
|
||||
### 5.2 Package Installation Errors
|
||||
- **pip command failure**: Depends on on_failure setting
|
||||
- `retry_without_pin`: Retry
|
||||
- `fail`: Raise exception
|
||||
- Other: Warning log
|
||||
- **Invalid package spec**: Raise ValueError
|
||||
|
||||
### 5.3 Batch Operation Errors
|
||||
- **Individual package failure**: Warning log + continue to next package
|
||||
- **pip freeze failure**: Empty dictionary + warning log
|
||||
|
||||
---
|
||||
|
||||
## 6. Performance Optimization
|
||||
|
||||
### 6.1 Caching Strategy
|
||||
- **Policy cache**: Reused program-wide via global variable
|
||||
- **pip freeze cache**: Reused per batch, invalidated after install/remove
|
||||
- **lazy loading**: Load only when needed
|
||||
|
||||
### 6.2 Parallel Processing Considerations
|
||||
- Current implementation is not thread-safe
|
||||
- Consider adding threading.Lock if needed
|
||||
- Batch operations execute sequentially only
|
||||
|
||||
---
|
||||
|
||||
## 7. Documentation Requirements
|
||||
|
||||
### 7.1 Code Documentation
|
||||
- Docstrings required for all public methods
|
||||
- Specify parameters, return values, and exceptions
|
||||
- Include usage examples
|
||||
|
||||
### 7.2 User Guide
|
||||
- Explain `pip-policy.json` structure
|
||||
- Policy writing examples
|
||||
- Usage pattern examples
|
||||
|
||||
### 7.3 Developer Guide
|
||||
- Architecture explanation
|
||||
- Extension methods
|
||||
- Test execution methods
|
||||
|
||||
---
|
||||
|
||||
## 8. Deployment Checklist
|
||||
|
||||
### 8.1 Code Quality
|
||||
- [ ] All unit tests pass
|
||||
- [ ] All integration tests pass
|
||||
- [ ] Code coverage ≥80%
|
||||
- [ ] No linting errors (flake8, pylint)
|
||||
- [ ] Type hints complete (mypy passes)
|
||||
|
||||
### 8.2 Documentation
|
||||
- [ ] README.md written
|
||||
- [ ] API documentation generated
|
||||
- [ ] Example policy files written
|
||||
- [ ] Usage guide written
|
||||
|
||||
### 8.3 Performance Verification
|
||||
- [ ] Policy loading performance measured (<100ms)
|
||||
- [ ] pip freeze caching effectiveness verified (≥50% speed improvement)
|
||||
- [ ] Memory usage confirmed (<10MB)
|
||||
|
||||
### 8.4 Security Verification
|
||||
- [ ] Input validation complete
|
||||
- [ ] Path traversal prevention
|
||||
- [ ] Command injection prevention
|
||||
- [ ] JSON parsing safety confirmed
|
||||
|
||||
---
|
||||
|
||||
## 9. Future Improvements
|
||||
|
||||
### 9.1 Short-term (1-2 weeks)
|
||||
- Implement ComfyUI version check
|
||||
- Implement user confirmation prompt (allow_continue=false)
|
||||
- Thread-safe improvements (add Lock)
|
||||
|
||||
### 9.2 Mid-term (1-2 months)
|
||||
- Add policy validation tools
|
||||
- Policy migration tools
|
||||
- More detailed logging and debugging options
|
||||
|
||||
### 9.3 Long-term (3-6 months)
|
||||
- Web UI for policy management
|
||||
- Provide policy templates
|
||||
- Community policy sharing system
|
||||
|
||||
---
|
||||
|
||||
## 10. Risks and Mitigation Strategies
|
||||
|
||||
### Risk 1: Policy Conflicts
|
||||
**Description**: Policies for different packages may conflict
|
||||
**Mitigation**: Develop policy validation tools, conflict detection algorithm
|
||||
|
||||
### Risk 2: pip Version Compatibility
|
||||
**Description**: Must work across various pip versions
|
||||
**Mitigation**: Test on multiple pip versions, version-specific branching
|
||||
|
||||
### Risk 3: Performance Degradation
|
||||
**Description**: Installation speed may decrease due to policy evaluation
|
||||
**Mitigation**: Optimize caching, minimize condition evaluation
|
||||
|
||||
### Risk 4: Policy Misconfiguration
|
||||
**Description**: Users may write incorrect policies
|
||||
**Mitigation**: JSON schema validation, provide examples and guides
|
||||
|
||||
---
|
||||
|
||||
## 11. Timeline
|
||||
|
||||
### Week 1
|
||||
- Phase 1: Core Infrastructure Setup (Day 1-2)
|
||||
- Phase 2: Caching and Utility Methods (Day 3-4)
|
||||
- Write unit tests (Day 5)
|
||||
|
||||
### Week 2
|
||||
- Phase 3: Core Installation Logic Implementation (Day 1-3)
|
||||
- Phase 4: Batch Operation Methods Implementation (Day 4-5)
|
||||
|
||||
### Week 3
|
||||
- Integration and edge case testing (Day 1-2)
|
||||
- Documentation (Day 3)
|
||||
- Code review and refactoring (Day 4-5)
|
||||
|
||||
### Week 4
|
||||
- Performance optimization (Day 1-2)
|
||||
- Security verification (Day 3)
|
||||
- Final testing and deployment preparation (Day 4-5)
|
||||
|
||||
---
|
||||
|
||||
## 12. Success Criteria
|
||||
|
||||
### Feature Completeness
|
||||
- ✅ All policy types (uninstall, apply_first_match, apply_all_matches, restore) work correctly
|
||||
- ✅ Policy merge logic works correctly
|
||||
- ✅ Batch operations perform normally
|
||||
|
||||
### Quality Metrics
|
||||
- ✅ Test coverage ≥80%
|
||||
- ✅ All tests pass
|
||||
- ✅ 0 linting errors
|
||||
- ✅ 100% type hint completion
|
||||
|
||||
### Performance Metrics
|
||||
- ✅ Policy loading <100ms
|
||||
- ✅ ≥50% performance improvement with pip freeze caching
|
||||
- ✅ Memory usage <10MB
|
||||
|
||||
### Usability
|
||||
- ✅ Clear error messages
|
||||
- ✅ Sufficient documentation
|
||||
- ✅ Verified in real-world use cases
|
||||
629
comfyui_manager/common/pip_util.py
Normal file
629
comfyui_manager/common/pip_util.py
Normal file
@ -0,0 +1,629 @@
|
||||
"""
|
||||
pip_util - Policy-based pip package management system
|
||||
|
||||
This module provides a policy-based approach to pip package installation
|
||||
to minimize dependency conflicts and protect existing installed packages.
|
||||
|
||||
Usage:
|
||||
# Batch operations (policy auto-loaded)
|
||||
with PipBatch() as batch:
|
||||
batch.ensure_not_installed()
|
||||
batch.install("numpy>=1.20")
|
||||
batch.install("pandas>=2.0")
|
||||
batch.install("scipy>=1.7")
|
||||
batch.ensure_installed()
|
||||
"""
|
||||
|
||||
import json
|
||||
import logging
|
||||
import platform
|
||||
import re
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
from typing import Dict, List, Optional, Tuple
|
||||
|
||||
from packaging.requirements import Requirement
|
||||
from packaging.specifiers import SpecifierSet
|
||||
from packaging.version import Version
|
||||
|
||||
from . import manager_util, context
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Global policy cache (lazy loaded on first access)
|
||||
_pip_policy_cache: Optional[Dict] = None
|
||||
|
||||
|
||||
def get_pip_policy() -> Dict:
|
||||
"""
|
||||
Get pip policy with lazy loading.
|
||||
|
||||
Returns the cached policy if available, otherwise loads it from files.
|
||||
This function automatically loads the policy on first access.
|
||||
|
||||
Thread safety: This function is NOT thread-safe.
|
||||
Ensure single-threaded access during initialization.
|
||||
|
||||
Returns:
|
||||
Dictionary of merged pip policies
|
||||
|
||||
Example:
|
||||
>>> policy = get_pip_policy()
|
||||
>>> numpy_policy = policy.get("numpy", {})
|
||||
"""
|
||||
global _pip_policy_cache
|
||||
|
||||
# Return cached policy if already loaded
|
||||
if _pip_policy_cache is not None:
|
||||
logger.debug("Returning cached pip policy")
|
||||
return _pip_policy_cache
|
||||
|
||||
logger.info("Loading pip policies...")
|
||||
|
||||
# Load base policy
|
||||
base_policy = {}
|
||||
base_policy_path = Path(manager_util.comfyui_manager_path) / "pip-policy.json"
|
||||
|
||||
try:
|
||||
if base_policy_path.exists():
|
||||
with open(base_policy_path, 'r', encoding='utf-8') as f:
|
||||
base_policy = json.load(f)
|
||||
logger.debug(f"Loaded base policy from {base_policy_path}")
|
||||
else:
|
||||
logger.warning(f"Base policy file not found: {base_policy_path}")
|
||||
except json.JSONDecodeError as e:
|
||||
logger.error(f"Failed to parse base policy JSON: {e}")
|
||||
base_policy = {}
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to read base policy file: {e}")
|
||||
base_policy = {}
|
||||
|
||||
# Load user policy
|
||||
user_policy = {}
|
||||
user_policy_path = Path(context.manager_files_path) / "pip-policy.user.json"
|
||||
|
||||
try:
|
||||
if user_policy_path.exists():
|
||||
with open(user_policy_path, 'r', encoding='utf-8') as f:
|
||||
user_policy = json.load(f)
|
||||
logger.debug(f"Loaded user policy from {user_policy_path}")
|
||||
else:
|
||||
# Create empty user policy file
|
||||
user_policy_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
with open(user_policy_path, 'w', encoding='utf-8') as f:
|
||||
json.dump({"_comment": "User-specific pip policy overrides"}, f, indent=2)
|
||||
logger.info(f"Created empty user policy file: {user_policy_path}")
|
||||
except json.JSONDecodeError as e:
|
||||
logger.warning(f"Failed to parse user policy JSON: {e}")
|
||||
user_policy = {}
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to read user policy file: {e}")
|
||||
user_policy = {}
|
||||
|
||||
# Merge policies (package-level override: user completely replaces base per package)
|
||||
merged_policy = base_policy.copy()
|
||||
for package_name, package_policy in user_policy.items():
|
||||
if package_name.startswith("_"): # Skip metadata fields like _comment
|
||||
continue
|
||||
merged_policy[package_name] = package_policy # Complete package replacement
|
||||
|
||||
# Store in global cache
|
||||
_pip_policy_cache = merged_policy
|
||||
logger.info(f"Policy loaded successfully: {len(_pip_policy_cache)} package policies")
|
||||
|
||||
return _pip_policy_cache
|
||||
|
||||
|
||||
class PipBatch:
|
||||
"""
|
||||
Pip package installation batch manager.
|
||||
|
||||
Maintains pip freeze cache during a batch of operations for performance optimization.
|
||||
|
||||
Usage pattern:
|
||||
# Batch operations (policy auto-loaded)
|
||||
with PipBatch() as batch:
|
||||
batch.ensure_not_installed()
|
||||
batch.install("numpy>=1.20")
|
||||
batch.install("pandas>=2.0")
|
||||
batch.install("scipy>=1.7")
|
||||
batch.ensure_installed()
|
||||
|
||||
Attributes:
|
||||
_installed_cache: Cache of installed packages from pip freeze
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
"""Initialize PipBatch with empty cache."""
|
||||
self._installed_cache: Optional[Dict[str, str]] = None
|
||||
|
||||
def __enter__(self):
|
||||
"""Enter context manager."""
|
||||
return self
|
||||
|
||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||
"""Exit context manager and clear cache."""
|
||||
self._installed_cache = None
|
||||
return False
|
||||
|
||||
def _refresh_installed_cache(self) -> None:
|
||||
"""
|
||||
Refresh the installed packages cache by executing pip freeze.
|
||||
|
||||
Parses pip freeze output into a dictionary of {package_name: version}.
|
||||
Ignores editable packages and comments.
|
||||
|
||||
Raises:
|
||||
No exceptions raised - failures result in empty cache with warning log
|
||||
"""
|
||||
try:
|
||||
cmd = manager_util.make_pip_cmd(["freeze"])
|
||||
result = subprocess.run(cmd, capture_output=True, text=True, check=True)
|
||||
|
||||
packages = {}
|
||||
for line in result.stdout.strip().split('\n'):
|
||||
line = line.strip()
|
||||
|
||||
# Skip empty lines
|
||||
if not line:
|
||||
continue
|
||||
|
||||
# Skip editable packages (-e /path/to/package or -e git+https://...)
|
||||
# Editable packages don't have version info and are typically development-only
|
||||
if line.startswith('-e '):
|
||||
continue
|
||||
|
||||
# Skip comments (defensive: pip freeze typically doesn't output comments,
|
||||
# but this handles manually edited requirements.txt or future pip changes)
|
||||
if line.startswith('#'):
|
||||
continue
|
||||
|
||||
# Parse package==version
|
||||
if '==' in line:
|
||||
try:
|
||||
package_name, version = line.split('==', 1)
|
||||
packages[package_name.strip()] = version.strip()
|
||||
except ValueError:
|
||||
logger.warning(f"Failed to parse pip freeze line: {line}")
|
||||
continue
|
||||
|
||||
self._installed_cache = packages
|
||||
logger.debug(f"Refreshed installed packages cache: {len(packages)} packages")
|
||||
|
||||
except subprocess.CalledProcessError as e:
|
||||
logger.warning(f"pip freeze failed: {e}")
|
||||
self._installed_cache = {}
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to refresh installed packages cache: {e}")
|
||||
self._installed_cache = {}
|
||||
|
||||
def _get_installed_packages(self) -> Dict[str, str]:
|
||||
"""
|
||||
Get cached installed packages, refresh if cache is None.
|
||||
|
||||
Returns:
|
||||
Dictionary of {package_name: version}
|
||||
"""
|
||||
if self._installed_cache is None:
|
||||
self._refresh_installed_cache()
|
||||
return self._installed_cache
|
||||
|
||||
def _invalidate_cache(self) -> None:
|
||||
"""
|
||||
Invalidate the installed packages cache.
|
||||
|
||||
Should be called after install/uninstall operations.
|
||||
"""
|
||||
self._installed_cache = None
|
||||
|
||||
def _parse_package_spec(self, package_info: str) -> Tuple[str, Optional[str]]:
|
||||
"""
|
||||
Parse package spec string into package name and version spec using PEP 508.
|
||||
|
||||
Uses the packaging library to properly parse package specifications according to
|
||||
PEP 508 standard, which handles complex cases like extras and multiple version
|
||||
constraints that simple regex cannot handle correctly.
|
||||
|
||||
Args:
|
||||
package_info: Package specification like "numpy", "numpy==1.26.0", "numpy>=1.20.0",
|
||||
or complex specs like "package[extra]>=1.0,<2.0"
|
||||
|
||||
Returns:
|
||||
Tuple of (package_name, version_spec)
|
||||
Examples: ("numpy", "==1.26.0"), ("pandas", ">=2.0.0"), ("scipy", None)
|
||||
Package names are normalized (e.g., "NumPy" -> "numpy")
|
||||
|
||||
Raises:
|
||||
ValueError: If package_info cannot be parsed according to PEP 508
|
||||
|
||||
Example:
|
||||
>>> batch._parse_package_spec("numpy>=1.20")
|
||||
("numpy", ">=1.20")
|
||||
>>> batch._parse_package_spec("requests[security]>=2.0,<3.0")
|
||||
("requests", ">=2.0,<3.0")
|
||||
"""
|
||||
try:
|
||||
req = Requirement(package_info)
|
||||
package_name = req.name # Normalized package name
|
||||
version_spec = str(req.specifier) if req.specifier else None
|
||||
return package_name, version_spec
|
||||
except Exception as e:
|
||||
raise ValueError(f"Invalid package spec: {package_info}") from e
|
||||
|
||||
def _evaluate_condition(self, condition: Optional[Dict], package_name: str,
|
||||
installed_packages: Dict[str, str]) -> bool:
|
||||
"""
|
||||
Evaluate policy condition and return whether it's satisfied.
|
||||
|
||||
Args:
|
||||
condition: Policy condition object (dict) or None
|
||||
package_name: Current package being processed
|
||||
installed_packages: Dictionary of {package_name: version}
|
||||
|
||||
Returns:
|
||||
True if condition is satisfied, False otherwise
|
||||
None condition always returns True
|
||||
|
||||
Example:
|
||||
>>> condition = {"type": "installed", "package": "numpy", "spec": ">=1.20"}
|
||||
>>> batch._evaluate_condition(condition, "numba", {"numpy": "1.26.0"})
|
||||
True
|
||||
"""
|
||||
# No condition means always satisfied
|
||||
if condition is None:
|
||||
return True
|
||||
|
||||
condition_type = condition.get("type")
|
||||
|
||||
if condition_type == "installed":
|
||||
# Check if a package is installed with optional version spec
|
||||
target_package = condition.get("package", package_name)
|
||||
installed_version = installed_packages.get(target_package)
|
||||
|
||||
# Package not installed
|
||||
if installed_version is None:
|
||||
return False
|
||||
|
||||
# Check version spec if provided
|
||||
spec = condition.get("spec")
|
||||
if spec:
|
||||
try:
|
||||
specifier = SpecifierSet(spec)
|
||||
return Version(installed_version) in specifier
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to compare version {installed_version} with spec {spec}: {e}")
|
||||
return False
|
||||
|
||||
# Package is installed (no spec check)
|
||||
return True
|
||||
|
||||
elif condition_type == "platform":
|
||||
# Check platform conditions (os, has_gpu, comfyui_version)
|
||||
conditions_met = True
|
||||
|
||||
# Check OS
|
||||
if "os" in condition:
|
||||
expected_os = condition["os"].lower()
|
||||
actual_os = platform.system().lower()
|
||||
if expected_os not in actual_os and actual_os not in expected_os:
|
||||
conditions_met = False
|
||||
|
||||
# Check GPU availability
|
||||
if "has_gpu" in condition:
|
||||
expected_gpu = condition["has_gpu"]
|
||||
try:
|
||||
import torch
|
||||
has_gpu = torch.cuda.is_available()
|
||||
except ImportError:
|
||||
has_gpu = False
|
||||
|
||||
if expected_gpu != has_gpu:
|
||||
conditions_met = False
|
||||
|
||||
# Check ComfyUI version
|
||||
if "comfyui_version" in condition:
|
||||
# TODO: Implement ComfyUI version check
|
||||
logger.warning("ComfyUI version condition not yet implemented")
|
||||
|
||||
return conditions_met
|
||||
|
||||
else:
|
||||
logger.warning(f"Unknown condition type: {condition_type}")
|
||||
return False
|
||||
|
||||
def install(self, package_info: str, extra_index_url: Optional[str] = None,
|
||||
override_policy: bool = False) -> bool:
|
||||
"""
|
||||
Install a pip package with policy-based modifications.
|
||||
|
||||
Args:
|
||||
package_info: Package specification (e.g., "numpy", "numpy==1.26.0", "numpy>=1.20.0")
|
||||
extra_index_url: Additional package repository URL (optional)
|
||||
override_policy: If True, skip policy application and install directly (default: False)
|
||||
|
||||
Returns:
|
||||
True if installation succeeded, False if skipped by policy
|
||||
|
||||
Raises:
|
||||
ValueError: If package_info cannot be parsed
|
||||
subprocess.CalledProcessError: If installation fails (depending on policy on_failure settings)
|
||||
|
||||
Example:
|
||||
>>> with PipBatch() as batch:
|
||||
... batch.install("numpy>=1.20")
|
||||
... batch.install("torch", override_policy=True)
|
||||
"""
|
||||
# Parse package spec
|
||||
try:
|
||||
package_name, version_spec = self._parse_package_spec(package_info)
|
||||
except ValueError as e:
|
||||
logger.error(f"Invalid package spec: {e}")
|
||||
raise
|
||||
|
||||
# Get installed packages cache
|
||||
installed_packages = self._get_installed_packages()
|
||||
|
||||
# Override policy - skip to direct installation
|
||||
if override_policy:
|
||||
logger.info(f"Installing {package_info} (policy override)")
|
||||
cmd = manager_util.make_pip_cmd(["install", package_info])
|
||||
if extra_index_url:
|
||||
cmd.extend(["--extra-index-url", extra_index_url])
|
||||
|
||||
try:
|
||||
subprocess.run(cmd, check=True)
|
||||
self._invalidate_cache()
|
||||
logger.info(f"Successfully installed {package_info}")
|
||||
return True
|
||||
except subprocess.CalledProcessError as e:
|
||||
logger.error(f"Failed to install {package_info}: {e}")
|
||||
raise
|
||||
|
||||
# Get policy (lazy loading)
|
||||
pip_policy = get_pip_policy()
|
||||
policy = pip_policy.get(package_name, {})
|
||||
|
||||
# If no policy, proceed with default installation
|
||||
if not policy:
|
||||
logger.debug(f"No policy found for {package_name}, proceeding with default installation")
|
||||
cmd = manager_util.make_pip_cmd(["install", package_info])
|
||||
if extra_index_url:
|
||||
cmd.extend(["--extra-index-url", extra_index_url])
|
||||
|
||||
try:
|
||||
subprocess.run(cmd, check=True)
|
||||
self._invalidate_cache()
|
||||
logger.info(f"Successfully installed {package_info}")
|
||||
return True
|
||||
except subprocess.CalledProcessError as e:
|
||||
logger.error(f"Failed to install {package_info}: {e}")
|
||||
raise
|
||||
|
||||
# Apply apply_first_match policies (exclusive - first match only)
|
||||
final_package_info = package_info
|
||||
final_extra_index_url = extra_index_url
|
||||
policy_reason = None
|
||||
|
||||
apply_first_match = policy.get("apply_first_match", [])
|
||||
for policy_item in apply_first_match:
|
||||
condition = policy_item.get("condition")
|
||||
if self._evaluate_condition(condition, package_name, installed_packages):
|
||||
policy_type = policy_item.get("type")
|
||||
|
||||
if policy_type == "skip":
|
||||
reason = policy_item.get("reason", "No reason provided")
|
||||
logger.info(f"Skipping installation of {package_name}: {reason}")
|
||||
return False
|
||||
|
||||
elif policy_type == "force_version":
|
||||
forced_version = policy_item.get("version")
|
||||
final_package_info = f"{package_name}=={forced_version}"
|
||||
policy_reason = policy_item.get("reason")
|
||||
if "extra_index_url" in policy_item:
|
||||
final_extra_index_url = policy_item["extra_index_url"]
|
||||
logger.info(f"Force version for {package_name}: {forced_version} ({policy_reason})")
|
||||
break # First match only
|
||||
|
||||
elif policy_type == "replace":
|
||||
replacement = policy_item.get("replacement")
|
||||
replacement_version = policy_item.get("version", "")
|
||||
if replacement_version:
|
||||
final_package_info = f"{replacement}{replacement_version}"
|
||||
else:
|
||||
final_package_info = replacement
|
||||
policy_reason = policy_item.get("reason")
|
||||
if "extra_index_url" in policy_item:
|
||||
final_extra_index_url = policy_item["extra_index_url"]
|
||||
logger.info(f"Replacing {package_name} with {final_package_info}: {policy_reason}")
|
||||
break # First match only
|
||||
|
||||
# Apply apply_all_matches policies (cumulative - all matches)
|
||||
additional_packages = []
|
||||
pinned_packages = []
|
||||
pin_on_failure = "fail"
|
||||
|
||||
apply_all_matches = policy.get("apply_all_matches", [])
|
||||
for policy_item in apply_all_matches:
|
||||
condition = policy_item.get("condition")
|
||||
if self._evaluate_condition(condition, package_name, installed_packages):
|
||||
policy_type = policy_item.get("type")
|
||||
|
||||
if policy_type == "pin_dependencies":
|
||||
pin_list = policy_item.get("pinned_packages", [])
|
||||
for pkg in pin_list:
|
||||
installed_version = installed_packages.get(pkg)
|
||||
if installed_version:
|
||||
pinned_packages.append(f"{pkg}=={installed_version}")
|
||||
else:
|
||||
logger.warning(f"Cannot pin {pkg}: not currently installed")
|
||||
pin_on_failure = policy_item.get("on_failure", "fail")
|
||||
reason = policy_item.get("reason", "")
|
||||
logger.info(f"Pinning dependencies: {pinned_packages} ({reason})")
|
||||
|
||||
elif policy_type == "install_with":
|
||||
additional = policy_item.get("additional_packages", [])
|
||||
additional_packages.extend(additional)
|
||||
reason = policy_item.get("reason", "")
|
||||
logger.info(f"Installing additional packages: {additional} ({reason})")
|
||||
|
||||
elif policy_type == "warn":
|
||||
message = policy_item.get("message", "")
|
||||
allow_continue = policy_item.get("allow_continue", True)
|
||||
logger.warning(f"Policy warning for {package_name}: {message}")
|
||||
if not allow_continue:
|
||||
# TODO: Implement user confirmation
|
||||
logger.info("User confirmation required (not implemented, continuing)")
|
||||
|
||||
# Build final package list
|
||||
packages_to_install = [final_package_info] + pinned_packages + additional_packages
|
||||
|
||||
# Execute installation
|
||||
cmd = manager_util.make_pip_cmd(["install"] + packages_to_install)
|
||||
if final_extra_index_url:
|
||||
cmd.extend(["--extra-index-url", final_extra_index_url])
|
||||
|
||||
try:
|
||||
subprocess.run(cmd, check=True)
|
||||
self._invalidate_cache()
|
||||
if policy_reason:
|
||||
logger.info(f"Successfully installed {final_package_info}: {policy_reason}")
|
||||
else:
|
||||
logger.info(f"Successfully installed {final_package_info}")
|
||||
return True
|
||||
|
||||
except subprocess.CalledProcessError as e:
|
||||
# Handle installation failure
|
||||
if pinned_packages and pin_on_failure == "retry_without_pin":
|
||||
logger.warning(f"Installation failed with pinned dependencies, retrying without pins")
|
||||
retry_cmd = manager_util.make_pip_cmd(["install", final_package_info])
|
||||
if final_extra_index_url:
|
||||
retry_cmd.extend(["--extra-index-url", final_extra_index_url])
|
||||
|
||||
try:
|
||||
subprocess.run(retry_cmd, check=True)
|
||||
self._invalidate_cache()
|
||||
logger.info(f"Successfully installed {final_package_info} (without pins)")
|
||||
return True
|
||||
except subprocess.CalledProcessError as retry_error:
|
||||
logger.error(f"Retry installation also failed: {retry_error}")
|
||||
raise
|
||||
|
||||
elif pin_on_failure == "fail":
|
||||
logger.error(f"Installation failed: {e}")
|
||||
raise
|
||||
|
||||
else:
|
||||
logger.warning(f"Installation failed, but continuing: {e}")
|
||||
return False
|
||||
|
||||
def ensure_not_installed(self) -> List[str]:
|
||||
"""
|
||||
Remove all packages matching uninstall policies (batch processing).
|
||||
|
||||
Iterates through all package policies and executes uninstall actions
|
||||
where conditions are satisfied.
|
||||
|
||||
Returns:
|
||||
List of removed package names
|
||||
|
||||
Example:
|
||||
>>> with PipBatch() as batch:
|
||||
... removed = batch.ensure_not_installed()
|
||||
... print(f"Removed: {removed}")
|
||||
"""
|
||||
# Get policy (lazy loading)
|
||||
pip_policy = get_pip_policy()
|
||||
|
||||
installed_packages = self._get_installed_packages()
|
||||
removed_packages = []
|
||||
|
||||
for package_name, policy in pip_policy.items():
|
||||
uninstall_policies = policy.get("uninstall", [])
|
||||
|
||||
for uninstall_policy in uninstall_policies:
|
||||
condition = uninstall_policy.get("condition")
|
||||
|
||||
if self._evaluate_condition(condition, package_name, installed_packages):
|
||||
target = uninstall_policy.get("target")
|
||||
reason = uninstall_policy.get("reason", "No reason provided")
|
||||
|
||||
# Check if target is installed
|
||||
if target in installed_packages:
|
||||
try:
|
||||
cmd = manager_util.make_pip_cmd(["uninstall", "-y", target])
|
||||
subprocess.run(cmd, check=True)
|
||||
|
||||
logger.info(f"Uninstalled {target}: {reason}")
|
||||
removed_packages.append(target)
|
||||
|
||||
# Remove from cache
|
||||
del installed_packages[target]
|
||||
|
||||
except subprocess.CalledProcessError as e:
|
||||
logger.warning(f"Failed to uninstall {target}: {e}")
|
||||
|
||||
# First match only per package
|
||||
break
|
||||
|
||||
return removed_packages
|
||||
|
||||
def ensure_installed(self) -> List[str]:
|
||||
"""
|
||||
Restore all packages matching restore policies (batch processing).
|
||||
|
||||
Iterates through all package policies and executes restore actions
|
||||
where conditions are satisfied.
|
||||
|
||||
Returns:
|
||||
List of restored package names
|
||||
|
||||
Example:
|
||||
>>> with PipBatch() as batch:
|
||||
... batch.install("numpy>=1.20")
|
||||
... restored = batch.ensure_installed()
|
||||
... print(f"Restored: {restored}")
|
||||
"""
|
||||
# Get policy (lazy loading)
|
||||
pip_policy = get_pip_policy()
|
||||
|
||||
installed_packages = self._get_installed_packages()
|
||||
restored_packages = []
|
||||
|
||||
for package_name, policy in pip_policy.items():
|
||||
restore_policies = policy.get("restore", [])
|
||||
|
||||
for restore_policy in restore_policies:
|
||||
condition = restore_policy.get("condition")
|
||||
|
||||
if self._evaluate_condition(condition, package_name, installed_packages):
|
||||
target = restore_policy.get("target")
|
||||
version = restore_policy.get("version")
|
||||
reason = restore_policy.get("reason", "No reason provided")
|
||||
extra_index_url = restore_policy.get("extra_index_url")
|
||||
|
||||
# Check if target needs restoration
|
||||
current_version = installed_packages.get(target)
|
||||
|
||||
if current_version is None or current_version != version:
|
||||
try:
|
||||
package_spec = f"{target}=={version}"
|
||||
cmd = manager_util.make_pip_cmd(["install", package_spec])
|
||||
|
||||
if extra_index_url:
|
||||
cmd.extend(["--extra-index-url", extra_index_url])
|
||||
|
||||
subprocess.run(cmd, check=True)
|
||||
|
||||
logger.info(f"Restored {package_spec}: {reason}")
|
||||
restored_packages.append(target)
|
||||
|
||||
# Update cache
|
||||
installed_packages[target] = version
|
||||
|
||||
except subprocess.CalledProcessError as e:
|
||||
logger.warning(f"Failed to restore {target}: {e}")
|
||||
|
||||
# First match only per package
|
||||
break
|
||||
|
||||
return restored_packages
|
||||
2916
comfyui_manager/common/pip_util.test-design.md
Normal file
2916
comfyui_manager/common/pip_util.test-design.md
Normal file
File diff suppressed because it is too large
Load Diff
34
tests/.gitignore
vendored
Normal file
34
tests/.gitignore
vendored
Normal file
@ -0,0 +1,34 @@
|
||||
# Test environment and artifacts
|
||||
|
||||
# Virtual environment
|
||||
test_venv/
|
||||
venv/
|
||||
env/
|
||||
|
||||
# pytest cache
|
||||
.pytest_cache/
|
||||
__pycache__/
|
||||
*.pyc
|
||||
*.pyo
|
||||
|
||||
# Coverage reports (module-specific naming)
|
||||
.coverage
|
||||
.coverage.*
|
||||
htmlcov*/
|
||||
coverage*.xml
|
||||
*.cover
|
||||
|
||||
# Test artifacts
|
||||
.tox/
|
||||
.hypothesis/
|
||||
|
||||
# IDE
|
||||
.vscode/
|
||||
.idea/
|
||||
*.swp
|
||||
*.swo
|
||||
*~
|
||||
|
||||
# OS
|
||||
.DS_Store
|
||||
Thumbs.db
|
||||
181
tests/README.md
Normal file
181
tests/README.md
Normal file
@ -0,0 +1,181 @@
|
||||
# ComfyUI Manager Test Suite
|
||||
|
||||
This directory contains all tests for the ComfyUI Manager project, organized by module structure.
|
||||
|
||||
## Directory Structure
|
||||
|
||||
```
|
||||
tests/
|
||||
├── setup_test_env.sh # Setup isolated test environment
|
||||
├── requirements.txt # Test dependencies
|
||||
├── pytest.ini # Global pytest configuration
|
||||
├── .gitignore # Ignore test artifacts
|
||||
│
|
||||
└── common/ # Tests for comfyui_manager/common/
|
||||
└── pip_util/ # Tests for pip_util.py
|
||||
├── README.md # pip_util test documentation
|
||||
├── conftest.py # pip_util test fixtures
|
||||
├── pytest.ini # pip_util-specific pytest config
|
||||
└── test_*.py # Actual test files (to be created)
|
||||
```
|
||||
|
||||
## Quick Start
|
||||
|
||||
### 1. Setup Test Environment (One Time)
|
||||
|
||||
```bash
|
||||
cd tests
|
||||
./setup_test_env.sh
|
||||
```
|
||||
|
||||
This creates an isolated virtual environment with all test dependencies.
|
||||
|
||||
### 2. Run Tests
|
||||
|
||||
```bash
|
||||
# Activate test environment
|
||||
source test_venv/bin/activate
|
||||
|
||||
# Run all tests from root
|
||||
cd tests
|
||||
pytest
|
||||
|
||||
# Run specific module tests
|
||||
cd tests/common/pip_util
|
||||
pytest
|
||||
|
||||
# Deactivate when done
|
||||
deactivate
|
||||
```
|
||||
|
||||
## Test Organization
|
||||
|
||||
Tests mirror the source code structure:
|
||||
|
||||
| Source Code | Test Location |
|
||||
|-------------|---------------|
|
||||
| `comfyui_manager/common/pip_util.py` | `tests/common/pip_util/test_*.py` |
|
||||
| `comfyui_manager/common/other.py` | `tests/common/other/test_*.py` |
|
||||
| `comfyui_manager/module/file.py` | `tests/module/file/test_*.py` |
|
||||
|
||||
## Writing Tests
|
||||
|
||||
1. Create test directory matching source structure
|
||||
2. Add `conftest.py` for module-specific fixtures
|
||||
3. Add `pytest.ini` for module-specific configuration (optional)
|
||||
4. Create `test_*.py` files with actual tests
|
||||
5. Document in module-specific README
|
||||
|
||||
## Test Categories
|
||||
|
||||
Use pytest markers to categorize tests:
|
||||
|
||||
```python
|
||||
@pytest.mark.unit
|
||||
def test_simple_function():
|
||||
pass
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_complex_workflow():
|
||||
pass
|
||||
|
||||
@pytest.mark.e2e
|
||||
def test_full_system():
|
||||
pass
|
||||
```
|
||||
|
||||
Run by category:
|
||||
```bash
|
||||
pytest -m unit # Only unit tests
|
||||
pytest -m integration # Only integration tests
|
||||
pytest -m e2e # Only end-to-end tests
|
||||
```
|
||||
|
||||
## Coverage Reports
|
||||
|
||||
Coverage reports are generated per module:
|
||||
|
||||
```bash
|
||||
cd tests/common/pip_util
|
||||
pytest # Generates htmlcov_pip_util/ and coverage_pip_util.xml
|
||||
```
|
||||
|
||||
## Environment Isolation
|
||||
|
||||
**Why use venv?**
|
||||
- ✅ Prevents test dependencies from corrupting main environment
|
||||
- ✅ Allows safe package installation/uninstallation during tests
|
||||
- ✅ Consistent test results across machines
|
||||
- ✅ Easy to recreate clean environment
|
||||
|
||||
## Available Test Modules
|
||||
|
||||
- **[common/pip_util](common/pip_util/)** - Policy-based pip package management system tests
|
||||
- Unit tests for policy loading, parsing, condition evaluation
|
||||
- Integration tests for policy application (60% of tests)
|
||||
- End-to-end workflow tests
|
||||
|
||||
## Adding New Test Modules
|
||||
|
||||
1. Create directory structure: `tests/module_path/component_name/`
|
||||
2. Add `conftest.py` with fixtures
|
||||
3. Add `pytest.ini` if needed (optional)
|
||||
4. Add `README.md` documenting the tests
|
||||
5. Create `test_*.py` files
|
||||
|
||||
Example:
|
||||
```bash
|
||||
mkdir -p tests/data_models/config
|
||||
cd tests/data_models/config
|
||||
touch conftest.py README.md test_config_loader.py
|
||||
```
|
||||
|
||||
## CI/CD Integration
|
||||
|
||||
Tests are designed to run in CI/CD pipelines:
|
||||
|
||||
```yaml
|
||||
# Example GitHub Actions
|
||||
- name: Setup test environment
|
||||
run: |
|
||||
cd tests
|
||||
./setup_test_env.sh
|
||||
|
||||
- name: Run tests
|
||||
run: |
|
||||
source tests/test_venv/bin/activate
|
||||
pytest tests/
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Import errors
|
||||
```bash
|
||||
# Make sure venv is activated
|
||||
source test_venv/bin/activate
|
||||
|
||||
# Verify Python path
|
||||
python -c "import sys; print(sys.path)"
|
||||
```
|
||||
|
||||
### Tests not discovered
|
||||
```bash
|
||||
# Check pytest configuration
|
||||
pytest --collect-only
|
||||
|
||||
# Verify test file naming (must start with test_)
|
||||
ls test_*.py
|
||||
```
|
||||
|
||||
### Clean rebuild
|
||||
```bash
|
||||
# Remove and recreate test environment
|
||||
rm -rf test_venv/
|
||||
./setup_test_env.sh
|
||||
```
|
||||
|
||||
## Resources
|
||||
|
||||
- **pytest Documentation**: https://docs.pytest.org/
|
||||
- **Coverage.py**: https://coverage.readthedocs.io/
|
||||
- **Module-specific READMEs**: Check each test module directory
|
||||
423
tests/common/pip_util/CONTEXT_FILES_GUIDE.md
Normal file
423
tests/common/pip_util/CONTEXT_FILES_GUIDE.md
Normal file
@ -0,0 +1,423 @@
|
||||
# Context Files Guide for pip_util Tests
|
||||
|
||||
Quick reference for all context files created for extending pip_util tests.
|
||||
|
||||
---
|
||||
|
||||
## 📋 File Overview
|
||||
|
||||
| File | Purpose | When to Use |
|
||||
|------|---------|-------------|
|
||||
| **DEPENDENCY_TREE_CONTEXT.md** | Complete dependency trees with version analysis | Adding new test packages or updating scenarios |
|
||||
| **DEPENDENCY_ANALYSIS.md** | Analysis methodology and findings | Understanding why packages were chosen |
|
||||
| **TEST_SCENARIOS.md** | Detailed test specifications | Writing new tests or understanding existing ones |
|
||||
| **analyze_dependencies.py** | Interactive dependency analyzer | Exploring new packages before adding tests |
|
||||
| **requirements-test-base.txt** | Base test environment packages | Setting up or modifying test environment |
|
||||
|
||||
---
|
||||
|
||||
## 🎯 Common Tasks
|
||||
|
||||
### Task 1: Adding a New Test Package
|
||||
|
||||
**Steps**:
|
||||
|
||||
1. **Analyze the package**:
|
||||
```bash
|
||||
python analyze_dependencies.py NEW_PACKAGE
|
||||
```
|
||||
|
||||
2. **Check size and dependencies**:
|
||||
```bash
|
||||
./test_venv/bin/pip download --no-deps NEW_PACKAGE
|
||||
ls -lh NEW_PACKAGE*.whl # Check size
|
||||
```
|
||||
|
||||
3. **Verify dependency tree**:
|
||||
- Open **DEPENDENCY_TREE_CONTEXT.md**
|
||||
- Follow "Adding New Test Scenarios" section
|
||||
- Document findings in the file
|
||||
|
||||
4. **Update requirements** (if pre-installation needed):
|
||||
- Add to `requirements-test-base.txt`
|
||||
- Run `./setup_test_env.sh` to recreate venv
|
||||
|
||||
5. **Write test**:
|
||||
- Follow patterns in `test_dependency_protection.py`
|
||||
- Use `reset_test_venv` fixture
|
||||
- Add scenario to **TEST_SCENARIOS.md**
|
||||
|
||||
6. **Verify**:
|
||||
```bash
|
||||
pytest test_YOUR_NEW_TEST.py -v --override-ini="addopts="
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Task 2: Understanding Existing Tests
|
||||
|
||||
**Steps**:
|
||||
|
||||
1. **Read test scenario**:
|
||||
- Open **TEST_SCENARIOS.md**
|
||||
- Find your scenario (1-6)
|
||||
- Review initial state, action, expected result
|
||||
|
||||
2. **Check dependency details**:
|
||||
- Open **DEPENDENCY_TREE_CONTEXT.md**
|
||||
- Look up package in table of contents
|
||||
- Review dependency tree and version analysis
|
||||
|
||||
3. **Run analysis**:
|
||||
```bash
|
||||
python analyze_dependencies.py PACKAGE
|
||||
```
|
||||
|
||||
4. **Examine test code**:
|
||||
- Open relevant test file
|
||||
- Check policy fixture
|
||||
- Review assertions
|
||||
|
||||
---
|
||||
|
||||
### Task 3: Updating for New Package Versions
|
||||
|
||||
**When**: PyPI releases major version updates (e.g., urllib3 3.0)
|
||||
|
||||
**Steps**:
|
||||
|
||||
1. **Check current environment**:
|
||||
```bash
|
||||
python analyze_dependencies.py --env
|
||||
```
|
||||
|
||||
2. **Analyze new versions**:
|
||||
```bash
|
||||
./test_venv/bin/pip index versions PACKAGE | head -20
|
||||
python analyze_dependencies.py PACKAGE
|
||||
```
|
||||
|
||||
3. **Update context files**:
|
||||
- Update version numbers in **DEPENDENCY_TREE_CONTEXT.md**
|
||||
- Update "Version Analysis" section
|
||||
- Document breaking changes
|
||||
|
||||
4. **Test with new versions**:
|
||||
- Update `requirements-test-base.txt` (if testing new base version)
|
||||
- OR update test to verify protection from new version
|
||||
- Run tests to verify behavior
|
||||
|
||||
5. **Update scenarios**:
|
||||
- Update **TEST_SCENARIOS.md** with new version numbers
|
||||
- Update expected results if behavior changed
|
||||
|
||||
---
|
||||
|
||||
### Task 4: Debugging Dependency Issues
|
||||
|
||||
**Problem**: Test fails with unexpected dependency versions
|
||||
|
||||
**Steps**:
|
||||
|
||||
1. **Check what's installed**:
|
||||
```bash
|
||||
./test_venv/bin/pip freeze | grep -E "(urllib3|certifi|six|requests)"
|
||||
```
|
||||
|
||||
2. **Analyze what would install**:
|
||||
```bash
|
||||
python analyze_dependencies.py PACKAGE
|
||||
```
|
||||
|
||||
3. **Compare with expected**:
|
||||
- Open **DEPENDENCY_TREE_CONTEXT.md**
|
||||
- Check "Install Scenarios" for the package
|
||||
- Compare actual vs. expected
|
||||
|
||||
4. **Check for PyPI changes**:
|
||||
```bash
|
||||
./test_venv/bin/pip index versions PACKAGE
|
||||
```
|
||||
|
||||
5. **Verify test environment**:
|
||||
```bash
|
||||
rm -rf test_venv && ./setup_test_env.sh
|
||||
pytest test_FILE.py -v --override-ini="addopts="
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 📚 Context File Details
|
||||
|
||||
### DEPENDENCY_TREE_CONTEXT.md
|
||||
|
||||
**Contents**:
|
||||
- Current test environment snapshot
|
||||
- Complete dependency trees for all test packages
|
||||
- Version analysis (current vs. latest)
|
||||
- Upgrade scenarios matrix
|
||||
- Guidelines for adding new scenarios
|
||||
- Quick reference tables
|
||||
|
||||
**Use when**:
|
||||
- Adding new test package
|
||||
- Understanding why a package was chosen
|
||||
- Checking version compatibility
|
||||
- Updating for new PyPI releases
|
||||
|
||||
**Key sections**:
|
||||
- Package Dependency Trees → See what each package depends on
|
||||
- Version Analysis → Understand version gaps and breaking changes
|
||||
- Adding New Test Scenarios → Step-by-step guide
|
||||
|
||||
---
|
||||
|
||||
### DEPENDENCY_ANALYSIS.md
|
||||
|
||||
**Contents**:
|
||||
- Detailed analysis of each test scenario
|
||||
- Real dependency verification using `pip --dry-run`
|
||||
- Version difference analysis
|
||||
- Rejected scenarios (and why)
|
||||
- Package size verification
|
||||
- Recommendations for implementation
|
||||
|
||||
**Use when**:
|
||||
- Understanding test design decisions
|
||||
- Evaluating new package candidates
|
||||
- Reviewing why certain packages were rejected
|
||||
- Learning the analysis methodology
|
||||
|
||||
**Key sections**:
|
||||
- Test Scenarios with Real Dependencies → Detailed scenarios
|
||||
- Rejected Scenarios → What NOT to use (e.g., click+colorama)
|
||||
- Validation Commands → How to verify analysis
|
||||
|
||||
---
|
||||
|
||||
### TEST_SCENARIOS.md
|
||||
|
||||
**Contents**:
|
||||
- Complete specifications for scenarios 1-6
|
||||
- Exact package versions and states
|
||||
- Policy configurations (JSON)
|
||||
- Expected pip commands
|
||||
- Expected final states
|
||||
- Key points for each scenario
|
||||
|
||||
**Use when**:
|
||||
- Writing new tests
|
||||
- Understanding test expectations
|
||||
- Debugging test failures
|
||||
- Documenting new scenarios
|
||||
|
||||
**Key sections**:
|
||||
- Each scenario section → Complete specification
|
||||
- Summary tables → Quick reference
|
||||
- Policy types summary → Available policy options
|
||||
|
||||
---
|
||||
|
||||
### analyze_dependencies.py
|
||||
|
||||
**Features**:
|
||||
- Interactive package analysis
|
||||
- Dry-run simulation
|
||||
- Version comparison
|
||||
- Pin impact analysis
|
||||
|
||||
**Use when**:
|
||||
- Exploring new packages
|
||||
- Verifying current environment
|
||||
- Checking upgrade impacts
|
||||
- Quick dependency checks
|
||||
|
||||
**Commands**:
|
||||
```bash
|
||||
# Analyze specific package
|
||||
python analyze_dependencies.py requests
|
||||
|
||||
# Analyze all test packages
|
||||
python analyze_dependencies.py --all
|
||||
|
||||
# Show current environment
|
||||
python analyze_dependencies.py --env
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### requirements-test-base.txt
|
||||
|
||||
**Contents**:
|
||||
- Base packages for test environment
|
||||
- Version specifications
|
||||
- Comments explaining each package's purpose
|
||||
|
||||
**Use when**:
|
||||
- Setting up test environment
|
||||
- Adding pre-installed packages
|
||||
- Modifying base versions
|
||||
- Recreating clean environment
|
||||
|
||||
**Format**:
|
||||
```txt
|
||||
# Scenario X: Purpose
|
||||
package==version # Comment explaining role
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 🔄 Workflow Examples
|
||||
|
||||
### Example 1: Adding flask Test
|
||||
|
||||
```bash
|
||||
# 1. Analyze flask
|
||||
python analyze_dependencies.py flask
|
||||
|
||||
# Output shows:
|
||||
# Would install: Flask, Jinja2, MarkupSafe, Werkzeug, blinker, click, itsdangerous
|
||||
|
||||
# 2. Check sizes
|
||||
./test_venv/bin/pip download --no-deps flask jinja2 werkzeug
|
||||
ls -lh *.whl
|
||||
|
||||
# 3. Document in DEPENDENCY_TREE_CONTEXT.md
|
||||
# Add section:
|
||||
### 3. flask → Dependencies
|
||||
**Package**: `flask==3.1.2`
|
||||
**Size**: ~100KB
|
||||
...
|
||||
|
||||
# 4. Write test
|
||||
# Create test_flask_dependencies.py
|
||||
|
||||
# 5. Test
|
||||
pytest test_flask_dependencies.py -v --override-ini="addopts="
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Example 2: Investigating Test Failure
|
||||
|
||||
```bash
|
||||
# Test failed: "urllib3 version mismatch"
|
||||
|
||||
# 1. Check installed
|
||||
./test_venv/bin/pip freeze | grep urllib3
|
||||
# Output: urllib3==2.5.0 (expected: 1.26.15)
|
||||
|
||||
# 2. Analyze what happened
|
||||
python analyze_dependencies.py requests
|
||||
|
||||
# 3. Check context
|
||||
# Open DEPENDENCY_TREE_CONTEXT.md
|
||||
# Section: "urllib3: Major Version Jump"
|
||||
# Confirms: 1.26.15 → 2.5.0 is expected without pin
|
||||
|
||||
# 4. Verify test has pin
|
||||
# Check test_dependency_protection.py for pin_policy fixture
|
||||
|
||||
# 5. Reset environment
|
||||
rm -rf test_venv && ./setup_test_env.sh
|
||||
|
||||
# 6. Re-run test
|
||||
pytest test_dependency_protection.py -v --override-ini="addopts="
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 🎓 Best Practices
|
||||
|
||||
### When Adding New Tests
|
||||
|
||||
✅ **DO**:
|
||||
- Use `analyze_dependencies.py` first
|
||||
- Document in **DEPENDENCY_TREE_CONTEXT.md**
|
||||
- Add scenario to **TEST_SCENARIOS.md**
|
||||
- Verify with real pip operations
|
||||
- Keep packages lightweight (<500KB total)
|
||||
|
||||
❌ **DON'T**:
|
||||
- Add packages without verifying dependencies
|
||||
- Use packages with optional dependencies only
|
||||
- Add heavy packages (>1MB)
|
||||
- Skip documentation
|
||||
- Mock subprocess for integration tests
|
||||
|
||||
---
|
||||
|
||||
### When Updating Context
|
||||
|
||||
✅ **DO**:
|
||||
- Re-run `analyze_dependencies.py --all`
|
||||
- Update version numbers throughout
|
||||
- Document breaking changes
|
||||
- Test after updates
|
||||
- Note update date
|
||||
|
||||
❌ **DON'T**:
|
||||
- Update only one file
|
||||
- Skip verification
|
||||
- Forget to update TEST_SCENARIOS.md
|
||||
- Leave outdated version numbers
|
||||
|
||||
---
|
||||
|
||||
## 🆘 Quick Troubleshooting
|
||||
|
||||
| Problem | Check | Solution |
|
||||
|---------|-------|----------|
|
||||
| Test fails with version mismatch | `pip freeze` | Recreate venv with `./setup_test_env.sh` |
|
||||
| Package not found | `pip index versions PKG` | Check if package exists on PyPI |
|
||||
| Unexpected dependencies | `analyze_dependencies.py PKG` | Review dependency tree in context file |
|
||||
| Wrong test data | **TEST_SCENARIOS.md** | Verify against documented scenario |
|
||||
| Unclear why package chosen | **DEPENDENCY_ANALYSIS.md** | Read "Rejected Scenarios" section |
|
||||
|
||||
---
|
||||
|
||||
## 📞 Need Help?
|
||||
|
||||
1. **Check context files first**: Most answers are documented
|
||||
2. **Run analyze_dependencies.py**: Verify current state
|
||||
3. **Review test scenarios**: Understand expected behavior
|
||||
4. **Examine dependency trees**: Understand relationships
|
||||
5. **Check DEPENDENCY_ANALYSIS.md**: Learn the "why" behind decisions
|
||||
|
||||
---
|
||||
|
||||
## 📝 Maintenance Checklist
|
||||
|
||||
**Every 6 months or when major versions release**:
|
||||
|
||||
- [ ] Run `python analyze_dependencies.py --all`
|
||||
- [ ] Check for new major versions: `pip index versions urllib3 certifi six`
|
||||
- [ ] Update **DEPENDENCY_TREE_CONTEXT.md** version numbers
|
||||
- [ ] Update **TEST_SCENARIOS.md** expected versions
|
||||
- [ ] Test all scenarios: `pytest -v --override-ini="addopts="`
|
||||
- [ ] Document any breaking changes
|
||||
- [ ] Update this guide if workflow changed
|
||||
|
||||
---
|
||||
|
||||
## 🔗 File Relationships
|
||||
|
||||
```
|
||||
requirements-test-base.txt
|
||||
↓ (defines)
|
||||
Current Test Environment
|
||||
↓ (analyzed by)
|
||||
analyze_dependencies.py
|
||||
↓ (documents)
|
||||
DEPENDENCY_TREE_CONTEXT.md
|
||||
↓ (informs)
|
||||
TEST_SCENARIOS.md
|
||||
↓ (implemented in)
|
||||
test_*.py files
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
**Last Updated**: 2025-10-01
|
||||
**Python Version**: 3.12.3
|
||||
**pip Version**: 25.2
|
||||
261
tests/common/pip_util/DEPENDENCY_ANALYSIS.md
Normal file
261
tests/common/pip_util/DEPENDENCY_ANALYSIS.md
Normal file
@ -0,0 +1,261 @@
|
||||
# pip_util Test Package Dependency Analysis
|
||||
|
||||
Real dependency analysis using `pip install --dry-run` to verify meaningful test scenarios.
|
||||
|
||||
## Analysis Date
|
||||
|
||||
Generated: 2025-10-01
|
||||
Tool: `pip install --dry-run --ignore-installed`
|
||||
|
||||
## Test Scenarios with Real Dependencies
|
||||
|
||||
### Scenario 1: Dependency Version Protection (requests + urllib3)
|
||||
|
||||
**Purpose**: Verify pin_dependencies prevents unwanted upgrades
|
||||
|
||||
**Initial Environment**:
|
||||
```
|
||||
urllib3==1.26.15
|
||||
certifi==2023.7.22
|
||||
charset-normalizer==3.2.0
|
||||
```
|
||||
|
||||
**Without pin** (`pip install requests`):
|
||||
```bash
|
||||
Would install:
|
||||
certifi-2025.8.3 # UPGRADED from 2023.7.22 (+2 years)
|
||||
charset-normalizer-3.4.3 # UPGRADED from 3.2.0 (minor)
|
||||
idna-3.10 # NEW dependency
|
||||
requests-2.32.5 # NEW package
|
||||
urllib3-2.5.0 # UPGRADED from 1.26.15 (MAJOR 1.x→2.x!)
|
||||
```
|
||||
|
||||
**With pin** (`pip install requests urllib3==1.26.15 certifi==2023.7.22 charset-normalizer==3.2.0`):
|
||||
```bash
|
||||
Would install:
|
||||
idna-3.10 # NEW dependency (required by requests)
|
||||
requests-2.32.5 # NEW package
|
||||
|
||||
# Pinned packages stay at old versions:
|
||||
urllib3==1.26.15 ✅ PROTECTED (prevented 1.x→2.x jump)
|
||||
certifi==2023.7.22 ✅ PROTECTED
|
||||
charset-normalizer==3.2.0 ✅ PROTECTED
|
||||
```
|
||||
|
||||
**Key Finding**:
|
||||
- `urllib3` 1.26.15 → 2.5.0 is a **MAJOR version jump** (breaking changes!)
|
||||
- requests accepts both: `urllib3<3,>=1.21.1` (compatible with 1.x and 2.x)
|
||||
- Pin successfully prevents unwanted major upgrade
|
||||
|
||||
---
|
||||
|
||||
### Scenario 2: Package with Dependency (python-dateutil + six)
|
||||
|
||||
**Purpose**: Verify pin_dependencies with dependency chain
|
||||
|
||||
**Analysis**:
|
||||
```bash
|
||||
$ pip install --dry-run python-dateutil
|
||||
|
||||
Would install:
|
||||
python-dateutil-2.9.0.post0
|
||||
six-1.17.0 # DEPENDENCY
|
||||
```
|
||||
|
||||
**Initial Environment**:
|
||||
```
|
||||
six==1.16.0 # Older version
|
||||
```
|
||||
|
||||
**Without pin** (`pip install python-dateutil`):
|
||||
```bash
|
||||
Would install:
|
||||
python-dateutil-2.9.0.post0
|
||||
six-1.17.0 # UPGRADED from 1.16.0
|
||||
```
|
||||
|
||||
**With pin** (`pip install python-dateutil six==1.16.0`):
|
||||
```bash
|
||||
Would install:
|
||||
python-dateutil-2.9.0.post0
|
||||
|
||||
# Pinned package:
|
||||
six==1.16.0 ✅ PROTECTED
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Scenario 3: Package Deletion and Restore (six)
|
||||
|
||||
**Purpose**: Verify restore policy reinstalls deleted packages
|
||||
|
||||
**Initial Environment**:
|
||||
```
|
||||
six==1.16.0
|
||||
attrs==23.1.0
|
||||
packaging==23.1
|
||||
```
|
||||
|
||||
**Action Sequence**:
|
||||
1. Delete six: `pip uninstall -y six`
|
||||
2. Verify deletion: `pip freeze | grep six` (empty)
|
||||
3. Restore: `batch.ensure_installed()` → `pip install six==1.16.0`
|
||||
|
||||
**Expected Result**:
|
||||
```
|
||||
six==1.16.0 # ✅ RESTORED
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Scenario 4: Version Change and Restore (urllib3)
|
||||
|
||||
**Purpose**: Verify restore policy reverts version changes
|
||||
|
||||
**Initial Environment**:
|
||||
```
|
||||
urllib3==1.26.15
|
||||
```
|
||||
|
||||
**Action Sequence**:
|
||||
1. Upgrade: `pip install urllib3==2.5.0`
|
||||
2. Verify change: `pip freeze | grep urllib3` → `urllib3==2.5.0`
|
||||
3. Restore: `batch.ensure_installed()` → `pip install urllib3==1.26.15`
|
||||
|
||||
**Expected Result**:
|
||||
```
|
||||
urllib3==1.26.15 # ✅ RESTORED (downgraded from 2.5.0)
|
||||
```
|
||||
|
||||
**Key Finding**:
|
||||
- Downgrade from 2.x to 1.x requires explicit version specification
|
||||
- pip allows downgrades with `pip install urllib3==1.26.15`
|
||||
|
||||
---
|
||||
|
||||
## Rejected Scenarios
|
||||
|
||||
### click + colorama (NO REAL DEPENDENCY)
|
||||
|
||||
**Analysis**:
|
||||
```bash
|
||||
$ pip install --dry-run click
|
||||
Would install: click-8.3.0
|
||||
|
||||
$ pip install --dry-run click colorama==0.4.6
|
||||
Would install: click-8.3.0 # colorama not installed!
|
||||
```
|
||||
|
||||
**Finding**: click has **NO direct dependency** on colorama
|
||||
- colorama is **optional** and platform-specific (Windows only)
|
||||
- Not a good test case for dependency protection
|
||||
|
||||
**Recommendation**: Use python-dateutil + six instead
|
||||
|
||||
---
|
||||
|
||||
## Package Size Verification
|
||||
|
||||
```bash
|
||||
Package Size Version Purpose
|
||||
-------------------------------------------------------
|
||||
urllib3 ~140KB 1.26.15 Protected dependency
|
||||
certifi ~158KB 2023.7.22 SSL certificates
|
||||
charset-normalizer ~46KB 3.2.0 Charset detection
|
||||
idna ~69KB 3.10 NEW dep from requests
|
||||
requests ~100KB 2.32.5 Main package to install
|
||||
six ~11KB 1.16.0 Restore test
|
||||
python-dateutil ~280KB 2.9.0 Depends on six
|
||||
attrs ~61KB 23.1.0 Bystander
|
||||
packaging ~48KB 23.1 Bystander
|
||||
-------------------------------------------------------
|
||||
Total ~913KB (< 1MB) ✅ All lightweight
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Dependency Graph
|
||||
|
||||
```
|
||||
requests 2.32.5
|
||||
├── charset_normalizer<4,>=2 (have: 3.2.0)
|
||||
├── idna<4,>=2.5 (need: 3.10) ← NEW
|
||||
├── urllib3<3,>=1.21.1 (have: 1.26.15, latest: 2.5.0)
|
||||
└── certifi>=2017.4.17 (have: 2023.7.22, latest: 2025.8.3)
|
||||
|
||||
python-dateutil 2.9.0
|
||||
└── six>=1.5 (have: 1.16.0, latest: 1.17.0)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Version Compatibility Matrix
|
||||
|
||||
| Package | Old Version | Latest | Spec | Compatible? |
|
||||
|---------|------------|--------|------|-------------|
|
||||
| urllib3 | 1.26.15 | 2.5.0 | <3,>=1.21.1 | ✅ Both work |
|
||||
| certifi | 2023.7.22 | 2025.8.3 | >=2017.4.17 | ✅ Both work |
|
||||
| charset-normalizer | 3.2.0 | 3.4.3 | <4,>=2 | ✅ Both work |
|
||||
| six | 1.16.0 | 1.17.0 | >=1.5 | ✅ Both work |
|
||||
| idna | (none) | 3.10 | <4,>=2.5 | ⚠️ Must install |
|
||||
|
||||
---
|
||||
|
||||
## Test Data Justification
|
||||
|
||||
### Why urllib3 1.26.15?
|
||||
1. **Real world scenario**: Many projects pin urllib3<2 to avoid breaking changes
|
||||
2. **Meaningful test**: 1.26.15 → 2.5.0 is a major version jump (API changes)
|
||||
3. **Compatibility**: requests accepts both 1.x and 2.x (good for testing)
|
||||
|
||||
### Why certifi 2023.7.22?
|
||||
1. **Real world scenario**: Older environment with outdated SSL certificates
|
||||
2. **Meaningful test**: 2-year version gap (2023 → 2025)
|
||||
3. **Safety**: Still compatible with requests
|
||||
|
||||
### Why six 1.16.0?
|
||||
1. **Lightweight**: Only 11KB
|
||||
2. **Real dependency**: python-dateutil actually depends on it
|
||||
3. **Stable**: six is mature and rarely changes
|
||||
|
||||
---
|
||||
|
||||
## Recommendations for Test Implementation
|
||||
|
||||
### ✅ Keep These Scenarios:
|
||||
1. **requests + urllib3 pin** - Real major version protection
|
||||
2. **python-dateutil + six** - Real dependency chain
|
||||
3. **six deletion/restore** - Real package management
|
||||
4. **urllib3 version change** - Real downgrade scenario
|
||||
|
||||
### ❌ Remove These Scenarios:
|
||||
1. **click + colorama** - No real dependency (colorama is optional/Windows-only)
|
||||
|
||||
### 📝 Update Required Files:
|
||||
1. `requirements-test-base.txt` - Add idna (new dependency from requests)
|
||||
2. `TEST_SCENARIOS.md` - Update with real dependency analysis
|
||||
3. `test_dependency_protection.py` - Remove click-colorama test
|
||||
4. `pip_util.design.en.md` - Update examples with verified dependencies
|
||||
|
||||
---
|
||||
|
||||
## Validation Commands
|
||||
|
||||
Run these to verify analysis:
|
||||
|
||||
```bash
|
||||
# Check current environment
|
||||
./test_venv/bin/pip freeze
|
||||
|
||||
# Simulate requests installation without pin
|
||||
./test_venv/bin/pip install --dry-run requests
|
||||
|
||||
# Simulate requests installation with pin
|
||||
./test_venv/bin/pip install --dry-run requests urllib3==1.26.15 certifi==2023.7.22 charset-normalizer==3.2.0
|
||||
|
||||
# Check python-dateutil dependencies
|
||||
./test_venv/bin/pip install --dry-run python-dateutil
|
||||
|
||||
# Verify urllib3 version availability
|
||||
./test_venv/bin/pip index versions urllib3 | head -20
|
||||
```
|
||||
413
tests/common/pip_util/DEPENDENCY_TREE_CONTEXT.md
Normal file
413
tests/common/pip_util/DEPENDENCY_TREE_CONTEXT.md
Normal file
@ -0,0 +1,413 @@
|
||||
# Dependency Tree Context for pip_util Tests
|
||||
|
||||
**Generated**: 2025-10-01
|
||||
**Tool**: `pip install --dry-run --ignore-installed`
|
||||
**Python**: 3.12.3
|
||||
**pip**: 25.2
|
||||
|
||||
This document provides detailed dependency tree information for all test packages, verified against real PyPI data. Use this as a reference when extending tests.
|
||||
|
||||
---
|
||||
|
||||
## Table of Contents
|
||||
|
||||
1. [Current Test Environment](#current-test-environment)
|
||||
2. [Package Dependency Trees](#package-dependency-trees)
|
||||
3. [Version Analysis](#version-analysis)
|
||||
4. [Upgrade Scenarios](#upgrade-scenarios)
|
||||
5. [Adding New Test Scenarios](#adding-new-test-scenarios)
|
||||
|
||||
---
|
||||
|
||||
## Current Test Environment
|
||||
|
||||
**Base packages installed in test_venv** (from `requirements-test-base.txt`):
|
||||
|
||||
```
|
||||
urllib3==1.26.15 # Protected from 2.x upgrade
|
||||
certifi==2023.7.22 # Protected from 2025.x upgrade
|
||||
charset-normalizer==3.2.0 # Protected from 3.4.x upgrade
|
||||
six==1.16.0 # For deletion/restore tests
|
||||
attrs==23.1.0 # Bystander package
|
||||
packaging==23.1 # Bystander package
|
||||
pytest==8.4.2 # Test framework
|
||||
```
|
||||
|
||||
**Total environment size**: ~913KB (all packages < 1MB)
|
||||
|
||||
---
|
||||
|
||||
## Package Dependency Trees
|
||||
|
||||
### 1. requests → Dependencies
|
||||
|
||||
**Package**: `requests==2.32.5`
|
||||
**Size**: ~100KB
|
||||
**Purpose**: Main test package for dependency protection
|
||||
|
||||
#### Dependency Tree
|
||||
|
||||
```
|
||||
requests==2.32.5
|
||||
├── charset-normalizer<4,>=2
|
||||
│ └── 3.2.0 (OLD) → 3.4.3 (LATEST)
|
||||
├── idna<4,>=2.5
|
||||
│ └── (NOT INSTALLED) → 3.10 (LATEST)
|
||||
├── urllib3<3,>=1.21.1
|
||||
│ └── 1.26.15 (OLD) → 2.5.0 (LATEST) ⚠️ MAJOR VERSION JUMP
|
||||
└── certifi>=2017.4.17
|
||||
└── 2023.7.22 (OLD) → 2025.8.3 (LATEST)
|
||||
```
|
||||
|
||||
#### Install Scenarios
|
||||
|
||||
**Scenario A: Without constraints (fresh install)**
|
||||
```bash
|
||||
$ pip install --dry-run --ignore-installed requests
|
||||
|
||||
Would install:
|
||||
certifi-2025.8.3 # Latest version
|
||||
charset-normalizer-3.4.3 # Latest version
|
||||
idna-3.10 # New dependency
|
||||
requests-2.32.5 # Target package
|
||||
urllib3-2.5.0 # Latest version (2.x!)
|
||||
```
|
||||
|
||||
**Scenario B: With pin constraints**
|
||||
```bash
|
||||
$ pip install --dry-run requests \
|
||||
urllib3==1.26.15 \
|
||||
certifi==2023.7.22 \
|
||||
charset-normalizer==3.2.0
|
||||
|
||||
Would install:
|
||||
certifi-2023.7.22 # Pinned to OLD version
|
||||
charset-normalizer-3.2.0 # Pinned to OLD version
|
||||
idna-3.10 # New dependency (not pinned)
|
||||
requests-2.32.5 # Target package
|
||||
urllib3-1.26.15 # Pinned to OLD version
|
||||
```
|
||||
|
||||
**Impact Analysis**:
|
||||
- ✅ Pin successfully prevents urllib3 1.x → 2.x major upgrade
|
||||
- ✅ Pin prevents certifi 2023 → 2025 upgrade (2 years)
|
||||
- ✅ Pin prevents charset-normalizer minor upgrade
|
||||
- ⚠️ idna is NEW and NOT pinned (acceptable - new dependency)
|
||||
|
||||
---
|
||||
|
||||
### 2. python-dateutil → Dependencies
|
||||
|
||||
**Package**: `python-dateutil==2.9.0.post0`
|
||||
**Size**: ~280KB
|
||||
**Purpose**: Real dependency chain test (depends on six)
|
||||
|
||||
#### Dependency Tree
|
||||
|
||||
```
|
||||
python-dateutil==2.9.0.post0
|
||||
└── six>=1.5
|
||||
└── 1.16.0 (OLD) → 1.17.0 (LATEST)
|
||||
```
|
||||
|
||||
#### Install Scenarios
|
||||
|
||||
**Scenario A: Without constraints**
|
||||
```bash
|
||||
$ pip install --dry-run --ignore-installed python-dateutil
|
||||
|
||||
Would install:
|
||||
python-dateutil-2.9.0.post0 # Target package
|
||||
six-1.17.0 # Latest version
|
||||
```
|
||||
|
||||
**Scenario B: With pin constraints**
|
||||
```bash
|
||||
$ pip install --dry-run python-dateutil six==1.16.0
|
||||
|
||||
Would install:
|
||||
python-dateutil-2.9.0.post0 # Target package
|
||||
six-1.16.0 # Pinned to OLD version
|
||||
```
|
||||
|
||||
**Impact Analysis**:
|
||||
- ✅ Pin successfully prevents six 1.16.0 → 1.17.0 upgrade
|
||||
- ✅ Real dependency relationship (verified via PyPI)
|
||||
|
||||
---
|
||||
|
||||
### 3. Other Test Packages (No Dependencies)
|
||||
|
||||
These packages have no dependencies or only have dependencies already in the test environment:
|
||||
|
||||
```
|
||||
attrs==23.1.0 # No dependencies
|
||||
packaging==23.1 # No dependencies (standalone)
|
||||
six==1.16.0 # No dependencies (pure Python)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Version Analysis
|
||||
|
||||
### urllib3: Major Version Jump (1.x → 2.x)
|
||||
|
||||
**Current**: 1.26.15 (2023)
|
||||
**Latest**: 2.5.0 (2025)
|
||||
**Breaking Changes**: YES - urllib3 2.0 removed deprecated APIs
|
||||
|
||||
**Available versions**:
|
||||
```
|
||||
2.x series: 2.5.0, 2.4.0, 2.3.0, 2.2.3, 2.2.2, 2.2.1, 2.2.0, 2.1.0, 2.0.7, ...
|
||||
1.26.x: 1.26.20, 1.26.19, 1.26.18, 1.26.17, 1.26.16, 1.26.15, ...
|
||||
1.25.x: 1.25.11, 1.25.10, 1.25.9, ...
|
||||
```
|
||||
|
||||
**Why test with 1.26.15?**
|
||||
- ✅ Real-world scenario: Many projects pin `urllib3<2` to avoid breaking changes
|
||||
- ✅ Meaningful test: 1.x → 2.x is a major API change
|
||||
- ✅ Compatibility: requests accepts both 1.x and 2.x (`urllib3<3,>=1.21.1`)
|
||||
|
||||
**Breaking changes in urllib3 2.0**:
|
||||
- Removed `urllib3.contrib.pyopenssl`
|
||||
- Removed `urllib3.contrib.securetransport`
|
||||
- Changed import paths for some modules
|
||||
- Updated connection pooling behavior
|
||||
|
||||
---
|
||||
|
||||
### certifi: Long-Term Version Gap (2023 → 2025)
|
||||
|
||||
**Current**: 2023.7.22 (July 2023)
|
||||
**Latest**: 2025.8.3 (August 2025)
|
||||
**Gap**: ~2 years of SSL certificate updates
|
||||
|
||||
**Available versions**:
|
||||
```
|
||||
2025: 2025.8.3, 2025.7.14, 2025.7.9, 2025.6.15, 2025.4.26, ...
|
||||
2024: 2024.12.25, 2024.11.28, 2024.10.29, 2024.9.19, ...
|
||||
2023: 2023.11.17, 2023.7.22, 2023.5.7, ...
|
||||
```
|
||||
|
||||
**Why test with 2023.7.22?**
|
||||
- ✅ Real-world scenario: Older environments with outdated SSL certificates
|
||||
- ✅ Meaningful test: 2-year gap shows protection of older versions
|
||||
- ✅ Safety: Still compatible with requests (`certifi>=2017.4.17`)
|
||||
|
||||
---
|
||||
|
||||
### charset-normalizer: Minor Version Updates
|
||||
|
||||
**Current**: 3.2.0 (2023)
|
||||
**Latest**: 3.4.3 (2025)
|
||||
**Breaking Changes**: NO - only minor/patch updates
|
||||
|
||||
**Available versions**:
|
||||
```
|
||||
3.4.x: 3.4.3, 3.4.2, 3.4.1, 3.4.0
|
||||
3.3.x: 3.3.2, 3.3.1, 3.3.0
|
||||
3.2.x: 3.2.0
|
||||
```
|
||||
|
||||
**Why test with 3.2.0?**
|
||||
- ✅ Demonstrates protection of minor version updates
|
||||
- ✅ Compatible with requests (`charset-normalizer<4,>=2`)
|
||||
|
||||
---
|
||||
|
||||
### six: Stable Version Update
|
||||
|
||||
**Current**: 1.16.0 (2021)
|
||||
**Latest**: 1.17.0 (2024)
|
||||
**Breaking Changes**: NO - six is very stable
|
||||
|
||||
**Available versions**:
|
||||
```
|
||||
1.17.0, 1.16.0, 1.15.0, 1.14.0, 1.13.0, 1.12.0, ...
|
||||
```
|
||||
|
||||
**Why test with 1.16.0?**
|
||||
- ✅ Real dependency of python-dateutil
|
||||
- ✅ Small size (11KB) - lightweight for tests
|
||||
- ✅ Demonstrates protection of stable packages
|
||||
|
||||
---
|
||||
|
||||
### idna: New Dependency
|
||||
|
||||
**Not pre-installed** - Added by requests
|
||||
|
||||
**Version**: 3.10
|
||||
**Size**: ~69KB
|
||||
**Dependency spec**: `idna<4,>=2.5` (from requests)
|
||||
|
||||
**Why NOT pre-installed?**
|
||||
- ✅ Tests that new dependencies are correctly added
|
||||
- ✅ Tests that pins only affect specified packages
|
||||
- ✅ Real-world scenario: new dependency introduced by package update
|
||||
|
||||
---
|
||||
|
||||
## Upgrade Scenarios
|
||||
|
||||
### Scenario Matrix
|
||||
|
||||
| Package | Initial | Without Pin | With Pin | Change Type |
|
||||
|---------|---------|-------------|----------|-------------|
|
||||
| **urllib3** | 1.26.15 | 2.5.0 ❌ | 1.26.15 ✅ | Major (breaking) |
|
||||
| **certifi** | 2023.7.22 | 2025.8.3 ❌ | 2023.7.22 ✅ | 2-year gap |
|
||||
| **charset-normalizer** | 3.2.0 | 3.4.3 ❌ | 3.2.0 ✅ | Minor update |
|
||||
| **six** | 1.16.0 | 1.17.0 ❌ | 1.16.0 ✅ | Stable update |
|
||||
| **idna** | (none) | 3.10 ✅ | 3.10 ✅ | New dependency |
|
||||
| **requests** | (none) | 2.32.5 ✅ | 2.32.5 ✅ | Target package |
|
||||
| **python-dateutil** | (none) | 2.9.0 ✅ | 2.9.0 ✅ | Target package |
|
||||
|
||||
---
|
||||
|
||||
## Adding New Test Scenarios
|
||||
|
||||
### Step 1: Identify Candidate Package
|
||||
|
||||
Use `pip install --dry-run` to analyze dependencies:
|
||||
|
||||
```bash
|
||||
# Analyze package dependencies
|
||||
./test_venv/bin/pip install --dry-run --ignore-installed PACKAGE
|
||||
|
||||
# Check what changes with current environment
|
||||
./test_venv/bin/pip install --dry-run PACKAGE
|
||||
|
||||
# List available versions
|
||||
./test_venv/bin/pip index versions PACKAGE
|
||||
```
|
||||
|
||||
### Step 2: Verify Real Dependencies
|
||||
|
||||
**Good candidates**:
|
||||
- ✅ Has 2+ dependencies
|
||||
- ✅ Dependencies have version upgrades available
|
||||
- ✅ Total size < 500KB (all packages combined)
|
||||
- ✅ Real-world use case (popular package)
|
||||
|
||||
**Examples**:
|
||||
```bash
|
||||
# flask → click, werkzeug, jinja2 (good: multiple dependencies)
|
||||
$ pip install --dry-run --ignore-installed flask
|
||||
Would install: Flask-3.1.2 Jinja2-3.1.6 MarkupSafe-3.0.3 Werkzeug-3.1.3 blinker-1.9.0 click-8.3.0 itsdangerous-2.2.0
|
||||
|
||||
# pytest-cov → pytest, coverage (good: popular testing tool)
|
||||
$ pip install --dry-run --ignore-installed pytest-cov
|
||||
Would install: coverage-7.10.7 pytest-8.4.2 pytest-cov-7.0.0
|
||||
```
|
||||
|
||||
**Bad candidates**:
|
||||
- ❌ click → colorama (no real dependency - colorama is optional/Windows-only)
|
||||
- ❌ pandas → numpy (too large - numpy is 50MB+)
|
||||
- ❌ torch → ... (too large - 800MB+)
|
||||
|
||||
### Step 3: Document Dependencies
|
||||
|
||||
Add to this file:
|
||||
|
||||
```markdown
|
||||
### Package: PACKAGE_NAME → Dependencies
|
||||
|
||||
**Package**: `PACKAGE==VERSION`
|
||||
**Size**: ~XXXKB
|
||||
**Purpose**: Brief description
|
||||
|
||||
#### Dependency Tree
|
||||
(Use tree format)
|
||||
|
||||
#### Install Scenarios
|
||||
(Show with/without pin)
|
||||
|
||||
#### Impact Analysis
|
||||
(What does pin protect?)
|
||||
```
|
||||
|
||||
### Step 4: Update Test Files
|
||||
|
||||
1. Add package to `requirements-test-base.txt` (if pre-installation needed)
|
||||
2. Create policy fixture in test file
|
||||
3. Write test function using `reset_test_venv` fixture
|
||||
4. Update `TEST_SCENARIOS.md` with detailed scenario
|
||||
|
||||
---
|
||||
|
||||
## Maintenance Notes
|
||||
|
||||
### Updating This Document
|
||||
|
||||
Re-run analysis when:
|
||||
- ✅ PyPI releases major version updates (e.g., urllib3 3.0)
|
||||
- ✅ Adding new test packages
|
||||
- ✅ Test environment base packages change
|
||||
- ✅ Every 6 months (to catch version drift)
|
||||
|
||||
### Verification Commands
|
||||
|
||||
```bash
|
||||
# Regenerate dependency tree
|
||||
./test_venv/bin/pip install --dry-run --ignore-installed requests
|
||||
./test_venv/bin/pip install --dry-run --ignore-installed python-dateutil
|
||||
|
||||
# Check current environment
|
||||
./test_venv/bin/pip freeze
|
||||
|
||||
# Verify test packages still available on PyPI
|
||||
./test_venv/bin/pip index versions urllib3
|
||||
./test_venv/bin/pip index versions certifi
|
||||
./test_venv/bin/pip index versions six
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Quick Reference: Package Specs
|
||||
|
||||
From actual package metadata:
|
||||
|
||||
```python
|
||||
# requests dependencies (from requests==2.32.5)
|
||||
install_requires = [
|
||||
"charset_normalizer<4,>=2",
|
||||
"idna<4,>=2.5",
|
||||
"urllib3<3,>=1.21.1",
|
||||
"certifi>=2017.4.17"
|
||||
]
|
||||
|
||||
# python-dateutil dependencies (from python-dateutil==2.9.0)
|
||||
install_requires = [
|
||||
"six>=1.5"
|
||||
]
|
||||
|
||||
# six dependencies
|
||||
install_requires = [] # No dependencies
|
||||
|
||||
# attrs dependencies
|
||||
install_requires = [] # No dependencies
|
||||
|
||||
# packaging dependencies
|
||||
install_requires = [] # No dependencies
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Version Compatibility Table
|
||||
|
||||
| Package | Minimum | Maximum | Current Test | Latest | Notes |
|
||||
|---------|---------|---------|--------------|--------|-------|
|
||||
| urllib3 | 1.21.1 | <3.0 | 1.26.15 | 2.5.0 | Major version jump possible |
|
||||
| certifi | 2017.4.17 | (none) | 2023.7.22 | 2025.8.3 | Always backward compatible |
|
||||
| charset-normalizer | 2.0 | <4.0 | 3.2.0 | 3.4.3 | Within major version |
|
||||
| six | 1.5 | (none) | 1.16.0 | 1.17.0 | Very stable |
|
||||
| idna | 2.5 | <4.0 | (new) | 3.10 | Added by requests |
|
||||
|
||||
---
|
||||
|
||||
## See Also
|
||||
|
||||
- **DEPENDENCY_ANALYSIS.md** - Detailed analysis methodology
|
||||
- **TEST_SCENARIOS.md** - Complete test scenario specifications
|
||||
- **requirements-test-base.txt** - Base environment packages
|
||||
- **README.md** - Test suite overview and usage
|
||||
305
tests/common/pip_util/README.md
Normal file
305
tests/common/pip_util/README.md
Normal file
@ -0,0 +1,305 @@
|
||||
# pip_util Integration Tests
|
||||
|
||||
Real integration tests for `pip_util.py` using actual PyPI packages and pip operations.
|
||||
|
||||
## Overview
|
||||
|
||||
These tests use a **real isolated venv** to verify pip_util behavior with actual package installations, deletions, and version changes. No mocks - real pip operations only.
|
||||
|
||||
## Quick Start
|
||||
|
||||
### 1. Setup Test Environment
|
||||
|
||||
```bash
|
||||
cd tests/common/pip_util
|
||||
./setup_test_env.sh
|
||||
```
|
||||
|
||||
This creates `test_venv/` with base packages:
|
||||
- urllib3==1.26.15
|
||||
- certifi==2023.7.22
|
||||
- charset-normalizer==3.2.0
|
||||
- colorama==0.4.6
|
||||
- six==1.16.0
|
||||
- attrs==23.1.0
|
||||
- packaging==23.1
|
||||
- pytest (latest)
|
||||
|
||||
### 2. Run Tests
|
||||
|
||||
```bash
|
||||
# Run all integration tests
|
||||
pytest -v --override-ini="addopts="
|
||||
|
||||
# Run specific test
|
||||
pytest test_dependency_protection.py -v --override-ini="addopts="
|
||||
|
||||
# Run with markers
|
||||
pytest -m integration -v --override-ini="addopts="
|
||||
```
|
||||
|
||||
## Test Architecture
|
||||
|
||||
### Real venv Integration
|
||||
|
||||
- **No subprocess mocking** - uses real pip install/uninstall
|
||||
- **Isolated test venv** - prevents system contamination
|
||||
- **Automatic cleanup** - `reset_test_venv` fixture restores state after each test
|
||||
|
||||
### Test Fixtures
|
||||
|
||||
**venv Management**:
|
||||
- `test_venv_path` - Path to test venv (session scope)
|
||||
- `test_pip_cmd` - pip command for test venv
|
||||
- `reset_test_venv` - Restore venv to initial state after each test
|
||||
|
||||
**Helpers**:
|
||||
- `get_installed_packages()` - Get current venv packages
|
||||
- `install_packages(*packages)` - Install packages in test venv
|
||||
- `uninstall_packages(*packages)` - Uninstall packages in test venv
|
||||
|
||||
**Policy Configuration**:
|
||||
- `temp_policy_dir` - Temporary directory for base policies
|
||||
- `temp_user_policy_dir` - Temporary directory for user policies
|
||||
- `mock_manager_util` - Mock manager_util paths to use temp dirs
|
||||
- `mock_context` - Mock context paths to use temp dirs
|
||||
|
||||
## Test Scenarios
|
||||
|
||||
### Scenario 1: Dependency Version Protection
|
||||
**File**: `test_dependency_protection.py::test_dependency_version_protection_with_pin`
|
||||
|
||||
**Initial State**:
|
||||
```python
|
||||
urllib3==1.26.15
|
||||
certifi==2023.7.22
|
||||
charset-normalizer==3.2.0
|
||||
```
|
||||
|
||||
**Action**: Install `requests` with pin_dependencies policy
|
||||
|
||||
**Expected Result**:
|
||||
```python
|
||||
# Dependencies stay at old versions (protected by pin)
|
||||
urllib3==1.26.15 # NOT upgraded to 2.x
|
||||
certifi==2023.7.22 # NOT upgraded
|
||||
charset-normalizer==3.2.0 # NOT upgraded
|
||||
requests==2.31.0 # newly installed
|
||||
```
|
||||
|
||||
### Scenario 2: Click-Colorama Dependency Chain
|
||||
**File**: `test_dependency_protection.py::test_dependency_chain_with_click_colorama`
|
||||
|
||||
**Initial State**:
|
||||
```python
|
||||
colorama==0.4.6
|
||||
```
|
||||
|
||||
**Action**: Install `click` with force_version + pin_dependencies
|
||||
|
||||
**Expected Result**:
|
||||
```python
|
||||
colorama==0.4.6 # PINNED
|
||||
click==8.1.3 # FORCED to specific version
|
||||
```
|
||||
|
||||
### Scenario 3: Package Deletion and Restore
|
||||
**File**: `test_environment_recovery.py::test_package_deletion_and_restore`
|
||||
|
||||
**Initial State**:
|
||||
```python
|
||||
six==1.16.0
|
||||
attrs==23.1.0
|
||||
packaging==23.1
|
||||
```
|
||||
|
||||
**Action**: Delete `six` → call `batch.ensure_installed()`
|
||||
|
||||
**Expected Result**:
|
||||
```python
|
||||
six==1.16.0 # RESTORED to required version
|
||||
```
|
||||
|
||||
### Scenario 4: Version Change and Restore
|
||||
**File**: `test_environment_recovery.py::test_version_change_and_restore`
|
||||
|
||||
**Initial State**:
|
||||
```python
|
||||
urllib3==1.26.15
|
||||
```
|
||||
|
||||
**Action**: Upgrade `urllib3` to 2.1.0 → call `batch.ensure_installed()`
|
||||
|
||||
**Expected Result**:
|
||||
```python
|
||||
urllib3==1.26.15 # RESTORED to required version (downgraded)
|
||||
```
|
||||
|
||||
## Test Categories
|
||||
|
||||
### Priority 1 (Essential) ✅ ALL PASSING
|
||||
- ✅ Dependency version protection (enhanced with exact versions)
|
||||
- ✅ Package deletion and restore (enhanced with exact versions)
|
||||
- ✅ Version change and restore (enhanced with downgrade verification)
|
||||
- ✅ Pin only affects specified packages ✨ NEW
|
||||
- ✅ Major version jump prevention ✨ NEW
|
||||
|
||||
### Priority 2 (Important)
|
||||
- ✅ Complex dependency chains (python-dateutil + six)
|
||||
- ⏳ Full workflow integration (TODO: update to real venv)
|
||||
- ⏳ Pin failure retry (TODO: update to real venv)
|
||||
|
||||
### Priority 3 (Edge Cases)
|
||||
- ⏳ Platform conditions (TODO: update to real venv)
|
||||
- ⏳ Policy priority (TODO: update to real venv)
|
||||
- ⏳ Unit tests (no venv needed)
|
||||
- ⏳ Edge cases (no venv needed)
|
||||
|
||||
## Package Selection
|
||||
|
||||
All test packages are **real PyPI packages < 200KB**:
|
||||
|
||||
| Package | Size | Version | Purpose |
|
||||
|---------|------|---------|---------|
|
||||
| **urllib3** | ~100KB | 1.26.15 | Protected dependency (prevent 2.x upgrade) |
|
||||
| **certifi** | ~10KB | 2023.7.22 | SSL certificates (pinned) |
|
||||
| **charset-normalizer** | ~46KB | 3.2.0 | Charset detection (pinned) |
|
||||
| **requests** | ~100KB | 2.31.0 | Main package to install |
|
||||
| **colorama** | ~25KB | 0.4.6 | Terminal colors (pinned) |
|
||||
| **click** | ~90KB | 8.1.3 | CLI framework (forced version) |
|
||||
| **six** | ~11KB | 1.16.0 | Python 2/3 compatibility (restore) |
|
||||
| **attrs** | ~61KB | 23.1.0 | Bystander package |
|
||||
| **packaging** | ~48KB | 23.1 | Bystander package |
|
||||
|
||||
## Cleanup
|
||||
|
||||
### Manual Cleanup
|
||||
```bash
|
||||
# Remove test venv
|
||||
rm -rf test_venv/
|
||||
|
||||
# Recreate fresh venv
|
||||
./setup_test_env.sh
|
||||
```
|
||||
|
||||
### Automatic Cleanup
|
||||
The `reset_test_venv` fixture automatically:
|
||||
1. Records initial package state
|
||||
2. Runs test
|
||||
3. Removes all packages (except pip/setuptools/wheel)
|
||||
4. Reinstalls initial packages
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Error: "Test venv not found"
|
||||
**Solution**: Run `./setup_test_env.sh`
|
||||
|
||||
### Error: "Package not installed in initial state"
|
||||
**Solution**: Check `requirements-test-base.txt` and recreate venv
|
||||
|
||||
### Tests are slow
|
||||
**Reason**: Real pip operations take 2-3 seconds per test
|
||||
**This is expected** - we're doing actual pip install/uninstall
|
||||
|
||||
## Implementation Details
|
||||
|
||||
### How reset_test_venv Works
|
||||
|
||||
```python
|
||||
@pytest.fixture
|
||||
def reset_test_venv(test_pip_cmd):
|
||||
# 1. Record initial state
|
||||
initial = subprocess.run(test_pip_cmd + ["freeze"], ...)
|
||||
|
||||
yield # Run test here
|
||||
|
||||
# 2. Remove all packages
|
||||
current = subprocess.run(test_pip_cmd + ["freeze"], ...)
|
||||
subprocess.run(test_pip_cmd + ["uninstall", "-y", ...], ...)
|
||||
|
||||
# 3. Restore initial state
|
||||
subprocess.run(test_pip_cmd + ["install", "-r", initial], ...)
|
||||
```
|
||||
|
||||
### How make_pip_cmd is Patched
|
||||
|
||||
```python
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup_pip_util(monkeypatch, test_pip_cmd):
|
||||
from comfyui_manager.common import pip_util
|
||||
|
||||
def make_test_pip_cmd(args: List[str]) -> List[str]:
|
||||
return test_pip_cmd + args # Use test venv pip
|
||||
|
||||
monkeypatch.setattr(
|
||||
pip_util.manager_util,
|
||||
"make_pip_cmd",
|
||||
make_test_pip_cmd
|
||||
)
|
||||
```
|
||||
|
||||
## Dependency Analysis Tool
|
||||
|
||||
Use `analyze_dependencies.py` to examine package dependencies before adding new tests:
|
||||
|
||||
```bash
|
||||
# Analyze specific package
|
||||
python analyze_dependencies.py requests
|
||||
|
||||
# Analyze all test packages
|
||||
python analyze_dependencies.py --all
|
||||
|
||||
# Show current environment
|
||||
python analyze_dependencies.py --env
|
||||
```
|
||||
|
||||
**Output includes**:
|
||||
- Latest available versions
|
||||
- Dependencies that would be installed
|
||||
- Version upgrades that would occur
|
||||
- Impact of pin constraints
|
||||
|
||||
**Example output**:
|
||||
```
|
||||
📦 Latest version: 2.32.5
|
||||
🔍 Scenario A: Install without constraints
|
||||
Would install 5 packages:
|
||||
• urllib3 1.26.15 → 2.5.0 ⚠️ UPGRADE
|
||||
|
||||
🔍 Scenario B: Install with pin constraints
|
||||
Would install 5 packages:
|
||||
• urllib3 1.26.15 (no change) 📌 PINNED
|
||||
|
||||
✅ Pin prevented 2 upgrade(s)
|
||||
```
|
||||
|
||||
## Test Statistics
|
||||
|
||||
**Current Status**: 6 tests, 100% passing
|
||||
|
||||
```
|
||||
test_dependency_version_protection_with_pin PASSED (2.28s)
|
||||
test_dependency_chain_with_six_pin PASSED (2.00s)
|
||||
test_pin_only_affects_specified_packages PASSED (2.25s) ✨ NEW
|
||||
test_major_version_jump_prevention PASSED (3.53s) ✨ NEW
|
||||
test_package_deletion_and_restore PASSED (2.25s)
|
||||
test_version_change_and_restore PASSED (2.24s)
|
||||
|
||||
Total: 14.10s
|
||||
```
|
||||
|
||||
**Test Improvements**:
|
||||
- ✅ All tests verify exact version numbers
|
||||
- ✅ All tests reference DEPENDENCY_TREE_CONTEXT.md
|
||||
- ✅ Added 2 new critical tests (pin selectivity, major version prevention)
|
||||
- ✅ Enhanced error messages with expected vs actual values
|
||||
|
||||
## Design Documents
|
||||
|
||||
- **TEST_IMPROVEMENTS.md** - Summary of test enhancements based on dependency context
|
||||
- **DEPENDENCY_TREE_CONTEXT.md** - Verified dependency trees for all test packages
|
||||
- **DEPENDENCY_ANALYSIS.md** - Dependency analysis methodology
|
||||
- **CONTEXT_FILES_GUIDE.md** - Guide for using context files
|
||||
- **TEST_SCENARIOS.md** - Detailed test scenario specifications
|
||||
- **pip_util.test-design.md** - Test design and architecture
|
||||
- **pip_util.design.en.md** - pip_util design documentation
|
||||
433
tests/common/pip_util/TEST_IMPROVEMENTS.md
Normal file
433
tests/common/pip_util/TEST_IMPROVEMENTS.md
Normal file
@ -0,0 +1,433 @@
|
||||
# Test Code Improvements Based on Dependency Context
|
||||
|
||||
**Date**: 2025-10-01
|
||||
**Basis**: DEPENDENCY_TREE_CONTEXT.md analysis
|
||||
|
||||
This document summarizes all test improvements made using verified dependency tree information.
|
||||
|
||||
---
|
||||
|
||||
## Summary of Changes
|
||||
|
||||
### Tests Enhanced
|
||||
|
||||
| Test File | Tests Modified | Tests Added | Total Tests |
|
||||
|-----------|----------------|-------------|-------------|
|
||||
| `test_dependency_protection.py` | 2 | 2 | 4 |
|
||||
| `test_environment_recovery.py` | 2 | 0 | 2 |
|
||||
| **Total** | **4** | **2** | **6** |
|
||||
|
||||
### Test Results
|
||||
|
||||
```bash
|
||||
$ pytest test_dependency_protection.py test_environment_recovery.py -v
|
||||
|
||||
test_dependency_protection.py::test_dependency_version_protection_with_pin PASSED
|
||||
test_dependency_protection.py::test_dependency_chain_with_six_pin PASSED
|
||||
test_dependency_protection.py::test_pin_only_affects_specified_packages PASSED ✨ NEW
|
||||
test_dependency_protection.py::test_major_version_jump_prevention PASSED ✨ NEW
|
||||
test_environment_recovery.py::test_package_deletion_and_restore PASSED
|
||||
test_environment_recovery.py::test_version_change_and_restore PASSED
|
||||
|
||||
6 passed in 14.10s
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Detailed Improvements
|
||||
|
||||
### 1. test_dependency_version_protection_with_pin
|
||||
|
||||
**File**: `test_dependency_protection.py:34-94`
|
||||
|
||||
**Enhancements**:
|
||||
- ✅ Added exact version assertions based on DEPENDENCY_TREE_CONTEXT.md
|
||||
- ✅ Verified initial versions: urllib3==1.26.15, certifi==2023.7.22, charset-normalizer==3.2.0
|
||||
- ✅ Added verification that idna is NOT pre-installed
|
||||
- ✅ Added assertion that idna==3.10 is installed as NEW dependency
|
||||
- ✅ Verified requests==2.32.5 is installed
|
||||
- ✅ Added detailed error messages explaining what versions are expected and why
|
||||
|
||||
**Key Assertions Added**:
|
||||
```python
|
||||
# Verify expected OLD versions
|
||||
assert initial_urllib3 == "1.26.15", f"Expected urllib3==1.26.15, got {initial_urllib3}"
|
||||
assert initial_certifi == "2023.7.22", f"Expected certifi==2023.7.22, got {initial_certifi}"
|
||||
assert initial_charset == "3.2.0", f"Expected charset-normalizer==3.2.0, got {initial_charset}"
|
||||
|
||||
# Verify idna is NOT installed initially
|
||||
assert "idna" not in initial, "idna should not be pre-installed"
|
||||
|
||||
# Verify new dependency was added (idna is NOT pinned, so it gets installed)
|
||||
assert "idna" in final_packages, "idna should be installed as new dependency"
|
||||
assert final_packages["idna"] == "3.10", f"Expected idna==3.10, got {final_packages['idna']}"
|
||||
```
|
||||
|
||||
**Based on Context**:
|
||||
- DEPENDENCY_TREE_CONTEXT.md Section 1: requests → Dependencies
|
||||
- Verified: Without pin, urllib3 would upgrade to 2.5.0 (MAJOR version jump)
|
||||
- Verified: idna is NEW dependency (not in requirements-test-base.txt)
|
||||
|
||||
---
|
||||
|
||||
### 2. test_dependency_chain_with_six_pin
|
||||
|
||||
**File**: `test_dependency_protection.py:117-162`
|
||||
|
||||
**Enhancements**:
|
||||
- ✅ Added exact version assertion for six==1.16.0
|
||||
- ✅ Added exact version assertion for python-dateutil==2.9.0.post0
|
||||
- ✅ Added detailed error messages
|
||||
- ✅ Added docstring reference to DEPENDENCY_TREE_CONTEXT.md
|
||||
|
||||
**Key Assertions Added**:
|
||||
```python
|
||||
# Verify expected OLD version
|
||||
assert initial_six == "1.16.0", f"Expected six==1.16.0, got {initial_six}"
|
||||
|
||||
# Verify final versions
|
||||
assert final_packages["python-dateutil"] == "2.9.0.post0", f"Expected python-dateutil==2.9.0.post0"
|
||||
assert final_packages["six"] == "1.16.0", "six should remain at 1.16.0 (prevented 1.17.0 upgrade)"
|
||||
```
|
||||
|
||||
**Based on Context**:
|
||||
- DEPENDENCY_TREE_CONTEXT.md Section 2: python-dateutil → Dependencies
|
||||
- Verified: six is a REAL dependency (not optional like colorama)
|
||||
- Verified: Without pin, six would upgrade from 1.16.0 to 1.17.0
|
||||
|
||||
---
|
||||
|
||||
### 3. test_pin_only_affects_specified_packages ✨ NEW
|
||||
|
||||
**File**: `test_dependency_protection.py:165-208`
|
||||
|
||||
**Purpose**: Verify that pin is selective, not global
|
||||
|
||||
**Test Logic**:
|
||||
1. Verify idna is NOT pre-installed
|
||||
2. Verify requests is NOT pre-installed
|
||||
3. Install requests with pin policy (only pins urllib3, certifi, charset-normalizer)
|
||||
4. Verify idna was installed at latest version (3.10) - NOT pinned
|
||||
5. Verify requests was installed at expected version (2.32.5)
|
||||
|
||||
**Key Assertions**:
|
||||
```python
|
||||
# Verify idna was installed (NOT pinned, so gets latest)
|
||||
assert "idna" in final_packages, "idna should be installed as new dependency"
|
||||
assert final_packages["idna"] == "3.10", "idna should be at latest version 3.10 (not pinned)"
|
||||
```
|
||||
|
||||
**Based on Context**:
|
||||
- DEPENDENCY_TREE_CONTEXT.md: "⚠️ idna is NEW and NOT pinned (acceptable - new dependency)"
|
||||
- Verified: Pin only affects specified packages in pinned_packages list
|
||||
|
||||
---
|
||||
|
||||
### 4. test_major_version_jump_prevention ✨ NEW
|
||||
|
||||
**File**: `test_dependency_protection.py:211-271`
|
||||
|
||||
**Purpose**: Verify that pin prevents MAJOR version jumps with breaking changes
|
||||
|
||||
**Test Logic**:
|
||||
1. Verify initial urllib3==1.26.15
|
||||
2. **Test WITHOUT pin**: Uninstall deps, install requests → urllib3 upgrades to 2.x
|
||||
3. Verify urllib3 was upgraded to 2.x (starts with "2.")
|
||||
4. Reset environment
|
||||
5. **Test WITH pin**: Install requests with pin → urllib3 stays at 1.x
|
||||
6. Verify urllib3 stayed at 1.26.15 (starts with "1.")
|
||||
|
||||
**Key Assertions**:
|
||||
```python
|
||||
# Without pin - verify urllib3 upgrades to 2.x
|
||||
assert without_pin["urllib3"].startswith("2."), \
|
||||
f"Without pin, urllib3 should upgrade to 2.x, got {without_pin['urllib3']}"
|
||||
|
||||
# With pin - verify urllib3 stays at 1.x
|
||||
assert final_packages["urllib3"] == "1.26.15", \
|
||||
"Pin should prevent urllib3 from upgrading to 2.x (breaking changes)"
|
||||
assert final_packages["urllib3"].startswith("1."), \
|
||||
f"urllib3 should remain at 1.x series, got {final_packages['urllib3']}"
|
||||
```
|
||||
|
||||
**Based on Context**:
|
||||
- DEPENDENCY_TREE_CONTEXT.md: "urllib3 1.26.15 → 2.5.0 is a MAJOR version jump"
|
||||
- DEPENDENCY_TREE_CONTEXT.md: "urllib3 2.0 removed deprecated APIs"
|
||||
- This is the MOST IMPORTANT test - prevents breaking changes
|
||||
|
||||
---
|
||||
|
||||
### 5. test_package_deletion_and_restore
|
||||
|
||||
**File**: `test_environment_recovery.py:33-78`
|
||||
|
||||
**Enhancements**:
|
||||
- ✅ Added exact version assertion for six==1.16.0
|
||||
- ✅ Added verification that six is restored to EXACT version (not latest)
|
||||
- ✅ Added detailed error messages
|
||||
- ✅ Added docstring reference to DEPENDENCY_TREE_CONTEXT.md
|
||||
|
||||
**Key Assertions Added**:
|
||||
```python
|
||||
# Verify six is initially installed at expected version
|
||||
assert initial["six"] == "1.16.0", f"Expected six==1.16.0, got {initial['six']}"
|
||||
|
||||
# Verify six was restored to EXACT required version (not latest)
|
||||
assert final_packages["six"] == "1.16.0", \
|
||||
"six should be restored to exact version 1.16.0 (not 1.17.0 latest)"
|
||||
```
|
||||
|
||||
**Based on Context**:
|
||||
- DEPENDENCY_TREE_CONTEXT.md: "six: 1.16.0 (OLD) → 1.17.0 (LATEST)"
|
||||
- Verified: Restore policy restores to EXACT version, not latest
|
||||
|
||||
---
|
||||
|
||||
### 6. test_version_change_and_restore
|
||||
|
||||
**File**: `test_environment_recovery.py:105-158`
|
||||
|
||||
**Enhancements**:
|
||||
- ✅ Added exact version assertions (1.26.15 initially, 2.1.0 after upgrade)
|
||||
- ✅ Added verification of major version change (1.x → 2.x)
|
||||
- ✅ Added verification of major version downgrade (2.x → 1.x)
|
||||
- ✅ Added detailed error messages explaining downgrade capability
|
||||
- ✅ Added docstring reference to DEPENDENCY_TREE_CONTEXT.md
|
||||
|
||||
**Key Assertions Added**:
|
||||
```python
|
||||
# Verify version was changed to 2.x
|
||||
assert installed_after["urllib3"] == "2.1.0", \
|
||||
f"urllib3 should be upgraded to 2.1.0, got {installed_after['urllib3']}"
|
||||
assert installed_after["urllib3"].startswith("2."), \
|
||||
"urllib3 should be at 2.x series"
|
||||
|
||||
# Verify version was DOWNGRADED from 2.x back to 1.x
|
||||
assert final["urllib3"] == "1.26.15", \
|
||||
"urllib3 should be downgraded to 1.26.15 (from 2.1.0)"
|
||||
assert final["urllib3"].startswith("1."), \
|
||||
f"urllib3 should be back at 1.x series, got {final['urllib3']}"
|
||||
```
|
||||
|
||||
**Based on Context**:
|
||||
- DEPENDENCY_TREE_CONTEXT.md: "urllib3 can upgrade from 1.26.15 (1.x) to 2.5.0 (2.x)"
|
||||
- Verified: Restore policy can DOWNGRADE (not just prevent upgrades)
|
||||
- Tests actual version downgrade capability (2.x → 1.x)
|
||||
|
||||
---
|
||||
|
||||
## Test Coverage Analysis
|
||||
|
||||
### Before Improvements
|
||||
|
||||
| Scenario | Coverage |
|
||||
|----------|----------|
|
||||
| Pin prevents upgrades | ✅ Basic |
|
||||
| New dependencies installed | ❌ Not tested |
|
||||
| Pin is selective | ❌ Not tested |
|
||||
| Major version jump prevention | ❌ Not tested |
|
||||
| Exact version restoration | ❌ Not tested |
|
||||
| Version downgrade capability | ❌ Not tested |
|
||||
|
||||
### After Improvements
|
||||
|
||||
| Scenario | Coverage | Test |
|
||||
|----------|----------|------|
|
||||
| Pin prevents upgrades | ✅ Enhanced | test_dependency_version_protection_with_pin |
|
||||
| New dependencies installed | ✅ Added | test_dependency_version_protection_with_pin |
|
||||
| Pin is selective | ✅ Added | test_pin_only_affects_specified_packages |
|
||||
| Major version jump prevention | ✅ Added | test_major_version_jump_prevention |
|
||||
| Exact version restoration | ✅ Enhanced | test_package_deletion_and_restore |
|
||||
| Version downgrade capability | ✅ Enhanced | test_version_change_and_restore |
|
||||
|
||||
---
|
||||
|
||||
## Key Testing Principles Applied
|
||||
|
||||
### 1. Exact Version Verification
|
||||
|
||||
**Before**:
|
||||
```python
|
||||
assert final_packages["urllib3"] == initial_urllib3 # Generic
|
||||
```
|
||||
|
||||
**After**:
|
||||
```python
|
||||
assert initial_urllib3 == "1.26.15", f"Expected urllib3==1.26.15, got {initial_urllib3}"
|
||||
assert final_packages["urllib3"] == "1.26.15", "urllib3 should remain at 1.26.15 (prevented 2.x upgrade)"
|
||||
```
|
||||
|
||||
**Benefit**: Fails with clear message if environment setup is wrong
|
||||
|
||||
---
|
||||
|
||||
### 2. Version Series Verification
|
||||
|
||||
**Added**:
|
||||
```python
|
||||
assert final_packages["urllib3"].startswith("1."), \
|
||||
f"urllib3 should remain at 1.x series, got {final_packages['urllib3']}"
|
||||
```
|
||||
|
||||
**Benefit**: Catches major version jumps even if exact version changes
|
||||
|
||||
---
|
||||
|
||||
### 3. Negative Testing (Verify NOT Installed)
|
||||
|
||||
**Added**:
|
||||
```python
|
||||
assert "idna" not in initial, "idna should not be pre-installed"
|
||||
```
|
||||
|
||||
**Benefit**: Ensures test environment is in expected state
|
||||
|
||||
---
|
||||
|
||||
### 4. Context-Based Documentation
|
||||
|
||||
**Every test now includes**:
|
||||
```python
|
||||
"""
|
||||
Based on DEPENDENCY_TREE_CONTEXT.md:
|
||||
<specific section reference>
|
||||
<expected behavior from context>
|
||||
"""
|
||||
```
|
||||
|
||||
**Benefit**: Links test expectations to verified dependency data
|
||||
|
||||
---
|
||||
|
||||
## Real-World Scenarios Tested
|
||||
|
||||
### Scenario 1: Preventing Breaking Changes
|
||||
|
||||
**Test**: `test_major_version_jump_prevention`
|
||||
|
||||
**Real-World Impact**:
|
||||
- urllib3 2.0 removed deprecated APIs
|
||||
- Many applications break when upgrading from 1.x to 2.x
|
||||
- Pin prevents this automatic breaking change
|
||||
|
||||
**Verified**: ✅ Pin successfully prevents 1.x → 2.x upgrade
|
||||
|
||||
---
|
||||
|
||||
### Scenario 2: Allowing New Dependencies
|
||||
|
||||
**Test**: `test_pin_only_affects_specified_packages`
|
||||
|
||||
**Real-World Impact**:
|
||||
- New dependencies are safe to add (idna)
|
||||
- Pin should not block ALL changes
|
||||
- Only specified packages are protected
|
||||
|
||||
**Verified**: ✅ idna installs at 3.10 even with pin policy active
|
||||
|
||||
---
|
||||
|
||||
### Scenario 3: Version Downgrade Recovery
|
||||
|
||||
**Test**: `test_version_change_and_restore`
|
||||
|
||||
**Real-World Impact**:
|
||||
- Sometimes packages get upgraded accidentally
|
||||
- Need to downgrade to known-good version
|
||||
- Downgrade is harder than upgrade prevention
|
||||
|
||||
**Verified**: ✅ Can downgrade urllib3 from 2.x to 1.x
|
||||
|
||||
---
|
||||
|
||||
## Test Execution Performance
|
||||
|
||||
```
|
||||
Test Performance Summary:
|
||||
|
||||
test_dependency_version_protection_with_pin 2.28s (enhanced)
|
||||
test_dependency_chain_with_six_pin 2.00s (enhanced)
|
||||
test_pin_only_affects_specified_packages 2.25s (NEW)
|
||||
test_major_version_jump_prevention 3.53s (NEW - does 2 install cycles)
|
||||
test_package_deletion_and_restore 2.25s (enhanced)
|
||||
test_version_change_and_restore 2.24s (enhanced)
|
||||
|
||||
Total: 14.10s for 6 tests
|
||||
Average: 2.35s per test
|
||||
```
|
||||
|
||||
**Note**: `test_major_version_jump_prevention` is slower because it tests both WITH and WITHOUT pin (2 install cycles).
|
||||
|
||||
---
|
||||
|
||||
## Files Modified
|
||||
|
||||
1. **test_dependency_protection.py**: +138 lines
|
||||
- Enhanced 2 existing tests
|
||||
- Added 2 new tests
|
||||
- Total: 272 lines (was 132 lines)
|
||||
|
||||
2. **test_environment_recovery.py**: +35 lines
|
||||
- Enhanced 2 existing tests
|
||||
- Total: 159 lines (was 141 lines)
|
||||
|
||||
---
|
||||
|
||||
## Verification Against Context
|
||||
|
||||
All test improvements verified against:
|
||||
|
||||
| Context Source | Usage |
|
||||
|----------------|-------|
|
||||
| **DEPENDENCY_TREE_CONTEXT.md** | All version numbers, dependency trees |
|
||||
| **DEPENDENCY_ANALYSIS.md** | Package selection rationale, rejected scenarios |
|
||||
| **TEST_SCENARIOS.md** | Scenario specifications, expected outcomes |
|
||||
| **requirements-test-base.txt** | Initial environment state |
|
||||
| **analyze_dependencies.py** | Real-time verification of expectations |
|
||||
|
||||
---
|
||||
|
||||
## Future Maintenance
|
||||
|
||||
### When to Update Tests
|
||||
|
||||
Update tests when:
|
||||
- ✅ PyPI releases new major versions (e.g., urllib3 3.0)
|
||||
- ✅ Base package versions change in requirements-test-base.txt
|
||||
- ✅ New test scenarios added to DEPENDENCY_TREE_CONTEXT.md
|
||||
- ✅ Policy behavior changes in pip_util.py
|
||||
|
||||
### How to Update Tests
|
||||
|
||||
1. Run `python analyze_dependencies.py --all`
|
||||
2. Update expected version numbers in tests
|
||||
3. Update DEPENDENCY_TREE_CONTEXT.md
|
||||
4. Update TEST_SCENARIOS.md
|
||||
5. Run tests to verify
|
||||
|
||||
### Verification Commands
|
||||
|
||||
```bash
|
||||
# Verify environment
|
||||
python analyze_dependencies.py --env
|
||||
|
||||
# Verify package dependencies
|
||||
python analyze_dependencies.py requests
|
||||
python analyze_dependencies.py python-dateutil
|
||||
|
||||
# Run all tests
|
||||
pytest test_dependency_protection.py test_environment_recovery.py -v --override-ini="addopts="
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Summary
|
||||
|
||||
✅ **6 tests** now verify real PyPI package dependencies
|
||||
✅ **100% pass rate** with real pip operations
|
||||
✅ **All version numbers** verified against DEPENDENCY_TREE_CONTEXT.md
|
||||
✅ **Major version jump prevention** explicitly tested
|
||||
✅ **Selective pinning** verified (only specified packages)
|
||||
✅ **Version downgrade** capability tested
|
||||
|
||||
**Key Achievement**: Tests now verify actual PyPI behavior, not mocked expectations.
|
||||
573
tests/common/pip_util/TEST_SCENARIOS.md
Normal file
573
tests/common/pip_util/TEST_SCENARIOS.md
Normal file
@ -0,0 +1,573 @@
|
||||
# pip_util Test Scenarios - Test Data Specification
|
||||
|
||||
This document precisely defines all test scenarios, packages, versions, and expected behaviors used in the pip_util test suite.
|
||||
|
||||
## Table of Contents
|
||||
1. [Test Scenario 1: Dependency Version Protection](#scenario-1-dependency-version-protection)
|
||||
2. [Test Scenario 2: Complex Dependency Chain](#scenario-2-complex-dependency-chain)
|
||||
3. [Test Scenario 3: Package Deletion and Restore](#scenario-3-package-deletion-and-restore)
|
||||
4. [Test Scenario 4: Version Change and Restore](#scenario-4-version-change-and-restore)
|
||||
5. [Test Scenario 5: Full Workflow Integration](#scenario-5-full-workflow-integration)
|
||||
6. [Test Scenario 6: Pin Failure Retry](#scenario-6-pin-failure-retry)
|
||||
|
||||
---
|
||||
|
||||
## Scenario 1: Dependency Version Protection
|
||||
|
||||
**File**: `test_dependency_protection.py::test_dependency_version_protection_with_pin`
|
||||
|
||||
**Purpose**: Verify that `pin_dependencies` policy prevents dependency upgrades during package installation.
|
||||
|
||||
### Initial Environment State
|
||||
```python
|
||||
installed_packages = {
|
||||
"urllib3": "1.26.15", # OLD stable version
|
||||
"certifi": "2023.7.22", # OLD version
|
||||
"charset-normalizer": "3.2.0" # OLD version
|
||||
}
|
||||
```
|
||||
|
||||
### Policy Configuration
|
||||
```json
|
||||
{
|
||||
"requests": {
|
||||
"apply_all_matches": [
|
||||
{
|
||||
"type": "pin_dependencies",
|
||||
"pinned_packages": ["urllib3", "certifi", "charset-normalizer"],
|
||||
"on_failure": "retry_without_pin"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Action
|
||||
```python
|
||||
batch.install("requests")
|
||||
```
|
||||
|
||||
### Expected pip Command
|
||||
```bash
|
||||
pip install requests urllib3==1.26.15 certifi==2023.7.22 charset-normalizer==3.2.0
|
||||
```
|
||||
|
||||
### Expected Final State
|
||||
```python
|
||||
installed_packages = {
|
||||
"urllib3": "1.26.15", # PROTECTED - stayed at old version
|
||||
"certifi": "2023.7.22", # PROTECTED - stayed at old version
|
||||
"charset-normalizer": "3.2.0", # PROTECTED - stayed at old version
|
||||
"requests": "2.31.0" # NEWLY installed
|
||||
}
|
||||
```
|
||||
|
||||
### Without Pin (What Would Happen)
|
||||
```python
|
||||
# If pin_dependencies was NOT used:
|
||||
installed_packages = {
|
||||
"urllib3": "2.1.0", # UPGRADED to 2.x (breaking change)
|
||||
"certifi": "2024.2.2", # UPGRADED to latest
|
||||
"charset-normalizer": "3.3.2", # UPGRADED to latest
|
||||
"requests": "2.31.0"
|
||||
}
|
||||
```
|
||||
|
||||
**Key Point**: Pin prevents `urllib3` from upgrading to 2.x, which has breaking API changes.
|
||||
|
||||
---
|
||||
|
||||
## Scenario 2: Complex Dependency Chain
|
||||
|
||||
**File**: `test_dependency_protection.py::test_dependency_chain_with_click_colorama`
|
||||
|
||||
**Purpose**: Verify that `force_version` + `pin_dependencies` work together correctly.
|
||||
|
||||
### Initial Environment State
|
||||
```python
|
||||
installed_packages = {
|
||||
"colorama": "0.4.6" # Existing dependency
|
||||
}
|
||||
```
|
||||
|
||||
### Policy Configuration
|
||||
```json
|
||||
{
|
||||
"click": {
|
||||
"apply_first_match": [
|
||||
{
|
||||
"condition": {
|
||||
"type": "installed",
|
||||
"package": "colorama",
|
||||
"spec": "<0.5.0"
|
||||
},
|
||||
"type": "force_version",
|
||||
"version": "8.1.3",
|
||||
"reason": "click 8.1.3 compatible with colorama <0.5"
|
||||
}
|
||||
],
|
||||
"apply_all_matches": [
|
||||
{
|
||||
"type": "pin_dependencies",
|
||||
"pinned_packages": ["colorama"]
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Condition Evaluation
|
||||
```python
|
||||
# Check: colorama installed AND version < 0.5.0?
|
||||
colorama_installed = True
|
||||
colorama_version = "0.4.6" # 0.4.6 < 0.5.0 → True
|
||||
# Result: Condition satisfied → apply force_version
|
||||
```
|
||||
|
||||
### Action
|
||||
```python
|
||||
batch.install("click")
|
||||
```
|
||||
|
||||
### Expected pip Command
|
||||
```bash
|
||||
pip install click==8.1.3 colorama==0.4.6
|
||||
```
|
||||
|
||||
### Expected Final State
|
||||
```python
|
||||
installed_packages = {
|
||||
"colorama": "0.4.6", # PINNED - version protected
|
||||
"click": "8.1.3" # FORCED to specific version
|
||||
}
|
||||
```
|
||||
|
||||
**Key Point**:
|
||||
- `force_version` forces click to install version 8.1.3
|
||||
- `pin_dependencies` ensures colorama stays at 0.4.6
|
||||
|
||||
---
|
||||
|
||||
## Scenario 3: Package Deletion and Restore
|
||||
|
||||
**File**: `test_environment_recovery.py::test_package_deletion_and_restore`
|
||||
|
||||
**Purpose**: Verify that deleted packages can be restored to required versions.
|
||||
|
||||
### Initial Environment State
|
||||
```python
|
||||
installed_packages = {
|
||||
"six": "1.16.0", # Critical package
|
||||
"attrs": "23.1.0",
|
||||
"packaging": "23.1"
|
||||
}
|
||||
```
|
||||
|
||||
### Policy Configuration
|
||||
```json
|
||||
{
|
||||
"six": {
|
||||
"restore": [
|
||||
{
|
||||
"target": "six",
|
||||
"version": "1.16.0",
|
||||
"reason": "six must be maintained at 1.16.0 for compatibility"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Action Sequence
|
||||
|
||||
**Step 1**: Install package that removes six
|
||||
```python
|
||||
batch.install("python-dateutil")
|
||||
```
|
||||
|
||||
**Step 1 Result**: six is DELETED
|
||||
```python
|
||||
installed_packages = {
|
||||
# "six": "1.16.0", # ❌ DELETED by python-dateutil
|
||||
"attrs": "23.1.0",
|
||||
"packaging": "23.1",
|
||||
"python-dateutil": "2.8.2" # ✅ NEW
|
||||
}
|
||||
```
|
||||
|
||||
**Step 2**: Restore deleted packages
|
||||
```python
|
||||
batch.ensure_installed()
|
||||
```
|
||||
|
||||
**Step 2 Result**: six is RESTORED
|
||||
```python
|
||||
installed_packages = {
|
||||
"six": "1.16.0", # ✅ RESTORED to required version
|
||||
"attrs": "23.1.0",
|
||||
"packaging": "23.1",
|
||||
"python-dateutil": "2.8.2"
|
||||
}
|
||||
```
|
||||
|
||||
### Expected pip Commands
|
||||
```bash
|
||||
# Step 1: Install
|
||||
pip install python-dateutil
|
||||
|
||||
# Step 2: Restore
|
||||
pip install six==1.16.0
|
||||
```
|
||||
|
||||
**Key Point**: `restore` policy automatically reinstalls deleted packages.
|
||||
|
||||
---
|
||||
|
||||
## Scenario 4: Version Change and Restore
|
||||
|
||||
**File**: `test_environment_recovery.py::test_version_change_and_restore`
|
||||
|
||||
**Purpose**: Verify that packages with changed versions can be restored to required versions.
|
||||
|
||||
### Initial Environment State
|
||||
```python
|
||||
installed_packages = {
|
||||
"urllib3": "1.26.15", # OLD version (required)
|
||||
"certifi": "2023.7.22"
|
||||
}
|
||||
```
|
||||
|
||||
### Policy Configuration
|
||||
```json
|
||||
{
|
||||
"urllib3": {
|
||||
"restore": [
|
||||
{
|
||||
"condition": {
|
||||
"type": "installed",
|
||||
"spec": "!=1.26.15"
|
||||
},
|
||||
"target": "urllib3",
|
||||
"version": "1.26.15",
|
||||
"reason": "urllib3 must be 1.26.15 for compatibility"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Action Sequence
|
||||
|
||||
**Step 1**: Install package that upgrades urllib3
|
||||
```python
|
||||
batch.install("requests")
|
||||
```
|
||||
|
||||
**Step 1 Result**: urllib3 is UPGRADED
|
||||
```python
|
||||
installed_packages = {
|
||||
"urllib3": "2.1.0", # ❌ UPGRADED from 1.26.15 to 2.1.0
|
||||
"certifi": "2023.7.22",
|
||||
"requests": "2.31.0" # ✅ NEW
|
||||
}
|
||||
```
|
||||
|
||||
**Step 2**: Check restore condition
|
||||
```python
|
||||
# Condition: urllib3 installed AND version != 1.26.15?
|
||||
urllib3_version = "2.1.0"
|
||||
condition_met = (urllib3_version != "1.26.15") # True
|
||||
# Result: Restore urllib3 to 1.26.15
|
||||
```
|
||||
|
||||
**Step 2**: Restore to required version
|
||||
```python
|
||||
batch.ensure_installed()
|
||||
```
|
||||
|
||||
**Step 2 Result**: urllib3 is DOWNGRADED
|
||||
```python
|
||||
installed_packages = {
|
||||
"urllib3": "1.26.15", # ✅ RESTORED to required version
|
||||
"certifi": "2023.7.22",
|
||||
"requests": "2.31.0"
|
||||
}
|
||||
```
|
||||
|
||||
### Expected pip Commands
|
||||
```bash
|
||||
# Step 1: Install (causes upgrade)
|
||||
pip install requests
|
||||
|
||||
# Step 2: Restore (downgrade)
|
||||
pip install urllib3==1.26.15
|
||||
```
|
||||
|
||||
**Key Point**: `restore` with condition can revert unwanted version changes.
|
||||
|
||||
---
|
||||
|
||||
## Scenario 5: Full Workflow Integration
|
||||
|
||||
**File**: `test_full_workflow_integration.py::test_uninstall_install_restore_workflow`
|
||||
|
||||
**Purpose**: Verify complete workflow: uninstall → install → restore.
|
||||
|
||||
### Initial Environment State
|
||||
```python
|
||||
installed_packages = {
|
||||
"old-package": "1.0.0", # To be removed
|
||||
"critical-package": "1.2.3", # To be restored
|
||||
"urllib3": "1.26.15",
|
||||
"certifi": "2023.7.22"
|
||||
}
|
||||
```
|
||||
|
||||
### Policy Configuration
|
||||
```json
|
||||
{
|
||||
"old-package": {
|
||||
"uninstall": [
|
||||
{
|
||||
"target": "old-package"
|
||||
}
|
||||
]
|
||||
},
|
||||
"requests": {
|
||||
"apply_all_matches": [
|
||||
{
|
||||
"type": "pin_dependencies",
|
||||
"pinned_packages": ["urllib3", "certifi"]
|
||||
}
|
||||
]
|
||||
},
|
||||
"critical-package": {
|
||||
"restore": [
|
||||
{
|
||||
"target": "critical-package",
|
||||
"version": "1.2.3"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Action Sequence
|
||||
|
||||
**Step 1**: Remove old packages
|
||||
```python
|
||||
removed = batch.ensure_not_installed()
|
||||
```
|
||||
|
||||
**Step 1 Result**:
|
||||
```python
|
||||
installed_packages = {
|
||||
# "old-package": "1.0.0", # ❌ REMOVED
|
||||
"critical-package": "1.2.3",
|
||||
"urllib3": "1.26.15",
|
||||
"certifi": "2023.7.22"
|
||||
}
|
||||
removed = ["old-package"]
|
||||
```
|
||||
|
||||
**Step 2**: Install new package with pins
|
||||
```python
|
||||
batch.install("requests")
|
||||
```
|
||||
|
||||
**Step 2 Result**:
|
||||
```python
|
||||
installed_packages = {
|
||||
"critical-package": "1.2.3",
|
||||
"urllib3": "1.26.15", # PINNED - no upgrade
|
||||
"certifi": "2023.7.22", # PINNED - no upgrade
|
||||
"requests": "2.31.0" # NEW
|
||||
}
|
||||
```
|
||||
|
||||
**Step 3**: Restore required packages
|
||||
```python
|
||||
restored = batch.ensure_installed()
|
||||
```
|
||||
|
||||
**Step 3 Result**:
|
||||
```python
|
||||
installed_packages = {
|
||||
"critical-package": "1.2.3", # Still present
|
||||
"urllib3": "1.26.15",
|
||||
"certifi": "2023.7.22",
|
||||
"requests": "2.31.0"
|
||||
}
|
||||
restored = [] # Nothing to restore (all present)
|
||||
```
|
||||
|
||||
### Expected pip Commands
|
||||
```bash
|
||||
# Step 1: Uninstall
|
||||
pip uninstall -y old-package
|
||||
|
||||
# Step 2: Install with pins
|
||||
pip install requests urllib3==1.26.15 certifi==2023.7.22
|
||||
|
||||
# Step 3: (No command - all packages present)
|
||||
```
|
||||
|
||||
**Key Point**: Complete workflow demonstrates policy coordination.
|
||||
|
||||
---
|
||||
|
||||
## Scenario 6: Pin Failure Retry
|
||||
|
||||
**File**: `test_pin_failure_retry.py::test_pin_failure_retry_without_pin_succeeds`
|
||||
|
||||
**Purpose**: Verify automatic retry without pins when installation with pins fails.
|
||||
|
||||
### Initial Environment State
|
||||
```python
|
||||
installed_packages = {
|
||||
"urllib3": "1.26.15",
|
||||
"certifi": "2023.7.22"
|
||||
}
|
||||
```
|
||||
|
||||
### Policy Configuration
|
||||
```json
|
||||
{
|
||||
"requests": {
|
||||
"apply_all_matches": [
|
||||
{
|
||||
"type": "pin_dependencies",
|
||||
"pinned_packages": ["urllib3", "certifi"],
|
||||
"on_failure": "retry_without_pin"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Action
|
||||
```python
|
||||
batch.install("requests")
|
||||
```
|
||||
|
||||
### Attempt 1: Install WITH pins (FAILS)
|
||||
```bash
|
||||
# Command:
|
||||
pip install requests urllib3==1.26.15 certifi==2023.7.22
|
||||
|
||||
# Result: FAILURE (dependency conflict)
|
||||
# Error: "Package conflict: requests requires urllib3>=2.0"
|
||||
```
|
||||
|
||||
### Attempt 2: Retry WITHOUT pins (SUCCEEDS)
|
||||
```bash
|
||||
# Command:
|
||||
pip install requests
|
||||
|
||||
# Result: SUCCESS
|
||||
```
|
||||
|
||||
**Final State**:
|
||||
```python
|
||||
installed_packages = {
|
||||
"urllib3": "2.1.0", # UPGRADED (pins removed)
|
||||
"certifi": "2024.2.2", # UPGRADED (pins removed)
|
||||
"requests": "2.31.0" # INSTALLED
|
||||
}
|
||||
```
|
||||
|
||||
### Expected Behavior
|
||||
1. **First attempt**: Install with pinned versions
|
||||
2. **On failure**: Log warning about conflict
|
||||
3. **Retry**: Install without pins
|
||||
4. **Success**: Package installed, dependencies upgraded
|
||||
|
||||
**Key Point**: `retry_without_pin` provides automatic fallback for compatibility issues.
|
||||
|
||||
---
|
||||
|
||||
## Scenario 6b: Pin Failure with Hard Fail
|
||||
|
||||
**File**: `test_pin_failure_retry.py::test_pin_failure_with_fail_raises_exception`
|
||||
|
||||
**Purpose**: Verify that `on_failure: fail` raises exception instead of retrying.
|
||||
|
||||
### Initial Environment State
|
||||
```python
|
||||
installed_packages = {
|
||||
"urllib3": "1.26.15",
|
||||
"certifi": "2023.7.22"
|
||||
}
|
||||
```
|
||||
|
||||
### Policy Configuration
|
||||
```json
|
||||
{
|
||||
"requests": {
|
||||
"apply_all_matches": [
|
||||
{
|
||||
"type": "pin_dependencies",
|
||||
"pinned_packages": ["urllib3", "certifi"],
|
||||
"on_failure": "fail"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Action
|
||||
```python
|
||||
batch.install("requests")
|
||||
```
|
||||
|
||||
### Attempt 1: Install WITH pins (FAILS)
|
||||
```bash
|
||||
# Command:
|
||||
pip install requests urllib3==1.26.15 certifi==2023.7.22
|
||||
|
||||
# Result: FAILURE (dependency conflict)
|
||||
# Error: "Package conflict: requests requires urllib3>=2.0"
|
||||
```
|
||||
|
||||
### Expected Behavior
|
||||
1. **First attempt**: Install with pinned versions
|
||||
2. **On failure**: Raise `subprocess.CalledProcessError`
|
||||
3. **No retry**: Exception propagates to caller
|
||||
4. **No changes**: Environment unchanged
|
||||
|
||||
**Key Point**: `on_failure: fail` ensures strict version requirements.
|
||||
|
||||
---
|
||||
|
||||
## Summary Table: All Test Packages
|
||||
|
||||
| Package | Initial Version | Action | Final Version | Role |
|
||||
|---------|----------------|--------|---------------|------|
|
||||
| **urllib3** | 1.26.15 | Pin | 1.26.15 | Protected dependency |
|
||||
| **certifi** | 2023.7.22 | Pin | 2023.7.22 | Protected dependency |
|
||||
| **charset-normalizer** | 3.2.0 | Pin | 3.2.0 | Protected dependency |
|
||||
| **requests** | (not installed) | Install | 2.31.0 | New package |
|
||||
| **colorama** | 0.4.6 | Pin | 0.4.6 | Protected dependency |
|
||||
| **click** | (not installed) | Force version | 8.1.3 | New package with forced version |
|
||||
| **six** | 1.16.0 | Delete→Restore | 1.16.0 | Deleted then restored |
|
||||
| **python-dateutil** | (not installed) | Install | 2.8.2 | Package that deletes six |
|
||||
| **attrs** | 23.1.0 | No change | 23.1.0 | Bystander package |
|
||||
| **packaging** | 23.1 | No change | 23.1 | Bystander package |
|
||||
|
||||
## Policy Types Summary
|
||||
|
||||
| Policy Type | Purpose | Example |
|
||||
|-------------|---------|---------|
|
||||
| **pin_dependencies** | Prevent dependency upgrades | Keep urllib3 at 1.26.15 |
|
||||
| **force_version** | Force specific package version | Install click==8.1.3 |
|
||||
| **restore** | Reinstall deleted/changed packages | Restore six to 1.16.0 |
|
||||
| **uninstall** | Remove obsolete packages | Remove old-package |
|
||||
| **on_failure** | Handle installation failures | retry_without_pin or fail |
|
||||
|
||||
## Test Data Design Principles
|
||||
|
||||
1. **Lightweight Packages**: All packages are <200KB for fast testing
|
||||
2. **Real Dependencies**: Use actual PyPI package relationships
|
||||
3. **Version Realism**: Use real version numbers from PyPI
|
||||
4. **Clear Scenarios**: Each test demonstrates one clear behavior
|
||||
5. **Reproducible**: Mock ensures consistent behavior across environments
|
||||
261
tests/common/pip_util/analyze_dependencies.py
Executable file
261
tests/common/pip_util/analyze_dependencies.py
Executable file
@ -0,0 +1,261 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Dependency Tree Analyzer for pip_util Tests
|
||||
|
||||
Usage:
|
||||
python analyze_dependencies.py [package]
|
||||
python analyze_dependencies.py --all
|
||||
python analyze_dependencies.py --update-context
|
||||
|
||||
Examples:
|
||||
python analyze_dependencies.py requests
|
||||
python analyze_dependencies.py python-dateutil
|
||||
python analyze_dependencies.py --all
|
||||
"""
|
||||
|
||||
import subprocess
|
||||
import sys
|
||||
from typing import Dict, List, Tuple, Optional
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
PIP = "./test_venv/bin/pip"
|
||||
|
||||
|
||||
def check_venv():
|
||||
"""Check if test venv exists"""
|
||||
if not Path(PIP).exists():
|
||||
print("❌ Test venv not found!")
|
||||
print(" Run: ./setup_test_env.sh")
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
def get_installed_packages() -> Dict[str, str]:
|
||||
"""Get currently installed packages"""
|
||||
result = subprocess.run(
|
||||
[PIP, "freeze"],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=True
|
||||
)
|
||||
|
||||
packages = {}
|
||||
for line in result.stdout.strip().split('\n'):
|
||||
if '==' in line:
|
||||
pkg, ver = line.split('==', 1)
|
||||
packages[pkg] = ver
|
||||
|
||||
return packages
|
||||
|
||||
|
||||
def analyze_package_dry_run(
|
||||
package: str,
|
||||
constraints: Optional[List[str]] = None
|
||||
) -> Tuple[List[Tuple[str, str]], Dict[str, str]]:
|
||||
"""
|
||||
Analyze what would be installed with --dry-run
|
||||
|
||||
Returns:
|
||||
- List of (package_name, version) tuples in install order
|
||||
- Dict of current_version → new_version for upgrades
|
||||
"""
|
||||
cmd = [PIP, "install", "--dry-run", "--ignore-installed", package]
|
||||
if constraints:
|
||||
cmd.extend(constraints)
|
||||
|
||||
result = subprocess.run(cmd, capture_output=True, text=True)
|
||||
|
||||
# Parse "Would install" line
|
||||
would_install = []
|
||||
for line in result.stdout.split('\n'):
|
||||
if 'Would install' in line:
|
||||
packages_str = line.split('Would install')[1].strip()
|
||||
for pkg_str in packages_str.split():
|
||||
parts = pkg_str.split('-', 1)
|
||||
if len(parts) == 2:
|
||||
would_install.append((parts[0], parts[1]))
|
||||
|
||||
# Check against current installed
|
||||
installed = get_installed_packages()
|
||||
changes = {}
|
||||
for pkg, new_ver in would_install:
|
||||
if pkg in installed:
|
||||
old_ver = installed[pkg]
|
||||
if old_ver != new_ver:
|
||||
changes[pkg] = (old_ver, new_ver)
|
||||
|
||||
return would_install, changes
|
||||
|
||||
|
||||
def get_available_versions(package: str, limit: int = 10) -> Tuple[str, List[str]]:
|
||||
"""
|
||||
Get available versions from PyPI
|
||||
|
||||
Returns:
|
||||
- Latest version
|
||||
- List of available versions (limited)
|
||||
"""
|
||||
result = subprocess.run(
|
||||
[PIP, "index", "versions", package],
|
||||
capture_output=True,
|
||||
text=True
|
||||
)
|
||||
|
||||
latest = None
|
||||
versions = []
|
||||
|
||||
for line in result.stdout.split('\n'):
|
||||
if 'LATEST:' in line:
|
||||
latest = line.split('LATEST:')[1].strip()
|
||||
elif 'Available versions:' in line:
|
||||
versions_str = line.split('Available versions:')[1].strip()
|
||||
versions = [v.strip() for v in versions_str.split(',')[:limit]]
|
||||
|
||||
return latest, versions
|
||||
|
||||
|
||||
def print_package_analysis(package: str, with_pin: bool = False):
|
||||
"""Print detailed analysis for a package"""
|
||||
print(f"\n{'='*80}")
|
||||
print(f"Package: {package}")
|
||||
print(f"{'='*80}")
|
||||
|
||||
installed = get_installed_packages()
|
||||
|
||||
# Get latest version
|
||||
latest, available = get_available_versions(package)
|
||||
if latest:
|
||||
print(f"\n📦 Latest version: {latest}")
|
||||
print(f"📋 Available versions: {', '.join(available[:5])}")
|
||||
|
||||
# Scenario 1: Without constraints
|
||||
print(f"\n🔍 Scenario A: Install without constraints")
|
||||
print(f" Command: pip install {package}")
|
||||
|
||||
would_install, changes = analyze_package_dry_run(package)
|
||||
|
||||
if would_install:
|
||||
print(f"\n Would install {len(would_install)} packages:")
|
||||
for pkg, ver in would_install:
|
||||
if pkg in changes:
|
||||
old_ver, new_ver = changes[pkg]
|
||||
print(f" • {pkg:25} {old_ver:15} → {new_ver:15} ⚠️ UPGRADE")
|
||||
elif pkg in installed:
|
||||
print(f" • {pkg:25} {ver:15} (already installed)")
|
||||
else:
|
||||
print(f" • {pkg:25} {ver:15} ✨ NEW")
|
||||
|
||||
# Scenario 2: With pin constraints (if dependencies exist)
|
||||
dependencies = [pkg for pkg, _ in would_install if pkg != package]
|
||||
if dependencies and with_pin:
|
||||
print(f"\n🔍 Scenario B: Install with pin constraints")
|
||||
|
||||
# Create pin constraints for all current dependencies
|
||||
constraints = []
|
||||
for dep in dependencies:
|
||||
if dep in installed:
|
||||
constraints.append(f"{dep}=={installed[dep]}")
|
||||
|
||||
if constraints:
|
||||
print(f" Command: pip install {package} {' '.join(constraints)}")
|
||||
|
||||
would_install_pinned, changes_pinned = analyze_package_dry_run(
|
||||
package, constraints
|
||||
)
|
||||
|
||||
print(f"\n Would install {len(would_install_pinned)} packages:")
|
||||
for pkg, ver in would_install_pinned:
|
||||
if pkg in constraints:
|
||||
print(f" • {pkg:25} {ver:15} 📌 PINNED")
|
||||
elif pkg in installed:
|
||||
print(f" • {pkg:25} {ver:15} (no change)")
|
||||
else:
|
||||
print(f" • {pkg:25} {ver:15} ✨ NEW")
|
||||
|
||||
# Show what was prevented
|
||||
prevented = set(changes.keys()) - set(changes_pinned.keys())
|
||||
if prevented:
|
||||
print(f"\n ✅ Pin prevented {len(prevented)} upgrade(s):")
|
||||
for pkg in prevented:
|
||||
old_ver, new_ver = changes[pkg]
|
||||
print(f" • {pkg:25} {old_ver:15} ❌→ {new_ver}")
|
||||
|
||||
|
||||
def analyze_all_test_packages():
|
||||
"""Analyze all packages used in tests"""
|
||||
print("="*80)
|
||||
print("ANALYZING ALL TEST PACKAGES")
|
||||
print("="*80)
|
||||
|
||||
test_packages = [
|
||||
("requests", True),
|
||||
("python-dateutil", True),
|
||||
]
|
||||
|
||||
for package, with_pin in test_packages:
|
||||
print_package_analysis(package, with_pin)
|
||||
|
||||
print(f"\n{'='*80}")
|
||||
print("ANALYSIS COMPLETE")
|
||||
print(f"{'='*80}")
|
||||
|
||||
|
||||
def print_current_environment():
|
||||
"""Print current test environment"""
|
||||
print("="*80)
|
||||
print("CURRENT TEST ENVIRONMENT")
|
||||
print("="*80)
|
||||
|
||||
installed = get_installed_packages()
|
||||
|
||||
print(f"\nTotal packages: {len(installed)}\n")
|
||||
|
||||
# Group by category
|
||||
test_packages = ["urllib3", "certifi", "charset-normalizer", "six", "attrs", "packaging"]
|
||||
framework = ["pytest", "iniconfig", "pluggy", "Pygments"]
|
||||
|
||||
print("Test packages:")
|
||||
for pkg in test_packages:
|
||||
if pkg in installed:
|
||||
print(f" {pkg:25} {installed[pkg]}")
|
||||
|
||||
print("\nTest framework:")
|
||||
for pkg in framework:
|
||||
if pkg in installed:
|
||||
print(f" {pkg:25} {installed[pkg]}")
|
||||
|
||||
other = set(installed.keys()) - set(test_packages) - set(framework)
|
||||
if other:
|
||||
print("\nOther packages:")
|
||||
for pkg in sorted(other):
|
||||
print(f" {pkg:25} {installed[pkg]}")
|
||||
|
||||
|
||||
def main():
|
||||
"""Main entry point"""
|
||||
check_venv()
|
||||
|
||||
if len(sys.argv) == 1:
|
||||
print("Usage: python analyze_dependencies.py [package|--all|--env]")
|
||||
print("\nExamples:")
|
||||
print(" python analyze_dependencies.py requests")
|
||||
print(" python analyze_dependencies.py --all")
|
||||
print(" python analyze_dependencies.py --env")
|
||||
sys.exit(0)
|
||||
|
||||
command = sys.argv[1]
|
||||
|
||||
if command == "--all":
|
||||
analyze_all_test_packages()
|
||||
elif command == "--env":
|
||||
print_current_environment()
|
||||
elif command.startswith("--"):
|
||||
print(f"Unknown option: {command}")
|
||||
sys.exit(1)
|
||||
else:
|
||||
# Analyze specific package
|
||||
print_package_analysis(command, with_pin=True)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
387
tests/common/pip_util/conftest.py
Normal file
387
tests/common/pip_util/conftest.py
Normal file
@ -0,0 +1,387 @@
|
||||
"""
|
||||
pytest configuration and shared fixtures for pip_util.py tests
|
||||
|
||||
This file provides common fixtures and configuration for all tests.
|
||||
Uses real isolated venv for actual pip operations.
|
||||
"""
|
||||
|
||||
import json
|
||||
import subprocess
|
||||
import sys
|
||||
from pathlib import Path
|
||||
from typing import Dict, List
|
||||
from unittest.mock import MagicMock
|
||||
|
||||
import pytest
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Test venv Management
|
||||
# =============================================================================
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def test_venv_path():
|
||||
"""
|
||||
Get path to test venv (must be created by setup_test_env.sh)
|
||||
|
||||
Returns:
|
||||
Path: Path to test venv directory
|
||||
"""
|
||||
venv_path = Path(__file__).parent / "test_venv"
|
||||
if not venv_path.exists():
|
||||
pytest.fail(
|
||||
f"Test venv not found at {venv_path}.\n"
|
||||
"Please run: ./setup_test_env.sh"
|
||||
)
|
||||
return venv_path
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def test_pip_cmd(test_venv_path):
|
||||
"""
|
||||
Get pip command for test venv
|
||||
|
||||
Returns:
|
||||
List[str]: pip command prefix for subprocess
|
||||
"""
|
||||
pip_path = test_venv_path / "bin" / "pip"
|
||||
if not pip_path.exists():
|
||||
pytest.fail(f"pip not found at {pip_path}")
|
||||
return [str(pip_path)]
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def reset_test_venv(test_pip_cmd):
|
||||
"""
|
||||
Reset test venv to initial state before each test
|
||||
|
||||
This fixture:
|
||||
1. Records current installed packages
|
||||
2. Yields control to test
|
||||
3. Restores original packages after test
|
||||
"""
|
||||
# Get initial state
|
||||
result = subprocess.run(
|
||||
test_pip_cmd + ["freeze"],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=True
|
||||
)
|
||||
initial_packages = result.stdout.strip()
|
||||
|
||||
yield
|
||||
|
||||
# Restore initial state
|
||||
# Uninstall everything except pip, setuptools, wheel
|
||||
result = subprocess.run(
|
||||
test_pip_cmd + ["freeze"],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=True
|
||||
)
|
||||
current_packages = result.stdout.strip()
|
||||
|
||||
if current_packages:
|
||||
packages_to_remove = []
|
||||
for line in current_packages.split('\n'):
|
||||
if line and '==' in line:
|
||||
pkg = line.split('==')[0].lower()
|
||||
if pkg not in ['pip', 'setuptools', 'wheel']:
|
||||
packages_to_remove.append(pkg)
|
||||
|
||||
if packages_to_remove:
|
||||
subprocess.run(
|
||||
test_pip_cmd + ["uninstall", "-y"] + packages_to_remove,
|
||||
capture_output=True,
|
||||
check=False # Don't fail if package doesn't exist
|
||||
)
|
||||
|
||||
# Reinstall initial packages
|
||||
if initial_packages:
|
||||
# Create temporary requirements file
|
||||
import tempfile
|
||||
with tempfile.NamedTemporaryFile(mode='w', suffix='.txt', delete=False) as f:
|
||||
f.write(initial_packages)
|
||||
temp_req = f.name
|
||||
|
||||
try:
|
||||
subprocess.run(
|
||||
test_pip_cmd + ["install", "-r", temp_req],
|
||||
capture_output=True,
|
||||
check=True
|
||||
)
|
||||
finally:
|
||||
Path(temp_req).unlink()
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Directory and Path Fixtures
|
||||
# =============================================================================
|
||||
|
||||
@pytest.fixture
|
||||
def temp_policy_dir(tmp_path):
|
||||
"""
|
||||
Create temporary directory for policy files
|
||||
|
||||
Returns:
|
||||
Path: Temporary directory for storing test policy files
|
||||
"""
|
||||
policy_dir = tmp_path / "policies"
|
||||
policy_dir.mkdir()
|
||||
return policy_dir
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def temp_user_policy_dir(tmp_path):
|
||||
"""
|
||||
Create temporary directory for user policy files
|
||||
|
||||
Returns:
|
||||
Path: Temporary directory for storing user policy files
|
||||
"""
|
||||
user_dir = tmp_path / "user_policies"
|
||||
user_dir.mkdir()
|
||||
return user_dir
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Module Setup and Mocking
|
||||
# =============================================================================
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup_pip_util(monkeypatch, test_pip_cmd):
|
||||
"""
|
||||
Setup pip_util module for testing with real venv
|
||||
|
||||
This fixture:
|
||||
1. Mocks comfy module (not needed for tests)
|
||||
2. Adds comfyui_manager to path
|
||||
3. Patches make_pip_cmd to use test venv
|
||||
4. Resets policy cache
|
||||
"""
|
||||
# Mock comfy module before importing anything
|
||||
comfy_mock = MagicMock()
|
||||
cli_args_mock = MagicMock()
|
||||
cli_args_mock.args = MagicMock()
|
||||
comfy_mock.cli_args = cli_args_mock
|
||||
sys.modules['comfy'] = comfy_mock
|
||||
sys.modules['comfy.cli_args'] = cli_args_mock
|
||||
|
||||
# Add comfyui_manager parent to path so relative imports work
|
||||
comfyui_manager_path = str(Path(__file__).parent.parent.parent.parent)
|
||||
if comfyui_manager_path not in sys.path:
|
||||
sys.path.insert(0, comfyui_manager_path)
|
||||
|
||||
# Import pip_util
|
||||
from comfyui_manager.common import pip_util
|
||||
|
||||
# Patch make_pip_cmd to use test venv pip
|
||||
def make_test_pip_cmd(args: List[str]) -> List[str]:
|
||||
return test_pip_cmd + args
|
||||
|
||||
monkeypatch.setattr(
|
||||
pip_util.manager_util,
|
||||
"make_pip_cmd",
|
||||
make_test_pip_cmd
|
||||
)
|
||||
|
||||
# Reset policy cache
|
||||
pip_util._pip_policy_cache = None
|
||||
|
||||
yield
|
||||
|
||||
# Cleanup
|
||||
pip_util._pip_policy_cache = None
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_manager_util(monkeypatch, temp_policy_dir):
|
||||
"""
|
||||
Mock manager_util module paths
|
||||
|
||||
Args:
|
||||
monkeypatch: pytest monkeypatch fixture
|
||||
temp_policy_dir: Temporary policy directory
|
||||
"""
|
||||
from comfyui_manager.common import pip_util
|
||||
|
||||
monkeypatch.setattr(
|
||||
pip_util.manager_util,
|
||||
"comfyui_manager_path",
|
||||
str(temp_policy_dir)
|
||||
)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_context(monkeypatch, temp_user_policy_dir):
|
||||
"""
|
||||
Mock context module paths
|
||||
|
||||
Args:
|
||||
monkeypatch: pytest monkeypatch fixture
|
||||
temp_user_policy_dir: Temporary user policy directory
|
||||
"""
|
||||
from comfyui_manager.common import pip_util
|
||||
|
||||
monkeypatch.setattr(
|
||||
pip_util.context,
|
||||
"manager_files_path",
|
||||
str(temp_user_policy_dir)
|
||||
)
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Platform Mocking Fixtures
|
||||
# =============================================================================
|
||||
|
||||
@pytest.fixture
|
||||
def mock_platform_linux(monkeypatch):
|
||||
"""Mock platform.system() to return 'Linux'"""
|
||||
monkeypatch.setattr("platform.system", lambda: "Linux")
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_platform_windows(monkeypatch):
|
||||
"""Mock platform.system() to return 'Windows'"""
|
||||
monkeypatch.setattr("platform.system", lambda: "Windows")
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_platform_darwin(monkeypatch):
|
||||
"""Mock platform.system() to return 'Darwin' (macOS)"""
|
||||
monkeypatch.setattr("platform.system", lambda: "Darwin")
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_torch_cuda_available(monkeypatch):
|
||||
"""Mock torch.cuda.is_available() to return True"""
|
||||
class MockCuda:
|
||||
@staticmethod
|
||||
def is_available():
|
||||
return True
|
||||
|
||||
class MockTorch:
|
||||
cuda = MockCuda()
|
||||
|
||||
import sys
|
||||
monkeypatch.setitem(sys.modules, "torch", MockTorch())
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_torch_cuda_unavailable(monkeypatch):
|
||||
"""Mock torch.cuda.is_available() to return False"""
|
||||
class MockCuda:
|
||||
@staticmethod
|
||||
def is_available():
|
||||
return False
|
||||
|
||||
class MockTorch:
|
||||
cuda = MockCuda()
|
||||
|
||||
import sys
|
||||
monkeypatch.setitem(sys.modules, "torch", MockTorch())
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_torch_not_installed(monkeypatch):
|
||||
"""Mock torch as not installed (ImportError)"""
|
||||
import sys
|
||||
if "torch" in sys.modules:
|
||||
monkeypatch.delitem(sys.modules, "torch")
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Helper Functions
|
||||
# =============================================================================
|
||||
|
||||
@pytest.fixture
|
||||
def get_installed_packages(test_pip_cmd):
|
||||
"""
|
||||
Helper to get currently installed packages in test venv
|
||||
|
||||
Returns:
|
||||
Callable that returns Dict[str, str] of installed packages
|
||||
"""
|
||||
def _get_installed() -> Dict[str, str]:
|
||||
result = subprocess.run(
|
||||
test_pip_cmd + ["freeze"],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=True
|
||||
)
|
||||
|
||||
packages = {}
|
||||
for line in result.stdout.strip().split('\n'):
|
||||
if line and '==' in line:
|
||||
pkg, ver = line.split('==', 1)
|
||||
packages[pkg] = ver
|
||||
|
||||
return packages
|
||||
|
||||
return _get_installed
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def install_packages(test_pip_cmd):
|
||||
"""
|
||||
Helper to install packages in test venv
|
||||
|
||||
Returns:
|
||||
Callable that installs packages
|
||||
"""
|
||||
def _install(*packages):
|
||||
subprocess.run(
|
||||
test_pip_cmd + ["install"] + list(packages),
|
||||
capture_output=True,
|
||||
check=True
|
||||
)
|
||||
|
||||
return _install
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def uninstall_packages(test_pip_cmd):
|
||||
"""
|
||||
Helper to uninstall packages in test venv
|
||||
|
||||
Returns:
|
||||
Callable that uninstalls packages
|
||||
"""
|
||||
def _uninstall(*packages):
|
||||
subprocess.run(
|
||||
test_pip_cmd + ["uninstall", "-y"] + list(packages),
|
||||
capture_output=True,
|
||||
check=False # Don't fail if package doesn't exist
|
||||
)
|
||||
|
||||
return _uninstall
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Test Data Factories
|
||||
# =============================================================================
|
||||
|
||||
@pytest.fixture
|
||||
def make_policy():
|
||||
"""
|
||||
Factory fixture for creating policy dictionaries
|
||||
|
||||
Returns:
|
||||
Callable that creates policy dict from parameters
|
||||
"""
|
||||
def _make_policy(
|
||||
package_name: str,
|
||||
policy_type: str,
|
||||
section: str = "apply_first_match",
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
policy_item = {"type": policy_type}
|
||||
policy_item.update(kwargs)
|
||||
|
||||
return {
|
||||
package_name: {
|
||||
section: [policy_item]
|
||||
}
|
||||
}
|
||||
|
||||
return _make_policy
|
||||
52
tests/common/pip_util/pytest.ini
Normal file
52
tests/common/pip_util/pytest.ini
Normal file
@ -0,0 +1,52 @@
|
||||
[pytest]
|
||||
# pytest configuration for pip_util.py tests
|
||||
|
||||
# Test discovery
|
||||
testpaths = .
|
||||
|
||||
# Markers
|
||||
markers =
|
||||
unit: Unit tests for individual functions
|
||||
integration: Integration tests for workflows
|
||||
e2e: End-to-end tests for complete scenarios
|
||||
|
||||
# Output options - extend global config
|
||||
addopts =
|
||||
# Coverage options for pip_util
|
||||
--cov=../../../comfyui_manager/common/pip_util
|
||||
--cov-report=html:htmlcov_pip_util
|
||||
--cov-report=term-missing
|
||||
--cov-report=xml:coverage_pip_util.xml
|
||||
# Coverage fail threshold
|
||||
--cov-fail-under=80
|
||||
|
||||
# Coverage configuration
|
||||
[coverage:run]
|
||||
source = ../../../comfyui_manager/common
|
||||
omit =
|
||||
*/tests/*
|
||||
*/test_*.py
|
||||
*/__pycache__/*
|
||||
*/test_venv/*
|
||||
|
||||
[coverage:report]
|
||||
precision = 2
|
||||
show_missing = True
|
||||
skip_covered = False
|
||||
|
||||
exclude_lines =
|
||||
# Standard pragma
|
||||
pragma: no cover
|
||||
# Don't complain about missing debug code
|
||||
def __repr__
|
||||
# Don't complain if tests don't hit defensive assertion code
|
||||
raise AssertionError
|
||||
raise NotImplementedError
|
||||
# Don't complain if non-runnable code isn't run
|
||||
if __name__ == .__main__.:
|
||||
# Don't complain about abstract methods
|
||||
@abstractmethod
|
||||
|
||||
[coverage:html]
|
||||
directory = htmlcov
|
||||
|
||||
20
tests/common/pip_util/requirements-test-base.txt
Normal file
20
tests/common/pip_util/requirements-test-base.txt
Normal file
@ -0,0 +1,20 @@
|
||||
# Base packages for pip_util integration tests
|
||||
# These packages are installed initially to test various scenarios
|
||||
# All versions verified using: pip install --dry-run --ignore-installed
|
||||
|
||||
# Scenario 1: Dependency Version Protection (requests + urllib3)
|
||||
# Purpose: Pin prevents urllib3 1.26.15 → 2.5.0 major upgrade
|
||||
urllib3==1.26.15 # OLD stable version (prevent 2.x upgrade)
|
||||
certifi==2023.7.22 # OLD version (prevent 2025.x upgrade)
|
||||
charset-normalizer==3.2.0 # OLD version (prevent 3.4.x upgrade)
|
||||
# Note: idna is NOT pre-installed (will be added by requests)
|
||||
|
||||
# Scenario 2: Package Deletion and Restore (six)
|
||||
# Purpose: Restore policy reinstalls deleted packages
|
||||
six==1.16.0 # Will be deleted and restored to 1.16.0
|
||||
attrs==23.1.0 # Bystander package
|
||||
packaging==23.1 # Bystander package (NOT 23.1.0, not 25.0)
|
||||
|
||||
# Scenario 3: Version Change and Restore (urllib3)
|
||||
# Purpose: Restore policy reverts version changes
|
||||
# urllib3==1.26.15 (same as Scenario 1, will be upgraded to 2.5.0 then restored)
|
||||
47
tests/common/pip_util/setup_test_env.sh
Executable file
47
tests/common/pip_util/setup_test_env.sh
Executable file
@ -0,0 +1,47 @@
|
||||
#!/bin/bash
|
||||
# Setup script for pip_util integration tests
|
||||
# Creates a test venv and installs base packages
|
||||
|
||||
set -e # Exit on error
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
VENV_DIR="$SCRIPT_DIR/test_venv"
|
||||
|
||||
echo "Setting up test environment for pip_util integration tests..."
|
||||
|
||||
# Remove existing venv if present
|
||||
if [ -d "$VENV_DIR" ]; then
|
||||
echo "Removing existing test venv..."
|
||||
rm -rf "$VENV_DIR"
|
||||
fi
|
||||
|
||||
# Create new venv
|
||||
echo "Creating test venv at $VENV_DIR..."
|
||||
python3 -m venv "$VENV_DIR"
|
||||
|
||||
# Activate venv
|
||||
source "$VENV_DIR/bin/activate"
|
||||
|
||||
# Upgrade pip
|
||||
echo "Upgrading pip..."
|
||||
pip install --upgrade pip
|
||||
|
||||
# Install pytest
|
||||
echo "Installing pytest..."
|
||||
pip install pytest
|
||||
|
||||
# Install base test packages
|
||||
echo "Installing base test packages..."
|
||||
pip install -r "$SCRIPT_DIR/requirements-test-base.txt"
|
||||
|
||||
echo ""
|
||||
echo "Test environment setup complete!"
|
||||
echo "Installed packages:"
|
||||
pip freeze
|
||||
|
||||
echo ""
|
||||
echo "To activate the test venv, run:"
|
||||
echo " source $VENV_DIR/bin/activate"
|
||||
echo ""
|
||||
echo "To run tests:"
|
||||
echo " pytest -v"
|
||||
271
tests/common/pip_util/test_dependency_protection.py
Normal file
271
tests/common/pip_util/test_dependency_protection.py
Normal file
@ -0,0 +1,271 @@
|
||||
"""
|
||||
Test dependency version protection with pin (Priority 1)
|
||||
|
||||
Tests that existing dependency versions are protected by pin_dependencies policy
|
||||
"""
|
||||
|
||||
import json
|
||||
from pathlib import Path
|
||||
|
||||
import pytest
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def pin_policy(temp_policy_dir):
|
||||
"""Create policy with pin_dependencies for lightweight real packages"""
|
||||
policy_content = {
|
||||
"requests": {
|
||||
"apply_all_matches": [
|
||||
{
|
||||
"type": "pin_dependencies",
|
||||
"pinned_packages": ["urllib3", "certifi", "charset-normalizer"],
|
||||
"on_failure": "retry_without_pin"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
|
||||
policy_file = temp_policy_dir / "pip-policy.json"
|
||||
policy_file.write_text(json.dumps(policy_content, indent=2))
|
||||
return policy_file
|
||||
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_dependency_version_protection_with_pin(
|
||||
pin_policy,
|
||||
mock_manager_util,
|
||||
mock_context,
|
||||
reset_test_venv,
|
||||
get_installed_packages
|
||||
):
|
||||
"""
|
||||
Test existing dependency versions are protected by pin
|
||||
|
||||
Priority: 1 (Essential)
|
||||
|
||||
Purpose:
|
||||
Verify that when installing a package that would normally upgrade
|
||||
dependencies, the pin_dependencies policy protects existing versions.
|
||||
|
||||
Based on DEPENDENCY_TREE_CONTEXT.md:
|
||||
Without pin: urllib3 1.26.15 → 2.5.0 (MAJOR upgrade)
|
||||
With pin: urllib3 stays at 1.26.15 (protected)
|
||||
"""
|
||||
from comfyui_manager.common.pip_util import PipBatch
|
||||
|
||||
# Verify initial packages are installed (from requirements-test-base.txt)
|
||||
initial = get_installed_packages()
|
||||
assert "urllib3" in initial
|
||||
assert "certifi" in initial
|
||||
assert "charset-normalizer" in initial
|
||||
|
||||
# Record initial versions (from DEPENDENCY_TREE_CONTEXT.md)
|
||||
initial_urllib3 = initial["urllib3"]
|
||||
initial_certifi = initial["certifi"]
|
||||
initial_charset = initial["charset-normalizer"]
|
||||
|
||||
# Verify expected OLD versions
|
||||
assert initial_urllib3 == "1.26.15", f"Expected urllib3==1.26.15, got {initial_urllib3}"
|
||||
assert initial_certifi == "2023.7.22", f"Expected certifi==2023.7.22, got {initial_certifi}"
|
||||
assert initial_charset == "3.2.0", f"Expected charset-normalizer==3.2.0, got {initial_charset}"
|
||||
|
||||
# Verify idna is NOT installed initially
|
||||
assert "idna" not in initial, "idna should not be pre-installed"
|
||||
|
||||
with PipBatch() as batch:
|
||||
result = batch.install("requests")
|
||||
final_packages = batch._get_installed_packages()
|
||||
|
||||
# Verify installation succeeded
|
||||
assert result is True
|
||||
assert "requests" in final_packages
|
||||
|
||||
# Verify versions were maintained (not upgraded to latest)
|
||||
# Without pin, these would upgrade to: urllib3==2.5.0, certifi==2025.8.3, charset-normalizer==3.4.3
|
||||
assert final_packages["urllib3"] == "1.26.15", "urllib3 should remain at 1.26.15 (prevented 2.x upgrade)"
|
||||
assert final_packages["certifi"] == "2023.7.22", "certifi should remain at 2023.7.22 (prevented 2025.x upgrade)"
|
||||
assert final_packages["charset-normalizer"] == "3.2.0", "charset-normalizer should remain at 3.2.0"
|
||||
|
||||
# Verify new dependency was added (idna is NOT pinned, so it gets installed)
|
||||
assert "idna" in final_packages, "idna should be installed as new dependency"
|
||||
assert final_packages["idna"] == "3.10", f"Expected idna==3.10, got {final_packages['idna']}"
|
||||
|
||||
# Verify requests was installed at expected version
|
||||
assert final_packages["requests"] == "2.32.5", f"Expected requests==2.32.5, got {final_packages['requests']}"
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def python_dateutil_policy(temp_policy_dir):
|
||||
"""Create policy for python-dateutil with six pinning"""
|
||||
policy_content = {
|
||||
"python-dateutil": {
|
||||
"apply_all_matches": [
|
||||
{
|
||||
"type": "pin_dependencies",
|
||||
"pinned_packages": ["six"],
|
||||
"reason": "Protect six from upgrading"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
|
||||
policy_file = temp_policy_dir / "pip-policy.json"
|
||||
policy_file.write_text(json.dumps(policy_content, indent=2))
|
||||
return policy_file
|
||||
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_dependency_chain_with_six_pin(
|
||||
python_dateutil_policy,
|
||||
mock_manager_util,
|
||||
mock_context,
|
||||
reset_test_venv,
|
||||
get_installed_packages
|
||||
):
|
||||
"""
|
||||
Test python-dateutil + six dependency chain with pin
|
||||
|
||||
Priority: 2 (Important)
|
||||
|
||||
Purpose:
|
||||
Verify that pin_dependencies protects actual dependencies
|
||||
(six is a real dependency of python-dateutil).
|
||||
|
||||
Based on DEPENDENCY_TREE_CONTEXT.md:
|
||||
python-dateutil depends on six>=1.5
|
||||
Without pin: six 1.16.0 → 1.17.0
|
||||
With pin: six stays at 1.16.0 (protected)
|
||||
"""
|
||||
from comfyui_manager.common.pip_util import PipBatch
|
||||
|
||||
# Verify six is installed
|
||||
initial = get_installed_packages()
|
||||
assert "six" in initial
|
||||
initial_six = initial["six"]
|
||||
|
||||
# Verify expected OLD version
|
||||
assert initial_six == "1.16.0", f"Expected six==1.16.0, got {initial_six}"
|
||||
|
||||
with PipBatch() as batch:
|
||||
result = batch.install("python-dateutil")
|
||||
final_packages = batch._get_installed_packages()
|
||||
|
||||
# Verify installation succeeded
|
||||
assert result is True
|
||||
|
||||
# Verify final versions
|
||||
assert "python-dateutil" in final_packages
|
||||
assert final_packages["python-dateutil"] == "2.9.0.post0", f"Expected python-dateutil==2.9.0.post0"
|
||||
|
||||
# Verify six was NOT upgraded (without pin, would upgrade to 1.17.0)
|
||||
assert "six" in final_packages
|
||||
assert final_packages["six"] == "1.16.0", "six should remain at 1.16.0 (prevented 1.17.0 upgrade)"
|
||||
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_pin_only_affects_specified_packages(
|
||||
pin_policy,
|
||||
mock_manager_util,
|
||||
mock_context,
|
||||
reset_test_venv,
|
||||
get_installed_packages
|
||||
):
|
||||
"""
|
||||
Test that pin only affects specified packages, not all dependencies
|
||||
|
||||
Priority: 1 (Essential)
|
||||
|
||||
Purpose:
|
||||
Verify that idna (new dependency) is installed even though
|
||||
other dependencies are pinned. This tests that pin is selective,
|
||||
not global.
|
||||
|
||||
Based on DEPENDENCY_TREE_CONTEXT.md:
|
||||
idna is a NEW dependency (not in initial environment)
|
||||
Pin only affects: urllib3, certifi, charset-normalizer
|
||||
idna should be installed at latest version (3.10)
|
||||
"""
|
||||
from comfyui_manager.common.pip_util import PipBatch
|
||||
|
||||
# Verify initial state
|
||||
initial = get_installed_packages()
|
||||
assert "idna" not in initial, "idna should not be pre-installed"
|
||||
assert "requests" not in initial, "requests should not be pre-installed"
|
||||
|
||||
with PipBatch() as batch:
|
||||
result = batch.install("requests")
|
||||
final_packages = batch._get_installed_packages()
|
||||
|
||||
# Verify installation succeeded
|
||||
assert result is True
|
||||
|
||||
# Verify idna was installed (NOT pinned, so gets latest)
|
||||
assert "idna" in final_packages, "idna should be installed as new dependency"
|
||||
assert final_packages["idna"] == "3.10", "idna should be at latest version 3.10 (not pinned)"
|
||||
|
||||
# Verify requests was installed
|
||||
assert "requests" in final_packages
|
||||
assert final_packages["requests"] == "2.32.5"
|
||||
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_major_version_jump_prevention(
|
||||
pin_policy,
|
||||
mock_manager_util,
|
||||
mock_context,
|
||||
reset_test_venv,
|
||||
get_installed_packages,
|
||||
install_packages,
|
||||
uninstall_packages
|
||||
):
|
||||
"""
|
||||
Test that pin prevents MAJOR version jumps (breaking changes)
|
||||
|
||||
Priority: 1 (Essential)
|
||||
|
||||
Purpose:
|
||||
Verify that pin prevents urllib3 1.x → 2.x major upgrade.
|
||||
This is the most important test because urllib3 2.0 has
|
||||
breaking API changes.
|
||||
|
||||
Based on DEPENDENCY_TREE_CONTEXT.md:
|
||||
urllib3 1.26.15 → 2.5.0 is a MAJOR version jump
|
||||
urllib3 2.0 removed deprecated APIs
|
||||
requests accepts both: urllib3<3,>=1.21.1
|
||||
"""
|
||||
from comfyui_manager.common.pip_util import PipBatch
|
||||
|
||||
# Verify initial urllib3 version
|
||||
initial = get_installed_packages()
|
||||
assert initial["urllib3"] == "1.26.15", "Expected urllib3==1.26.15"
|
||||
|
||||
# First, test WITHOUT pin to verify urllib3 would upgrade to 2.x
|
||||
# (This simulates what would happen without our protection)
|
||||
uninstall_packages("urllib3", "certifi", "charset-normalizer")
|
||||
install_packages("requests")
|
||||
|
||||
without_pin = get_installed_packages()
|
||||
|
||||
# Verify urllib3 was upgraded to 2.x without pin
|
||||
assert "urllib3" in without_pin
|
||||
assert without_pin["urllib3"].startswith("2."), \
|
||||
f"Without pin, urllib3 should upgrade to 2.x, got {without_pin['urllib3']}"
|
||||
|
||||
# Now reset and test WITH pin
|
||||
uninstall_packages("requests", "urllib3", "certifi", "charset-normalizer", "idna")
|
||||
install_packages("urllib3==1.26.15", "certifi==2023.7.22", "charset-normalizer==3.2.0")
|
||||
|
||||
with PipBatch() as batch:
|
||||
result = batch.install("requests")
|
||||
final_packages = batch._get_installed_packages()
|
||||
|
||||
# Verify installation succeeded
|
||||
assert result is True
|
||||
|
||||
# Verify urllib3 stayed at 1.x (prevented major version jump)
|
||||
assert final_packages["urllib3"] == "1.26.15", \
|
||||
"Pin should prevent urllib3 from upgrading to 2.x (breaking changes)"
|
||||
|
||||
# Verify it's specifically 1.x, not 2.x
|
||||
assert final_packages["urllib3"].startswith("1."), \
|
||||
f"urllib3 should remain at 1.x series, got {final_packages['urllib3']}"
|
||||
279
tests/common/pip_util/test_edge_cases.py
Normal file
279
tests/common/pip_util/test_edge_cases.py
Normal file
@ -0,0 +1,279 @@
|
||||
"""
|
||||
Edge cases and boundary conditions (Priority 3)
|
||||
|
||||
Tests empty policies, malformed JSON, and edge cases
|
||||
"""
|
||||
|
||||
import json
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
|
||||
import pytest
|
||||
|
||||
|
||||
@pytest.mark.unit
|
||||
def test_empty_base_policy_uses_default_installation(
|
||||
empty_policy_file,
|
||||
mock_manager_util,
|
||||
mock_context
|
||||
):
|
||||
"""
|
||||
Test default installation with empty policy
|
||||
|
||||
Priority: 3 (Recommended)
|
||||
|
||||
Purpose:
|
||||
Verify that when policy is empty, the system falls back
|
||||
to default installation behavior.
|
||||
"""
|
||||
import sys
|
||||
# Path setup handled by conftest.py
|
||||
|
||||
from comfyui_manager.common.pip_util import get_pip_policy
|
||||
|
||||
policy = get_pip_policy()
|
||||
|
||||
assert policy == {}
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def malformed_policy_file(temp_policy_dir):
|
||||
"""Create malformed JSON policy file"""
|
||||
policy_file = temp_policy_dir / "pip-policy.json"
|
||||
policy_file.write_text("{invalid json content")
|
||||
return policy_file
|
||||
|
||||
|
||||
@pytest.mark.unit
|
||||
def test_json_parse_error_fallback_to_empty(
|
||||
malformed_policy_file,
|
||||
mock_manager_util,
|
||||
mock_context,
|
||||
capture_logs
|
||||
):
|
||||
"""
|
||||
Test empty dict on JSON parse error
|
||||
|
||||
Priority: 3 (Recommended)
|
||||
|
||||
Purpose:
|
||||
Verify that malformed JSON results in empty policy
|
||||
with appropriate error logging.
|
||||
"""
|
||||
import sys
|
||||
# Path setup handled by conftest.py
|
||||
|
||||
from comfyui_manager.common.pip_util import get_pip_policy
|
||||
|
||||
policy = get_pip_policy()
|
||||
|
||||
assert policy == {}
|
||||
# Should have error log about parsing failure
|
||||
assert any("parse" in record.message.lower() for record in capture_logs.records)
|
||||
|
||||
|
||||
@pytest.mark.unit
|
||||
def test_unknown_condition_type_returns_false(
|
||||
mock_manager_util,
|
||||
mock_context,
|
||||
capture_logs
|
||||
):
|
||||
"""
|
||||
Test unknown condition type returns False
|
||||
|
||||
Priority: 3 (Recommended)
|
||||
|
||||
Purpose:
|
||||
Verify that unknown condition types are handled gracefully
|
||||
by returning False with a warning.
|
||||
"""
|
||||
import sys
|
||||
# Path setup handled by conftest.py
|
||||
|
||||
from comfyui_manager.common.pip_util import PipBatch
|
||||
|
||||
batch = PipBatch()
|
||||
condition = {"type": "unknown_type", "some_field": "value"}
|
||||
|
||||
result = batch._evaluate_condition(condition, "pkg", {})
|
||||
|
||||
assert result is False
|
||||
# Should have warning about unknown type
|
||||
assert any("unknown" in record.message.lower() for record in capture_logs.records)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def self_reference_policy(temp_policy_dir):
|
||||
"""Create policy with self-reference"""
|
||||
policy_content = {
|
||||
"critical-package": {
|
||||
"restore": [
|
||||
{
|
||||
"condition": {
|
||||
"type": "installed",
|
||||
"spec": "!=1.2.3"
|
||||
},
|
||||
"target": "critical-package",
|
||||
"version": "1.2.3"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
|
||||
policy_file = temp_policy_dir / "pip-policy.json"
|
||||
policy_file.write_text(json.dumps(policy_content, indent=2))
|
||||
return policy_file
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_self_reference_subprocess(monkeypatch):
|
||||
"""Mock subprocess for self-reference test"""
|
||||
call_sequence = []
|
||||
|
||||
installed_packages = {
|
||||
"critical-package": "1.2.2"
|
||||
}
|
||||
|
||||
def mock_run(cmd, **kwargs):
|
||||
call_sequence.append(cmd)
|
||||
|
||||
# pip freeze
|
||||
if "freeze" in cmd:
|
||||
output = "\n".join([f"{pkg}=={ver}" for pkg, ver in installed_packages.items()])
|
||||
return subprocess.CompletedProcess(cmd, 0, output, "")
|
||||
|
||||
# pip install
|
||||
if "install" in cmd and "critical-package==1.2.3" in cmd:
|
||||
installed_packages["critical-package"] = "1.2.3"
|
||||
return subprocess.CompletedProcess(cmd, 0, "", "")
|
||||
|
||||
return subprocess.CompletedProcess(cmd, 0, "", "")
|
||||
|
||||
monkeypatch.setattr("subprocess.run", mock_run)
|
||||
return call_sequence, installed_packages
|
||||
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_restore_self_version_check(
|
||||
self_reference_policy,
|
||||
mock_manager_util,
|
||||
mock_context,
|
||||
mock_self_reference_subprocess
|
||||
):
|
||||
"""
|
||||
Test restore policy checking its own version
|
||||
|
||||
Priority: 3 (Recommended)
|
||||
|
||||
Purpose:
|
||||
Verify that when a condition omits the package field,
|
||||
it correctly defaults to checking the package itself.
|
||||
"""
|
||||
import sys
|
||||
# Path setup handled by conftest.py
|
||||
|
||||
from comfyui_manager.common.pip_util import PipBatch
|
||||
|
||||
call_sequence, installed_packages = mock_self_reference_subprocess
|
||||
|
||||
with PipBatch() as batch:
|
||||
restored = batch.ensure_installed()
|
||||
final = batch._get_installed_packages()
|
||||
|
||||
# Condition should evaluate with self-reference
|
||||
# "1.2.2" != "1.2.3" → True
|
||||
assert "critical-package" in restored
|
||||
assert final["critical-package"] == "1.2.3"
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def partial_failure_policy(temp_policy_dir):
|
||||
"""Create policy for multiple uninstalls"""
|
||||
policy_content = {
|
||||
"pkg-a": {
|
||||
"uninstall": [{"target": "old-pkg-1"}]
|
||||
},
|
||||
"pkg-b": {
|
||||
"uninstall": [{"target": "old-pkg-2"}]
|
||||
},
|
||||
"pkg-c": {
|
||||
"uninstall": [{"target": "old-pkg-3"}]
|
||||
}
|
||||
}
|
||||
|
||||
policy_file = temp_policy_dir / "pip-policy.json"
|
||||
policy_file.write_text(json.dumps(policy_content, indent=2))
|
||||
return policy_file
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_partial_failure_subprocess(monkeypatch):
|
||||
"""Mock subprocess with one failure"""
|
||||
call_sequence = []
|
||||
|
||||
installed_packages = {
|
||||
"old-pkg-1": "1.0",
|
||||
"old-pkg-2": "1.0",
|
||||
"old-pkg-3": "1.0"
|
||||
}
|
||||
|
||||
def mock_run(cmd, **kwargs):
|
||||
call_sequence.append(cmd)
|
||||
|
||||
# pip freeze
|
||||
if "freeze" in cmd:
|
||||
output = "\n".join([f"{pkg}=={ver}" for pkg, ver in installed_packages.items()])
|
||||
return subprocess.CompletedProcess(cmd, 0, output, "")
|
||||
|
||||
# pip uninstall
|
||||
if "uninstall" in cmd:
|
||||
if "old-pkg-2" in cmd:
|
||||
# Fail on pkg-2
|
||||
raise subprocess.CalledProcessError(1, cmd, "", "Uninstall failed")
|
||||
else:
|
||||
# Success on others
|
||||
for pkg in ["old-pkg-1", "old-pkg-3"]:
|
||||
if pkg in cmd:
|
||||
installed_packages.pop(pkg, None)
|
||||
return subprocess.CompletedProcess(cmd, 0, "", "")
|
||||
|
||||
return subprocess.CompletedProcess(cmd, 0, "", "")
|
||||
|
||||
monkeypatch.setattr("subprocess.run", mock_run)
|
||||
return call_sequence, installed_packages
|
||||
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_ensure_not_installed_continues_on_individual_failure(
|
||||
partial_failure_policy,
|
||||
mock_manager_util,
|
||||
mock_context,
|
||||
mock_partial_failure_subprocess,
|
||||
capture_logs
|
||||
):
|
||||
"""
|
||||
Test partial failure handling
|
||||
|
||||
Priority: 2 (Important)
|
||||
|
||||
Purpose:
|
||||
Verify that when one package removal fails, the system
|
||||
continues processing other packages.
|
||||
"""
|
||||
import sys
|
||||
# Path setup handled by conftest.py
|
||||
|
||||
from comfyui_manager.common.pip_util import PipBatch
|
||||
|
||||
call_sequence, installed_packages = mock_partial_failure_subprocess
|
||||
|
||||
with PipBatch() as batch:
|
||||
removed = batch.ensure_not_installed()
|
||||
|
||||
# Verify partial success
|
||||
assert "old-pkg-1" in removed
|
||||
assert "old-pkg-3" in removed
|
||||
assert "old-pkg-2" not in removed # Failed
|
||||
|
||||
# Verify warning logged for failure
|
||||
assert any("warning" in record.levelname.lower() for record in capture_logs.records)
|
||||
158
tests/common/pip_util/test_environment_recovery.py
Normal file
158
tests/common/pip_util/test_environment_recovery.py
Normal file
@ -0,0 +1,158 @@
|
||||
"""
|
||||
Test environment corruption and recovery (Priority 1)
|
||||
|
||||
Tests that packages deleted or modified during installation are restored
|
||||
"""
|
||||
|
||||
import json
|
||||
from pathlib import Path
|
||||
|
||||
import pytest
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def restore_policy(temp_policy_dir):
|
||||
"""Create policy with restore section for lightweight packages"""
|
||||
policy_content = {
|
||||
"six": {
|
||||
"restore": [
|
||||
{
|
||||
"target": "six",
|
||||
"version": "1.16.0",
|
||||
"reason": "six must be maintained at 1.16.0 for compatibility"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
|
||||
policy_file = temp_policy_dir / "pip-policy.json"
|
||||
policy_file.write_text(json.dumps(policy_content, indent=2))
|
||||
return policy_file
|
||||
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_package_deletion_and_restore(
|
||||
restore_policy,
|
||||
mock_manager_util,
|
||||
mock_context,
|
||||
reset_test_venv,
|
||||
get_installed_packages,
|
||||
install_packages,
|
||||
uninstall_packages
|
||||
):
|
||||
"""
|
||||
Test package deleted by installation is restored
|
||||
|
||||
Priority: 1 (Essential)
|
||||
|
||||
Purpose:
|
||||
Verify that when a package installation deletes another package,
|
||||
the restore policy can bring it back with the correct version.
|
||||
|
||||
Based on DEPENDENCY_TREE_CONTEXT.md:
|
||||
six==1.16.0 must be maintained for compatibility
|
||||
After deletion, should restore to exactly 1.16.0
|
||||
"""
|
||||
from comfyui_manager.common.pip_util import PipBatch
|
||||
|
||||
# Verify six is initially installed at expected version
|
||||
initial = get_installed_packages()
|
||||
assert "six" in initial
|
||||
assert initial["six"] == "1.16.0", f"Expected six==1.16.0, got {initial['six']}"
|
||||
|
||||
with PipBatch() as batch:
|
||||
# Manually remove six to simulate deletion by another package
|
||||
uninstall_packages("six")
|
||||
|
||||
# Check six was deleted
|
||||
installed_after_delete = batch._get_installed_packages()
|
||||
assert "six" not in installed_after_delete, "six should be deleted"
|
||||
|
||||
# Restore six
|
||||
restored = batch.ensure_installed()
|
||||
final_packages = batch._get_installed_packages()
|
||||
|
||||
# Verify six was restored to EXACT required version (not latest)
|
||||
assert "six" in restored, "six should be in restored list"
|
||||
assert final_packages["six"] == "1.16.0", \
|
||||
"six should be restored to exact version 1.16.0 (not 1.17.0 latest)"
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def version_change_policy(temp_policy_dir):
|
||||
"""Create policy for version change test with real packages"""
|
||||
policy_content = {
|
||||
"urllib3": {
|
||||
"restore": [
|
||||
{
|
||||
"condition": {
|
||||
"type": "installed",
|
||||
"spec": "!=1.26.15"
|
||||
},
|
||||
"target": "urllib3",
|
||||
"version": "1.26.15",
|
||||
"reason": "urllib3 must be 1.26.15 for compatibility"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
|
||||
policy_file = temp_policy_dir / "pip-policy.json"
|
||||
policy_file.write_text(json.dumps(policy_content, indent=2))
|
||||
return policy_file
|
||||
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_version_change_and_restore(
|
||||
version_change_policy,
|
||||
mock_manager_util,
|
||||
mock_context,
|
||||
reset_test_venv,
|
||||
get_installed_packages,
|
||||
install_packages
|
||||
):
|
||||
"""
|
||||
Test package version changed by installation is restored
|
||||
|
||||
Priority: 1 (Essential)
|
||||
|
||||
Purpose:
|
||||
Verify that when a package installation changes another package's
|
||||
version, the restore policy can revert it to the required version.
|
||||
|
||||
Based on DEPENDENCY_TREE_CONTEXT.md:
|
||||
urllib3 can upgrade from 1.26.15 (1.x) to 2.5.0 (2.x)
|
||||
Restore policy with condition "!=1.26.15" should downgrade back
|
||||
This tests downgrade capability (not just upgrade prevention)
|
||||
"""
|
||||
from comfyui_manager.common.pip_util import PipBatch
|
||||
|
||||
# Verify urllib3 1.26.15 is installed
|
||||
initial = get_installed_packages()
|
||||
assert "urllib3" in initial
|
||||
assert initial["urllib3"] == "1.26.15", f"Expected urllib3==1.26.15, got {initial['urllib3']}"
|
||||
|
||||
with PipBatch() as batch:
|
||||
# Manually upgrade urllib3 to 2.x to simulate version change
|
||||
# This is a MAJOR version upgrade (1.x → 2.x)
|
||||
install_packages("urllib3==2.1.0")
|
||||
|
||||
installed_after = batch._get_installed_packages()
|
||||
# Verify version was changed to 2.x
|
||||
assert installed_after["urllib3"] == "2.1.0", \
|
||||
f"urllib3 should be upgraded to 2.1.0, got {installed_after['urllib3']}"
|
||||
assert installed_after["urllib3"].startswith("2."), \
|
||||
"urllib3 should be at 2.x series"
|
||||
|
||||
# Restore urllib3 to 1.26.15 (this is a DOWNGRADE from 2.x to 1.x)
|
||||
restored = batch.ensure_installed()
|
||||
final = batch._get_installed_packages()
|
||||
|
||||
# Verify condition was satisfied (2.1.0 != 1.26.15) and restore was triggered
|
||||
assert "urllib3" in restored, "urllib3 should be in restored list"
|
||||
|
||||
# Verify version was DOWNGRADED from 2.x back to 1.x
|
||||
assert final["urllib3"] == "1.26.15", \
|
||||
"urllib3 should be downgraded to 1.26.15 (from 2.1.0)"
|
||||
assert final["urllib3"].startswith("1."), \
|
||||
f"urllib3 should be back at 1.x series, got {final['urllib3']}"
|
||||
204
tests/common/pip_util/test_full_workflow_integration.py
Normal file
204
tests/common/pip_util/test_full_workflow_integration.py
Normal file
@ -0,0 +1,204 @@
|
||||
"""
|
||||
Test full workflow integration (Priority 1)
|
||||
|
||||
Tests the complete uninstall → install → restore workflow
|
||||
"""
|
||||
|
||||
import json
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
|
||||
import pytest
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def workflow_policy(temp_policy_dir):
|
||||
"""Create policy for full workflow test"""
|
||||
policy_content = {
|
||||
"target-package": {
|
||||
"uninstall": [
|
||||
{
|
||||
"condition": {
|
||||
"type": "installed",
|
||||
"package": "conflicting-pkg"
|
||||
},
|
||||
"target": "conflicting-pkg",
|
||||
"reason": "Conflicts with target-package"
|
||||
}
|
||||
],
|
||||
"apply_all_matches": [
|
||||
{
|
||||
"type": "pin_dependencies",
|
||||
"pinned_packages": ["numpy", "pandas"]
|
||||
}
|
||||
]
|
||||
},
|
||||
"critical-package": {
|
||||
"restore": [
|
||||
{
|
||||
"target": "critical-package",
|
||||
"version": "1.2.3",
|
||||
"reason": "Critical package must be 1.2.3"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
|
||||
policy_file = temp_policy_dir / "pip-policy.json"
|
||||
policy_file.write_text(json.dumps(policy_content, indent=2))
|
||||
return policy_file
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_workflow_subprocess(monkeypatch):
|
||||
"""Mock subprocess for workflow test"""
|
||||
call_sequence = []
|
||||
|
||||
# Initial environment: conflicting-pkg, numpy, pandas, critical-package
|
||||
installed_packages = {
|
||||
"conflicting-pkg": "1.0.0",
|
||||
"numpy": "1.26.0",
|
||||
"pandas": "2.0.0",
|
||||
"critical-package": "1.2.3"
|
||||
}
|
||||
|
||||
def mock_run(cmd, **kwargs):
|
||||
call_sequence.append(cmd)
|
||||
|
||||
# pip freeze
|
||||
if "freeze" in cmd:
|
||||
output = "\n".join([f"{pkg}=={ver}" for pkg, ver in installed_packages.items()])
|
||||
return subprocess.CompletedProcess(cmd, 0, output, "")
|
||||
|
||||
# pip uninstall
|
||||
if "uninstall" in cmd:
|
||||
# Remove conflicting-pkg
|
||||
if "conflicting-pkg" in cmd:
|
||||
installed_packages.pop("conflicting-pkg", None)
|
||||
return subprocess.CompletedProcess(cmd, 0, "", "")
|
||||
|
||||
# pip install target-package (deletes critical-package)
|
||||
if "install" in cmd and "target-package" in cmd:
|
||||
# Simulate target-package installation deleting critical-package
|
||||
installed_packages.pop("critical-package", None)
|
||||
installed_packages["target-package"] = "1.0.0"
|
||||
return subprocess.CompletedProcess(cmd, 0, "", "")
|
||||
|
||||
# pip install critical-package (restore)
|
||||
if "install" in cmd and "critical-package==1.2.3" in cmd:
|
||||
installed_packages["critical-package"] = "1.2.3"
|
||||
return subprocess.CompletedProcess(cmd, 0, "", "")
|
||||
|
||||
return subprocess.CompletedProcess(cmd, 0, "", "")
|
||||
|
||||
monkeypatch.setattr("subprocess.run", mock_run)
|
||||
return call_sequence, installed_packages
|
||||
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_uninstall_install_restore_workflow(
|
||||
workflow_policy,
|
||||
mock_manager_util,
|
||||
mock_context,
|
||||
mock_workflow_subprocess
|
||||
):
|
||||
"""
|
||||
Test complete uninstall → install → restore workflow
|
||||
|
||||
Priority: 1 (Essential)
|
||||
|
||||
Purpose:
|
||||
Verify the complete workflow executes in correct order:
|
||||
1. ensure_not_installed() removes conflicting packages
|
||||
2. install() applies policies (pin_dependencies)
|
||||
3. ensure_installed() restores deleted packages
|
||||
"""
|
||||
import sys
|
||||
# Path setup handled by conftest.py
|
||||
|
||||
from comfyui_manager.common.pip_util import PipBatch
|
||||
|
||||
call_sequence, installed_packages = mock_workflow_subprocess
|
||||
|
||||
with PipBatch() as batch:
|
||||
# Step 1: uninstall - remove conflicting packages
|
||||
removed = batch.ensure_not_installed()
|
||||
|
||||
# Step 2: install target-package with pinned dependencies
|
||||
result = batch.install("target-package")
|
||||
|
||||
# Step 3: restore critical-package that was deleted
|
||||
restored = batch.ensure_installed()
|
||||
|
||||
# Verify Step 1: conflicting-pkg was removed
|
||||
assert "conflicting-pkg" in removed
|
||||
|
||||
# Verify Step 2: target-package was installed with pinned dependencies
|
||||
assert result is True
|
||||
# Check that pip install was called with pinned packages
|
||||
install_calls = [cmd for cmd in call_sequence if "install" in cmd and "target-package" in cmd]
|
||||
assert len(install_calls) > 0
|
||||
install_cmd = install_calls[0]
|
||||
assert "target-package" in install_cmd
|
||||
assert "numpy==1.26.0" in install_cmd
|
||||
assert "pandas==2.0.0" in install_cmd
|
||||
|
||||
# Verify Step 3: critical-package was restored
|
||||
assert "critical-package" in restored
|
||||
|
||||
# Verify final state
|
||||
assert "conflicting-pkg" not in installed_packages
|
||||
assert "critical-package" in installed_packages
|
||||
assert installed_packages["critical-package"] == "1.2.3"
|
||||
assert "target-package" in installed_packages
|
||||
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_cache_invalidation_across_workflow(
|
||||
workflow_policy,
|
||||
mock_manager_util,
|
||||
mock_context,
|
||||
mock_workflow_subprocess
|
||||
):
|
||||
"""
|
||||
Test cache is correctly refreshed at each workflow step
|
||||
|
||||
Priority: 1 (Essential)
|
||||
|
||||
Purpose:
|
||||
Verify that the cache is invalidated and refreshed after each
|
||||
operation (uninstall, install, restore) to reflect current state.
|
||||
"""
|
||||
import sys
|
||||
# Path setup handled by conftest.py
|
||||
|
||||
from comfyui_manager.common.pip_util import PipBatch
|
||||
|
||||
call_sequence, installed_packages = mock_workflow_subprocess
|
||||
|
||||
with PipBatch() as batch:
|
||||
# Initial cache state
|
||||
cache1 = batch._get_installed_packages()
|
||||
assert "conflicting-pkg" in cache1
|
||||
assert "critical-package" in cache1
|
||||
|
||||
# After uninstall
|
||||
removed = batch.ensure_not_installed()
|
||||
cache2 = batch._get_installed_packages()
|
||||
assert "conflicting-pkg" not in cache2 # Removed
|
||||
|
||||
# After install (critical-package gets deleted by target-package)
|
||||
batch.install("target-package")
|
||||
cache3 = batch._get_installed_packages()
|
||||
assert "target-package" in cache3 # Added
|
||||
assert "critical-package" not in cache3 # Deleted by target-package
|
||||
|
||||
# After restore
|
||||
restored = batch.ensure_installed()
|
||||
cache4 = batch._get_installed_packages()
|
||||
assert "critical-package" in cache4 # Restored
|
||||
|
||||
# Verify cache was refreshed at each step
|
||||
assert cache1 != cache2 # Changed after uninstall
|
||||
assert cache2 != cache3 # Changed after install
|
||||
assert cache3 != cache4 # Changed after restore
|
||||
216
tests/common/pip_util/test_pin_failure_retry.py
Normal file
216
tests/common/pip_util/test_pin_failure_retry.py
Normal file
@ -0,0 +1,216 @@
|
||||
"""
|
||||
Test pin failure and retry logic (Priority 1)
|
||||
|
||||
Tests that installation with pinned dependencies can retry without pins on failure
|
||||
"""
|
||||
|
||||
import json
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
|
||||
import pytest
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def retry_policy(temp_policy_dir):
|
||||
"""Create policy with retry_without_pin"""
|
||||
policy_content = {
|
||||
"new-pkg": {
|
||||
"apply_all_matches": [
|
||||
{
|
||||
"type": "pin_dependencies",
|
||||
"pinned_packages": ["numpy", "pandas"],
|
||||
"on_failure": "retry_without_pin"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
|
||||
policy_file = temp_policy_dir / "pip-policy.json"
|
||||
policy_file.write_text(json.dumps(policy_content, indent=2))
|
||||
return policy_file
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_retry_subprocess(monkeypatch):
|
||||
"""Mock subprocess that fails with pins, succeeds without"""
|
||||
call_sequence = []
|
||||
attempt_count = [0]
|
||||
|
||||
installed_packages = {
|
||||
"numpy": "1.26.0",
|
||||
"pandas": "2.0.0"
|
||||
}
|
||||
|
||||
def mock_run(cmd, **kwargs):
|
||||
call_sequence.append(cmd)
|
||||
|
||||
# pip freeze
|
||||
if "freeze" in cmd:
|
||||
output = "\n".join([f"{pkg}=={ver}" for pkg, ver in installed_packages.items()])
|
||||
return subprocess.CompletedProcess(cmd, 0, output, "")
|
||||
|
||||
# pip install
|
||||
if "install" in cmd and "new-pkg" in cmd:
|
||||
attempt_count[0] += 1
|
||||
|
||||
# First attempt with pins - FAIL
|
||||
if attempt_count[0] == 1 and "numpy==1.26.0" in cmd and "pandas==2.0.0" in cmd:
|
||||
raise subprocess.CalledProcessError(1, cmd, "", "Dependency conflict")
|
||||
|
||||
# Second attempt without pins - SUCCESS
|
||||
if attempt_count[0] == 2:
|
||||
installed_packages["new-pkg"] = "1.0.0"
|
||||
# Without pins, versions might change
|
||||
return subprocess.CompletedProcess(cmd, 0, "", "")
|
||||
|
||||
return subprocess.CompletedProcess(cmd, 0, "", "")
|
||||
|
||||
monkeypatch.setattr("subprocess.run", mock_run)
|
||||
return call_sequence, installed_packages, attempt_count
|
||||
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_pin_failure_retry_without_pin_succeeds(
|
||||
retry_policy,
|
||||
mock_manager_util,
|
||||
mock_context,
|
||||
mock_retry_subprocess,
|
||||
capture_logs
|
||||
):
|
||||
"""
|
||||
Test retry without pin succeeds after pin failure
|
||||
|
||||
Priority: 1 (Essential)
|
||||
|
||||
Purpose:
|
||||
Verify that when installation with pinned dependencies fails,
|
||||
the system automatically retries without pins and succeeds.
|
||||
"""
|
||||
import sys
|
||||
# Path setup handled by conftest.py
|
||||
|
||||
from comfyui_manager.common.pip_util import PipBatch
|
||||
|
||||
call_sequence, installed_packages, attempt_count = mock_retry_subprocess
|
||||
|
||||
with PipBatch() as batch:
|
||||
result = batch.install("new-pkg")
|
||||
|
||||
# Verify installation succeeded on retry
|
||||
assert result is True
|
||||
|
||||
# Verify two installation attempts were made
|
||||
install_calls = [cmd for cmd in call_sequence if "install" in cmd and "new-pkg" in cmd]
|
||||
assert len(install_calls) == 2
|
||||
|
||||
# First attempt had pins
|
||||
first_call = install_calls[0]
|
||||
assert "new-pkg" in first_call
|
||||
assert "numpy==1.26.0" in first_call
|
||||
assert "pandas==2.0.0" in first_call
|
||||
|
||||
# Second attempt had no pins (just new-pkg)
|
||||
second_call = install_calls[1]
|
||||
assert "new-pkg" in second_call
|
||||
assert "numpy==1.26.0" not in second_call
|
||||
assert "pandas==2.0.0" not in second_call
|
||||
|
||||
# Verify warning log
|
||||
assert any("retrying without pins" in record.message.lower() for record in capture_logs.records)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def fail_policy(temp_policy_dir):
|
||||
"""Create policy with on_failure: fail"""
|
||||
policy_content = {
|
||||
"pytorch-addon": {
|
||||
"apply_all_matches": [
|
||||
{
|
||||
"condition": {
|
||||
"type": "installed",
|
||||
"package": "torch",
|
||||
"spec": ">=2.0.0"
|
||||
},
|
||||
"type": "pin_dependencies",
|
||||
"pinned_packages": ["torch", "torchvision", "torchaudio"],
|
||||
"on_failure": "fail"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
|
||||
policy_file = temp_policy_dir / "pip-policy.json"
|
||||
policy_file.write_text(json.dumps(policy_content, indent=2))
|
||||
return policy_file
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_fail_subprocess(monkeypatch):
|
||||
"""Mock subprocess that always fails"""
|
||||
call_sequence = []
|
||||
|
||||
installed_packages = {
|
||||
"torch": "2.1.0",
|
||||
"torchvision": "0.16.0",
|
||||
"torchaudio": "2.1.0"
|
||||
}
|
||||
|
||||
def mock_run(cmd, **kwargs):
|
||||
call_sequence.append(cmd)
|
||||
|
||||
# pip freeze
|
||||
if "freeze" in cmd:
|
||||
output = "\n".join([f"{pkg}=={ver}" for pkg, ver in installed_packages.items()])
|
||||
return subprocess.CompletedProcess(cmd, 0, output, "")
|
||||
|
||||
# pip install - ALWAYS FAIL
|
||||
if "install" in cmd and "pytorch-addon" in cmd:
|
||||
raise subprocess.CalledProcessError(1, cmd, "", "Installation failed")
|
||||
|
||||
return subprocess.CompletedProcess(cmd, 0, "", "")
|
||||
|
||||
monkeypatch.setattr("subprocess.run", mock_run)
|
||||
return call_sequence, installed_packages
|
||||
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_pin_failure_with_fail_raises_exception(
|
||||
fail_policy,
|
||||
mock_manager_util,
|
||||
mock_context,
|
||||
mock_fail_subprocess,
|
||||
capture_logs
|
||||
):
|
||||
"""
|
||||
Test exception is raised when on_failure is "fail"
|
||||
|
||||
Priority: 1 (Essential)
|
||||
|
||||
Purpose:
|
||||
Verify that when on_failure is set to "fail", installation
|
||||
failure with pinned dependencies raises an exception and
|
||||
does not retry.
|
||||
"""
|
||||
import sys
|
||||
# Path setup handled by conftest.py
|
||||
|
||||
from comfyui_manager.common.pip_util import PipBatch
|
||||
|
||||
call_sequence, installed_packages = mock_fail_subprocess
|
||||
|
||||
with PipBatch() as batch:
|
||||
# Should raise exception
|
||||
with pytest.raises(subprocess.CalledProcessError):
|
||||
batch.install("pytorch-addon")
|
||||
|
||||
# Verify only one installation attempt was made (no retry)
|
||||
install_calls = [cmd for cmd in call_sequence if "install" in cmd and "pytorch-addon" in cmd]
|
||||
assert len(install_calls) == 1
|
||||
|
||||
# Verify it had pins
|
||||
install_cmd = install_calls[0]
|
||||
assert "pytorch-addon" in install_cmd
|
||||
assert "torch==2.1.0" in install_cmd
|
||||
assert "torchvision==0.16.0" in install_cmd
|
||||
assert "torchaudio==2.1.0" in install_cmd
|
||||
139
tests/common/pip_util/test_platform_conditions.py
Normal file
139
tests/common/pip_util/test_platform_conditions.py
Normal file
@ -0,0 +1,139 @@
|
||||
"""
|
||||
Test platform-specific conditions (Priority 2)
|
||||
|
||||
Tests OS and GPU detection for conditional policies
|
||||
"""
|
||||
|
||||
import json
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
|
||||
import pytest
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def platform_policy(temp_policy_dir):
|
||||
"""Create policy with platform conditions"""
|
||||
policy_content = {
|
||||
"onnxruntime": {
|
||||
"apply_first_match": [
|
||||
{
|
||||
"condition": {
|
||||
"type": "platform",
|
||||
"os": "linux",
|
||||
"has_gpu": True
|
||||
},
|
||||
"type": "replace",
|
||||
"replacement": "onnxruntime-gpu"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
|
||||
policy_file = temp_policy_dir / "pip-policy.json"
|
||||
policy_file.write_text(json.dumps(policy_content, indent=2))
|
||||
return policy_file
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_platform_subprocess(monkeypatch):
|
||||
"""Mock subprocess for platform test"""
|
||||
call_sequence = []
|
||||
installed_packages = {}
|
||||
|
||||
def mock_run(cmd, **kwargs):
|
||||
call_sequence.append(cmd)
|
||||
|
||||
# pip freeze
|
||||
if "freeze" in cmd:
|
||||
output = "\n".join([f"{pkg}=={ver}" for pkg, ver in installed_packages.items()])
|
||||
return subprocess.CompletedProcess(cmd, 0, output, "")
|
||||
|
||||
# pip install
|
||||
if "install" in cmd:
|
||||
if "onnxruntime-gpu" in cmd:
|
||||
installed_packages["onnxruntime-gpu"] = "1.0.0"
|
||||
elif "onnxruntime" in cmd:
|
||||
installed_packages["onnxruntime"] = "1.0.0"
|
||||
return subprocess.CompletedProcess(cmd, 0, "", "")
|
||||
|
||||
return subprocess.CompletedProcess(cmd, 0, "", "")
|
||||
|
||||
monkeypatch.setattr("subprocess.run", mock_run)
|
||||
return call_sequence, installed_packages
|
||||
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_linux_gpu_uses_gpu_package(
|
||||
platform_policy,
|
||||
mock_manager_util,
|
||||
mock_context,
|
||||
mock_platform_subprocess,
|
||||
mock_platform_linux,
|
||||
mock_torch_cuda_available
|
||||
):
|
||||
"""
|
||||
Test GPU-specific package on Linux + GPU
|
||||
|
||||
Priority: 2 (Important)
|
||||
|
||||
Purpose:
|
||||
Verify that platform-conditional policies correctly detect
|
||||
Linux + GPU and install the appropriate package variant.
|
||||
"""
|
||||
import sys
|
||||
# Path setup handled by conftest.py
|
||||
|
||||
from comfyui_manager.common.pip_util import PipBatch
|
||||
|
||||
call_sequence, installed_packages = mock_platform_subprocess
|
||||
|
||||
with PipBatch() as batch:
|
||||
result = batch.install("onnxruntime")
|
||||
|
||||
# Verify installation succeeded
|
||||
assert result is True
|
||||
|
||||
# Verify GPU version was installed
|
||||
install_calls = [cmd for cmd in call_sequence if "install" in cmd]
|
||||
assert any("onnxruntime-gpu" in str(cmd) for cmd in install_calls)
|
||||
assert "onnxruntime-gpu" in installed_packages
|
||||
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_windows_no_gpu_uses_cpu_package(
|
||||
platform_policy,
|
||||
mock_manager_util,
|
||||
mock_context,
|
||||
mock_platform_subprocess,
|
||||
mock_platform_windows,
|
||||
mock_torch_cuda_unavailable
|
||||
):
|
||||
"""
|
||||
Test CPU package on Windows + No GPU
|
||||
|
||||
Priority: 2 (Important)
|
||||
|
||||
Purpose:
|
||||
Verify that when platform conditions are not met,
|
||||
the original package is installed without replacement.
|
||||
"""
|
||||
import sys
|
||||
# Path setup handled by conftest.py
|
||||
|
||||
from comfyui_manager.common.pip_util import PipBatch
|
||||
|
||||
call_sequence, installed_packages = mock_platform_subprocess
|
||||
|
||||
with PipBatch() as batch:
|
||||
result = batch.install("onnxruntime")
|
||||
|
||||
# Verify installation succeeded
|
||||
assert result is True
|
||||
|
||||
# Verify CPU version was installed (no GPU replacement)
|
||||
install_calls = [cmd for cmd in call_sequence if "install" in cmd]
|
||||
assert any("onnxruntime" in str(cmd) for cmd in install_calls)
|
||||
assert "onnxruntime-gpu" not in str(call_sequence)
|
||||
assert "onnxruntime" in installed_packages
|
||||
assert "onnxruntime-gpu" not in installed_packages
|
||||
180
tests/common/pip_util/test_policy_priority.py
Normal file
180
tests/common/pip_util/test_policy_priority.py
Normal file
@ -0,0 +1,180 @@
|
||||
"""
|
||||
Test policy priority and conflicts (Priority 2)
|
||||
|
||||
Tests that user policies override base policies correctly
|
||||
"""
|
||||
|
||||
import json
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
|
||||
import pytest
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def conflicting_policies(temp_policy_dir, temp_user_policy_dir):
|
||||
"""Create conflicting base and user policies"""
|
||||
# Base policy
|
||||
base_content = {
|
||||
"numpy": {
|
||||
"apply_first_match": [
|
||||
{
|
||||
"type": "skip",
|
||||
"reason": "Base policy skip"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
base_file = temp_policy_dir / "pip-policy.json"
|
||||
base_file.write_text(json.dumps(base_content, indent=2))
|
||||
|
||||
# User policy (should override)
|
||||
user_content = {
|
||||
"numpy": {
|
||||
"apply_first_match": [
|
||||
{
|
||||
"type": "force_version",
|
||||
"version": "1.26.0",
|
||||
"reason": "User override"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
user_file = temp_user_policy_dir / "pip-policy.user.json"
|
||||
user_file.write_text(json.dumps(user_content, indent=2))
|
||||
|
||||
return base_file, user_file
|
||||
|
||||
|
||||
@pytest.mark.unit
|
||||
def test_user_policy_overrides_base_policy(
|
||||
conflicting_policies,
|
||||
mock_manager_util,
|
||||
mock_context,
|
||||
mock_subprocess_success
|
||||
):
|
||||
"""
|
||||
Test user policy completely replaces base policy
|
||||
|
||||
Priority: 2 (Important)
|
||||
|
||||
Purpose:
|
||||
Verify that user policy completely overrides base policy
|
||||
at the package level (not section-level merge).
|
||||
"""
|
||||
import sys
|
||||
# Path setup handled by conftest.py
|
||||
|
||||
from comfyui_manager.common.pip_util import get_pip_policy
|
||||
|
||||
policy = get_pip_policy()
|
||||
|
||||
# Verify user policy replaced base policy
|
||||
assert "numpy" in policy
|
||||
assert "apply_first_match" in policy["numpy"]
|
||||
assert len(policy["numpy"]["apply_first_match"]) == 1
|
||||
|
||||
# Should be force_version (user), not skip (base)
|
||||
assert policy["numpy"]["apply_first_match"][0]["type"] == "force_version"
|
||||
assert policy["numpy"]["apply_first_match"][0]["version"] == "1.26.0"
|
||||
|
||||
# Base policy skip should be completely gone
|
||||
assert not any(
|
||||
item["type"] == "skip"
|
||||
for item in policy["numpy"]["apply_first_match"]
|
||||
)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def first_match_policy(temp_policy_dir):
|
||||
"""Create policy with multiple apply_first_match entries"""
|
||||
policy_content = {
|
||||
"pkg": {
|
||||
"apply_first_match": [
|
||||
{
|
||||
"condition": {
|
||||
"type": "installed",
|
||||
"package": "numpy"
|
||||
},
|
||||
"type": "force_version",
|
||||
"version": "1.0"
|
||||
},
|
||||
{
|
||||
"type": "force_version",
|
||||
"version": "2.0"
|
||||
},
|
||||
{
|
||||
"type": "skip"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
|
||||
policy_file = temp_policy_dir / "pip-policy.json"
|
||||
policy_file.write_text(json.dumps(policy_content, indent=2))
|
||||
return policy_file
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_first_match_subprocess(monkeypatch):
|
||||
"""Mock subprocess for first match test"""
|
||||
call_sequence = []
|
||||
|
||||
installed_packages = {
|
||||
"numpy": "1.26.0"
|
||||
}
|
||||
|
||||
def mock_run(cmd, **kwargs):
|
||||
call_sequence.append(cmd)
|
||||
|
||||
# pip freeze
|
||||
if "freeze" in cmd:
|
||||
output = "\n".join([f"{pkg}=={ver}" for pkg, ver in installed_packages.items()])
|
||||
return subprocess.CompletedProcess(cmd, 0, output, "")
|
||||
|
||||
# pip install
|
||||
if "install" in cmd and "pkg" in cmd:
|
||||
if "pkg==1.0" in cmd:
|
||||
installed_packages["pkg"] = "1.0"
|
||||
return subprocess.CompletedProcess(cmd, 0, "", "")
|
||||
|
||||
return subprocess.CompletedProcess(cmd, 0, "", "")
|
||||
|
||||
monkeypatch.setattr("subprocess.run", mock_run)
|
||||
return call_sequence, installed_packages
|
||||
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_first_match_stops_at_first_satisfied(
|
||||
first_match_policy,
|
||||
mock_manager_util,
|
||||
mock_context,
|
||||
mock_first_match_subprocess
|
||||
):
|
||||
"""
|
||||
Test apply_first_match stops at first satisfied condition
|
||||
|
||||
Priority: 2 (Important)
|
||||
|
||||
Purpose:
|
||||
Verify that in apply_first_match, only the first policy
|
||||
with a satisfied condition is executed (exclusive execution).
|
||||
"""
|
||||
import sys
|
||||
# Path setup handled by conftest.py
|
||||
|
||||
from comfyui_manager.common.pip_util import PipBatch
|
||||
|
||||
call_sequence, installed_packages = mock_first_match_subprocess
|
||||
|
||||
with PipBatch() as batch:
|
||||
result = batch.install("pkg")
|
||||
|
||||
# Verify installation succeeded
|
||||
assert result is True
|
||||
|
||||
# First condition satisfied (numpy installed), so version 1.0 applied
|
||||
install_calls = [cmd for cmd in call_sequence if "install" in cmd and "pkg" in cmd]
|
||||
assert len(install_calls) > 0
|
||||
assert "pkg==1.0" in install_calls[0]
|
||||
assert "pkg==2.0" not in str(call_sequence) # Second policy not applied
|
||||
178
tests/common/pip_util/test_unit_parsing.py
Normal file
178
tests/common/pip_util/test_unit_parsing.py
Normal file
@ -0,0 +1,178 @@
|
||||
"""
|
||||
Unit tests for package spec parsing and condition evaluation
|
||||
|
||||
Tests core utility functions
|
||||
"""
|
||||
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
|
||||
import pytest
|
||||
|
||||
|
||||
@pytest.mark.unit
|
||||
def test_parse_package_spec_name_only(mock_manager_util, mock_context):
|
||||
"""Test parsing package name without version"""
|
||||
import sys
|
||||
# Path setup handled by conftest.py
|
||||
|
||||
from comfyui_manager.common.pip_util import PipBatch
|
||||
|
||||
batch = PipBatch()
|
||||
name, spec = batch._parse_package_spec("numpy")
|
||||
|
||||
assert name == "numpy"
|
||||
assert spec is None
|
||||
|
||||
|
||||
@pytest.mark.unit
|
||||
def test_parse_package_spec_exact_version(mock_manager_util, mock_context):
|
||||
"""Test parsing package with exact version"""
|
||||
import sys
|
||||
# Path setup handled by conftest.py
|
||||
|
||||
from comfyui_manager.common.pip_util import PipBatch
|
||||
|
||||
batch = PipBatch()
|
||||
name, spec = batch._parse_package_spec("numpy==1.26.0")
|
||||
|
||||
assert name == "numpy"
|
||||
assert spec == "==1.26.0"
|
||||
|
||||
|
||||
@pytest.mark.unit
|
||||
def test_parse_package_spec_min_version(mock_manager_util, mock_context):
|
||||
"""Test parsing package with minimum version"""
|
||||
import sys
|
||||
# Path setup handled by conftest.py
|
||||
|
||||
from comfyui_manager.common.pip_util import PipBatch
|
||||
|
||||
batch = PipBatch()
|
||||
name, spec = batch._parse_package_spec("pandas>=2.0.0")
|
||||
|
||||
assert name == "pandas"
|
||||
assert spec == ">=2.0.0"
|
||||
|
||||
|
||||
@pytest.mark.unit
|
||||
def test_parse_package_spec_hyphenated_name(mock_manager_util, mock_context):
|
||||
"""Test parsing package with hyphens"""
|
||||
import sys
|
||||
# Path setup handled by conftest.py
|
||||
|
||||
from comfyui_manager.common.pip_util import PipBatch
|
||||
|
||||
batch = PipBatch()
|
||||
name, spec = batch._parse_package_spec("scikit-learn>=1.0")
|
||||
|
||||
assert name == "scikit-learn"
|
||||
assert spec == ">=1.0"
|
||||
|
||||
|
||||
@pytest.mark.unit
|
||||
def test_evaluate_condition_none(mock_manager_util, mock_context):
|
||||
"""Test None condition always returns True"""
|
||||
import sys
|
||||
# Path setup handled by conftest.py
|
||||
|
||||
from comfyui_manager.common.pip_util import PipBatch
|
||||
|
||||
batch = PipBatch()
|
||||
result = batch._evaluate_condition(None, "numpy", {})
|
||||
|
||||
assert result is True
|
||||
|
||||
|
||||
@pytest.mark.unit
|
||||
def test_evaluate_condition_installed_package_exists(mock_manager_util, mock_context):
|
||||
"""Test installed condition when package exists"""
|
||||
import sys
|
||||
# Path setup handled by conftest.py
|
||||
|
||||
from comfyui_manager.common.pip_util import PipBatch
|
||||
|
||||
batch = PipBatch()
|
||||
condition = {"type": "installed", "package": "numpy"}
|
||||
installed = {"numpy": "1.26.0"}
|
||||
|
||||
result = batch._evaluate_condition(condition, "numba", installed)
|
||||
|
||||
assert result is True
|
||||
|
||||
|
||||
@pytest.mark.unit
|
||||
def test_evaluate_condition_installed_package_not_exists(mock_manager_util, mock_context):
|
||||
"""Test installed condition when package doesn't exist"""
|
||||
import sys
|
||||
# Path setup handled by conftest.py
|
||||
|
||||
from comfyui_manager.common.pip_util import PipBatch
|
||||
|
||||
batch = PipBatch()
|
||||
condition = {"type": "installed", "package": "numpy"}
|
||||
installed = {}
|
||||
|
||||
result = batch._evaluate_condition(condition, "numba", installed)
|
||||
|
||||
assert result is False
|
||||
|
||||
|
||||
@pytest.mark.unit
|
||||
def test_evaluate_condition_platform_os_match(
|
||||
mock_manager_util,
|
||||
mock_context,
|
||||
mock_platform_linux
|
||||
):
|
||||
"""Test platform OS condition matching"""
|
||||
import sys
|
||||
# Path setup handled by conftest.py
|
||||
|
||||
from comfyui_manager.common.pip_util import PipBatch
|
||||
|
||||
batch = PipBatch()
|
||||
condition = {"type": "platform", "os": "linux"}
|
||||
|
||||
result = batch._evaluate_condition(condition, "package", {})
|
||||
|
||||
assert result is True
|
||||
|
||||
|
||||
@pytest.mark.unit
|
||||
def test_evaluate_condition_platform_gpu_available(
|
||||
mock_manager_util,
|
||||
mock_context,
|
||||
mock_torch_cuda_available
|
||||
):
|
||||
"""Test GPU detection when available"""
|
||||
import sys
|
||||
# Path setup handled by conftest.py
|
||||
|
||||
from comfyui_manager.common.pip_util import PipBatch
|
||||
|
||||
batch = PipBatch()
|
||||
condition = {"type": "platform", "has_gpu": True}
|
||||
|
||||
result = batch._evaluate_condition(condition, "package", {})
|
||||
|
||||
assert result is True
|
||||
|
||||
|
||||
@pytest.mark.unit
|
||||
def test_evaluate_condition_platform_gpu_not_available(
|
||||
mock_manager_util,
|
||||
mock_context,
|
||||
mock_torch_cuda_unavailable
|
||||
):
|
||||
"""Test GPU detection when not available"""
|
||||
import sys
|
||||
# Path setup handled by conftest.py
|
||||
|
||||
from comfyui_manager.common.pip_util import PipBatch
|
||||
|
||||
batch = PipBatch()
|
||||
condition = {"type": "platform", "has_gpu": True}
|
||||
|
||||
result = batch._evaluate_condition(condition, "package", {})
|
||||
|
||||
assert result is False
|
||||
41
tests/pytest.ini
Normal file
41
tests/pytest.ini
Normal file
@ -0,0 +1,41 @@
|
||||
[pytest]
|
||||
# Global pytest configuration for comfyui-manager tests
|
||||
|
||||
# Test discovery
|
||||
python_files = test_*.py
|
||||
python_classes = Test*
|
||||
python_functions = test_*
|
||||
|
||||
# Add comfyui_manager to Python path
|
||||
pythonpath = ../comfyui_manager
|
||||
|
||||
# Output options
|
||||
addopts =
|
||||
# Verbose output
|
||||
-v
|
||||
# Show extra test summary info
|
||||
-ra
|
||||
# Show local variables in tracebacks
|
||||
--showlocals
|
||||
# Strict markers (fail on unknown markers)
|
||||
--strict-markers
|
||||
|
||||
# Markers for test categorization
|
||||
markers =
|
||||
unit: Unit tests for individual functions
|
||||
integration: Integration tests for policy application
|
||||
e2e: End-to-end workflow tests
|
||||
slow: Tests that take significant time
|
||||
requires_network: Tests that require network access
|
||||
|
||||
# Logging
|
||||
log_cli = false
|
||||
log_cli_level = INFO
|
||||
log_cli_format = %(asctime)s [%(levelname)8s] %(message)s
|
||||
log_cli_date_format = %Y-%m-%d %H:%M:%S
|
||||
|
||||
# Warnings
|
||||
filterwarnings =
|
||||
error
|
||||
ignore::DeprecationWarning
|
||||
ignore::PendingDeprecationWarning
|
||||
19
tests/requirements.txt
Normal file
19
tests/requirements.txt
Normal file
@ -0,0 +1,19 @@
|
||||
# Test Dependencies for pip_util.py
|
||||
# Install in isolated venv to prevent environment corruption
|
||||
|
||||
# Testing Framework
|
||||
pytest>=7.4.0
|
||||
pytest-cov>=4.1.0
|
||||
pytest-mock>=3.11.0
|
||||
|
||||
# Code Quality
|
||||
flake8>=6.0.0
|
||||
black>=23.0.0
|
||||
mypy>=1.5.0
|
||||
|
||||
# Dependencies from main project
|
||||
packaging>=23.0
|
||||
|
||||
# Mock and testing utilities
|
||||
responses>=0.23.0
|
||||
freezegun>=1.2.0
|
||||
75
tests/setup_test_env.sh
Executable file
75
tests/setup_test_env.sh
Executable file
@ -0,0 +1,75 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Test Environment Setup Script for pip_util.py
|
||||
# Creates isolated venv to prevent environment corruption
|
||||
|
||||
set -e
|
||||
|
||||
SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
|
||||
VENV_DIR="${SCRIPT_DIR}/test_venv"
|
||||
|
||||
echo "=================================================="
|
||||
echo "pip_util.py Test Environment Setup"
|
||||
echo "=================================================="
|
||||
echo ""
|
||||
|
||||
# Check Python version
|
||||
PYTHON_CMD=""
|
||||
if command -v python3 &> /dev/null; then
|
||||
PYTHON_CMD="python3"
|
||||
elif command -v python &> /dev/null; then
|
||||
PYTHON_CMD="python"
|
||||
else
|
||||
echo "❌ Error: Python not found. Please install Python 3.8 or higher."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
PYTHON_VERSION=$($PYTHON_CMD --version 2>&1 | awk '{print $2}')
|
||||
echo "✓ Found Python: $PYTHON_VERSION"
|
||||
|
||||
# Remove existing venv if present
|
||||
if [ -d "$VENV_DIR" ]; then
|
||||
echo ""
|
||||
read -p "⚠️ Existing test venv found. Remove and recreate? (y/N): " -n 1 -r
|
||||
echo
|
||||
if [[ $REPLY =~ ^[Yy]$ ]]; then
|
||||
echo "🗑️ Removing existing venv..."
|
||||
rm -rf "$VENV_DIR"
|
||||
else
|
||||
echo "Keeping existing venv. Skipping creation."
|
||||
exit 0
|
||||
fi
|
||||
fi
|
||||
|
||||
# Create venv
|
||||
echo ""
|
||||
echo "📦 Creating virtual environment..."
|
||||
$PYTHON_CMD -m venv "$VENV_DIR"
|
||||
|
||||
# Activate venv
|
||||
echo "🔌 Activating virtual environment..."
|
||||
source "${VENV_DIR}/bin/activate"
|
||||
|
||||
# Upgrade pip
|
||||
echo "⬆️ Upgrading pip..."
|
||||
pip install --upgrade pip
|
||||
|
||||
# Install test dependencies
|
||||
echo ""
|
||||
echo "📚 Installing test dependencies..."
|
||||
pip install -r "${SCRIPT_DIR}/requirements.txt"
|
||||
|
||||
echo ""
|
||||
echo "=================================================="
|
||||
echo "✅ Test environment setup complete!"
|
||||
echo "=================================================="
|
||||
echo ""
|
||||
echo "To activate the test environment:"
|
||||
echo " source ${VENV_DIR}/bin/activate"
|
||||
echo ""
|
||||
echo "To run tests:"
|
||||
echo " pytest"
|
||||
echo ""
|
||||
echo "To deactivate:"
|
||||
echo " deactivate"
|
||||
echo ""
|
||||
Loading…
Reference in New Issue
Block a user