System Utils
detect_model_type(model: str) -> Optional[str]
Detect the model file type from a Hugging Face model repository.
This function attempts to determine whether a Hugging Face model repository contains GGUF or safetensors files by querying the repository file list.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
model
|
str
|
Model identifier, typically in the format "owner/repo" for Hugging Face repos. |
required |
Returns:
| Type | Description |
|---|---|
Optional[str]
|
A string indicating the detected model type ("gguf" or "safetensors"), or None |
Optional[str]
|
if the type could not be determined or if the huggingface_hub library is not available. |
Source code in pita/utils/system_utils.py
96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 | |
get_gpu_vram_usage_mb() -> Optional[int]
Get the current VRAM usage (in MiB) across all NVIDIA GPUs.
This function uses nvidia-smi to query current GPU memory usage and returns the sum across all GPUs if multiple are present.
Returns:
| Type | Description |
|---|---|
Optional[int]
|
Total current VRAM usage in MiB across all GPUs, or None if nvidia-smi |
Optional[int]
|
is not available or an error occurs. |
Source code in pita/utils/system_utils.py
71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 | |
get_total_vram() -> Union[int, str]
Get the total VRAM (in MiB) of the primary GPU on the system.
This function attempts to detect VRAM across different platforms and GPU types: - NVIDIA GPUs: Uses nvidia-smi (Windows & Linux) - AMD GPUs: Uses ROCm-smi (Linux) - Windows Generic: Uses PowerShell WMI queries
Returns:
| Type | Description |
|---|---|
Union[int, str]
|
Total VRAM in MiB (int) if successfully detected, or an error message (str) |
Union[int, str]
|
if detection fails or drivers are not installed. |
Source code in pita/utils/system_utils.py
7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 | |