## HWM - Console Commands (Scripts)

This folder contains Laravel Artisan console commands used by the HWM addon.
This README documents the principal/functional commands and the remaining import/sync helpers.

Files documented here (path: `Http/V2/Addons/HWM/Scripts`):

- `ArchiveAdmissions.php` — archive admissions with finalised cycles (CLOSED aged, CANCELLED, REJECTED).
- `CleanInvalidAdmissions.php` — mark PENDING admissions as EXPIRED (no-show or expired token), archive them, and move state to ARCHIVED.
- `SyncPhysiciansFromCredencializa.php` — synchronise physicians catalog from the external Credencializa service into the local DB.
- `ImportPhysiciansFromSpreadsheet.php` — import physicians from an XLSX into directory_users, physicians, and physician_hospitals.
- `ImportPromoterHospitalsFromSpreadsheet.php` — import promoter ↔ hospital mappings from XLSX into `pro_promoter_hospitals`.
- `ImportPromotersUsersFromSpreadsheet.php` — create PROMOTOR users from XLSX and link them to `pro_promoters`.

---

## Common notes / environment

- These commands are Laravel console commands (classes extend `Illuminate\Console\Command`).
- Register a command in `app/Console/Kernel.php` by adding the class to the `$commands` array, e.g.:

  ```php
  protected $commands = [
      \App\Console\Commands\ArchiveAdmissions::class,
      // ...
  ];
  ```

- Many commands use `HwmConfig` constants (for timezone, archive delay, system email). Example config keys used by these scripts:
  - `HwmConfig::$HWM_TIMEZONE` — timezone used to compute cutoffs.
  - `HwmConfig::$HWM_ARCHIVE_AFTER_DAYS` — days after CLOSED to archive.
  - `HwmConfig::$HWM_SYSTEM_EMAIL` — system email used for archived/emitted events.

- Several scripts set `@ini_set('memory_limit','1024M')` and `@set_time_limit(0)` as they may process large files or many rows.

- Some scripts contain a `DB_IS_UTC` constant. If your DB stores DATETIME in UTC, set it to `true` in the command class so cutoffs convert from local TZ to UTC before querying.

- Most import/sync commands support `--dry-run` to preview changes without committing to DB. Prefer running with `--dry-run` first.

---

## 1) ArchiveAdmissions

- File: `ArchiveAdmissions.php`
- Signature: `hwm:archive-admissions {--dry-run}`

Purpose
- Archive admissions that have reached an end state or are otherwise finalised.
- Targets:
  - CLOSED admissions whose billing close event is older than the configured cutoff (prefers `BILLING_CLOSED` event_time; falls back to `admissions.updated_at`).
  - CANCELLED and REJECTED admissions (archived immediately).

How it works
- Computes cutoff using `HwmConfig::$HWM_TIMEZONE` (default `UTC`) and `HwmConfig::$HWM_ARCHIVE_AFTER_DAYS` (default 30).
- Optionally converts the cutoff to DB timezone if `DB_IS_UTC` is `true`.
- For each candidate admission it calls `EventService::archiveNow(...)` with a reason (CANCELLED, REJECTED, or CLOSED_AGED) and the configured `HWM_SYSTEM_EMAIL` as actor.

Installation / register
- Add `\App\Console\Commands\ArchiveAdmissions::class` to `app/Console/Kernel.php` `$commands` array.

Run examples
- Dry-run (no DB changes):
  ```bash
  php artisan hwm:archive-admissions --dry-run
  ```
- Execute (perform archiving):
  ```bash
  php artisan hwm:archive-admissions
  ```

Scheduling recommendation
- Run daily in an off-peak hour (e.g., 02:00). Example crontab entry:
  ```cron
  0 2 * * * cd /path/to/project && php artisan hwm:archive-admissions >> storage/logs/hwm-archive.log 2>&1
  ```

Notes & Troubleshooting
- Ensure `HwmConfig::$HWM_SYSTEM_EMAIL` is set, otherwise events/archives may be recorded without a conclusive performer.
- If many rows are processed, check DB load and consider running during low-traffic windows.
- Errors are logged to console; run with `--dry-run` first to inspect candidate counts.

---

## 2) CleanInvalidAdmissions

- File: `CleanInvalidAdmissions.php`
- Signature: `hwm:clean-invalid-admissions {--dry-run}`

Purpose
- Detect PENDING admissions that are invalid and mark them EXPIRED, archive them and transition the admission state to `EXPIRED`.
- Two categories are handled:
  - No-show: admissions with `admission_datetime` within today's window (local timezone) that did not happen.
  - Token expired: QR tokens for an admission whose `expires_at` falls within today's window.

How it works
- Uses `HwmConfig::$HWM_TIMEZONE` to compute the current day window (start-of-day to end-of-day).
- Emits an `ADMISSION_EXPIRED` event (via `EventService`), transitions state to `EXPIRED` (via `StateService`) and archives the admission.

Installation / register
- Add `\App\Console\Commands\CleanInvalidAdmissions::class` to `app/Console/Kernel.php` `$commands`.

Run examples
- Dry-run only reporting:
  ```bash
  php artisan hwm:clean-invalid-admissions --dry-run
  ```
- Execute:
  ```bash
  php artisan hwm:clean-invalid-admissions
  ```

Scheduling recommendation
- Run daily soon after daily cutover (e.g., 00:10) so the day window is stable:
  ```cron
  10 0 * * * cd /path/to/project && php artisan hwm:clean-invalid-admissions >> storage/logs/hwm-clean-invalid.log 2>&1
  ```

Notes & Troubleshooting
- The command checks that the admission is still `PENDING` and not already archived before applying changes.
- Because the command emits events and updates state tables, test in a staging environment first.

---

## 3) SyncPhysiciansFromCredencializa

- File: `SyncPhysiciansFromCredencializa.php`
- Signature:
  ```text
  hwm:sync-physicians-credencializa {--dry-run} {--chunk=500} {--override-professional-id}
  ```

Purpose
- Synchronise the physicians catalog from the external Credencializa service into local reference tables (`ref_directory_users`, `ref_physicians`, `ref_physician_hospitals`).
- Source of truth is the external list; the command creates/updates records accordingly and can deactivate physicians no longer in the source.

Key options
- `--dry-run` — simulate changes without committing.
- `--chunk` — number of rows processed per DB transaction (default 500).
- `--override-professional-id` — if present, overwrite existing `professional_id` (cédula) for physicians.

How it works
- Loads hospitals and specialties into caches.
- Fetches external JSON via `CredencializaService::getPhysiciansDecoded()`.
- For each row it resolves or creates a directory user, matches or creates specialty, upserts physician, and ensures physician↔hospital links.
- Maintains post-pass tasks: deactivate physicians missing in external feed and soft-delete hospital links not present in the source.

Installation / register
- Add `\App\Console\Commands\SyncPhysiciansFromCredencializa::class` to `app/Console/Kernel.php` `$commands`.

Run examples
- Dry-run to preview:
  ```bash
  php artisan hwm:sync-physicians-credencializa --dry-run
  ```
- Run for real in chunks of 1000:
  ```bash
  php artisan hwm:sync-physicians-credencializa --chunk=1000
  ```

Notes & Troubleshooting
- Ensure the `CredencializaService` is correctly configured and reachable from the app (API credentials, base URL, etc.).
- The command writes a summary and tries to save a friendly log file in `storage/logs/` with details of operations; inspect it after a run.
- If you use `--override-professional-id`, expect updates to the `professional_id` field for existing physicians; this is irreversible without backups.

---

## 4) ImportPhysiciansFromSpreadsheet

- File: `ImportPhysiciansFromSpreadsheet.php`
- Signature: `hwm:import-physicians-xls {path?} {--dry-run} {--chunk=500}`

Purpose
- Import physicians data from an XLSX file into `directory_users`, `physicians`, and `physician_hospitals`.

Important details
- The class uses PhpSpreadsheet to read Excel files and expects specific columns (e.g., `NumeroGASS`, `NumeroSAP`, `Nombres`, `ApellidoPaterno`, `ApellidoMaterno`, `UnidadPrincipal`, `CsUnidadPrincipal`, `TituloProfesional`, `Especialidad1`). If a required column is missing the import will abort.
- It has several in-memory caches for hospitals, specialties and directory-user lookups. It prefers identifiers (SAP or GASS) for matching existing records.

Run examples
- Provide a local path to the XLSX or place `Catalogo de Medicos.xlsx` next to the script and run:
  ```bash
  php artisan hwm:import-physicians-xls /path/to/file.xlsx --chunk=500 --dry-run
  ```

Notes
- The default filename attempted is `Catalogo de Medicos.xlsx` located next to the command class. If you want to use a sheet in another format, open and inspect the header first.

---

## 5) ImportPromoterHospitalsFromSpreadsheet

- File: `ImportPromoterHospitalsFromSpreadsheet.php`
- Signature: `hwm:import-promoter-hospitals {path?} {--sheet=Relación Promotor-Hospital} {--dry-run} {--chunk=500}`

Purpose
- Import mappings between promoters and hospitals from a spreadsheet sheet (default "Relación Promotor-Hospital") into `hwm.pro_promoter_hospitals`.

Important details
- Header normalization is flexible (the command maps variations like `código`, `codigo`, `id promotor`, `hospital`, etc.).
- It preloads hospitals and promoters and supports reviving soft-deleted mappings.

Run example
  ```bash
  php artisan hwm:import-promoter-hospitals /path/to/Layout\ Catálogos.xlsx --sheet="Relación Promotor-Hospital" --dry-run
  ```

---

## 6) ImportPromotersUsersFromSpreadsheet

- File: `ImportPromotersUsersFromSpreadsheet.php`
- Signature: `hwm:import-promoters-users-xls {path?} {--sheet=Promotor} {--dry-run} {--chunk=500} {--domain=hospicore.com.mx} {--pwd-suffix=HOSPICORE25} {--saas=0}`

Purpose
- Create or find PROMOTOR users from a spreadsheet, link them to `pro_promoters` by promoter code, optionally assign RBAC roles and optionally register in SaaS.

Important details
- Uses a streaming read filter to process large files in chunks.
- Builds canonical emails from commercial name/name if missing and dedupes email addresses.
- Optionally assigns the `PROM_HOSPITAL_LIAISON` RBAC role if that role exists in DB.

Run example
  ```bash
  php artisan hwm:import-promoters-users-xls /path/to/promoterUsers.xlsx --sheet=Promotor --domain=example.com --dry-run
  ```

---

## Best practices and safety

- Always try `--dry-run` first on import/sync commands to validate candidate counts and see warnings.
- Run heavy imports during maintenance windows or off-peak hours.
- Backup relevant DB tables before running mass-updates or enabling `--override-professional-id`.
- Monitor `storage/logs/` after runs for summary logs produced by the commands.

## Troubleshooting tips

- Missing columns / malformed XLSX: Open the file in Excel and verify header names — the import scripts expect specific column names (see the top of each command file for the required headers).
- Timezone/cutoff issues: Verify `HwmConfig::$HWM_TIMEZONE` and the `DB_IS_UTC` flag inside the command if DB datetimes are stored in UTC.
- External service issues (Credencializa): Ensure service credentials and connectivity are configured; test the service class in tinker or a tiny script to confirm API access.
- Database deadlocks/timeouts: reduce chunk size (e.g., `--chunk=100`) and re-run.

## Contributing small improvements

- Add unit tests for helper functions (e.g., `matchSpecialtyId`, name normalization). Keep the imports idempotent.
- For large imports, consider queueing per-chunk jobs and using job workers instead of long-running Artisan commands.

---

If you want, I can:

- Add example crontab entries to a `crontab.example` file near these scripts.
- Add a small `test` artisan command that runs a dry-run summary for all commands.

Completed: documented commands in this folder. Verify and tell me if you want more detail per-command (field-by-field mapping, expected Excel column examples, or sample outputs/log files).
