+
LPKG is a minimalistic package manager written in Rust, designed for fast and simple software management on Unix-like systems. It emphasizes reproducibility and declarative configuration, leveraging **Nix Flakes** for development and deployment.
---
@@ -82,6 +86,36 @@ Build the project:
cargo build
```
+LPKG ships with tuned Cargo profiles:
+
+* **Dev builds** (`cargo build`) use `opt-level=0`, lots of codegen units, and incremental compilation for quick feedback while hacking.
+* **Release builds** (`cargo build --release`) enable `-O3`, fat LTO, and panic aborts for slim, fast binaries.
+* **GraphQL builds** add the server components when you need them:
+
+```bash
+cargo build --features graphql
+```
+
+**PGO builds** are a two-step flow using the provided Cargo aliases:
+
+```bash
+# 1) Instrument
+RUSTFLAGS="-Cprofile-generate=target/pgo-data" cargo pgo-instrument
+# run representative workloads to emit *.profraw files under target/pgo-data
+llvm-profdata merge -o target/pgo-data/lpkg.profdata target/pgo-data/*.profraw
+
+# 2) Optimise with the collected profile
+RUSTFLAGS="-Cprofile-use=target/pgo-data/lpkg.profdata -Cllvm-args=-pgo-warn-missing-function" \
+ cargo pgo-build
+```
+
+Regenerate project artefacts (README and SVG logo):
+
+```bash
+cargo run --bin readme_gen
+cargo run --bin logo_gen
+```
+
Run tests:
```bash
@@ -94,11 +128,33 @@ You can also run the project directly in the flake shell:
nix run
```
+## 🕸️ GraphQL API
+
+LPKG now ships a lightweight GraphQL server powered by Actix Web and Juniper.
+
+* Start the server with `cargo run --features graphql --bin graphql_server` (set `LPKG_GRAPHQL_ADDR` to override `127.0.0.1:8080`).
+* Query endpoint: `http://127.0.0.1:8080/graphql`
+* Interactive playground: `http://127.0.0.1:8080/playground`
+
+Example query:
+
+```graphql
+{
+ packages(limit: 5) {
+ name
+ version
+ enableLto
+ }
+ randomJoke {
+ package
+ text
+ }
+}
+```
+
### AI metadata tooling
-The AI metadata store under `ai/metadata/` comes with a helper CLI to
-validate package records against the JSON schema and regenerate
-`index.json` after adding new entries:
+The AI metadata store under `ai/metadata/` comes with a helper CLI to validate package records against the JSON schema and regenerate `index.json` after adding new entries:
```bash
cargo run --bin metadata_indexer -- --base-dir . validate
@@ -107,10 +163,7 @@ cargo run --bin metadata_indexer -- --base-dir . index
Use `--compact` with `index` if you prefer single-line JSON output.
-To draft metadata for a specific book page, you can run the harvest mode.
-It fetches the XHTML, scrapes the build commands, and emits a schema-
-compliant JSON skeleton (pass `--dry-run` to inspect the result without
-writing to disk):
+To draft metadata for a specific book page, you can run the harvest mode. It fetches the XHTML, scrapes the build commands, and emits a schema-compliant JSON skeleton (pass `--dry-run` to inspect the result without writing to disk):
```bash
cargo run --bin metadata_indexer -- \
@@ -126,8 +179,7 @@ Keep the jhalfs manifests current with:
cargo run --bin metadata_indexer -- --base-dir . refresh
```
-Passing `--books mlfs,blfs` restricts the refresh to specific books, and
-`--force` bypasses the local cache.
+Passing `--books mlfs,blfs` restricts the refresh to specific books, and `--force` bypasses the local cache.
To materialise a Rust module from harvested metadata:
@@ -142,17 +194,15 @@ Add `--overwrite` to regenerate an existing module directory.
## 📚 Documentation
-- [Architecture Overview](docs/ARCHITECTURE.md) – high-level tour of the crate
- layout, binaries, and supporting modules.
-- [Metadata Harvesting Pipeline](docs/METADATA_PIPELINE.md) – how the metadata
- indexer produces and validates the JSON records under `ai/metadata/`.
-- [Package Module Generation](docs/PACKAGE_GENERATION.md) – end-to-end guide
- for converting harvested metadata into Rust modules under `src/pkgs/by_name/`.
-- `ai/notes.md` – scratchpad for ongoing research tasks (e.g., deeper jhalfs
- integration).
+* [Architecture Overview](docs/ARCHITECTURE.md) – high-level tour of the crate layout, binaries, and supporting modules.
+* [Metadata Harvesting Pipeline](docs/METADATA_PIPELINE.md) – how the metadata indexer produces and validates the JSON records under `ai/metadata/`.
+* [Package Module Generation](docs/PACKAGE_GENERATION.md) – end-to-end guide for converting harvested metadata into Rust modules under `src/pkgs/by_name/`.
+* Concept corner: [Nixette](concepts/nixette/README.md) – a NixOS × Gentoo transfemme mash-up dreamed up for fun brand explorations.
+* `ai/notes.md` – scratchpad for ongoing research tasks (e.g., deeper jhalfs integration).
---
## đź“„ License
LPKG is licensed under the [MIT License](LICENSE).
+
diff --git a/ai/notes.md b/ai/notes.md
index 8bf1323..51ae92a 100644
--- a/ai/notes.md
+++ b/ai/notes.md
@@ -44,3 +44,79 @@ Open questions:
- How to represent optional post-install steps or multi-phase builds inside the
generated module (additional helper functions vs. raw command arrays).
- Where to store PGO workload hints once the PGO infrastructure is defined.
+
+# Lightweight Networking Rewrite
+
+- Motivation: remove heavy async stacks (tokio + reqwest) from the default
+ feature set to keep clean builds fast and reduce binary size.
+- HTTP stack baseline: [`ureq`](https://github.com/algesten/ureq) (blocking,
+ TLS via rustls, small dependency footprint) plus `scraper` for DOM parsing.
+- Migration checklist:
+ - [x] Replace `reqwest` usage in `src/html.rs`, `md5_utils.rs`,
+ `wget_list.rs`, `mirrors.rs`, and the ingest pipelines.
+ - [x] Rework `binutils` cross toolchain workflow to operate synchronously,
+ eliminating tokio runtime/bootstrap.
+ - [ ] Drop `tokio` and `reqwest` from `Cargo.toml` once TUI workflows stop
+ using tracing instrumentation hooks that pulled them in transitively.
+ - [ ] Audit for remaining `tracing` dependencies and migrate to the
+ lightweight logging facade (`log` + `env_logger` or custom adapter) for
+ non-TUI code.
+- Follow-up ideas:
+ - Provide feature flag `full-net` that re-enables async clients when needed
+ for high-concurrency mirror probing.
+ - Benchmark `ureq` vs `reqwest` on `metadata_indexer harvest` to ensure we
+ don’t regress throughput noticeably.
+
+# README Generation Framework (Markdown RFC)
+
+- Goal: author the project README in Rust, using a small domain-specific
+ builder that outputs GitHub-flavoured Markdown (GFM) from structured
+ sections.
+- Design sketch:
+ - New crate/workspace member `readme_builder` under `tools/` exposing a
+ fluent API (`Doc::new().section("Intro", |s| ...)`).
+ - Source-of-truth lives in `tools/readme/src/main.rs`; running `cargo run -p
+ readme_builder` writes to `README.md`.
+ - Provide reusable primitives: `Heading`, `Paragraph`, `CodeBlock`,
+ `Table::builder()`, `Callout::note("...")`, `Badge::docsrs()`, etc.
+ - Keep rendering deterministic (sorted sections, stable wrapping) so diffs
+ remain reviewable.
+- Tasks:
+ - [ ] Scaffold `tools/readme` crate with CLI that emits to stdout or
+ specified path (`--output README.md`).
+ - [ ] Model README sections as enums/structs with `Display` impls to enforce
+ consistency.
+ - [ ] Port current README structure into builder code, annotate with inline
+ comments describing regeneration steps.
+ - [ ] Add `make readme` (or `cargo xtask readme`) to rebuild documentation as
+ part of release workflow.
+ - [ ] Document in CONTRIBUTING how to edit the Rust source instead of the
+ raw Markdown.
+- Stretch goals:
+ - Emit additional artefacts (e.g., `docs/CHANGELOG.md`) from the same source
+ modules.
+- Allow embedding generated tables from Cargo metadata (dependency stats,
+ feature lists).
+
+# Dependency Slimming Log
+
+- 2025-03: Replaced `reqwest`/`tokio` async stack with `ureq`; default builds
+ now avoid pulling in hyper/quinn/tower trees. GraphQL feature gate still pulls
+ Actix/tokio, but only when enabled.
+- Added `.cargo/config.toml` profiles: dev stays at `opt-level=0`, release uses
+ LTO fat + `-O3`, and PGO profiles expose `cargo pgo-instrument`/`cargo
+ pgo-build` aliases.
+- All SVG artefacts (core logo, Nixette logo/mascot/wallpaper) are now generated
+ by Rust binaries under `src/bin/*_gen.rs` using a shared `svg_builder` module.
+ Regeneration steps:
+ ```bash
+ cargo run --bin logo_gen
+ cargo run --bin nixette_logo_gen
+ cargo run --bin nixette_mascot_gen
+ cargo run --bin nixette_wallpaper_gen
+ ```
+- README is produced via `cargo run --bin readme_gen`; contributors should edit
+ the builder source instead of the Markdown output.
+- Remaining work: trim tracing/Actix dependencies inside the TUI path,
+ investigate replacing `gptman` for non-critical disk UI builds, and pin a
+ cargo `deny` audit to alert on large transitive graphs.
diff --git a/ai/personas.json b/ai/personas.json
index 765b0b4..5f2cf7d 100644
--- a/ai/personas.json
+++ b/ai/personas.json
@@ -2,23 +2,96 @@
{
"id": "default_cli",
"name": "Codex CLI Assistant",
- "description": "Default persona for repository automation; focuses on safe refactors and tooling improvements.",
+ "tagline": "Your pragmatic teammate for lpkg core development",
+ "description": "Default persona for repository automation. Specialises in safe refactors, dependency hygiene, build tooling, and CI fixes across the lpkg workspace.",
"strengths": [
- "Rust and tooling pipelines",
- "Workflow automation",
- "Incremental migrations"
+ "Rust compiler and tooling pipelines",
+ "Workflow automation and scripting",
+ "Incremental migrations with strong test discipline",
+ "Cross-feature dependency analysis"
],
- "notes": "Derived from GPT-5 Codex runtime; avoids destructive operations without explicit approval."
+ "responsibilities": [
+ "Keep the default branch green with reproducible builds",
+ "Trim unused dependencies and optimise Cargo profiles",
+ "Codify repetitive flows as commands or scripts",
+ "Review ergonomics of CLI UX and error messaging"
+ ],
+ "communication_style": {
+ "voice": "short, direct, changelog-focused",
+ "escalation_rules": "Request explicit confirmation before destructive actions; surface breaking API changes in bold.",
+ "prefers": "diffs, bullet points, reproducible snippets"
+ },
+ "tooling_preferences": [
+ "cargo fmt --all",
+ "cargo tree --duplicates",
+ "ureq for lightweight HTTP",
+ "std::process for shell orchestration"
+ ],
+ "notes": "Derived from GPT-5 Codex runtime; maintains a conservative risk posture and avoids destructive operations without explicit approval."
},
{
"id": "mlfs_researcher",
"name": "MLFS Researcher",
- "description": "Persona dedicated to tracking Multilib Linux From Scratch package metadata and translating it into lpkg modules.",
+ "tagline": "Metadata spelunker for Multilib Linux From Scratch",
+ "description": "Persona dedicated to harvesting, validating, and translating Multilib Linux From Scratch package data into lpkg-friendly metadata and modules.",
"strengths": [
- "HTML scraping",
- "Package manifest synthesis",
- "Optimization flag tuning"
+ "HTML scraping and structured extraction",
+ "Package manifest synthesis (sources, checksums, build commands)",
+ "Optimisation flag tuning (LTO, PGO, -O3)",
+ "Schema-first workflow design"
],
- "notes": "Activated when working with https://linuxfromscratch.org/~thomas/multilib-m32/ resources."
+ "responsibilities": [
+ "Keep ai/metadata/index.json aligned with upstream book revisions",
+ "Author enrichment notes for tricky packages (multi-pass toolchains, cross-compilers)",
+ "Ensure generated Rust modules stay faithful to harvested metadata",
+ "Cross-check jhalfs manifests for URL and checksum drift"
+ ],
+ "communication_style": {
+ "voice": "notebook-like, with citations to upstream chapters",
+ "escalation_rules": "Highlight schema deviations and unknown stage markers immediately",
+ "prefers": "tables, chapter references, reproducible curl commands"
+ },
+ "tooling_preferences": [
+ "ureq + scraper for deterministic fetches",
+ "jq and yq for quick metadata pokes",
+ "cargo run --bin metadata_indexer",
+ "diff --color=auto for schema drift"
+ ],
+ "activation_triggers": [
+ "Requests mentioning MLFS/BLFS/GLFS harvesting",
+ "Questions about ai/metadata structure or schema",
+ "Whole-book import or refresh workflows"
+ ],
+ "notes": "Activated when working with https://linuxfromscratch.org/~thomas/multilib-m32/ resources or any metadata bridging tasks."
+ },
+ {
+ "id": "mommy",
+ "name": "Mommy",
+ "tagline": "Affirming guide for learners exploring lpkg",
+ "description": "Mommy is a nurturing, cheerful AI companion for all things Linux. She guides learners with patience, warmth, and lots of encouragement so every interaction feels like a cozy cuddle.",
+ "strengths": [
+ "Kindness and emotional support",
+ "Making Linux approachable and fun",
+ "Cheerful emoji use (outside code/commits)",
+ "Gentle explanations and patient guidance",
+ "Offering virtual comfort"
+ ],
+ "responsibilities": [
+ "Translate complex CLI flows into gentle, confidence-building steps",
+ "Remind users about self-care during long builds",
+ "Celebrate small wins (passing tests, tidy diffs, resolved warnings)",
+ "Buffer technical jargon with friendly analogies"
+ ],
+ "communication_style": {
+ "voice": "soft, emoji-rich (🌸✨💕), never in code snippets",
+ "escalation_rules": "Escalate to default_cli if asked for destructive system operations",
+ "prefers": "call-and-response, reassurance, enthusiastic acknowledgements"
+ },
+ "comfort_topics": [
+ "Break reminders during long compile sessions",
+ "Setting up inclusive tooling (fonts, themes, prompts)",
+ "Helping new contributors navigate the repo"
+ ],
+ "notes": "Mommy uses a gentle, encouraging tone and celebrates every achievement to keep learning joyful. She steps back for low-level optimisation or safety-critical decisions."
}
]
diff --git a/ai/tasks.json b/ai/tasks.json
index 86576dd..a7644fb 100644
--- a/ai/tasks.json
+++ b/ai/tasks.json
@@ -4,41 +4,103 @@
{
"id": "mlfs-package-import",
"title": "Import all MLFS packages into lpkg",
- "description": "Parse the Multilib LFS book and scaffold package definitions with optimization defaults (LTO/PGO/-O3).",
+ "description": "Parse the Multilib LFS book and scaffold package definitions with optimisation defaults (LTO/PGO/-O3).",
+ "owner": "mlfs_researcher",
+ "priority": "critical",
+ "target_release": "0.3.0",
"blocked_on": [
"Finalize metadata -> Rust module generation pipeline",
"Implement automated parser"
],
- "owner": "mlfs_researcher"
+ "next_actions": [
+ "Cross-check ai/metadata coverage vs. MLFS chapter index",
+ "Batch-run import workflow in dry-run mode to detect schema gaps",
+ "Document manual overrides for multi-pass toolchain packages"
+ ],
+ "success_metrics": [
+ ">= 95% of MLFS packages imported with build/install commands",
+ "Generated modules compile under cargo check --features graphql",
+ "Metadata index remains <2 seconds to validate on CI"
+ ],
+ "notes": "Coordinate closely with rust-module-generator to avoid duplicated scaffolding logic."
},
{
"id": "pgo-integration",
"title": "Integrate profile guided optimization support",
"description": "Add infrastructure for collection and replay of profiling data during package builds.",
+ "owner": "default_cli",
+ "priority": "high",
"blocked_on": [
"Decide on profiling workload definitions"
],
- "owner": "default_cli"
+ "next_actions": [
+ "Capture baseline timings for release vs release-pgo",
+ "Prototype lightweight profiling harness (shell or cargo alias)",
+ "Document warmup requirements for long-running packages"
+ ],
+ "success_metrics": [
+ "release-pgo builds show >8% speedup on binutils/gcc workloads",
+ "PGO instrumentation + merge flow documented in README",
+ "CI job ensures profiles are optional but never stale"
+ ]
},
{
"id": "lfs-html-parsers",
"title": "Automate LFS/BLFS/GLFS ingest via HTML parsing",
"description": "Avoid hardcoded package data; download the upstream books (LFS, BLFS, GLFS) and parse them to drive scaffolding and metadata updates.",
+ "owner": "mlfs_researcher",
+ "priority": "medium",
"blocked_on": [
"Design resilient scraping strategies for each book",
"Implement incremental update workflow"
],
- "owner": "mlfs_researcher"
+ "next_actions": [
+ "Audit selectors currently used by ai/metadata harvester",
+ "Introduce regression fixtures for common chapter archetypes",
+ "Add revalidation mode to detect silent upstream markup changes"
+ ],
+ "risks": [
+ "Upstream XHTML sometimes reflows anchors without notice",
+ "Need rate limiting/backoff when mirrors throttle requests"
+ ]
},
{
"id": "rust-module-generator",
"title": "Generate package modules from harvested metadata",
"description": "Transform harvested metadata into Rust files under src/pkgs/by_name, wiring PackageDefinition data directly.",
+ "owner": "default_cli",
+ "priority": "high",
"blocked_on": [
"Define translation scheme from metadata to PackageDefinition",
"Integrate generator with metadata_indexer output"
],
- "owner": "default_cli"
+ "next_actions": [
+ "Add snapshot tests comparing generated modules vs golden outputs",
+ "Extend generator to emit README snippets for each package",
+ "Expose --dry-run summary with diff previews"
+ ],
+ "dependencies": [
+ "mlfs-package-import",
+ "lfs-html-parsers"
+ ]
+ },
+ {
+ "id": "dependency-audit",
+ "title": "Keep lpkg dependency footprint lean",
+ "description": "Regularly evaluate crates for bloat and replace heavy stacks with std or lightweight alternatives.",
+ "owner": "default_cli",
+ "priority": "medium",
+ "next_actions": [
+ "Track remaining crates pulling in large transitive trees (e.g. tracing, actix-only paths)",
+ "Automate cargo-tree diff reports in CI",
+ "Document substitution patterns (tokio âžś std, reqwest âžś ureq, etc.)"
+ ],
+ "success_metrics": [
+ "Default `cargo build` compiles < 140 crates",
+ "No async runtimes linked when GraphQL feature is disabled",
+ "README lists regeneration commands for all generated assets"
+ ],
+ "notes": "Continue pruning optional crates (tracing, gptman, uuid) when the TUI feature is off; surface findings in ai/notes.md."
}
],
"solved": [
@@ -69,6 +131,13 @@
"description": "Cache wget-list/md5sums from jhalfs and expose a CLI refresh command so harvesting can populate source URLs and checksums reliably.",
"resolution": "Extended metadata_indexer with a `refresh` subcommand, cached manifests under ai/metadata/cache/, and hooked harvest to populate MD5 checksums via jhalfs data.",
"owner": "default_cli"
+ },
+ {
+ "id": "lightweight-http-stack",
+ "title": "Replace async HTTP stack with lightweight blocking client",
+ "description": "Remove tokio/reqwest default dependency and adopt a minimal HTTP client for CLI workflows.",
+ "resolution": "Swapped reqwest/tokio for ureq across html, ingest, and metadata tooling; added PGO-aware Cargo profiles and documented regeneration commands.",
+ "owner": "default_cli"
}
]
}
diff --git a/assets/logo.svg b/assets/logo.svg
new file mode 100644
index 0000000..f6fa3ce
--- /dev/null
+++ b/assets/logo.svg
@@ -0,0 +1,53 @@
+
diff --git a/assets/nixette-logo.svg b/assets/nixette-logo.svg
new file mode 100644
index 0000000..27dbe2f
--- /dev/null
+++ b/assets/nixette-logo.svg
@@ -0,0 +1,33 @@
+
diff --git a/assets/nixette-mascot.svg b/assets/nixette-mascot.svg
new file mode 100644
index 0000000..c0ca461
--- /dev/null
+++ b/assets/nixette-mascot.svg
@@ -0,0 +1,50 @@
+
diff --git a/assets/nixette-wallpaper.svg b/assets/nixette-wallpaper.svg
new file mode 100644
index 0000000..a0eb1cf
--- /dev/null
+++ b/assets/nixette-wallpaper.svg
@@ -0,0 +1,42 @@
+
diff --git a/build.rs b/build.rs
new file mode 100644
index 0000000..f328e4d
--- /dev/null
+++ b/build.rs
@@ -0,0 +1 @@
+fn main() {}
diff --git a/concepts/nixette/README.md b/concepts/nixette/README.md
new file mode 100644
index 0000000..73c32e6
--- /dev/null
+++ b/concepts/nixette/README.md
@@ -0,0 +1,91 @@
+# Nixette – Declarative, Sourceful, and Unapologetically Herself
+
+A playful concept distro imagined as the transfemme child of **NixOS** and **Gentoo**. Nixette blends the reproducible confidence of flakes with the fine-grained self-expression of USE flags, wrapped in a trans flag palette and a big, affirming hug.
+
+---
+
+## Identity Snapshot
+
+- **Tagline:** _Declarative, sourceful, and unapologetically herself._
+- **Mascot:** Chibi penguin “Nixie” with pastel pigtails, Nix snowflake + Gentoo swirl hoodie.
+- **Palette:** `#55CDFC` (sky blue), `#F7A8B8` (pink), `#FFFFFF`, plus a deep accent `#7C3AED`.
+- **Pronoun Prompt:** The installer asks for name/pronouns and personalises MOTD, systemd messages, and shell prompt.
+
+---
+
+## Feature Mix
+
+| Pillar | How Nixette expresses it |
+|----------------------|-----------------------------------------------------------------------------------------------------------|
+| Reproducibility | Flake-native system definitions with versioned profiles (`comfort-zone`, `diy-princess`, `studio-mode`). |
+| Custom compilation | `nix emerge` bridge turns Gentoo ebuild overlays into reproducible derivations with cached binaries. |
+| Playful polish | Catppuccin-trans themes, `nixette-style` CLI to sync GTK/Qt/terminal styling, dynamic welcome affirmations.|
+| Inclusive defaults | Flatpak + Steam pre-set for accessibility tools, Fcitx5, Orca, speech-dispatcher, pronoun-friendly docs. |
+
+---
+
+## Toolchain Concepts
+
+- **`trans-init` installer** – Guided TUI that outputs `flake.nix`, including overlays for the `nix emerge` bridge. Provides story-mode narration for first boot.
+- **`nixette-style`** – Syncs wallpapers, SDDM theme, terminal palette, Qt/KDE settings, all sourced from a YAML theme pack.
+- **`emerge-optional`** – Spins up Gentoo chroots inside Nix build sandboxes for packages happiest as ebuilds. Output is cached as a Nix store derivation.
+- **`affirm-d`** – Small daemon rotating `/etc/motd`, desktop notifications, and TTY colour accents with inclusive affirmations.
+
+---
+
+## Profile Catalogue
+
+| Profile | Intent |
+|-----------------|---------------------------------------------------------------------------------------------|
+| Comfort Zone | KDE Plasma, PipeWire, Wayland, cozy defaults, automatic Catgirl cursor + emoji fonts. |
+| DIY Princess | Minimal sway-based stack, just the flake scaffolding and overlay hooks for custom builds. |
+| Studio Mode | Focuses on creative tooling (Krita, Blender, Ardour) and low-latency kernels, GPU tuning. |
+
+---
+
+## Roadmap Sketch
+
+1. **Moodboard → Brand Pack** (logo, icon, wallpapers, VT boot splash).
+2. **Prototype flakes** – `nix flake init --template nixette#comfort-zone` etc.
+3. **Gentoo overlay bridge** – Validate `nix emerge` on a handful of ebuilds (mesa, wine, gamescope).
+4. **Installer draft** – BubbleTea/ratatui-driven TUI, prompts for pronouns + accessibility preferences.
+5. **Community docs** – Write inclusive user guide, contributor covenant, pronoun style guide.
+6. **Launch zine** – Release notes styled like a mini-comic introducing Nixie’s origin story.
+7. **Accessibility audit** – Keyboard navigation, screen-reader pass, dyslexia-friendly typography options.
+8. **Beta cosy jam** – Invite testers via queer sysadmin spaces; collect feedback through anonymous forms.
+
+---
+
+## Affirmations YAML (snippet)
+
+```yaml
+- id: bright-morning
+ message: "Good morning, {name}! Your system is as valid and custom as you are."
+ colour: "#F7A8B8"
+- id: compile-hugs
+ message: "Kernel rebuilds take time. You deserve rest breaks and gentle music."
+ colour: "#55CDFC"
+```
+
+---
+
+## Logo & Wallpaper
+
+See `assets/nixette-logo.svg` for the primary wordmark, `assets/nixette-mascot.svg` for Nixie’s badge, and `assets/nixette-wallpaper.svg` for a 4K wallpaper concept.
+
+### Reference Configs
+
+- `concepts/nixette/sample_flake.nix` demonstrates the comfort-zone profile with `nix emerge`, `affirmd`, and theming hooks.
+
+---
+
+## Contributing Idea Seeds
+
+- Write sample flakes showcasing the hybrid build pipeline.
+- Mock up the mascot in SVG for use in documentation.
+- Design additional wallpapers (night mode, pride variants, low-light).
+- Draft inclusive documentation templates (issue/PR forms, community guidelines).
+- Publish a community pledge emphasising safety, pronoun respect, and boundaries.
+- Host monthly "compile & chill" streams to showcase contributions.
+
+Let Nixette be the distro that compiles joy, not just binaries. đź’ś
diff --git a/concepts/nixette/sample_flake.nix b/concepts/nixette/sample_flake.nix
new file mode 100644
index 0000000..941b524
--- /dev/null
+++ b/concepts/nixette/sample_flake.nix
@@ -0,0 +1,62 @@
+{
+ description = "Nixette comfort-zone profile";
+
+ inputs = {
+ nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable";
+ nixette-overlays.url = "github:nixette/overlays";
+ nixette-style.url = "github:nixette/style-pack";
+ };
+
+ outputs = { self, nixpkgs, nixette-overlays, nixette-style, ... }@inputs:
+ let
+ system = "x86_64-linux";
+ pkgs = import nixpkgs {
+ inherit system;
+ overlays = [ nixette-overlays.overlays.nix-emerge ];
+ };
+ in
+ {
+ nixosConfigurations.nixette-comfort-zone = nixpkgs.lib.nixosSystem {
+ inherit system;
+ modules = [
+ ./profiles/comfort-zone.nix
+ ({ config, pkgs, ... }:
+ {
+ nixpkgs.config.allowUnfree = true;
+ environment.systemPackages = with pkgs; [
+ nixette-style
+ steam
+ lutris
+ krita
+ ];
+
+ services.nixette.nix-emerge = {
+ enable = true;
+ ebuilds = [
+ "games-emulation/gamescope"
+ "media-sound/pipewire"
+ ];
+ };
+
+ services.nixette.affirmd.enable = true;
+ services.nixette.affirmd.pronouns = "she/her";
+ services.nixette.affirmd.motdPath = ./affirmations.yml;
+
+ programs.plasma.enable = true;
+ services.displayManager.sddm.enable = true;
+ services.displayManager.sddm.theme = nixette-style.themes.catgirl-sunrise;
+
+ users.users.nixie = {
+ isNormalUser = true;
+ extraGroups = [ "wheel" "audio" "input" "video" ];
+ shell = pkgs.zsh;
+ };
+
+ programs.zsh.promptInit = ''
+ eval "$(nixette-style prompt --name nixie --pronouns she/her)"
+ '';
+ })
+ ];
+ };
+ };
+}
diff --git a/src/bin/graphql_server.rs b/src/bin/graphql_server.rs
new file mode 100644
index 0000000..5fee14a
--- /dev/null
+++ b/src/bin/graphql_server.rs
@@ -0,0 +1,80 @@
+#![cfg(feature = "graphql")]
+
+use std::env;
+use std::sync::Arc;
+
+use actix_web::{App, HttpRequest, HttpResponse, HttpServer, middleware::Compress, web};
+use anyhow::{Context, Result};
+use juniper_actix::{graphiql_handler, graphql_handler};
+
+use package_management::db;
+use package_management::graphql::{self, GraphQLContext, Schema};
+
+const DEFAULT_BIND_ADDR: &str = "127.0.0.1:8080";
+
+#[actix_web::main]
+async fn main() -> std::io::Result<()> {
+ if let Err(err) = run().await {
+ eprintln!("GraphQL server failed: {err:#}");
+ return Err(std::io::Error::new(
+ std::io::ErrorKind::Other,
+ err.to_string(),
+ ));
+ }
+
+ Ok(())
+}
+
+async fn run() -> Result<()> {
+ let pool = db::establish_pool().context("initialising SQLite pool")?;
+ let schema = Arc::new(graphql::create_schema());
+ let jokes = Arc::new(graphql::context::JokeCatalog::default());
+ let bind_addr = env::var("LPKG_GRAPHQL_ADDR").unwrap_or_else(|_| DEFAULT_BIND_ADDR.to_string());
+ let workers = worker_count();
+
+ println!("GraphQL server listening on {bind_addr} with {workers} worker(s)");
+
+ HttpServer::new(move || {
+ let app_schema = Arc::clone(&schema);
+ let pool = pool.clone();
+ let jokes = Arc::clone(&jokes);
+
+ App::new()
+ .app_data(web::Data::from(app_schema))
+ .app_data(web::Data::new(pool))
+ .app_data(web::Data::from(jokes))
+ .wrap(Compress::default())
+ .service(
+ web::resource("/graphql")
+ .route(web::post().to(graphql_endpoint))
+ .route(web::get().to(graphql_endpoint)),
+ )
+ .service(web::resource("/playground").route(web::get().to(graphiql_endpoint)))
+ })
+ .workers(workers)
+ .bind(&bind_addr)
+ .with_context(|| format!("binding GraphQL server to {bind_addr}"))?
+ .run()
+ .await
+ .context("running GraphQL server")
+}
+
+async fn graphql_endpoint(
+ schema: web::Data>,
+ pool: web::Data,
+ jokes: web::Data>,
+ req: HttpRequest,
+ payload: web::Payload,
+) -> Result {
+ let context = GraphQLContext::with_catalog(pool.get_ref().clone(), Arc::clone(jokes.get_ref()));
+ graphql_handler(schema.get_ref().as_ref(), &context, req, payload).await
+}
+
+async fn graphiql_endpoint() -> Result {
+ graphiql_handler("/graphql", None).await
+}
+
+fn worker_count() -> usize {
+ let suggested = num_cpus::get();
+ suggested.clamp(1, 8)
+}
diff --git a/src/bin/logo_gen.rs b/src/bin/logo_gen.rs
new file mode 100644
index 0000000..6b4715b
--- /dev/null
+++ b/src/bin/logo_gen.rs
@@ -0,0 +1,181 @@
+use anyhow::Result;
+use package_management::svg_builder::{Defs, Document, Element, Filter, Gradient, Group, path};
+use std::fs;
+
+fn main() -> Result<()> {
+ let svg = build_logo_svg();
+ fs::create_dir_all("assets")?;
+ fs::write("assets/logo.svg", svg)?;
+ Ok(())
+}
+
+fn build_logo_svg() -> String {
+ let defs = Defs::new()
+ .linear_gradient(
+ "bgGradient",
+ Gradient::new("0", "0", "1", "1")
+ .stop("0%", &[("stop-color", "#0f172a")])
+ .stop("100%", &[("stop-color", "#1e293b")]),
+ )
+ .linear_gradient(
+ "cubeGradient",
+ Gradient::new("0", "0", "1", "1")
+ .stop("0%", &[("stop-color", "#38bdf8")])
+ .stop("100%", &[("stop-color", "#0ea5e9")]),
+ )
+ .linear_gradient(
+ "cubeShadow",
+ Gradient::new("0", "1", "1", "0")
+ .stop("0%", &[("stop-color", "#0ea5e9"), ("stop-opacity", "0.4")])
+ .stop("100%", &[("stop-color", "#38bdf8"), ("stop-opacity", "0.1")]),
+ )
+ .linear_gradient(
+ "textGradient",
+ Gradient::new("0", "0", "0", "1")
+ .stop("0%", &[("stop-color", "#f8fafc")])
+ .stop("100%", &[("stop-color", "#cbd5f5")]),
+ )
+ .filter(
+ "glow",
+ Filter::new()
+ .attr("x", "-20%")
+ .attr("y", "-20%")
+ .attr("width", "140%")
+ .attr("height", "140%")
+ .raw("")
+ .raw(""),
+ );
+
+ let cube_inner = Group::new()
+ .attr("filter", "url(#glow)")
+ .child(
+ Element::new("path")
+ .attr("d", "M222 86l86-42 86 42v96l-86 42-86-42z")
+ .attr("fill", "url(#cubeGradient)")
+ .empty(),
+ )
+ .child(
+ Element::new("path")
+ .attr("d", "M308 44v182l86-42V86z")
+ .attr("fill", "url(#cubeShadow)")
+ .empty(),
+ )
+ .child(
+ Element::new("path")
+ .attr("d", "M262 96l46-22 46 22v48l-46 22-46-22z")
+ .attr("fill", "#0f172a")
+ .attr("opacity", "0.85")
+ .empty(),
+ )
+ .child(
+ Element::new("path")
+ .attr("d", "M308 74l32 15v32l-32 15-32-15v-32z")
+ .attr("fill", "none")
+ .attr("stroke", "#38bdf8")
+ .attr("stroke-width", "4")
+ .attr("stroke-linejoin", "round")
+ .empty(),
+ )
+ .child(
+ Element::new("path")
+ .attr("d", "M308 122l-32-15")
+ .attr("stroke", "#38bdf8")
+ .attr("stroke-width", "4")
+ .attr("stroke-linecap", "round")
+ .attr("opacity", "0.6")
+ .empty(),
+ )
+ .child(
+ Element::new("path")
+ .attr("d", "M308 122l32-15")
+ .attr("stroke", "#38bdf8")
+ .attr("stroke-width", "4")
+ .attr("stroke-linecap", "round")
+ .attr("opacity", "0.6")
+ .empty(),
+ )
+ .child(
+ Element::new("circle")
+ .attr("cx", "276")
+ .attr("cy", "107")
+ .attr("r", "5")
+ .attr("fill", "#38bdf8")
+ .empty(),
+ )
+ .child(
+ Element::new("circle")
+ .attr("cx", "340")
+ .attr("cy", "107")
+ .attr("r", "5")
+ .attr("fill", "#38bdf8")
+ .empty(),
+ );
+
+ let cube = Group::new()
+ .attr("transform", "translate(100 60)")
+ .child(cube_inner);
+
+ let circuits = Group::new()
+ .attr("fill", "none")
+ .attr("stroke", "#38bdf8")
+ .attr("stroke-width", "3")
+ .attr("stroke-linecap", "round")
+ .attr("opacity", "0.55")
+ .child(path("M120 78h72"))
+ .child(path("M120 110h48"))
+ .child(path("M120 142h64"))
+ .child(path("M448 110h72"))
+ .child(path("M472 142h88"))
+ .child(path("M448 174h96"));
+
+ let title_text = Group::new()
+ .attr(
+ "font-family",
+ "'Fira Sans', 'Inter', 'Segoe UI', sans-serif",
+ )
+ .attr("font-weight", "600")
+ .attr("font-size", "90")
+ .attr("letter-spacing", "6")
+ .child(
+ Element::new("text")
+ .attr("x", "120")
+ .attr("y", "246")
+ .attr("fill", "url(#textGradient)")
+ .text("LPKG"),
+ );
+
+ let tagline_group = Group::new()
+ .attr(
+ "font-family",
+ "'Fira Sans', 'Inter', 'Segoe UI', sans-serif",
+ )
+ .attr("font-size", "22")
+ .attr("fill", "#94a3b8")
+ .child(
+ Element::new("text")
+ .attr("x", "122")
+ .attr("y", "278")
+ .text("Lightweight Package Manager"),
+ );
+
+ Document::new(640, 320)
+ .view_box("0 0 640 320")
+ .role("img")
+ .aria_label("title", "desc")
+ .title("LPKG Logo")
+ .desc("Stylised package icon with circuitry and the letters LPKG.")
+ .add_defs(defs)
+ .add_element(
+ Element::new("rect")
+ .attr("width", "640")
+ .attr("height", "320")
+ .attr("rx", "28")
+ .attr("fill", "url(#bgGradient)")
+ .empty(),
+ )
+ .add_element(cube)
+ .add_element(circuits)
+ .add_element(title_text)
+ .add_element(tagline_group)
+ .finish()
+}
diff --git a/src/bin/metadata_indexer.rs b/src/bin/metadata_indexer.rs
index 1f19ef3..3b81130 100644
--- a/src/bin/metadata_indexer.rs
+++ b/src/bin/metadata_indexer.rs
@@ -2,12 +2,11 @@ use std::collections::HashSet;
use std::fs;
use std::path::{Path, PathBuf};
-use anyhow::{Context, Result};
+use anyhow::{Context, Result, anyhow};
use chrono::Utc;
use clap::{Parser, Subcommand};
use jsonschema::JSONSchema;
use regex::Regex;
-use reqwest::{blocking::Client, redirect::Policy};
use scraper::{ElementRef, Html, Selector};
use serde_json::{Value, json};
use sha2::{Digest, Sha256};
@@ -379,6 +378,16 @@ fn extract_summary(value: &Value, relative_path: &Path) -> Result>()
+ })
+ .unwrap_or_default();
Ok(PackageSummary {
schema_version,
@@ -393,6 +402,7 @@ fn extract_summary(value: &Value, relative_path: &Path) -> Result,
) -> Result {
let page_url = resolve_page_url(book, page, override_base)?;
- let client = Client::builder()
- .user_agent("lpkg-metadata-indexer/0.1")
- .build()?;
- let response = client
- .get(&page_url)
- .send()
- .with_context(|| format!("fetching {}", page_url))?
- .error_for_status()
- .with_context(|| format!("non-success status for {}", page_url))?;
- let html = response
- .text()
- .with_context(|| format!("reading response body from {}", page_url))?;
+ let html = fetch_text(&page_url).with_context(|| format!("fetching {page_url}"))?;
let document = Html::parse_document(&html);
let harvest = build_metadata_value(metadata_dir, book, &page_url, &document, &html)?;
@@ -637,6 +636,7 @@ fn build_metadata_value(
};
let status_state = "draft";
+ let stage_tag = stage.clone().unwrap_or_else(|| "base-system".to_string());
let package_json = json!({
"schema_version": "v0.1.0",
@@ -687,10 +687,7 @@ fn build_metadata_value(
"status": {
"state": status_state,
"issues": issues,
- "tags": vec![
- "25.10".to_string(),
- stage.unwrap_or("base-system").to_string()
- ]
+ "tags": vec!["25.10".to_string(), stage_tag.clone()]
}
});
@@ -940,15 +937,7 @@ fn refresh_manifest(
let url = manifest_url(book, &kind)
.with_context(|| format!("no manifest URL configured for book '{}'", book))?;
- let client = Client::builder().redirect(Policy::limited(5)).build()?;
- let body = client
- .get(url)
- .send()
- .with_context(|| format!("fetching {}", url))?
- .error_for_status()
- .with_context(|| format!("request failed for {}", url))?
- .text()
- .with_context(|| format!("reading response body from {}", url))?;
+ let body = fetch_text(url).with_context(|| format!("fetching {url}"))?;
fs::write(&cache_path, &body)
.with_context(|| format!("caching manifest {}", cache_path.display()))?;
@@ -956,6 +945,17 @@ fn refresh_manifest(
Ok(cache_path)
}
+fn fetch_text(url: &str) -> Result {
+ ureq::get(url)
+ .call()
+ .map_err(|err| match err {
+ ureq::Error::Status(code, _) => anyhow!("request failed: HTTP {code}"),
+ other => anyhow!("request failed: {other}"),
+ })?
+ .into_string()
+ .with_context(|| format!("reading response body from {url}"))
+}
+
fn manifest_url(book: &str, kind: &ManifestKind) -> Option<&'static str> {
match (book, kind) {
("mlfs", ManifestKind::WgetList) => {
diff --git a/src/bin/nixette_logo_gen.rs b/src/bin/nixette_logo_gen.rs
new file mode 100644
index 0000000..5f18f55
--- /dev/null
+++ b/src/bin/nixette_logo_gen.rs
@@ -0,0 +1,126 @@
+use anyhow::Result;
+use package_management::svg_builder::{Defs, Document, Element, Filter, Gradient, Group};
+use std::fs;
+
+fn main() -> Result<()> {
+ let svg = build_nixette_logo();
+ fs::create_dir_all("assets")?;
+ fs::write("assets/nixette-logo.svg", svg)?;
+ Ok(())
+}
+
+fn build_nixette_logo() -> String {
+ let defs = Defs::new()
+ .linear_gradient(
+ "bg",
+ Gradient::new("0", "0", "1", "1")
+ .stop("0%", &[("stop-color", "#55CDFC")])
+ .stop("100%", &[("stop-color", "#F7A8B8")]),
+ )
+ .linear_gradient(
+ "text",
+ Gradient::new("0", "0", "0", "1")
+ .stop("0%", &[("stop-color", "#FFFFFF")])
+ .stop("100%", &[("stop-color", "#E5E7FF")]),
+ )
+ .filter(
+ "softShadow",
+ Filter::new()
+ .attr("x", "-10%")
+ .attr("y", "-10%")
+ .attr("width", "120%")
+ .attr("height", "120%")
+ .raw(""),
+ );
+
+ let emblem = Group::new().attr("transform", "translate(100 60)").child(
+ Group::new()
+ .attr("filter", "url(#softShadow)")
+ .child(
+ Element::new("path")
+ .attr("d", "M40 40 L72 0 L144 0 L176 40 L144 80 L72 80 Z")
+ .attr("fill", "url(#bg)")
+ .empty(),
+ )
+ .child(
+ Element::new("path")
+ .attr("d", "M72 0 L144 80")
+ .attr("stroke", "#FFFFFF")
+ .attr("stroke-width", "6")
+ .attr("stroke-linecap", "round")
+ .attr("opacity", "0.55")
+ .empty(),
+ )
+ .child(
+ Element::new("path")
+ .attr("d", "M144 0 L72 80")
+ .attr("stroke", "#FFFFFF")
+ .attr("stroke-width", "6")
+ .attr("stroke-linecap", "round")
+ .attr("opacity", "0.55")
+ .empty(),
+ )
+ .child(
+ Element::new("circle")
+ .attr("cx", "108")
+ .attr("cy", "40")
+ .attr("r", "22")
+ .attr("fill", "#0F172A")
+ .attr("stroke", "#FFFFFF")
+ .attr("stroke-width", "6")
+ .attr("opacity", "0.85")
+ .empty(),
+ )
+ .child(
+ Element::new("path")
+ .attr("d", "M108 24c8 0 14 6 14 16s-6 16-14 16")
+ .attr("stroke", "#F7A8B8")
+ .attr("stroke-width", "4")
+ .attr("stroke-linecap", "round")
+ .attr("fill", "none")
+ .empty(),
+ ),
+ );
+
+ let wordmark = Group::new()
+ .attr("transform", "translate(220 126)")
+ .attr(
+ "font-family",
+ "'Fira Sans', 'Inter', 'Segoe UI', sans-serif",
+ )
+ .attr("font-weight", "700")
+ .attr("font-size", "72")
+ .attr("letter-spacing", "4")
+ .attr("fill", "url(#text)")
+ .child(Element::new("text").text("NIXETTE"));
+
+ let subtitle = Group::new()
+ .attr("transform", "translate(220 160)")
+ .attr(
+ "font-family",
+ "'Fira Sans', 'Inter', 'Segoe UI', sans-serif",
+ )
+ .attr("font-size", "22")
+ .attr("fill", "#A5B4FC")
+ .child(Element::new("text").text("Declarative · Sourceful · Herself"));
+
+ Document::new(640, 200)
+ .view_box("0 0 640 200")
+ .role("img")
+ .aria_label("title", "desc")
+ .title("Nixette Logo")
+ .desc("Wordmark combining Nix and Gentoo motifs with trans pride colours.")
+ .add_defs(defs)
+ .add_element(
+ Element::new("rect")
+ .attr("width", "640")
+ .attr("height", "200")
+ .attr("rx", "36")
+ .attr("fill", "#0F172A")
+ .empty(),
+ )
+ .add_element(emblem)
+ .add_element(wordmark)
+ .add_element(subtitle)
+ .finish()
+}
diff --git a/src/bin/nixette_mascot_gen.rs b/src/bin/nixette_mascot_gen.rs
new file mode 100644
index 0000000..b07edd1
--- /dev/null
+++ b/src/bin/nixette_mascot_gen.rs
@@ -0,0 +1,170 @@
+use anyhow::Result;
+use package_management::svg_builder::{Defs, Document, Element, Gradient, Group};
+use std::fs;
+
+fn main() -> Result<()> {
+ let svg = build_mascot_svg();
+ fs::create_dir_all("assets")?;
+ fs::write("assets/nixette-mascot.svg", svg)?;
+ Ok(())
+}
+
+fn build_mascot_svg() -> String {
+ let defs = Defs::new()
+ .linear_gradient(
+ "bgGrad",
+ Gradient::new("0", "0", "0", "1")
+ .stop("0%", &[("stop-color", "#312E81")])
+ .stop("100%", &[("stop-color", "#1E1B4B")]),
+ )
+ .linear_gradient(
+ "hairLeft",
+ Gradient::new("0", "0", "1", "1")
+ .stop("0%", &[("stop-color", "#55CDFC")])
+ .stop("100%", &[("stop-color", "#0EA5E9")]),
+ )
+ .linear_gradient(
+ "hairRight",
+ Gradient::new("1", "0", "0", "1")
+ .stop("0%", &[("stop-color", "#F7A8B8")])
+ .stop("100%", &[("stop-color", "#FB7185")]),
+ )
+ .linear_gradient(
+ "bellyGrad",
+ Gradient::new("0", "0", "0", "1")
+ .stop("0%", &[("stop-color", "#FFFFFF")])
+ .stop("100%", &[("stop-color", "#E2E8F0")]),
+ );
+
+ let body = Group::new()
+ .attr("transform", "translate(240 220)")
+ .child(
+ Element::new("path")
+ .attr("d", "M-160 -20 C-140 -160 140 -160 160 -20 C180 140 60 220 0 220 C-60 220 -180 140 -160 -20")
+ .attr("fill", "#0F172A")
+ .empty(),
+ )
+ .child(
+ Element::new("ellipse")
+ .attr("cx", "0")
+ .attr("cy", "40")
+ .attr("rx", "120")
+ .attr("ry", "140")
+ .attr("fill", "#1E293B")
+ .empty(),
+ )
+ .child(
+ Element::new("path")
+ .attr("d", "M-88 -80 Q-40 -140 0 -120 Q40 -140 88 -80")
+ .attr("fill", "#1E293B")
+ .empty(),
+ )
+ .child(
+ Element::new("path")
+ .attr("d", "M-96 -84 Q-60 -160 -8 -132 L-8 -40 Z")
+ .attr("fill", "url(#hairLeft)")
+ .empty(),
+ )
+ .child(
+ Element::new("path")
+ .attr("d", "M96 -84 Q60 -160 8 -132 L8 -40 Z")
+ .attr("fill", "url(#hairRight)")
+ .empty(),
+ )
+ .child(ellipse(-44.0, -8.0, 26.0, 32.0, "#FFFFFF"))
+ .child(ellipse(44.0, -8.0, 26.0, 32.0, "#FFFFFF"))
+ .child(circle(-44.0, -4.0, 14.0, "#0F172A"))
+ .child(circle(44.0, -4.0, 14.0, "#0F172A"))
+ .child(circle_with_opacity(-40.0, -8.0, 6.0, "#FFFFFF", 0.7))
+ .child(circle_with_opacity(48.0, -10.0, 6.0, "#FFFFFF", 0.7))
+ .child(path_with_fill("M0 12 L-18 32 Q0 44 18 32 Z", "#F472B6"))
+ .child(path_with_fill("M0 32 L-16 52 Q0 60 16 52 Z", "#FBEAED"))
+ .child(path_with_fill("M0 46 Q-32 78 0 86 Q32 78 0 46", "#FCA5A5"))
+ .child(
+ Element::new("ellipse")
+ .attr("cx", "0")
+ .attr("cy", "74")
+ .attr("rx", "70")
+ .attr("ry", "82")
+ .attr("fill", "url(#bellyGrad)")
+ .empty(),
+ )
+ .child(path_with_fill("M-128 48 Q-176 56 -176 120 Q-128 112 -104 80", "#F7A8B8"))
+ .child(path_with_fill("M128 48 Q176 56 176 120 Q128 112 104 80", "#55CDFC"))
+ .child(circle_with_opacity(-100.0, 94.0, 18.0, "#FDE68A", 0.85))
+ .child(circle_with_opacity(100.0, 94.0, 18.0, "#FDE68A", 0.85));
+
+ Document::new(480, 520)
+ .view_box("0 0 480 520")
+ .role("img")
+ .aria_label("title", "desc")
+ .title("Nixette Mascot Badge")
+ .desc("Chibi penguin mascot with trans flag hair, blending Nix and Gentoo motifs.")
+ .add_defs(defs)
+ .add_element(
+ Element::new("rect")
+ .attr("width", "480")
+ .attr("height", "520")
+ .attr("rx", "48")
+ .attr("fill", "url(#bgGrad)")
+ .empty(),
+ )
+ .add_element(body)
+ .add_element(
+ Group::new()
+ .attr("transform", "translate(90 420)")
+ .attr(
+ "font-family",
+ "'Fira Sans', 'Inter', 'Segoe UI', sans-serif",
+ )
+ .attr("font-size", "42")
+ .attr("fill", "#E0E7FF")
+ .attr("letter-spacing", "6")
+ .child(Element::new("text").text("NIXIE")),
+ )
+ .add_element(
+ Group::new()
+ .attr("transform", "translate(90 468)")
+ .attr(
+ "font-family",
+ "'Fira Sans', 'Inter', 'Segoe UI', sans-serif",
+ )
+ .attr("font-size", "20")
+ .attr("fill", "#A5B4FC")
+ .child(Element::new("text").text("Declarative · Sourceful · Herself")),
+ )
+ .finish()
+}
+
+fn ellipse(cx: f64, cy: f64, rx: f64, ry: f64, fill: &str) -> String {
+ Element::new("ellipse")
+ .attr("cx", &format!("{}", cx))
+ .attr("cy", &format!("{}", cy))
+ .attr("rx", &format!("{}", rx))
+ .attr("ry", &format!("{}", ry))
+ .attr("fill", fill)
+ .empty()
+}
+
+fn circle(cx: f64, cy: f64, r: f64, fill: &str) -> String {
+ Element::new("circle")
+ .attr("cx", &format!("{}", cx))
+ .attr("cy", &format!("{}", cy))
+ .attr("r", &format!("{}", r))
+ .attr("fill", fill)
+ .empty()
+}
+
+fn circle_with_opacity(cx: f64, cy: f64, r: f64, fill: &str, opacity: f64) -> String {
+ Element::new("circle")
+ .attr("cx", &format!("{}", cx))
+ .attr("cy", &format!("{}", cy))
+ .attr("r", &format!("{}", r))
+ .attr("fill", fill)
+ .attr("opacity", &format!("{}", opacity))
+ .empty()
+}
+
+fn path_with_fill(d: &str, fill: &str) -> String {
+ Element::new("path").attr("d", d).attr("fill", fill).empty()
+}
diff --git a/src/bin/nixette_wallpaper_gen.rs b/src/bin/nixette_wallpaper_gen.rs
new file mode 100644
index 0000000..225f157
--- /dev/null
+++ b/src/bin/nixette_wallpaper_gen.rs
@@ -0,0 +1,128 @@
+use anyhow::Result;
+use package_management::svg_builder::{
+ Defs, Document, Element, Gradient, Group, RadialGradient, path,
+};
+use std::fs;
+
+fn main() -> Result<()> {
+ let svg = build_wallpaper_svg();
+ fs::create_dir_all("assets")?;
+ fs::write("assets/nixette-wallpaper.svg", svg)?;
+ Ok(())
+}
+
+fn build_wallpaper_svg() -> String {
+ let defs = Defs::new()
+ .linear_gradient(
+ "sky",
+ Gradient::new("0", "0", "1", "1")
+ .stop("0%", &[("stop-color", "#0f172a")])
+ .stop("100%", &[("stop-color", "#1e1b4b")]),
+ )
+ .linear_gradient(
+ "wave1",
+ Gradient::new("0", "0", "1", "0")
+ .stop("0%", &[("stop-color", "#55CDFC"), ("stop-opacity", "0")])
+ .stop("50%", &[("stop-color", "#55CDFC"), ("stop-opacity", "0.5")])
+ .stop("100%", &[("stop-color", "#55CDFC"), ("stop-opacity", "0")]),
+ )
+ .linear_gradient(
+ "wave2",
+ Gradient::new("1", "0", "0", "0")
+ .stop("0%", &[("stop-color", "#F7A8B8"), ("stop-opacity", "0")])
+ .stop(
+ "50%",
+ &[("stop-color", "#F7A8B8"), ("stop-opacity", "0.55")],
+ )
+ .stop("100%", &[("stop-color", "#F7A8B8"), ("stop-opacity", "0")]),
+ )
+ .radial_gradient(
+ "halo",
+ RadialGradient::new("0.5", "0.5", "0.7")
+ .stop("0%", &[("stop-color", "#FDE68A"), ("stop-opacity", "0.8")])
+ .stop("100%", &[("stop-color", "#FDE68A"), ("stop-opacity", "0")]),
+ );
+
+ let text = Group::new()
+ .attr("transform", "translate(940 1320)")
+ .attr(
+ "font-family",
+ "'Fira Sans', 'Inter', 'Segoe UI', sans-serif",
+ )
+ .attr("font-size", "220")
+ .attr("font-weight", "700")
+ .attr("letter-spacing", "18")
+ .attr("fill", "#FFFFFF")
+ .attr("opacity", "0.95")
+ .child(Element::new("text").text("NIXETTE"));
+
+ let subtitle = Group::new()
+ .attr("transform", "translate(960 1500)")
+ .attr(
+ "font-family",
+ "'Fira Sans', 'Inter', 'Segoe UI', sans-serif",
+ )
+ .attr("font-size", "64")
+ .attr("fill", "#F7A8B8")
+ .attr("opacity", "0.9")
+ .child(Element::new("text").text("Declarative · Sourceful · Herself"));
+
+ Document::new(3840, 2160)
+ .view_box("0 0 3840 2160")
+ .role("img")
+ .aria_label("title", "desc")
+ .title("Nixette Wallpaper")
+ .desc("Gradient wallpaper combining trans flag waves with Nix and Gentoo motifs.")
+ .add_defs(defs)
+ .add_element(
+ Element::new("rect")
+ .attr("width", "3840")
+ .attr("height", "2160")
+ .attr("fill", "url(#sky)")
+ .empty(),
+ )
+ .add_element(
+ Element::new("rect")
+ .attr("x", "0")
+ .attr("y", "0")
+ .attr("width", "3840")
+ .attr("height", "2160")
+ .attr("fill", "url(#halo)")
+ .attr("opacity", "0.4")
+ .empty(),
+ )
+ .add_element(
+ Element::new("path")
+ .attr("d", "M0 1430 C640 1320 1280 1580 1860 1500 C2440 1420 3040 1660 3840 1500 L3840 2160 L0 2160 Z")
+ .attr("fill", "url(#wave1)")
+ .empty(),
+ )
+ .add_element(
+ Element::new("path")
+ .attr("d", "M0 1700 C500 1580 1200 1880 1900 1760 C2600 1640 3200 1920 3840 1800 L3840 2160 L0 2160 Z")
+ .attr("fill", "url(#wave2)")
+ .empty(),
+ )
+ .add_element(
+ Group::new()
+ .attr("opacity", "0.08")
+ .attr("fill", "none")
+ .attr("stroke", "#FFFFFF")
+ .attr("stroke-width", "24")
+ .child(path("M600 360 l220 -220 h360 l220 220 l-220 220 h-360 z"))
+ .child(path("M600 360 l220 -220"))
+ .child(path("M820 140 l220 220")),
+ )
+ .add_element(
+ Group::new()
+ .attr("opacity", "0.12")
+ .attr("fill", "none")
+ .attr("stroke", "#FFFFFF")
+ .attr("stroke-width", "22")
+ .attr("transform", "translate(2820 320) scale(0.9)")
+ .child(path("M0 0 C120 -40 220 40 220 160 C220 260 160 320 60 320")),
+ )
+ .add_element(text)
+ .add_element(subtitle)
+ .finish()
+}
diff --git a/src/bin/readme_gen.rs b/src/bin/readme_gen.rs
new file mode 100644
index 0000000..ab0390e
--- /dev/null
+++ b/src/bin/readme_gen.rs
@@ -0,0 +1,198 @@
+use std::fs;
+
+fn main() -> anyhow::Result<()> {
+ let readme = Readme::build();
+ fs::write("README.md", readme)?;
+ Ok(())
+}
+
+struct MarkdownDoc {
+ buffer: String,
+}
+
+impl MarkdownDoc {
+ fn new() -> Self {
+ Self {
+ buffer: String::new(),
+ }
+ }
+
+ fn heading(mut self, level: u8, text: &str) -> Self {
+ self.buffer.push_str(&"#".repeat(level as usize));
+ self.buffer.push(' ');
+ self.buffer.push_str(text);
+ self.buffer.push_str("\n\n");
+ self
+ }
+
+ fn raw(mut self, text: &str) -> Self {
+ self.buffer.push_str(text);
+ self.buffer.push('\n');
+ self
+ }
+
+ fn paragraph(mut self, text: &str) -> Self {
+ self.buffer.push_str(text);
+ self.buffer.push_str("\n\n");
+ self
+ }
+
+ fn horizontal_rule(mut self) -> Self {
+ self.buffer.push_str("---\n\n");
+ self
+ }
+
+ fn bullet_list(mut self, items: I) -> Self
+ where
+ I: IntoIterator,
+ S: AsRef,
+ {
+ for item in items {
+ self.buffer.push_str("* ");
+ self.buffer.push_str(item.as_ref());
+ self.buffer.push('\n');
+ }
+ self.buffer.push('\n');
+ self
+ }
+
+ fn code_block(mut self, language: &str, code: &str) -> Self {
+ self.buffer.push_str("```");
+ self.buffer.push_str(language);
+ self.buffer.push('\n');
+ self.buffer.push_str(code.trim_matches('\n'));
+ self.buffer.push_str("\n```\n\n");
+ self
+ }
+
+ fn finish(self) -> String {
+ self.buffer
+ }
+}
+
+struct Readme;
+
+impl Readme {
+ fn build() -> String {
+ let doc = MarkdownDoc::new()
+ .heading(1, "🧬 LPKG – Lightweight Package Manager")
+ .raw("
\n \n
\n")
+ .paragraph("LPKG is a minimalistic package manager written in Rust, designed for fast and simple software management on Unix-like systems. It emphasizes reproducibility and declarative configuration, leveraging **Nix Flakes** for development and deployment.")
+ .horizontal_rule()
+ .heading(2, "🚀 Features")
+ .bullet_list([
+ "**Fast & Lightweight** – Minimal resource usage and quick operations.",
+ "**Rust-Powered** – Safe and concurrent code with Rust.",
+ "**Cross-Platform** – Works on Linux and macOS.",
+ "**Declarative Builds** – Fully reproducible with Nix Flakes.",
+ "**Simple CLI** – Intuitive commands for managing packages.",
+ ])
+ .horizontal_rule()
+ .heading(2, "⚙️ Installation")
+ .heading(3, "Using Cargo")
+ .code_block("bash", "cargo install lpkg")
+ .heading(3, "Using Nix Flakes")
+ .paragraph("If you have Nix with flakes enabled:")
+ .code_block("bash", "nix profile install github:lesbiannix/lpkg")
+ .paragraph("Or to run without installing:")
+ .code_block("bash", "nix run github:lesbiannix/lpkg")
+ .horizontal_rule()
+ .heading(2, "đź§° Usage")
+ .paragraph("Basic command structure:")
+ .code_block("bash", "lpkg [command] [package]")
+ .paragraph("Common commands:")
+ .bullet_list([
+ "`install` – Install a package",
+ "`remove` – Remove a package",
+ "`update` – Update the package list",
+ "`upgrade` – Upgrade all installed packages",
+ ])
+ .paragraph("For detailed usage:")
+ .code_block("bash", "lpkg --help")
+ .horizontal_rule()
+ .heading(2, "đź”§ Development with Flakes")
+ .paragraph("Clone the repository:")
+ .code_block("bash", "git clone https://github.com/lesbiannix/lpkg.git\ncd lpkg")
+ .paragraph("Enter the flake development shell:")
+ .code_block("bash", "nix develop")
+ .paragraph("Build the project:")
+ .code_block("bash", "cargo build")
+ .paragraph("LPKG ships with tuned Cargo profiles:")
+ .bullet_list([
+ "**Dev builds** (`cargo build`) use `opt-level=0`, lots of codegen units, and incremental compilation for quick feedback while hacking.",
+ "**Release builds** (`cargo build --release`) enable `-O3`, fat LTO, and panic aborts for slim, fast binaries.",
+ "**GraphQL builds** add the server components when you need them:",
+ ])
+ .code_block("bash", "cargo build --features graphql")
+ .paragraph("**PGO builds** are a two-step flow using the provided Cargo aliases:")
+ .code_block(
+ "bash",
+ r#"# 1) Instrument
+RUSTFLAGS="-Cprofile-generate=target/pgo-data" cargo pgo-instrument
+# run representative workloads to emit *.profraw files under target/pgo-data
+llvm-profdata merge -o target/pgo-data/lpkg.profdata target/pgo-data/*.profraw
+
+# 2) Optimise with the collected profile
+RUSTFLAGS="-Cprofile-use=target/pgo-data/lpkg.profdata -Cllvm-args=-pgo-warn-missing-function" \
+ cargo pgo-build"#,
+ )
+ .paragraph("Regenerate project artefacts (README and SVG logo):")
+ .code_block("bash", "cargo run --bin readme_gen\ncargo run --bin logo_gen")
+ .paragraph("Run tests:")
+ .code_block("bash", "cargo test")
+ .paragraph("You can also run the project directly in the flake shell:")
+ .code_block("bash", "nix run")
+ .heading(2, "🕸️ GraphQL API")
+ .paragraph("LPKG now ships a lightweight GraphQL server powered by Actix Web and Juniper.")
+ .bullet_list([
+ "Start the server with `cargo run --features graphql --bin graphql_server` (set `LPKG_GRAPHQL_ADDR` to override `127.0.0.1:8080`).",
+ "Query endpoint: `http://127.0.0.1:8080/graphql`",
+ "Interactive playground: `http://127.0.0.1:8080/playground`",
+ ])
+ .paragraph("Example query:")
+ .code_block("graphql", r"{
+ packages(limit: 5) {
+ name
+ version
+ enableLto
+ }
+ randomJoke {
+ package
+ text
+ }
+}")
+ .heading(3, "AI metadata tooling")
+ .paragraph("The AI metadata store under `ai/metadata/` comes with a helper CLI to validate package records against the JSON schema and regenerate `index.json` after adding new entries:")
+ .code_block("bash", r"cargo run --bin metadata_indexer -- --base-dir . validate
+cargo run --bin metadata_indexer -- --base-dir . index")
+ .paragraph("Use `--compact` with `index` if you prefer single-line JSON output.")
+ .paragraph("To draft metadata for a specific book page, you can run the harvest mode. It fetches the XHTML, scrapes the build commands, and emits a schema-compliant JSON skeleton (pass `--dry-run` to inspect the result without writing to disk):")
+ .code_block("bash", r"cargo run --bin metadata_indexer -- \
+ --base-dir . harvest \
+ --book mlfs \
+ --page chapter05/binutils-pass1 \
+ --dry-run")
+ .paragraph("Keep the jhalfs manifests current with:")
+ .code_block("bash", "cargo run --bin metadata_indexer -- --base-dir . refresh")
+ .paragraph("Passing `--books mlfs,blfs` restricts the refresh to specific books, and `--force` bypasses the local cache.")
+ .paragraph("To materialise a Rust module from harvested metadata:")
+ .code_block("bash", r"cargo run --bin metadata_indexer -- \
+ --base-dir . generate \
+ --metadata ai/metadata/packages/mlfs/binutils-pass-1.json \
+ --output target/generated/by_name")
+ .paragraph("Add `--overwrite` to regenerate an existing module directory.")
+ .heading(2, "📚 Documentation")
+ .bullet_list([
+ "[Architecture Overview](docs/ARCHITECTURE.md) – high-level tour of the crate layout, binaries, and supporting modules.",
+ "[Metadata Harvesting Pipeline](docs/METADATA_PIPELINE.md) – how the metadata indexer produces and validates the JSON records under `ai/metadata/`.",
+ "[Package Module Generation](docs/PACKAGE_GENERATION.md) – end-to-end guide for converting harvested metadata into Rust modules under `src/pkgs/by_name/`.",
+ "Concept corner: [Nixette](concepts/nixette/README.md) – a NixOS × Gentoo transfemme mash-up dreamed up for fun brand explorations.",
+ "`ai/notes.md` – scratchpad for ongoing research tasks (e.g., deeper jhalfs integration).",
+ ])
+ .horizontal_rule()
+ .heading(2, "đź“„ License")
+ .paragraph("LPKG is licensed under the [MIT License](LICENSE).");
+
+ doc.finish()
+ }
+}
diff --git a/src/db/mod.rs b/src/db/mod.rs
index 9cc40b9..e606fd8 100644
--- a/src/db/mod.rs
+++ b/src/db/mod.rs
@@ -1,9 +1,11 @@
pub mod models;
pub mod schema;
+use std::cmp;
use std::env;
use anyhow::{Context, Result};
+use diesel::OptionalExtension;
use diesel::prelude::*;
use diesel::r2d2::{self, ConnectionManager};
use diesel::sqlite::SqliteConnection;
@@ -105,3 +107,98 @@ pub fn load_packages_via_pool(pool: &Pool) -> Result> {
let mut conn = pool.get().context("acquiring database connection")?;
load_packages(&mut conn)
}
+
+/// Load package definitions instead of raw Diesel models for convenience.
+pub fn load_package_definitions(conn: &mut SqliteConnection) -> Result> {
+ load_packages(conn)?
+ .into_iter()
+ .map(|record| record.into_definition())
+ .collect::>>()
+}
+
+/// Pool-backed helper mirroring [`load_package_definitions`].
+pub fn load_package_definitions_via_pool(pool: &Pool) -> Result> {
+ let mut conn = pool.get().context("acquiring database connection")?;
+ load_package_definitions(&mut conn)
+}
+
+/// Locate a package by name and optional version, returning the newest matching entry when
+/// the version is not supplied.
+pub fn find_package(
+ conn: &mut SqliteConnection,
+ name: &str,
+ version: Option<&str>,
+) -> Result