chore: stage remaining scaffolding
This commit is contained in:
parent
5a8823fddb
commit
4d0aa78dbd
11 changed files with 696 additions and 0 deletions
36
Makefile.toml
Normal file
36
Makefile.toml
Normal file
|
|
@ -0,0 +1,36 @@
|
|||
[tasks.format]
|
||||
description = "Format Rust code using rustfmt"
|
||||
install_crate = "rustfmt"
|
||||
command = "cargo"
|
||||
args = ["fmt", "--", "--emit=files"]
|
||||
|
||||
[tasks.clean]
|
||||
description = "Clean build artifacts"
|
||||
command = "cargo"
|
||||
args = ["clean"]
|
||||
|
||||
[tasks.build]
|
||||
description = "Build the project"
|
||||
command = "cargo"
|
||||
args = ["build"]
|
||||
dependencies = ["clean"]
|
||||
|
||||
[tasks.test]
|
||||
description = "Run tests"
|
||||
command = "cargo"
|
||||
args = ["test"]
|
||||
dependencies = ["clean"]
|
||||
|
||||
[tasks.my-flow]
|
||||
description = "Run full workflow: format, build, test"
|
||||
dependencies = ["format", "build", "test"]
|
||||
|
||||
[tasks.dev-flow]
|
||||
description = "Full developer workflow: format, lint, build, test"
|
||||
dependencies = ["format", "clippy", "build", "test"]
|
||||
|
||||
[tasks.release-build]
|
||||
description = "Build the project in release mode"
|
||||
command = "cargo"
|
||||
args = ["build", "--release", "--all-features"]
|
||||
dependencies = ["clean"]
|
||||
1
build.rs
Normal file
1
build.rs
Normal file
|
|
@ -0,0 +1 @@
|
|||
fn main() {}
|
||||
91
concepts/nixette/README.md
Normal file
91
concepts/nixette/README.md
Normal file
|
|
@ -0,0 +1,91 @@
|
|||
# Nixette – Declarative, Sourceful, and Unapologetically Herself
|
||||
|
||||
A playful concept distro imagined as the transfemme child of **NixOS** and **Gentoo**. Nixette blends the reproducible confidence of flakes with the fine-grained self-expression of USE flags, wrapped in a trans flag palette and a big, affirming hug.
|
||||
|
||||
---
|
||||
|
||||
## Identity Snapshot
|
||||
|
||||
- **Tagline:** _Declarative, sourceful, and unapologetically herself._
|
||||
- **Mascot:** Chibi penguin “Nixie” with pastel pigtails, Nix snowflake + Gentoo swirl hoodie.
|
||||
- **Palette:** `#55CDFC` (sky blue), `#F7A8B8` (pink), `#FFFFFF`, plus a deep accent `#7C3AED`.
|
||||
- **Pronoun Prompt:** The installer asks for name/pronouns and personalises MOTD, systemd messages, and shell prompt.
|
||||
|
||||
---
|
||||
|
||||
## Feature Mix
|
||||
|
||||
| Pillar | How Nixette expresses it |
|
||||
|----------------------|-----------------------------------------------------------------------------------------------------------|
|
||||
| Reproducibility | Flake-native system definitions with versioned profiles (`comfort-zone`, `diy-princess`, `studio-mode`). |
|
||||
| Custom compilation | `nix emerge` bridge turns Gentoo ebuild overlays into reproducible derivations with cached binaries. |
|
||||
| Playful polish | Catppuccin-trans themes, `nixette-style` CLI to sync GTK/Qt/terminal styling, dynamic welcome affirmations.|
|
||||
| Inclusive defaults | Flatpak + Steam pre-set for accessibility tools, Fcitx5, Orca, speech-dispatcher, pronoun-friendly docs. |
|
||||
|
||||
---
|
||||
|
||||
## Toolchain Concepts
|
||||
|
||||
- **`trans-init` installer** – Guided TUI that outputs `flake.nix`, including overlays for the `nix emerge` bridge. Provides story-mode narration for first boot.
|
||||
- **`nixette-style`** – Syncs wallpapers, SDDM theme, terminal palette, Qt/KDE settings, all sourced from a YAML theme pack.
|
||||
- **`emerge-optional`** – Spins up Gentoo chroots inside Nix build sandboxes for packages happiest as ebuilds. Output is cached as a Nix store derivation.
|
||||
- **`affirm-d`** – Small daemon rotating `/etc/motd`, desktop notifications, and TTY colour accents with inclusive affirmations.
|
||||
|
||||
---
|
||||
|
||||
## Profile Catalogue
|
||||
|
||||
| Profile | Intent |
|
||||
|-----------------|---------------------------------------------------------------------------------------------|
|
||||
| Comfort Zone | KDE Plasma, PipeWire, Wayland, cozy defaults, automatic Catgirl cursor + emoji fonts. |
|
||||
| DIY Princess | Minimal sway-based stack, just the flake scaffolding and overlay hooks for custom builds. |
|
||||
| Studio Mode | Focuses on creative tooling (Krita, Blender, Ardour) and low-latency kernels, GPU tuning. |
|
||||
|
||||
---
|
||||
|
||||
## Roadmap Sketch
|
||||
|
||||
1. **Moodboard → Brand Pack** (logo, icon, wallpapers, VT boot splash).
|
||||
2. **Prototype flakes** – `nix flake init --template nixette#comfort-zone` etc.
|
||||
3. **Gentoo overlay bridge** – Validate `nix emerge` on a handful of ebuilds (mesa, wine, gamescope).
|
||||
4. **Installer draft** – BubbleTea/ratatui-driven TUI, prompts for pronouns + accessibility preferences.
|
||||
5. **Community docs** – Write inclusive user guide, contributor covenant, pronoun style guide.
|
||||
6. **Launch zine** – Release notes styled like a mini-comic introducing Nixie’s origin story.
|
||||
7. **Accessibility audit** – Keyboard navigation, screen-reader pass, dyslexia-friendly typography options.
|
||||
8. **Beta cosy jam** – Invite testers via queer sysadmin spaces; collect feedback through anonymous forms.
|
||||
|
||||
---
|
||||
|
||||
## Affirmations YAML (snippet)
|
||||
|
||||
```yaml
|
||||
- id: bright-morning
|
||||
message: "Good morning, {name}! Your system is as valid and custom as you are."
|
||||
colour: "#F7A8B8"
|
||||
- id: compile-hugs
|
||||
message: "Kernel rebuilds take time. You deserve rest breaks and gentle music."
|
||||
colour: "#55CDFC"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Logo & Wallpaper
|
||||
|
||||
See `assets/nixette-logo.svg` for the primary wordmark, `assets/nixette-mascot.svg` for Nixie’s badge, and `assets/nixette-wallpaper.svg` for a 4K wallpaper concept.
|
||||
|
||||
### Reference Configs
|
||||
|
||||
- `concepts/nixette/sample_flake.nix` demonstrates the comfort-zone profile with `nix emerge`, `affirmd`, and theming hooks.
|
||||
|
||||
---
|
||||
|
||||
## Contributing Idea Seeds
|
||||
|
||||
- Write sample flakes showcasing the hybrid build pipeline.
|
||||
- Mock up the mascot in SVG for use in documentation.
|
||||
- Design additional wallpapers (night mode, pride variants, low-light).
|
||||
- Draft inclusive documentation templates (issue/PR forms, community guidelines).
|
||||
- Publish a community pledge emphasising safety, pronoun respect, and boundaries.
|
||||
- Host monthly "compile & chill" streams to showcase contributions.
|
||||
|
||||
Let Nixette be the distro that compiles joy, not just binaries. 💜
|
||||
62
concepts/nixette/sample_flake.nix
Normal file
62
concepts/nixette/sample_flake.nix
Normal file
|
|
@ -0,0 +1,62 @@
|
|||
{
|
||||
description = "Nixette comfort-zone profile";
|
||||
|
||||
inputs = {
|
||||
nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable";
|
||||
nixette-overlays.url = "github:nixette/overlays";
|
||||
nixette-style.url = "github:nixette/style-pack";
|
||||
};
|
||||
|
||||
outputs = { self, nixpkgs, nixette-overlays, nixette-style, ... }@inputs:
|
||||
let
|
||||
system = "x86_64-linux";
|
||||
pkgs = import nixpkgs {
|
||||
inherit system;
|
||||
overlays = [ nixette-overlays.overlays.nix-emerge ];
|
||||
};
|
||||
in
|
||||
{
|
||||
nixosConfigurations.nixette-comfort-zone = nixpkgs.lib.nixosSystem {
|
||||
inherit system;
|
||||
modules = [
|
||||
./profiles/comfort-zone.nix
|
||||
({ config, pkgs, ... }:
|
||||
{
|
||||
nixpkgs.config.allowUnfree = true;
|
||||
environment.systemPackages = with pkgs; [
|
||||
nixette-style
|
||||
steam
|
||||
lutris
|
||||
krita
|
||||
];
|
||||
|
||||
services.nixette.nix-emerge = {
|
||||
enable = true;
|
||||
ebuilds = [
|
||||
"games-emulation/gamescope"
|
||||
"media-sound/pipewire"
|
||||
];
|
||||
};
|
||||
|
||||
services.nixette.affirmd.enable = true;
|
||||
services.nixette.affirmd.pronouns = "she/her";
|
||||
services.nixette.affirmd.motdPath = ./affirmations.yml;
|
||||
|
||||
programs.plasma.enable = true;
|
||||
services.displayManager.sddm.enable = true;
|
||||
services.displayManager.sddm.theme = nixette-style.themes.catgirl-sunrise;
|
||||
|
||||
users.users.nixie = {
|
||||
isNormalUser = true;
|
||||
extraGroups = [ "wheel" "audio" "input" "video" ];
|
||||
shell = pkgs.zsh;
|
||||
};
|
||||
|
||||
programs.zsh.promptInit = ''
|
||||
eval "$(nixette-style prompt --name nixie --pronouns she/her)"
|
||||
'';
|
||||
})
|
||||
];
|
||||
};
|
||||
};
|
||||
}
|
||||
80
src/bin/graphql_server.rs
Normal file
80
src/bin/graphql_server.rs
Normal file
|
|
@ -0,0 +1,80 @@
|
|||
#![cfg(feature = "graphql")]
|
||||
|
||||
use std::env;
|
||||
use std::sync::Arc;
|
||||
|
||||
use actix_web::{App, HttpRequest, HttpResponse, HttpServer, middleware::Compress, web};
|
||||
use anyhow::{Context, Result};
|
||||
use juniper_actix::{graphiql_handler, graphql_handler};
|
||||
|
||||
use package_management::db;
|
||||
use package_management::graphql::{self, GraphQLContext, Schema};
|
||||
|
||||
const DEFAULT_BIND_ADDR: &str = "127.0.0.1:8080";
|
||||
|
||||
#[actix_web::main]
|
||||
async fn main() -> std::io::Result<()> {
|
||||
if let Err(err) = run().await {
|
||||
eprintln!("GraphQL server failed: {err:#}");
|
||||
return Err(std::io::Error::new(
|
||||
std::io::ErrorKind::Other,
|
||||
err.to_string(),
|
||||
));
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
async fn run() -> Result<()> {
|
||||
let pool = db::establish_pool().context("initialising SQLite pool")?;
|
||||
let schema = Arc::new(graphql::create_schema());
|
||||
let jokes = Arc::new(graphql::context::JokeCatalog::default());
|
||||
let bind_addr = env::var("LPKG_GRAPHQL_ADDR").unwrap_or_else(|_| DEFAULT_BIND_ADDR.to_string());
|
||||
let workers = worker_count();
|
||||
|
||||
println!("GraphQL server listening on {bind_addr} with {workers} worker(s)");
|
||||
|
||||
HttpServer::new(move || {
|
||||
let app_schema = Arc::clone(&schema);
|
||||
let pool = pool.clone();
|
||||
let jokes = Arc::clone(&jokes);
|
||||
|
||||
App::new()
|
||||
.app_data(web::Data::from(app_schema))
|
||||
.app_data(web::Data::new(pool))
|
||||
.app_data(web::Data::from(jokes))
|
||||
.wrap(Compress::default())
|
||||
.service(
|
||||
web::resource("/graphql")
|
||||
.route(web::post().to(graphql_endpoint))
|
||||
.route(web::get().to(graphql_endpoint)),
|
||||
)
|
||||
.service(web::resource("/playground").route(web::get().to(graphiql_endpoint)))
|
||||
})
|
||||
.workers(workers)
|
||||
.bind(&bind_addr)
|
||||
.with_context(|| format!("binding GraphQL server to {bind_addr}"))?
|
||||
.run()
|
||||
.await
|
||||
.context("running GraphQL server")
|
||||
}
|
||||
|
||||
async fn graphql_endpoint(
|
||||
schema: web::Data<Arc<Schema>>,
|
||||
pool: web::Data<db::Pool>,
|
||||
jokes: web::Data<Arc<graphql::context::JokeCatalog>>,
|
||||
req: HttpRequest,
|
||||
payload: web::Payload,
|
||||
) -> Result<HttpResponse, actix_web::Error> {
|
||||
let context = GraphQLContext::with_catalog(pool.get_ref().clone(), Arc::clone(jokes.get_ref()));
|
||||
graphql_handler(schema.get_ref().as_ref(), &context, req, payload).await
|
||||
}
|
||||
|
||||
async fn graphiql_endpoint() -> Result<HttpResponse, actix_web::Error> {
|
||||
graphiql_handler("/graphql", None).await
|
||||
}
|
||||
|
||||
fn worker_count() -> usize {
|
||||
let suggested = num_cpus::get();
|
||||
suggested.clamp(1, 8)
|
||||
}
|
||||
138
src/graphql/context.rs
Normal file
138
src/graphql/context.rs
Normal file
|
|
@ -0,0 +1,138 @@
|
|||
use std::sync::Arc;
|
||||
|
||||
use rand::rng;
|
||||
use rand::seq::IteratorRandom;
|
||||
|
||||
use crate::db;
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct GraphQLContext {
|
||||
pub db_pool: db::Pool,
|
||||
jokes: Arc<JokeCatalog>,
|
||||
}
|
||||
|
||||
impl GraphQLContext {
|
||||
pub fn new(db_pool: db::Pool) -> Self {
|
||||
Self {
|
||||
db_pool,
|
||||
jokes: Arc::new(JokeCatalog::default()),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn with_jokes(db_pool: db::Pool, jokes: Vec<Joke>) -> Self {
|
||||
Self {
|
||||
db_pool,
|
||||
jokes: Arc::new(JokeCatalog::new(jokes)),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn with_catalog(db_pool: db::Pool, catalog: Arc<JokeCatalog>) -> Self {
|
||||
Self {
|
||||
db_pool,
|
||||
jokes: catalog,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn joke_catalog(&self) -> Arc<JokeCatalog> {
|
||||
Arc::clone(&self.jokes)
|
||||
}
|
||||
}
|
||||
|
||||
impl juniper::Context for GraphQLContext {}
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct Joke {
|
||||
pub id: String,
|
||||
pub package: Option<String>,
|
||||
pub text: String,
|
||||
}
|
||||
|
||||
impl Joke {
|
||||
pub fn new(id: impl Into<String>, package: Option<&str>, text: impl Into<String>) -> Self {
|
||||
Self {
|
||||
id: id.into(),
|
||||
package: package.map(|pkg| pkg.to_string()),
|
||||
text: text.into(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct JokeCatalog {
|
||||
entries: Arc<Vec<Joke>>,
|
||||
}
|
||||
|
||||
impl JokeCatalog {
|
||||
fn new(entries: Vec<Joke>) -> Self {
|
||||
Self {
|
||||
entries: Arc::new(entries),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn random(&self, package: Option<&str>) -> Option<Joke> {
|
||||
let mut rng = rng();
|
||||
|
||||
if let Some(package) = package {
|
||||
if let Some(chosen) = self
|
||||
.entries
|
||||
.iter()
|
||||
.filter(|joke| matches_package(joke, package))
|
||||
.choose(&mut rng)
|
||||
{
|
||||
return Some(chosen.clone());
|
||||
}
|
||||
}
|
||||
|
||||
self.entries.iter().choose(&mut rng).cloned()
|
||||
}
|
||||
|
||||
pub fn all(&self, package: Option<&str>) -> Vec<Joke> {
|
||||
match package {
|
||||
Some(package) => self
|
||||
.entries
|
||||
.iter()
|
||||
.filter(|joke| matches_package(joke, package))
|
||||
.cloned()
|
||||
.collect(),
|
||||
None => self.entries.as_ref().clone(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl Default for JokeCatalog {
|
||||
fn default() -> Self {
|
||||
Self::new(default_jokes())
|
||||
}
|
||||
}
|
||||
|
||||
fn matches_package(joke: &Joke, package: &str) -> bool {
|
||||
joke.package
|
||||
.as_deref()
|
||||
.map(|pkg| pkg.eq_ignore_ascii_case(package))
|
||||
.unwrap_or(false)
|
||||
}
|
||||
|
||||
fn default_jokes() -> Vec<Joke> {
|
||||
vec![
|
||||
Joke::new(
|
||||
"optimizer-overdrive",
|
||||
Some("gcc"),
|
||||
"The GCC optimizer walked into a bar, reordered everyone’s drinks, and they still tasted the same—just faster.",
|
||||
),
|
||||
Joke::new(
|
||||
"linker-chuckle",
|
||||
Some("binutils"),
|
||||
"Our linker refuses to go on vacation; it can’t handle unresolved references to the beach.",
|
||||
),
|
||||
Joke::new(
|
||||
"glibc-giggle",
|
||||
Some("glibc"),
|
||||
"The C library tried stand-up comedy but segfaulted halfway through the punchline.",
|
||||
),
|
||||
Joke::new(
|
||||
"pkg-general",
|
||||
None,
|
||||
"LPKG packages never get lost—they always follow the dependency graph back home.",
|
||||
),
|
||||
]
|
||||
}
|
||||
14
src/graphql/mod.rs
Normal file
14
src/graphql/mod.rs
Normal file
|
|
@ -0,0 +1,14 @@
|
|||
pub mod context;
|
||||
pub mod schema;
|
||||
|
||||
pub use context::{GraphQLContext, Joke};
|
||||
pub use schema::QueryRoot;
|
||||
|
||||
use juniper::{EmptyMutation, EmptySubscription, RootNode};
|
||||
|
||||
pub type Schema =
|
||||
RootNode<QueryRoot, EmptyMutation<GraphQLContext>, EmptySubscription<GraphQLContext>>;
|
||||
|
||||
pub fn create_schema() -> Schema {
|
||||
Schema::new(QueryRoot {}, EmptyMutation::new(), EmptySubscription::new())
|
||||
}
|
||||
133
src/graphql/schema.rs
Normal file
133
src/graphql/schema.rs
Normal file
|
|
@ -0,0 +1,133 @@
|
|||
use anyhow::{Error as AnyhowError, Result as AnyhowResult};
|
||||
use juniper::{FieldResult, GraphQLObject, Value, graphql_object};
|
||||
|
||||
use crate::{db, pkgs::package::PackageDefinition};
|
||||
|
||||
use super::context::{GraphQLContext, Joke};
|
||||
|
||||
#[derive(Clone, GraphQLObject)]
|
||||
#[graphql(description = "Package metadata exposed via the GraphQL API")]
|
||||
pub struct PackageType {
|
||||
pub name: String,
|
||||
pub version: String,
|
||||
pub source: Option<String>,
|
||||
pub md5: Option<String>,
|
||||
pub configure_args: Vec<String>,
|
||||
pub build_commands: Vec<String>,
|
||||
pub install_commands: Vec<String>,
|
||||
pub dependencies: Vec<String>,
|
||||
pub enable_lto: bool,
|
||||
pub enable_pgo: bool,
|
||||
pub cflags: Vec<String>,
|
||||
pub ldflags: Vec<String>,
|
||||
pub profdata: Option<String>,
|
||||
}
|
||||
|
||||
impl From<PackageDefinition> for PackageType {
|
||||
fn from(pkg: PackageDefinition) -> Self {
|
||||
let optimizations = pkg.optimizations;
|
||||
|
||||
Self {
|
||||
name: pkg.name,
|
||||
version: pkg.version,
|
||||
source: pkg.source,
|
||||
md5: pkg.md5,
|
||||
configure_args: pkg.configure_args,
|
||||
build_commands: pkg.build_commands,
|
||||
install_commands: pkg.install_commands,
|
||||
dependencies: pkg.dependencies,
|
||||
enable_lto: optimizations.enable_lto,
|
||||
enable_pgo: optimizations.enable_pgo,
|
||||
cflags: optimizations.cflags,
|
||||
ldflags: optimizations.ldflags,
|
||||
profdata: optimizations.profdata,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, GraphQLObject)]
|
||||
#[graphql(description = "A light-hearted package-related joke")]
|
||||
pub struct JokeType {
|
||||
pub id: String,
|
||||
pub package: Option<String>,
|
||||
pub text: String,
|
||||
}
|
||||
|
||||
impl From<Joke> for JokeType {
|
||||
fn from(joke: Joke) -> Self {
|
||||
Self {
|
||||
id: joke.id,
|
||||
package: joke.package,
|
||||
text: joke.text,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Default)]
|
||||
pub struct QueryRoot;
|
||||
|
||||
#[graphql_object(context = GraphQLContext)]
|
||||
impl QueryRoot {
|
||||
fn packages(context: &GraphQLContext, limit: Option<i32>) -> FieldResult<Vec<PackageType>> {
|
||||
let limit = limit.unwrap_or(50).clamp(1, 200) as usize;
|
||||
let definitions =
|
||||
db::load_package_definitions_via_pool(&context.db_pool).map_err(field_error)?;
|
||||
|
||||
Ok(definitions
|
||||
.into_iter()
|
||||
.take(limit)
|
||||
.map(PackageType::from)
|
||||
.collect())
|
||||
}
|
||||
|
||||
fn package(
|
||||
context: &GraphQLContext,
|
||||
name: String,
|
||||
version: Option<String>,
|
||||
) -> FieldResult<Option<PackageType>> {
|
||||
let definition =
|
||||
db::find_package_definition_via_pool(&context.db_pool, &name, version.as_deref())
|
||||
.map_err(field_error)?;
|
||||
|
||||
Ok(definition.map(PackageType::from))
|
||||
}
|
||||
|
||||
fn search(
|
||||
context: &GraphQLContext,
|
||||
query: String,
|
||||
limit: Option<i32>,
|
||||
) -> FieldResult<Vec<PackageType>> {
|
||||
let limit = limit.map(|value| i64::from(value.clamp(1, 200)));
|
||||
let results =
|
||||
db::search_packages_via_pool(&context.db_pool, &query, limit).map_err(field_error)?;
|
||||
|
||||
let packages = results
|
||||
.into_iter()
|
||||
.map(|pkg| pkg.into_definition().map(PackageType::from))
|
||||
.collect::<AnyhowResult<Vec<_>>>()
|
||||
.map_err(field_error)?;
|
||||
|
||||
Ok(packages)
|
||||
}
|
||||
|
||||
fn jokes(context: &GraphQLContext, package: Option<String>) -> FieldResult<Vec<JokeType>> {
|
||||
let catalog = context.joke_catalog();
|
||||
Ok(catalog
|
||||
.all(package.as_deref())
|
||||
.into_iter()
|
||||
.map(JokeType::from)
|
||||
.collect())
|
||||
}
|
||||
|
||||
fn random_joke(
|
||||
context: &GraphQLContext,
|
||||
package: Option<String>,
|
||||
) -> FieldResult<Option<JokeType>> {
|
||||
let catalog = context.joke_catalog();
|
||||
Ok(catalog.random(package.as_deref()).map(JokeType::from))
|
||||
}
|
||||
}
|
||||
|
||||
fn field_error(err: AnyhowError) -> juniper::FieldError {
|
||||
juniper::FieldError::new(err.to_string(), Value::null())
|
||||
}
|
||||
80
src/tui/animations/donut.rs
Normal file
80
src/tui/animations/donut.rs
Normal file
|
|
@ -0,0 +1,80 @@
|
|||
use std::time::Duration;
|
||||
use rsille::canvas::Canvas;
|
||||
use super::Animation;
|
||||
|
||||
const THETA_SPACING: f64 = 0.07;
|
||||
const PHI_SPACING: f64 = 0.02;
|
||||
|
||||
pub struct DonutAnimation {
|
||||
a: f64, // rotation around X
|
||||
b: f64, // rotation around Z
|
||||
size: (u16, u16),
|
||||
}
|
||||
|
||||
impl DonutAnimation {
|
||||
pub fn new(width: u16, height: u16) -> Self {
|
||||
Self {
|
||||
a: 0.0,
|
||||
b: 0.0,
|
||||
size: (width, height),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl Animation for DonutAnimation {
|
||||
fn update(&mut self, delta: Duration) {
|
||||
let delta_secs = delta.as_secs_f64();
|
||||
self.a += delta_secs;
|
||||
self.b += delta_secs * 0.5;
|
||||
}
|
||||
|
||||
fn render(&self, canvas: &mut Canvas) {
|
||||
let (width, height) = self.size;
|
||||
let (width_f, height_f) = (width as f64, height as f64);
|
||||
let k2 = 5.0;
|
||||
let k1 = width_f * k2 * 3.0 / (8.0 * (height_f + width_f));
|
||||
|
||||
for theta in 0..((2.0 * std::f64::consts::PI / THETA_SPACING) as i32) {
|
||||
let theta_f = theta as f64 * THETA_SPACING;
|
||||
let cos_theta = theta_f.cos();
|
||||
let sin_theta = theta_f.sin();
|
||||
|
||||
for phi in 0..((2.0 * std::f64::consts::PI / PHI_SPACING) as i32) {
|
||||
let phi_f = phi as f64 * PHI_SPACING;
|
||||
let cos_phi = phi_f.cos();
|
||||
let sin_phi = phi_f.sin();
|
||||
|
||||
let cos_a = self.a.cos();
|
||||
let sin_a = self.a.sin();
|
||||
let cos_b = self.b.cos();
|
||||
let sin_b = self.b.sin();
|
||||
|
||||
let h = cos_theta + 2.0;
|
||||
let d = 1.0 / (sin_phi * h * sin_a + sin_theta * cos_a + 5.0);
|
||||
let t = sin_phi * h * cos_a - sin_theta * sin_a;
|
||||
|
||||
let x = (width_f / 2.0 + 30.0 * d * (cos_phi * h * cos_b - t * sin_b)) as i32;
|
||||
let y = (height_f / 2.0 + 15.0 * d * (cos_phi * h * sin_b + t * cos_b)) as i32;
|
||||
let z = (1.0 / d) as u8;
|
||||
|
||||
if x >= 0 && x < width as i32 && y >= 0 && y < height as i32 {
|
||||
let luminance = if z > 0 { z } else { 1 };
|
||||
let c = match luminance {
|
||||
0..=31 => '.',
|
||||
32..=63 => '*',
|
||||
64..=95 => 'o',
|
||||
96..=127 => '&',
|
||||
128..=159 => '8',
|
||||
160..=191 => '#',
|
||||
_ => '@',
|
||||
};
|
||||
canvas.put_char(x as u16, y as u16, c);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn is_finished(&self) -> bool {
|
||||
false // continuous animation
|
||||
}
|
||||
}
|
||||
13
src/tui/animations/mod.rs
Normal file
13
src/tui/animations/mod.rs
Normal file
|
|
@ -0,0 +1,13 @@
|
|||
use rsille::canvas::Canvas;
|
||||
use std::time::Duration;
|
||||
|
||||
pub trait Animation {
|
||||
fn update(&mut self, delta: Duration);
|
||||
fn render(&self, canvas: &mut Canvas);
|
||||
fn is_finished(&self) -> bool;
|
||||
}
|
||||
|
||||
pub trait ProgressAnimation: Animation {
|
||||
fn set_progress(&mut self, progress: f64);
|
||||
fn get_progress(&self) -> f64;
|
||||
}
|
||||
48
src/tui/animations/progress.rs
Normal file
48
src/tui/animations/progress.rs
Normal file
|
|
@ -0,0 +1,48 @@
|
|||
use std::time::Duration;
|
||||
use rsille::canvas::Canvas;
|
||||
use super::{Animation, ProgressAnimation};
|
||||
|
||||
pub struct ProgressBarAnimation {
|
||||
progress: f64,
|
||||
width: u16,
|
||||
height: u16,
|
||||
animation_offset: f64,
|
||||
}
|
||||
|
||||
impl ProgressBarAnimation {
|
||||
pub fn new(width: u16, height: u16) -> Self {
|
||||
Self {
|
||||
progress: 0.0,
|
||||
width,
|
||||
height,
|
||||
animation_offset: 0.0,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl Animation for ProgressBarAnimation {
|
||||
fn update(&mut self, delta: Duration) {
|
||||
self.animation_offset += delta.as_secs_f64() * 2.0;
|
||||
if self.animation_offset >= 1.0 {
|
||||
self.animation_offset -= 1.0;
|
||||
}
|
||||
}
|
||||
|
||||
fn render(&self, canvas: &mut Canvas) {
|
||||
// Animated progress bar rendering will be implemented here
|
||||
}
|
||||
|
||||
fn is_finished(&self) -> bool {
|
||||
self.progress >= 1.0
|
||||
}
|
||||
}
|
||||
|
||||
impl ProgressAnimation for ProgressBarAnimation {
|
||||
fn set_progress(&mut self, progress: f64) {
|
||||
self.progress = progress.clamp(0.0, 1.0);
|
||||
}
|
||||
|
||||
fn get_progress(&self) -> f64 {
|
||||
self.progress
|
||||
}
|
||||
}
|
||||
Loading…
Add table
Add a link
Reference in a new issue