Compare commits

...

5 Commits

Author SHA1 Message Date
b3796bde79
feat: build backend with Nix and add CI
All checks were successful
Publish Docker Images / build-and-publish (push) Successful in 7m54s
2025-11-05 15:13:34 +01:00
245e256015
feat: redact email password in logs 2025-11-05 12:54:12 +01:00
11509a51a1
feat: add rate limiting to the backend’s API 2025-11-05 12:54:12 +01:00
91f6603f5f
feat: relay contact requests to SMTP server 2025-11-05 12:54:12 +01:00
105efdecb4
feat: initialization migration to Nuxt
This commit initializes both the Nuxt frontend and the Rust backend of
the new version of phundrak.com
2025-11-05 12:54:12 +01:00
141 changed files with 19521 additions and 8335 deletions

1
.devenv-root Normal file
View File

@ -0,0 +1 @@
/home/phundrak/code/web/phundrak.com

View File

@ -7,6 +7,10 @@ insert_final_newline = true
charset = utf-8 charset = utf-8
trim_trailing_whitespace = true trim_trailing_whitespace = true
[*.{rs, toml}]
indent_style = space
indent_size = 4
[*.{json,ts,css}] [*.{json,ts,css}]
indent_style = space indent_style = space
indent_size = 2 indent_size = 2

3
.env.example Normal file
View File

@ -0,0 +1,3 @@
NUXT_PUBLIC_TURNSTILE_SITE_KEY="changeme"
NUXT_TURNSTILE_SECRET_KEY="changeme"
NUXT_BACKEND=http://localhost:3001

55
.envrc
View File

@ -1 +1,54 @@
use nix #!/usr/bin/env bash
if ! has nix_direnv_version || ! nix_direnv_version 3.1.0; then
source_url "https://raw.githubusercontent.com/nix-community/nix-direnv/3.1.0/direnvrc" "sha256-yMJ2OVMzrFaDPn7q8nCBZFRYpL/f0RcHzhmw/i6btJM="
fi
export DEVENV_IN_DIRENV_SHELL=true
# Load .env file if present
dotenv_if_exists
watch_file flake.nix
watch_file flake.lock
watch_file .envrc.local
watch_file backend/shell.nix
watch_file frontend/shell.nix
# Check if .envrc.local exists and contains a shell preference
if [[ -f .envrc.local ]]; then
source .envrc.local
fi
# If no shell is specified, prompt the user interactively
if [[ -z "$NIX_SHELL_NAME" ]]; then
echo ""
echo "🔧 Available development shells:"
echo " 1) frontend - Nuxt.js/Vue development environment"
echo " 2) backend - Rust backend development environment"
echo ""
echo "💡 Tip: Create a .envrc.local file with 'export NIX_SHELL_NAME=frontend' to skip this prompt"
echo ""
# Read user input
read -p "Select shell (1 or 2): " choice
case $choice in
1|frontend)
NIX_SHELL_NAME=frontend
;;
2|backend)
NIX_SHELL_NAME=backend
;;
*)
echo "❌ Invalid choice. Please select 1 or 2."
return 1
;;
esac
echo "✅ Loading ${NIX_SHELL_NAME} environment..."
fi
if ! use flake ".#${NIX_SHELL_NAME}" --no-pure-eval; then
echo "❌ devenv could not be built. The devenv environment was not loaded. Make the necessary changes to flake.nix and hit enter to try again." >&2
fi

217
.github/workflows/README.md vendored Normal file
View File

@ -0,0 +1,217 @@
# GitHub Actions Workflows
## Docker Image Publishing
The `publish-docker.yml` workflow automatically builds and publishes Docker images for the backend service using Nix.
### Triggers and Tagging Strategy
| Event | Condition | Published Tags | Example |
|--------------+-----------------------------+------------------------+-------------------|
| Tag push | Tag pushed to `main` branch | `latest` + version tag | `latest`, `1.0.0` |
| Branch push | Push to `develop` branch | `develop` | `develop` |
| Pull request | PR opened or updated | `pr<number>` | `pr12` |
| Branch push | Push to `main` (no tag) | `latest` | `latest` |
### Required Secrets
Configure these secrets in your repository settings (`Settings``Secrets and variables``Actions`):
| Secret Name | Description | Example Value |
|---------------------+---------------------------------------------+-----------------------------------------|
| `DOCKER_USERNAME` | Username for Docker registry authentication | `phundrak` |
| `DOCKER_PASSWORD` | Password or token for Docker registry | Personal Access Token (PAT) or password |
| `CACHIX_AUTH_TOKEN` | (Optional) Token for Cachix caching | Your Cachix auth token |
#### For GitHub Container Registry (ghcr.io)
1. Create a Personal Access Token (PAT):
- Go to GitHub Settings → Developer settings → Personal access tokens → Tokens (classic)
- Click "Generate new token (classic)"
- Select scopes: `write:packages`, `read:packages`, `delete:packages`
- Copy the generated token
2. Add secrets:
- `DOCKER_USERNAME`: Your GitHub username
- `DOCKER_PASSWORD`: The PAT you just created
#### For Docker Hub
1. Create an access token:
- Go to Docker Hub → Account Settings → Security → Access Tokens
- Click "New Access Token"
- Set permissions to "Read, Write, Delete"
- Copy the generated token
2. Add secrets:
- `DOCKER_USERNAME`: Your Docker Hub username
- `DOCKER_PASSWORD`: The access token you just created
#### For Gitea Registry (e.g., labs.phundrak.com)
1. Create an access token in Gitea:
- Log in to your Gitea instance
- Go to Settings (click your avatar → Settings)
- Navigate to Applications → Manage Access Tokens
- Click "Generate New Token"
- Give it a descriptive name (e.g., "Phundrak Labs Docker Registry")
- Select the required permissions:
- `write:package` - Required to publish packages
- `read:package` - Required to pull packages
- Click "Generate Token"
- Copy the generated token immediately (it won't be shown again)
2. Add secrets:
- `DOCKER_USERNAME`: Your Gitea username
- `DOCKER_PASSWORD`: The access token you just created
Note: Gitea's container registry is accessed at `https://your-gitea-instance/username/-/packages`
#### For Other Custom Registries
1. Obtain credentials from your registry administrator
2. Add secrets:
- `DOCKER_USERNAME`: Your registry username
- `DOCKER_PASSWORD`: Your registry password or token
### Configuring Cachix (Build Caching)
Cachix is a Nix binary cache that dramatically speeds up builds by caching build artifacts. The workflow supports configurable Cachix settings.
#### Environment Variables
Configure these in the workflow's `env` section or as repository variables:
| Variable | Description | Default Value | Example |
|--------------------+------------------------------------------------+---------------+--------------------|
| `CACHIX_NAME` | Name of the Cachix cache to use | `devenv` | `phundrak-dot-com` |
| `CACHIX_SKIP_PUSH` | Whether to skip pushing artifacts to the cache | `true` | `false` |
#### Option 1: Pull from Public Cache Only
If you only want to pull from a public cache (no pushing):
1. Set environment variables in the workflow:
```yaml
env:
CACHIX_NAME: devenv # or any public cache name
CACHIX_SKIP_PUSH: true
```
2. No `CACHIX_AUTH_TOKEN` secret is needed
This is useful when using public caches like `devenv` or `nix-community`.
#### Option 2: Use Your Own Cache (Recommended for Faster Builds)
To cache your own build artifacts for faster subsequent builds:
1. Create a Cachix cache:
- Go to https://app.cachix.org
- Sign up and create a new cache (e.g., `your-project-name`)
- Free for public/open-source projects
2. Get your auth token:
- In Cachix, go to your cache settings
- Find your auth token under "Auth tokens"
- Copy the token
3. Add your cache configuration to `flake.nix`:
```nix
nixConfig = {
extra-trusted-public-keys = [
"devenv.cachix.org-1:w1cLUi8dv3hnoSPGAuibQv+f9TZLr6cv/Hm9XgU50cw="
"your-cache-name.cachix.org-1:YOUR_PUBLIC_KEY_HERE"
];
extra-substituters = [
"https://devenv.cachix.org"
"https://your-cache-name.cachix.org"
];
};
```
4. Configure the workflow:
- Edit `.github/workflows/publish-docker.yml`:
```yaml
env:
CACHIX_NAME: your-cache-name
CACHIX_SKIP_PUSH: false
```
- Or set as repository variables in GitHub/Gitea
5. Add your auth token as a secret:
- Go to repository `Settings``Secrets and variables``Actions`
- Add secret `CACHIX_AUTH_TOKEN` with your token
#### Benefits of Using Your Own Cache
- **Faster builds**: Subsequent builds reuse cached artifacts (Rust dependencies, compiled binaries)
- **Reduced CI time**: Can reduce build time from 10+ minutes to under 1 minute
- **Cost savings**: Less compute time means lower CI costs
- **Shared across branches**: All branches benefit from the same cache
### Configuring the Docker Registry
The target registry is set via the `DOCKER_REGISTRY` environment variable in the workflow file. To change it:
1. Edit `.github/workflows/publish-docker.yml`
2. Modify the `env` section:
```yaml
env:
DOCKER_REGISTRY: ghcr.io # Change to your registry (e.g., docker.io, labs.phundrak.com)
IMAGE_NAME: phundrak/phundrak-dot-com-backend
```
Or set it as a repository variable:
- Go to `Settings``Secrets and variables``Actions``Variables` tab
- Add `DOCKER_REGISTRY` with your desired registry URL
### Image Naming
Images are published with the name: `${DOCKER_REGISTRY}/${IMAGE_NAME}:${TAG}`
For example:
- `labs.phundrak.com/phundrak/phundrak-dot-com-backend:latest`
- `labs.phundrak.com/phundrak/phundrak-dot-com-backend:1.0.0`
- `labs.phundrak.com/phundrak/phundrak-dot-com-backend:develop`
- `labs.phundrak.com/phundrak/phundrak-dot-com-backend:pr12`
### Local Testing
To test the Docker image build locally:
```bash
# Build the image with Nix
nix build .#backendDockerLatest
# Load it into Docker
docker load < result
# Run the container (image name comes from Cargo.toml package.name)
docker run -p 3100:3100 phundrak/phundrak-dot-com-backend:latest
```
### Troubleshooting
#### Authentication Failures
If you see authentication errors:
1. Verify your `DOCKER_USERNAME` and `DOCKER_PASSWORD` secrets are correct
2. For ghcr.io, ensure your PAT has the correct permissions
3. Check that the `DOCKER_REGISTRY` matches your credentials
#### Build Failures
If the Nix build fails:
1. Test the build locally first: `nix build .#backendDockerLatest`
2. Check the GitHub Actions logs for specific error messages
3. Ensure all dependencies in `flake.nix` are correctly specified
#### Image Not Appearing in Registry
1. Verify the workflow completed successfully in the Actions tab
2. Check that the registry URL is correct
3. For ghcr.io, images appear at: `https://github.com/users/USERNAME/packages/container/IMAGE_NAME`
4. Ensure your token has write permissions

View File

@ -1,50 +0,0 @@
name: deploy
on:
push:
branches:
- main
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v4
with:
node-version: 22.x
- run: npm ci
- uses: purcell/setup-emacs@master
with:
version: 29.1
- name: "Export org to md"
run: emacs -Q --script export.el
- run: npm run build
- name: "Deploy to Cloudflare Pages"
uses: cloudflare/pages-action@v1
with:
apiToken: ${{ secrets.CLOUDFLARE_API_TOKEN }}
accountId: ${{ secrets.ACCOUNT_ID }}
projectName: phundrak-com
directory: content/.vuepress/dist/
githubToken: ${{ secrets.TOKEN }}
# - name: "Deploy on the Web"
# uses: appleboy/scp-action@v0.1.7
# with:
# host: ${{ secrets.HOST }}
# username: ${{ secrets.USERNAME }}
# key: ${{ secrets.KEY }}
# port: ${{ secrets.PORT }}
# source: content/.vuepress/dist/*
# target: ${{ secrets.DESTPATH }}
# strip_components: 3
# - name: "Deploy on Gemini"
# uses: appleboy/scp-action@v0.1.7
# with:
# host: ${{ secrets.HOST }}
# username: ${{ secrets.USERNAME }}
# key: ${{ secrets.KEY }}
# port: ${{ secrets.PORT }}
# source: gemini/*
# target: ${{ secrets.DESTPATH_GMI }}
# strip_components: 1

123
.github/workflows/publish-docker.yml vendored Normal file
View File

@ -0,0 +1,123 @@
name: Publish Docker Images
on:
push:
branches:
- main
- develop
tags:
- 'v*.*.*'
pull_request:
types: [opened, synchronize, reopened]
env:
CACHIX_NAME: devenv
CACHIX_SKIP_PUSH: true
DOCKER_REGISTRY: labs.phundrak.com # Override in repository settings if needed
IMAGE_NAME: phundrak/phundrak-dot-com-backend
jobs:
build-and-publish:
runs-on: ubuntu-latest
permissions:
contents: read
packages: write # Required for pushing to Phundrak Labs registry
pull-requests: read
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Install Nix
uses: cachix/install-nix-action@v27
with:
nix_path: nixpkgs=channel:nixos-unstable
- name: Setup Cachix
uses: cachix/cachix-action@v15
with:
name: '${{ env.CACHIX_NAME }}'
authToken: '${{ secrets.CACHIX_AUTH_TOKEN }}'
skipPush: ${{ env.CACHIX_SKIP_PUSH }}
- name: Build Docker image with Nix
run: |
echo "Building Docker image..."
nix build .#backendDockerLatest --accept-flake-config
- name: Load Docker image
run: |
echo "Loading Docker image into Docker daemon..."
docker load < result
- name: Log in to Docker Registry
run: |
echo "${{ secrets.DOCKER_PASSWORD }}" | docker login ${{ env.DOCKER_REGISTRY }} -u ${{ secrets.DOCKER_USERNAME }} --password-stdin
- name: Determine tags and push images
run: |
set -euo pipefail
REGISTRY="${{ env.DOCKER_REGISTRY }}"
IMAGE_NAME="${{ env.IMAGE_NAME }}"
# The locally built image from Nix (name comes from Cargo.toml package.name)
LOCAL_IMAGE="phundrak/phundrak-dot-com-backend:latest"
echo "Event: ${{ github.event_name }}"
echo "Ref: ${{ github.ref }}"
echo "Ref type: ${{ github.ref_type }}"
# Determine which tags to push based on the event
if [[ "${{ github.event_name }}" == "push" && "${{ github.ref_type }}" == "tag" ]]; then
# Tag push on main branch → publish 'latest' and versioned tag
echo "Tag push detected"
TAG_VERSION="${{ github.ref_name }}"
# Remove 'v' prefix if present (v1.0.0 → 1.0.0)
TAG_VERSION="${TAG_VERSION#v}"
echo "Tagging and pushing: ${REGISTRY}/${IMAGE_NAME}:latest"
docker tag "${LOCAL_IMAGE}" "${REGISTRY}/${IMAGE_NAME}:latest"
docker push "${REGISTRY}/${IMAGE_NAME}:latest"
echo "Tagging and pushing: ${REGISTRY}/${IMAGE_NAME}:${TAG_VERSION}"
docker tag "${LOCAL_IMAGE}" "${REGISTRY}/${IMAGE_NAME}:${TAG_VERSION}"
docker push "${REGISTRY}/${IMAGE_NAME}:${TAG_VERSION}"
elif [[ "${{ github.event_name }}" == "push" && "${{ github.ref }}" == "refs/heads/develop" ]]; then
# Push on develop branch → publish 'develop' tag
echo "Push to develop branch detected"
echo "Tagging and pushing: ${REGISTRY}/${IMAGE_NAME}:develop"
docker tag "${LOCAL_IMAGE}" "${REGISTRY}/${IMAGE_NAME}:develop"
docker push "${REGISTRY}/${IMAGE_NAME}:develop"
elif [[ "${{ github.event_name }}" == "pull_request" ]]; then
# Pull request → publish 'pr<number>' tag
echo "Pull request detected"
PR_NUMBER="${{ github.event.pull_request.number }}"
echo "Tagging and pushing: ${REGISTRY}/${IMAGE_NAME}:pr${PR_NUMBER}"
docker tag "${LOCAL_IMAGE}" "${REGISTRY}/${IMAGE_NAME}:pr${PR_NUMBER}"
docker push "${REGISTRY}/${IMAGE_NAME}:pr${PR_NUMBER}"
elif [[ "${{ github.event_name }}" == "push" && "${{ github.ref }}" == "refs/heads/main" ]]; then
# Push to main branch (not a tag) → publish 'latest'
echo "Push to main branch detected"
echo "Tagging and pushing: ${REGISTRY}/${IMAGE_NAME}:latest"
docker tag "${LOCAL_IMAGE}" "${REGISTRY}/${IMAGE_NAME}:latest"
docker push "${REGISTRY}/${IMAGE_NAME}:latest"
else
echo "Unknown event or ref, skipping push"
exit 1
fi
- name: Log out from Docker Registry
if: always()
run: docker logout ${{ env.DOCKER_REGISTRY }}
- name: Image published successfully
run: |
echo "✅ Docker image(s) published successfully to ${{ env.DOCKER_REGISTRY }}/${{ env.IMAGE_NAME }}"

38
.gitignore vendored
View File

@ -1,6 +1,36 @@
node_modules
.temp .temp
.cache .cache
/content/.vuepress/dist/* .devenv
*.md
/.yarn/ # Logs
logs
*.log
# Misc
.DS_Store
.fleet
.idea
# Local env files
.env
.env.*
!.env.example
# Backend
target/
coverage/
# Frontend
## Nuxt dev/build outputs
.output
.data
.nuxt
.nitro
.cache
dist
## Node dependencies
node_modules
# Nix
result

View File

@ -1,3 +0,0 @@
enableMessageNames: false
nodeLinker: node-modules

View File

@ -1,51 +1,76 @@
#+title: phundrak.com #+title: phundrak.com
#+html: <a href="https://www.gnu.org/software/emacs/"><img src="https://img.shields.io/badge/Emacs-29.1-blueviolet.svg?style=flat-square&logo=GNU%20Emacs&logoColor=white" /></a> #+html: <a href="https://www.rust-lang.org/"><img src="https://img.shields.io/badge/Rust-Backend-orange.svg?style=flat-square&logo=Rust&logoColor=white" /></a>
#+html: <a href="https://orgmode.org/"><img src="https://img.shields.io/badge/Written%20with-Org%20mode-success?logo=Org&logoColor=white&style=flat-square"/></a> #+html: <a href="https://nuxt.com/"><img src="https://img.shields.io/badge/Frontend-Nuxt%204-00DC82?logo=Nuxt.js&logoColor=white&style=flat-square"/></a>
#+html: <a href="https://v2.vuepress.vuejs.org/"><img src="https://img.shields.io/badge/Framework-Vuepress-42D392?logo=Vue.js&logoColor=white&style=flat-square"/></a> #+html: <a href="https://vuejs.org/"><img src="https://img.shields.io/badge/Vue-3-42B883?logo=Vue.js&logoColor=white&style=flat-square"/></a>
#+html: <a href="https://phundrak.com"><img src="https://img.shields.io/badge/dynamic/json?label=Website&query=%24%5B%3A1%5D.status&url=https%3A%2F%2Fdrone.phundrak.com%2Fapi%2Frepos%2Fphundrak%2Fphundrak.com%2Fbuilds&style=flat-square&logo=buffer" /></a> #+html: <a href="https://phundrak.com"><img src="https://img.shields.io/badge/Website-phundrak.com-blue?style=flat-square&logo=buffer" /></a>
* Introduction * Introduction
This is the repository for my website [[https://phundrak.com][phundrak.com]] which contains the This is the repository for my website [[https://phundrak.com][phundrak.com]] which contains the
code available on the =main= branch. Code available on the =develop= code available on the =main= branch. Code available on the =develop=
branch is available at [[https://beta.phundrak.com][beta.phundrak.com]]. branch is available at [[https://beta.phundrak.com][beta.phundrak.com]].
* Structure of the project * Architecture
This website is made with [[https://v2.vuepress.vuejs.org/][VuePress]], a Vue-powered static site The website follows a modern full-stack architecture:
generator. You can find its Node.JS configuration in the [[file:package.json][package.json]]
file as well as its content and general configuration in the directory
[[file:content/][content]].
** Installing and running - *Backend*: Rust using the [[https://github.com/poem-web/poem][Poem]] web framework (located in [[file:backend/][backend/]])
In order to run the website, you firts need to export all the orgmode - *Frontend*: Nuxt 4 + Vue 3 + TypeScript (located in [[file:frontend/][frontend/]])
files to Markdown files. I recommend using =ox-gfm= to do so. If you
dont mind =package.el= installing it as well as =f.el=, you can run the ** Backend
following command: The backend is written in Rust and provides a RESTful API using the
Poem framework with OpenAPI support.
*** Running the Backend
To run the backend in development mode:
#+begin_src shell #+begin_src shell
emacs -Q --script export.el cd backend
cargo run
#+end_src #+end_src
To install the NPM dependencies for the project, run one of the To run tests:
following commands:
#+begin_src shell #+begin_src shell
yarn cd backend
# or cargo test
npm install # delete the yarn.lock file before
#+end_src #+end_src
To run the project, run one of the following commands using the same For continuous testing and linting during development, use [[https://dystroy.org/bacon/][bacon]]:
package manager as above:
#+begin_src shell #+begin_src shell
yarn dev cd backend
# or bacon
npm run dev
#+end_src #+end_src
You can compile the website to a static website by running *** Building the Backend
To build the backend for production:
#+begin_src shell #+begin_src shell
yarn build cd backend
# or cargo build --release
npm run build
#+end_src #+end_src
The compiled version of the website can then be found in =content/.vuepress/dist=. The compiled binary will be available at =backend/target/release/backend=.
** Frontend
The frontend is built with Nuxt 4, Vue 3, and TypeScript, providing a
modern single-page application experience.
*** Installing Dependencies
First, install the required dependencies using =pnpm=:
#+begin_src shell
cd frontend
pnpm install
#+end_src
*** Running the Frontend
To run the frontend in development mode:
#+begin_src shell
cd frontend
pnpm dev
#+end_src
*** Building the Frontend
To build the frontend for production:
#+begin_src shell
cd frontend
pnpm build
#+end_src
The compiled version of the website can then be found in =frontend/.output=.

View File

@ -0,0 +1,6 @@
[all]
out = ["Xml"]
target-dir = "coverage"
output-dir = "coverage"
fail-under = 60
exclude-files = ["target/*"]

View File

@ -0,0 +1,7 @@
[all]
out = ["Html", "Lcov"]
skip-clean = true
target-dir = "coverage"
output-dir = "coverage"
fail-under = 60
exclude-files = ["target/*"]

3249
backend/Cargo.lock generated Normal file

File diff suppressed because it is too large Load Diff

33
backend/Cargo.toml Normal file
View File

@ -0,0 +1,33 @@
[package]
name = "phundrak-dot-com-backend"
version = "0.1.0"
edition = "2024"
publish = false
authors = ["Lucien Cartier-Tilet <lucien@phundrak.com>"]
license = "AGPL-3.0-only"
[lib]
path = "src/lib.rs"
[[bin]]
path = "src/main.rs"
name = "phundrak-dot-com-backend"
[dependencies]
chrono = { version = "0.4.42", features = ["serde"] }
config = { version = "0.15.18", features = ["yaml"] }
dotenvy = "0.15.7"
governor = "0.8.0"
lettre = { version = "0.11.19", default-features = false, features = ["builder", "hostname", "pool", "rustls-tls", "tokio1", "tokio1-rustls-tls", "smtp-transport"] }
poem = { version = "3.1.12", default-features = false, features = ["csrf", "rustls", "test"] }
poem-openapi = { version = "5.1.16", features = ["chrono", "swagger-ui"] }
serde = "1.0.228"
serde_json = "1.0.145"
thiserror = "2.0.17"
tokio = { version = "1.48.0", features = ["macros", "rt-multi-thread"] }
tracing = "0.1.41"
tracing-subscriber = { version = "0.3.20", features = ["fmt", "std", "env-filter", "registry", "json", "tracing-log"] }
validator = { version = "0.20.0", features = ["derive"] }
[lints.rust]
unexpected_cfgs = { level = "warn", check-cfg = ['cfg(tarpaulin_include)'] }

424
backend/README.md Normal file
View File

@ -0,0 +1,424 @@
# phundrak.com Backend
The backend for [phundrak.com](https://phundrak.com), built with Rust and the [Poem](https://github.com/poem-web/poem) web framework.
## Features
- **RESTful API** with automatic OpenAPI/Swagger documentation
- **Rate limiting** with configurable per-second limits using the
Generic Cell Rate Algorithm (thanks to
[`governor`](https://github.com/boinkor-net/governor))
- **Contact form** with SMTP email relay (supports TLS, STARTTLS, and
unencrypted)
- **Type-safe routing** using Poem's declarative API
- **Hierarchical configuration** with YAML files and environment
variable overrides
- **Structured logging** with `tracing` and `tracing-subscriber`
- **Strict linting** for code quality and safety
- **Comprehensive testing** with integration test support
## API Endpoints
The application provides the following endpoints:
- **Swagger UI**: `/` - Interactive API documentation
- **OpenAPI Spec**: `/specs` - OpenAPI specification in YAML format
- **Health Check**: `GET /api/health` - Returns server health status
- **Application Metadata**: `GET /api/meta` - Returns version and build info
- **Contact Form**: `POST /api/contact` - Submit contact form (relays to SMTP)
## Configuration
Configuration is loaded from multiple sources in order of precedence:
1. `settings/base.yaml` - Base configuration
2. `settings/{environment}.yaml` - Environment-specific (development/production)
3. Environment variables prefixed with `APP__` (e.g., `APP__APPLICATION__PORT=8080`)
The environment is determined by the `APP_ENVIRONMENT` variable (defaults to "development").
### Configuration Example
```yaml
application:
port: 3100
version: "0.1.0"
email:
host: smtp.example.com
port: 587
user: user@example.com
from: Contact Form <noreply@example.com>
password: your_password
recipient: Admin <admin@example.com>
starttls: true # Use STARTTLS (typically port 587)
tls: false # Use implicit TLS (typically port 465)
rate_limit:
enabled: true # Enable/disable rate limiting
burst_size: 10 # Maximum requests allowed in time window
per_seconds: 60 # Time window in seconds (100 req/60s = ~1.67 req/s)
```
You can also use a `.env` file for local development settings.
### Rate Limiting
The application includes built-in rate limiting to protect against abuse:
- Uses the **Generic Cell Rate Algorithm (GCRA)** via the `governor` crate
- **In-memory rate limiting** - no external dependencies like Redis required
- **Configurable limits** via YAML configuration or environment variables
- **Per-second rate limiting** with burst support
- Returns `429 Too Many Requests` when limits are exceeded
Default configuration: 100 requests per 60 seconds (approximately 1.67 requests per second with burst capacity).
To disable rate limiting, set `rate_limit.enabled: false` in your configuration.
## Development
### Prerequisites
**Option 1: Native Development**
- Rust (latest stable version recommended)
- Cargo (comes with Rust)
**Option 2: Nix Development (Recommended)**
- [Nix](https://nixos.org/download) with flakes enabled
- All dependencies managed automatically
### Running the Server
**With Cargo:**
```bash
cargo run
```
**With Nix development shell:**
```bash
nix develop .#backend
cargo run
```
The server will start on the configured port (default: 3100).
### Building
**With Cargo:**
For development builds:
```bash
cargo build
```
For optimized production builds:
```bash
cargo build --release
```
The compiled binary will be at `target/release/backend`.
**With Nix:**
Build the backend binary:
```bash
nix build .#backend
# Binary available at: ./result/bin/backend
```
Build Docker images:
```bash
# Build versioned Docker image (e.g., 0.1.0)
nix build .#backendDocker
# Build latest Docker image
nix build .#backendDockerLatest
# Load into Docker
docker load < result
# Image will be available as: localhost/phundrak/backend-rust:latest
```
The Nix build ensures reproducible builds with all dependencies pinned.
## Testing
Run all tests:
```bash
cargo test
# or
just test
```
Run a specific test:
```bash
cargo test <test_name>
```
Run tests with output:
```bash
cargo test -- --nocapture
```
Run tests with coverage:
```bash
cargo tarpaulin --config .tarpaulin.local.toml
# or
just coverage
```
### Testing Notes
- Integration tests use random TCP ports to avoid conflicts
- Tests use `get_test_app()` helper for consistent test setup
- Telemetry is automatically disabled during tests
- Tests are organized in `#[cfg(test)]` modules within each file
## Code Quality
### Linting
This project uses extremely strict Clippy linting rules:
- `#![deny(clippy::all)]`
- `#![deny(clippy::pedantic)]`
- `#![deny(clippy::nursery)]`
- `#![warn(missing_docs)]`
Run Clippy to check for issues:
```bash
cargo clippy --all-targets
# or
just lint
```
All code must pass these checks before committing.
### Continuous Checking with Bacon
For continuous testing and linting during development, use [bacon](https://dystroy.org/bacon/):
```bash
bacon # Runs clippy-all by default
bacon test # Runs tests continuously
bacon clippy # Runs clippy on default target only
```
Press 'c' in bacon to run clippy-all.
## Code Style
### Error Handling
- Use `thiserror` for custom error types
- Always return `Result` types for fallible operations
- Use descriptive error messages
### Logging
Always use `tracing::event!` with proper target and level:
```rust
tracing::event!(
target: "backend", // or "backend::module_name"
tracing::Level::INFO,
"Message here"
);
```
### Imports
Organize imports in three groups:
1. Standard library (`std::*`)
2. External crates (poem, serde, etc.)
3. Local modules (`crate::*`)
### Testing Conventions
- Use `#[tokio::test]` for async tests
- Use descriptive test names that explain what is being tested
- Test both success and error cases
- For endpoint tests, verify both status codes and response bodies
## Project Structure
```
backend/
├── src/
│ ├── main.rs # Application entry point
│ ├── lib.rs # Library root with run() and prepare()
│ ├── startup.rs # Application builder, server setup
│ ├── settings.rs # Configuration management
│ ├── telemetry.rs # Logging and tracing setup
│ ├── middleware/ # Custom middleware
│ │ ├── mod.rs # Middleware module
│ │ └── rate_limit.rs # Rate limiting middleware
│ └── route/ # API route handlers
│ ├── mod.rs # Route organization
│ ├── contact.rs # Contact form endpoint
│ ├── health.rs # Health check endpoint
│ └── meta.rs # Metadata endpoint
├── settings/ # Configuration files
│ ├── base.yaml # Base configuration
│ ├── development.yaml # Development overrides
│ └── production.yaml # Production overrides
├── Cargo.toml # Dependencies and metadata
└── README.md # This file
```
## Architecture
### Application Initialization Flow
1. `main.rs` calls `run()` from `lib.rs`
2. `run()` calls `prepare()` which:
- Loads environment variables from `.env` file
- Initializes `Settings` from YAML files and environment variables
- Sets up telemetry/logging (unless in test mode)
- Builds the `Application` with optional TCP listener
3. `Application::build()`:
- Sets up OpenAPI service with all API endpoints
- Configures Swagger UI at the root path (`/`)
- Configures API routes under `/api` prefix
- Creates server with TCP listener
4. Application runs with CORS middleware and settings injected as data
### Email Handling
The contact form supports multiple SMTP configurations:
- **Implicit TLS (SMTPS)** - typically port 465
- **STARTTLS (Always/Opportunistic)** - typically port 587
- **Unencrypted** (for local dev) - with or without authentication
The `SmtpTransport` is built dynamically from `EmailSettings` based on
TLS/STARTTLS configuration.
## Docker Deployment
### Using Pre-built Images
Docker images are automatically built and published via GitHub Actions to the configured container registry.
Pull and run the latest image:
```bash
# Pull from Phundrak Labs (labs.phundrak.com)
docker pull labs.phundrak.com/phundrak/phundrak-dot-com-backend:latest
# Run the container
docker run -d \
--name phundrak-backend \
-p 3100:3100 \
-e APP__APPLICATION__PORT=3100 \
-e APP__EMAIL__HOST=smtp.example.com \
-e APP__EMAIL__PORT=587 \
-e APP__EMAIL__USER=user@example.com \
-e APP__EMAIL__PASSWORD=your_password \
-e APP__EMAIL__FROM="Contact Form <noreply@example.com>" \
-e APP__EMAIL__RECIPIENT="Admin <admin@example.com>" \
labs.phundrak.com/phundrak/phundrak-dot-com-backend:latest
```
### Available Image Tags
The following tags are automatically published:
- `latest` - Latest stable release (from tagged commits on `main`)
- `<version>` - Specific version (e.g., `1.0.0`, from tagged commits like `v1.0.0`)
- `develop` - Latest development build (from `develop` branch)
- `pr<number>` - Pull request preview builds (e.g., `pr12`)
### Building Images Locally
Build with Nix (recommended for reproducibility):
```bash
nix build .#backendDockerLatest
docker load < result
docker run -p 3100:3100 localhost/phundrak/backend-rust:latest
```
Build with Docker directly:
```bash
# Note: This requires a Dockerfile (not included in this project)
# Use Nix builds for containerization
```
### Docker Compose Example
```yaml
version: '3.8'
services:
backend:
image: labs.phundrak.com/phundrak/phundrak-dot-com-backend:latest
ports:
- "3100:3100"
environment:
APP__APPLICATION__PORT: 3100
APP__EMAIL__HOST: smtp.example.com
APP__EMAIL__PORT: 587
APP__EMAIL__USER: ${SMTP_USER}
APP__EMAIL__PASSWORD: ${SMTP_PASSWORD}
APP__EMAIL__FROM: "Contact Form <noreply@example.com>"
APP__EMAIL__RECIPIENT: "Admin <admin@example.com>"
APP__EMAIL__STARTTLS: true
APP__RATE_LIMIT__ENABLED: true
APP__RATE_LIMIT__BURST_SIZE: 10
APP__RATE_LIMIT__PER_SECONDS: 60
restart: unless-stopped
```
## CI/CD Pipeline
### Automated Docker Publishing
GitHub Actions automatically builds and publishes Docker images based on repository events:
| Event Type | Trigger | Published Tags |
|-----------------|------------------------------|-------------------------------|
| Tag push | `v*.*.*` tag on `main` | `latest`, `<version>` |
| Branch push | Push to `develop` | `develop` |
| Pull request | PR opened/updated | `pr<number>` |
| Branch push | Push to `main` (no tag) | `latest` |
### Workflow Details
The CI/CD pipeline (`.github/workflows/publish-docker.yml`):
1. **Checks out the repository**
2. **Installs Nix** with flakes enabled
3. **Builds the Docker image** using Nix for reproducibility
4. **Authenticates** with the configured Docker registry
5. **Tags and pushes** images based on the event type
### Registry Configuration
Images are published to the registry specified by the `DOCKER_REGISTRY` environment variable in the workflow (default: `labs.phundrak.com`).
To use the published images, authenticate with the registry:
```bash
# For Phundrak Labs (labs.phundrak.com)
echo $GITHUB_TOKEN | docker login labs.phundrak.com -u USERNAME --password-stdin
# Pull the image
docker pull labs.phundrak.com/phundrak/phundrak-dot-com-backend:latest
```
### Required Secrets
The workflow requires these GitHub secrets:
- `DOCKER_USERNAME` - Registry username
- `DOCKER_PASSWORD` - Registry password or token
- `CACHIX_AUTH_TOKEN` - (Optional) For Nix build caching
See [.github/workflows/README.md](../.github/workflows/README.md) for detailed setup instructions.
## License
AGPL-3.0-only - See the root repository for full license information.

84
backend/bacon.toml Normal file
View File

@ -0,0 +1,84 @@
# This is a configuration file for the bacon tool
#
# Bacon repository: https://github.com/Canop/bacon
# Complete help on configuration: https://dystroy.org/bacon/config/
# You can also check bacon's own bacon.toml file
# as an example: https://github.com/Canop/bacon/blob/main/bacon.toml
default_job = "clippy-all"
[jobs.check]
command = ["cargo", "check", "--color", "always"]
need_stdout = false
[jobs.check-all]
command = ["cargo", "check", "--all-targets", "--color", "always"]
need_stdout = false
# Run clippy on the default target
[jobs.clippy]
command = [
"cargo", "clippy",
"--color", "always",
]
need_stdout = false
[jobs.clippy-all]
command = [
"cargo", "clippy",
"--all-targets",
"--color", "always",
]
need_stdout = false
[jobs.test]
command = [
"cargo", "test", "--color", "always",
"--", "--color", "always", # see https://github.com/Canop/bacon/issues/124
]
need_stdout = true
[jobs.doc]
command = ["cargo", "doc", "--color", "always", "--no-deps"]
need_stdout = false
# If the doc compiles, then it opens in your browser and bacon switches
# to the previous job
[jobs.doc-open]
command = ["cargo", "doc", "--color", "always", "--no-deps", "--open"]
need_stdout = false
on_success = "back" # so that we don't open the browser at each change
# You can run your application and have the result displayed in bacon,
# *if* it makes sense for this crate.
# Don't forget the `--color always` part or the errors won't be
# properly parsed.
# If your program never stops (eg a server), you may set `background`
# to false to have the cargo run output immediately displayed instead
# of waiting for program's end.
[jobs.run]
command = [
"cargo", "run",
"--color", "always",
# put launch parameters for your program behind a `--` separator
]
need_stdout = true
allow_warnings = true
background = true
# This parameterized job runs the example of your choice, as soon
# as the code compiles.
# Call it as
# bacon ex -- my-example
[jobs.ex]
command = ["cargo", "run", "--color", "always", "--example"]
need_stdout = true
allow_warnings = true
# You may define here keybindings that would be specific to
# a project, for example a shortcut to launch a specific job.
# Shortcuts to internal functions (scrolling, toggling, etc.)
# should go in your personal global prefs.toml file instead.
[keybindings]
# alt-m = "job:my-job"
c = "job:clippy-all" # comment this to have 'c' run clippy on only the default target

51
backend/deny.toml Normal file
View File

@ -0,0 +1,51 @@
[output]
feature-depth = 1
[advisories]
ignore = []
[licenses]
# List of explicitly allowed licenses
# See https://spdx.org/licenses/ for list of possible licenses
allow = [
"0BSD",
"AGPL-3.0-only",
"Apache-2.0 WITH LLVM-exception",
"Apache-2.0",
"BSD-3-Clause",
"CDLA-Permissive-2.0",
"ISC",
"MIT",
"MPL-2.0",
"OpenSSL",
"Unicode-3.0",
"Zlib",
]
confidence-threshold = 0.8
exceptions = []
[licenses.private]
ignore = false
registries = []
[bans]
multiple-versions = "allow"
wildcards = "allow"
highlight = "all"
workspace-default-features = "allow"
external-default-features = "allow"
allow = []
deny = []
skip = []
skip-tree = []
[sources]
unknown-registry = "deny"
unknown-git = "deny"
allow-registry = ["https://github.com/rust-lang/crates.io-index"]
allow-git = []
[sources.allow-org]
github = []
gitlab = []
bitbucket = []

48
backend/justfile Normal file
View File

@ -0,0 +1,48 @@
default: run
run:
cargo run
run-release:
cargo run --release
format:
cargo fmt --all
format-check:
cargo fmt --check --all
audit:
cargo deny
build:
cargo build
build-release:
cargo build --release
lint:
cargo clippy --all-targets
release-build:
cargo build --release
release-run:
cargo run --release
test:
cargo test
coverage:
mkdir -p coverage
cargo tarpaulin --config .tarpaulin.local.toml
coverage-ci:
mkdir -p coverage
cargo tarpaulin --config .tarpaulin.ci.toml
check-all: format-check lint coverage audit
## Local Variables:
## mode: makefile
## End:

60
backend/nix/package.nix Normal file
View File

@ -0,0 +1,60 @@
{
rust-overlay,
inputs,
system,
...
}: let
rust = import ./rust-version.nix { inherit rust-overlay inputs system; };
pkgs = rust.pkgs;
rustPlatform = pkgs.makeRustPlatform {
cargo = rust.version;
rustc = rust.version;
};
cargoToml = builtins.fromTOML (builtins.readFile ../Cargo.toml);
name = cargoToml.package.name;
version = cargoToml.package.version;
rustBuild = rustPlatform.buildRustPackage {
pname = name;
inherit version;
src = ../.;
cargoLock.lockFile = ../Cargo.lock;
};
settingsDir = pkgs.runCommand "settings" {} ''
mkdir -p $out/settings
cp ${../settings}/*.yaml $out/settings/
'';
makeDockerImage = tag:
pkgs.dockerTools.buildLayeredImage {
name = "phundrak/${name}";
inherit tag;
created = "now";
config = {
Entrypoint = ["${rustBuild}/bin/${name}"];
WorkingDir = "/";
Env = [
"SSL_CERT_FILE=${pkgs.cacert}/etc/ssl/certs/ca-bundle.crt"
];
ExposedPorts = {
"3100/tcp" = {};
};
Labels = {
"org.opencontainers.image.title" = name;
"org.opencontainers.image.version" = version;
"org.opencontainers.image.description" = "REST API backend for phundrak.com";
"org.opencontainers.image.authors" = "Lucien Cartier-Tilet <lucien@phundrak.com>";
"org.opencontainers.image.licenses" = "AGPL-3.0-only";
"org.opencontainers.image.source" = "https://labs.phundrak.com/phundrak/phundrak.com";
"org.opencontainers.image.url" = "https://labs.phundrak.com/phundrak/phundrak.com";
"org.opencontainers.image.documentation" = "https://labs.phundrak.com/phundrak/phundrak.com";
"org.opencontainers.image.vendor" = "Phundrak";
};
};
contents = [rustBuild pkgs.cacert settingsDir];
};
dockerImageLatest = makeDockerImage "latest";
dockerImageVersioned = makeDockerImage version;
in {
backend = rustBuild;
backendDocker = dockerImageVersioned;
backendDockerLatest = dockerImageLatest;
}

View File

@ -0,0 +1,6 @@
{rust-overlay, inputs, system, ...}: let
overlays = [(import rust-overlay)];
in rec {
pkgs = import inputs.nixpkgs {inherit system overlays;};
version = pkgs.rust-bin.stable.latest.default;
}

75
backend/nix/shell.nix Normal file
View File

@ -0,0 +1,75 @@
{
inputs,
pkgs,
system,
self,
rust-overlay,
...
}: let
rustPlatform = import ./rust-version.nix { inherit rust-overlay inputs system; };
in
inputs.devenv.lib.mkShell {
inherit inputs pkgs;
modules = [
{
devenv.root = let
devenvRootFileContent = builtins.readFile "${self}/.devenv-root";
in
pkgs.lib.mkIf (devenvRootFileContent != "") devenvRootFileContent;
}
{
packages = with rustPlatform.pkgs; [
(rustPlatform.version.override {
extensions = [
"clippy"
"rust-src"
"rust-analyzer"
"rustfmt"
];
})
bacon
cargo-deny
cargo-shuttle
cargo-tarpaulin
cargo-watch
flyctl
just
marksman
tombi # TOML lsp server
];
services.mailpit = {
enable = true;
# HTTP interface for viewing emails
uiListenAddress = "127.0.0.1:8025";
# SMTP server for receiving emails
smtpListenAddress = "127.0.0.1:1025";
};
processes.run.exec = "cargo watch -x run";
enterShell = ''
echo "🦀 Rust backend development environment loaded!"
echo "📦 Rust version: $(rustc --version)"
echo "📦 Cargo version: $(cargo --version)"
echo ""
echo "Available tools:"
echo " - rust-analyzer (LSP)"
echo " - clippy (linter)"
echo " - rustfmt (formatter)"
echo " - bacon (continuous testing/linting)"
echo " - cargo-deny (dependency checker)"
echo " - cargo-tarpaulin (code coverage)"
echo ""
echo "📧 Mailpit service:"
echo " - SMTP server: 127.0.0.1:1025"
echo " - Web UI: http://127.0.0.1:8025"
echo ""
echo "🚀 Quick start:"
echo " Run 'devenv up' to launch:"
echo " - Mailpit service (email testing)"
echo " - Backend with 'cargo watch -x run' (auto-reload)"
'';
}
];
}

View File

@ -0,0 +1,8 @@
application:
port: 3100
version: "0.1.0"
rate_limit:
enabled: true
burst_size: 10
per_seconds: 60

View File

@ -0,0 +1,18 @@
frontend_url: http://localhost:3000
debug: true
application:
protocol: http
host: 127.0.0.1
base_url: http://127.0.0.1:3100
name: "com.phundrak.backend.dev"
email:
host: localhost
port: 1025
user: ""
password: ""
from: Contact Form <noreply@example.com>
recipient: Admin <user@example.com>
tls: false
starttls: false

View File

@ -0,0 +1,18 @@
debug: false
frontend_url: ""
application:
name: "com.phundrak.backend.prod"
protocol: https
host: 0.0.0.0
base_url: ""
email:
host: ""
port: 0
user: ""
password: ""
from: ""
recipient: ""
tls: false
starttls: false

82
backend/src/lib.rs Normal file
View File

@ -0,0 +1,82 @@
//! Backend API server for phundrak.com
//!
//! This is a REST API built with the Poem framework that provides:
//! - Health check endpoints
//! - Application metadata endpoints
//! - Contact form submission with email integration
#![deny(clippy::all)]
#![deny(clippy::pedantic)]
#![deny(clippy::nursery)]
#![warn(missing_docs)]
#![allow(clippy::unused_async)]
/// Custom middleware implementations
pub mod middleware;
/// API route handlers and endpoints
pub mod route;
/// Application configuration settings
pub mod settings;
/// Application startup and server configuration
pub mod startup;
/// Logging and tracing setup
pub mod telemetry;
type MaybeListener = Option<poem::listener::TcpListener<String>>;
fn prepare(listener: MaybeListener) -> startup::Application {
dotenvy::dotenv().ok();
let settings = settings::Settings::new().expect("Failed to read settings");
if !cfg!(test) {
let subscriber = telemetry::get_subscriber(settings.debug);
telemetry::init_subscriber(subscriber);
}
tracing::event!(
target: "backend",
tracing::Level::DEBUG,
"Using these settings: {:?}",
settings
);
let application = startup::Application::build(settings, listener);
tracing::event!(
target: "backend",
tracing::Level::INFO,
"Listening on http://{}:{}/",
application.host(),
application.port()
);
tracing::event!(
target: "backend",
tracing::Level::INFO,
"Documentation available at http://{}:{}/",
application.host(),
application.port()
);
application
}
/// Runs the application with the specified TCP listener.
///
/// # Errors
///
/// Returns a `std::io::Error` if the server fails to start or encounters
/// an I/O error during runtime (e.g., port already in use, network issues).
#[cfg(not(tarpaulin_include))]
pub async fn run(listener: MaybeListener) -> Result<(), std::io::Error> {
let application = prepare(listener);
application.make_app().run().await
}
#[cfg(test)]
fn make_random_tcp_listener() -> poem::listener::TcpListener<String> {
let tcp_listener =
std::net::TcpListener::bind("127.0.0.1:0").expect("Failed to bind a random TCP listener");
let port = tcp_listener.local_addr().unwrap().port();
poem::listener::TcpListener::bind(format!("127.0.0.1:{port}"))
}
#[cfg(test)]
fn get_test_app() -> startup::App {
let tcp_listener = make_random_tcp_listener();
prepare(Some(tcp_listener)).make_app().into()
}

7
backend/src/main.rs Normal file
View File

@ -0,0 +1,7 @@
//! Backend server entry point.
#[cfg(not(tarpaulin_include))]
#[tokio::main]
async fn main() -> Result<(), std::io::Error> {
phundrak_dot_com_backend::run(None).await
}

View File

@ -0,0 +1,5 @@
//! Custom middleware for the application.
//!
//! This module contains custom middleware implementations including rate limiting.
pub mod rate_limit;

View File

@ -0,0 +1,211 @@
//! Rate limiting middleware using the governor crate.
//!
//! This middleware implements per-IP rate limiting using the Generic Cell Rate
//! Algorithm (GCRA) via the governor crate. It stores rate limiters in memory
//! without requiring external dependencies like Redis.
use std::{
net::IpAddr,
num::NonZeroU32,
sync::Arc,
time::Duration,
};
use governor::{
clock::DefaultClock,
state::{InMemoryState, NotKeyed},
Quota, RateLimiter,
};
use poem::{
Endpoint, Error, IntoResponse, Middleware, Request, Response, Result,
};
/// Rate limiting configuration.
#[derive(Debug, Clone)]
pub struct RateLimitConfig {
/// Maximum number of requests allowed in the time window (burst size).
pub burst_size: u32,
/// Time window in seconds for rate limiting.
pub per_seconds: u64,
}
impl RateLimitConfig {
/// Creates a new rate limit configuration.
///
/// # Arguments
///
/// * `burst_size` - Maximum number of requests allowed in the time window
/// * `per_seconds` - Time window in seconds
#[must_use]
pub const fn new(burst_size: u32, per_seconds: u64) -> Self {
Self {
burst_size,
per_seconds,
}
}
/// Creates a rate limiter from this configuration.
///
/// # Panics
///
/// Panics if `burst_size` is zero.
#[must_use]
pub fn create_limiter(&self) -> RateLimiter<NotKeyed, InMemoryState, DefaultClock> {
let quota = Quota::with_period(Duration::from_secs(self.per_seconds))
.expect("Failed to create quota")
.allow_burst(NonZeroU32::new(self.burst_size).expect("Burst size must be non-zero"));
RateLimiter::direct(quota)
}
}
impl Default for RateLimitConfig {
fn default() -> Self {
// Default: 10 requests per second with burst of 20
Self::new(20, 1)
}
}
/// Middleware for rate limiting based on IP address.
pub struct RateLimit {
limiter: Arc<RateLimiter<NotKeyed, InMemoryState, DefaultClock>>,
}
impl RateLimit {
/// Creates a new rate limiting middleware with the given configuration.
#[must_use]
pub fn new(config: &RateLimitConfig) -> Self {
Self {
limiter: Arc::new(config.create_limiter()),
}
}
}
impl<E: Endpoint> Middleware<E> for RateLimit {
type Output = RateLimitEndpoint<E>;
fn transform(&self, ep: E) -> Self::Output {
RateLimitEndpoint {
endpoint: ep,
limiter: self.limiter.clone(),
}
}
}
/// The endpoint wrapper that performs rate limiting checks.
pub struct RateLimitEndpoint<E> {
endpoint: E,
limiter: Arc<RateLimiter<NotKeyed, InMemoryState, DefaultClock>>,
}
impl<E: Endpoint> Endpoint for RateLimitEndpoint<E> {
type Output = Response;
async fn call(&self, req: Request) -> Result<Self::Output> {
// Check rate limit
if self.limiter.check().is_err() {
let client_ip = Self::get_client_ip(&req)
.map_or_else(|| "unknown".to_string(), |ip| ip.to_string());
tracing::event!(
target: "backend::middleware::rate_limit",
tracing::Level::WARN,
client_ip = %client_ip,
"Rate limit exceeded"
);
return Err(Error::from_status(poem::http::StatusCode::TOO_MANY_REQUESTS));
}
// Process the request
let response = self.endpoint.call(req).await;
response.map(IntoResponse::into_response)
}
}
impl<E> RateLimitEndpoint<E> {
/// Extracts the client IP address from the request.
fn get_client_ip(req: &Request) -> Option<IpAddr> {
req.remote_addr().as_socket_addr().map(std::net::SocketAddr::ip)
}
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn rate_limit_config_new() {
let config = RateLimitConfig::new(10, 60);
assert_eq!(config.burst_size, 10);
assert_eq!(config.per_seconds, 60);
}
#[test]
fn rate_limit_config_default() {
let config = RateLimitConfig::default();
assert_eq!(config.burst_size, 20);
assert_eq!(config.per_seconds, 1);
}
#[test]
fn rate_limit_config_creates_limiter() {
let config = RateLimitConfig::new(5, 1);
let limiter = config.create_limiter();
// First 5 requests should succeed
for _ in 0..5 {
assert!(limiter.check().is_ok());
}
// 6th request should fail
assert!(limiter.check().is_err());
}
#[tokio::test]
async fn rate_limit_middleware_allows_within_limit() {
use poem::{handler, test::TestClient, EndpointExt, Route};
#[handler]
async fn index() -> String {
"Hello".to_string()
}
let config = RateLimitConfig::new(5, 60);
let app = Route::new()
.at("/", poem::get(index))
.with(RateLimit::new(&config));
let cli = TestClient::new(app);
// First 5 requests should succeed
for _ in 0..5 {
let response = cli.get("/").send().await;
response.assert_status_is_ok();
}
}
#[tokio::test]
async fn rate_limit_middleware_blocks_over_limit() {
use poem::{handler, test::TestClient, EndpointExt, Route};
#[handler]
async fn index() -> String {
"Hello".to_string()
}
let config = RateLimitConfig::new(3, 60);
let app = Route::new()
.at("/", poem::get(index))
.with(RateLimit::new(&config));
let cli = TestClient::new(app);
// First 3 requests should succeed
for _ in 0..3 {
let response = cli.get("/").send().await;
response.assert_status_is_ok();
}
// 4th request should be rate limited
let response = cli.get("/").send().await;
response.assert_status(poem::http::StatusCode::TOO_MANY_REQUESTS);
}
}

View File

@ -0,0 +1,514 @@
//! Contact form endpoint for handling user submissions and sending emails.
//!
//! This module provides functionality to:
//! - Validate contact form submissions
//! - Detect spam using honeypot fields
//! - Send emails via SMTP with various TLS configurations
use lettre::{
Message, SmtpTransport, Transport, message::header::ContentType,
transport::smtp::authentication::Credentials,
};
use poem_openapi::{ApiResponse, Object, OpenApi, payload::Json};
use validator::Validate;
use super::ApiCategory;
use crate::settings::{EmailSettings, Starttls};
impl TryFrom<&EmailSettings> for SmtpTransport {
type Error = lettre::transport::smtp::Error;
fn try_from(settings: &EmailSettings) -> Result<Self, Self::Error> {
if settings.tls {
// Implicit TLS (SMTPS) - typically port 465
tracing::event!(target: "backend::contact", tracing::Level::DEBUG, "Using implicit TLS (SMTPS)");
let creds = Credentials::new(settings.user.clone(), settings.password.clone());
Ok(Self::relay(&settings.host)?
.port(settings.port)
.credentials(creds)
.build())
} else {
// STARTTLS or no encryption
match settings.starttls {
Starttls::Never => {
// For local development without TLS
tracing::event!(target: "backend::contact", tracing::Level::DEBUG, "Using unencrypted connection");
let builder = Self::builder_dangerous(&settings.host).port(settings.port);
if settings.user.is_empty() {
Ok(builder.build())
} else {
let creds =
Credentials::new(settings.user.clone(), settings.password.clone());
Ok(builder.credentials(creds).build())
}
}
Starttls::Opportunistic | Starttls::Always => {
// STARTTLS - typically port 587
tracing::event!(target: "backend::contact", tracing::Level::DEBUG, "Using STARTTLS");
let creds = Credentials::new(settings.user.clone(), settings.password.clone());
Ok(Self::starttls_relay(&settings.host)?
.port(settings.port)
.credentials(creds)
.build())
}
}
}
}
}
#[derive(Debug, Object, Validate)]
struct ContactRequest {
#[validate(length(
min = 1,
max = "100",
message = "Name must be between 1 and 100 characters"
))]
name: String,
#[validate(email(message = "Invalid email address"))]
email: String,
#[validate(length(
min = 10,
max = 5000,
message = "Message must be between 10 and 5000 characters"
))]
message: String,
/// Honeypot field - should always be empty
#[oai(rename = "website")]
honeypot: Option<String>,
}
#[derive(Debug, Object, serde::Deserialize)]
struct ContactResponse {
success: bool,
message: String,
}
impl From<ContactResponse> for Json<ContactResponse> {
fn from(value: ContactResponse) -> Self {
Self(value)
}
}
#[derive(ApiResponse)]
enum ContactApiResponse {
/// Success
#[oai(status = 200)]
Ok(Json<ContactResponse>),
/// Bad Request - validation failed
#[oai(status = 400)]
BadRequest(Json<ContactResponse>),
/// Too Many Requests - rate limit exceeded
#[oai(status = 429)]
#[allow(dead_code)]
TooManyRequests,
/// Internal Server Error
#[oai(status = 500)]
InternalServerError(Json<ContactResponse>),
}
/// API for handling contact form submissions and sending emails.
#[derive(Clone)]
pub struct ContactApi {
settings: EmailSettings,
}
impl From<EmailSettings> for ContactApi {
fn from(settings: EmailSettings) -> Self {
Self { settings }
}
}
#[OpenApi(tag = "ApiCategory::Contact")]
impl ContactApi {
/// Submit a contact form
///
/// Send a message through the contact form. Rate limited to prevent spam.
#[oai(path = "/contact", method = "post")]
async fn submit_contact(
&self,
body: Json<ContactRequest>,
remote_addr: Option<poem::web::Data<&poem::web::RemoteAddr>>,
) -> ContactApiResponse {
let body = body.0;
if body.honeypot.is_some() {
tracing::event!(target: "backend::contact", tracing::Level::INFO, "Honeypot triggered, rejecting request silently. IP: {}", remote_addr.map_or_else(|| "No remote address found".to_owned(), |ip| ip.0.to_string()));
return ContactApiResponse::Ok(
ContactResponse {
success: true,
message: "Message sent successfully, but not really, you bot".to_owned(),
}
.into(),
);
}
if let Err(e) = body.validate() {
return ContactApiResponse::BadRequest(
ContactResponse {
success: false,
message: format!("Validation error: {e}"),
}
.into(),
);
}
match self.send_email(&body).await {
Ok(()) => {
tracing::event!(target: "backend::contact", tracing::Level::INFO, "Message sent successfully from: {}", body.email);
ContactApiResponse::Ok(
ContactResponse {
success: true,
message: "Message sent successfully".to_owned(),
}
.into(),
)
}
Err(e) => {
tracing::event!(target: "backend::contact", tracing::Level::ERROR, "Failed to send email: {}", e);
ContactApiResponse::InternalServerError(
ContactResponse {
success: false,
message: "Failed to send message. Please try again later.".to_owned(),
}
.into(),
)
}
}
}
async fn send_email(&self, request: &ContactRequest) -> Result<(), Box<dyn std::error::Error>> {
let email_body = format!(
r"New contact form submission:
Name: {}
Email: {},
Message:
{}",
request.name, request.email, request.message
);
tracing::event!(target: "email", tracing::Level::DEBUG, "Sending email content: {}", email_body);
let email = Message::builder()
.from(self.settings.from.parse()?)
.reply_to(format!("{} <{}>", request.name, request.email).parse()?)
.to(self.settings.recipient.parse()?)
.subject(format!("Contact Form: {}", request.name))
.header(ContentType::TEXT_PLAIN)
.body(email_body)?;
tracing::event!(target: "email", tracing::Level::DEBUG, "Email to be sent: {}", format!("{email:?}"));
let mailer = SmtpTransport::try_from(&self.settings)?;
mailer.send(&email)?;
Ok(())
}
}
#[cfg(test)]
mod tests {
use super::*;
// Tests for ContactRequest validation
#[test]
fn contact_request_valid() {
let request = ContactRequest {
name: "John Doe".to_string(),
email: "john@example.com".to_string(),
message: "This is a test message that is long enough.".to_string(),
honeypot: None,
};
assert!(request.validate().is_ok());
}
#[test]
fn contact_request_name_too_short() {
let request = ContactRequest {
name: String::new(),
email: "john@example.com".to_string(),
message: "This is a test message that is long enough.".to_string(),
honeypot: None,
};
assert!(request.validate().is_err());
}
#[test]
fn contact_request_name_too_long() {
let request = ContactRequest {
name: "a".repeat(101),
email: "john@example.com".to_string(),
message: "This is a test message that is long enough.".to_string(),
honeypot: None,
};
assert!(request.validate().is_err());
}
#[test]
fn contact_request_name_at_max_length() {
let request = ContactRequest {
name: "a".repeat(100),
email: "john@example.com".to_string(),
message: "This is a test message that is long enough.".to_string(),
honeypot: None,
};
assert!(request.validate().is_ok());
}
#[test]
fn contact_request_invalid_email() {
let request = ContactRequest {
name: "John Doe".to_string(),
email: "not-an-email".to_string(),
message: "This is a test message that is long enough.".to_string(),
honeypot: None,
};
assert!(request.validate().is_err());
}
#[test]
fn contact_request_message_too_short() {
let request = ContactRequest {
name: "John Doe".to_string(),
email: "john@example.com".to_string(),
message: "Short".to_string(),
honeypot: None,
};
assert!(request.validate().is_err());
}
#[test]
fn contact_request_message_too_long() {
let request = ContactRequest {
name: "John Doe".to_string(),
email: "john@example.com".to_string(),
message: "a".repeat(5001),
honeypot: None,
};
assert!(request.validate().is_err());
}
#[test]
fn contact_request_message_at_min_length() {
let request = ContactRequest {
name: "John Doe".to_string(),
email: "john@example.com".to_string(),
message: "a".repeat(10),
honeypot: None,
};
assert!(request.validate().is_ok());
}
#[test]
fn contact_request_message_at_max_length() {
let request = ContactRequest {
name: "John Doe".to_string(),
email: "john@example.com".to_string(),
message: "a".repeat(5000),
honeypot: None,
};
assert!(request.validate().is_ok());
}
// Tests for SmtpTransport TryFrom implementation
#[test]
fn smtp_transport_implicit_tls() {
let settings = EmailSettings {
host: "smtp.example.com".to_string(),
port: 465,
user: "user@example.com".to_string(),
password: "password".to_string(),
from: "from@example.com".to_string(),
recipient: "to@example.com".to_string(),
tls: true,
starttls: Starttls::Never,
};
let result = SmtpTransport::try_from(&settings);
assert!(result.is_ok());
}
#[test]
fn smtp_transport_starttls_always() {
let settings = EmailSettings {
host: "smtp.example.com".to_string(),
port: 587,
user: "user@example.com".to_string(),
password: "password".to_string(),
from: "from@example.com".to_string(),
recipient: "to@example.com".to_string(),
tls: false,
starttls: Starttls::Always,
};
let result = SmtpTransport::try_from(&settings);
assert!(result.is_ok());
}
#[test]
fn smtp_transport_starttls_opportunistic() {
let settings = EmailSettings {
host: "smtp.example.com".to_string(),
port: 587,
user: "user@example.com".to_string(),
password: "password".to_string(),
from: "from@example.com".to_string(),
recipient: "to@example.com".to_string(),
tls: false,
starttls: Starttls::Opportunistic,
};
let result = SmtpTransport::try_from(&settings);
assert!(result.is_ok());
}
#[test]
fn smtp_transport_no_encryption_with_credentials() {
let settings = EmailSettings {
host: "localhost".to_string(),
port: 1025,
user: "user@example.com".to_string(),
password: "password".to_string(),
from: "from@example.com".to_string(),
recipient: "to@example.com".to_string(),
tls: false,
starttls: Starttls::Never,
};
let result = SmtpTransport::try_from(&settings);
assert!(result.is_ok());
}
#[test]
fn smtp_transport_no_encryption_no_credentials() {
let settings = EmailSettings {
host: "localhost".to_string(),
port: 1025,
user: String::new(),
password: String::new(),
from: "from@example.com".to_string(),
recipient: "to@example.com".to_string(),
tls: false,
starttls: Starttls::Never,
};
let result = SmtpTransport::try_from(&settings);
assert!(result.is_ok());
}
// Integration tests for contact API endpoint
#[tokio::test]
async fn contact_endpoint_honeypot_triggered() {
let app = crate::get_test_app();
let cli = poem::test::TestClient::new(app);
let body = serde_json::json!({
"name": "Bot Name",
"email": "bot@example.com",
"message": "This is a spam message from a bot.",
"website": "http://spam.com"
});
let resp = cli.post("/api/contact").body_json(&body).send().await;
resp.assert_status_is_ok();
let json_text = resp.0.into_body().into_string().await.unwrap();
let json: ContactResponse = serde_json::from_str(&json_text).unwrap();
assert!(json.success);
assert!(json.message.contains("not really"));
}
#[tokio::test]
async fn contact_endpoint_validation_error_empty_name() {
let app = crate::get_test_app();
let cli = poem::test::TestClient::new(app);
let body = serde_json::json!({
"name": "",
"email": "test@example.com",
"message": "This is a valid message that is long enough."
});
let resp = cli.post("/api/contact").body_json(&body).send().await;
resp.assert_status(poem::http::StatusCode::BAD_REQUEST);
let json_text = resp.0.into_body().into_string().await.unwrap();
let json: ContactResponse = serde_json::from_str(&json_text).unwrap();
assert!(!json.success);
assert!(json.message.contains("Validation error"));
}
#[tokio::test]
async fn contact_endpoint_validation_error_invalid_email() {
let app = crate::get_test_app();
let cli = poem::test::TestClient::new(app);
let body = serde_json::json!({
"name": "Test User",
"email": "not-an-email",
"message": "This is a valid message that is long enough."
});
let resp = cli.post("/api/contact").body_json(&body).send().await;
resp.assert_status(poem::http::StatusCode::BAD_REQUEST);
let json_text = resp.0.into_body().into_string().await.unwrap();
let json: ContactResponse = serde_json::from_str(&json_text).unwrap();
assert!(!json.success);
assert!(json.message.contains("Validation error"));
}
#[tokio::test]
async fn contact_endpoint_validation_error_message_too_short() {
let app = crate::get_test_app();
let cli = poem::test::TestClient::new(app);
let body = serde_json::json!({
"name": "Test User",
"email": "test@example.com",
"message": "Short"
});
let resp = cli.post("/api/contact").body_json(&body).send().await;
resp.assert_status(poem::http::StatusCode::BAD_REQUEST);
let json_text = resp.0.into_body().into_string().await.unwrap();
let json: ContactResponse = serde_json::from_str(&json_text).unwrap();
assert!(!json.success);
assert!(json.message.contains("Validation error"));
}
#[tokio::test]
async fn contact_endpoint_validation_error_name_too_long() {
let app = crate::get_test_app();
let cli = poem::test::TestClient::new(app);
let body = serde_json::json!({
"name": "a".repeat(101),
"email": "test@example.com",
"message": "This is a valid message that is long enough."
});
let resp = cli.post("/api/contact").body_json(&body).send().await;
resp.assert_status(poem::http::StatusCode::BAD_REQUEST);
let json_text = resp.0.into_body().into_string().await.unwrap();
let json: ContactResponse = serde_json::from_str(&json_text).unwrap();
assert!(!json.success);
assert!(json.message.contains("Validation error"));
}
#[tokio::test]
async fn contact_endpoint_validation_error_message_too_long() {
let app = crate::get_test_app();
let cli = poem::test::TestClient::new(app);
let body = serde_json::json!({
"name": "Test User",
"email": "test@example.com",
"message": "a".repeat(5001)
});
let resp = cli.post("/api/contact").body_json(&body).send().await;
resp.assert_status(poem::http::StatusCode::BAD_REQUEST);
let json_text = resp.0.into_body().into_string().await.unwrap();
let json: ContactResponse = serde_json::from_str(&json_text).unwrap();
assert!(!json.success);
assert!(json.message.contains("Validation error"));
}
}

View File

@ -0,0 +1,38 @@
//! Health check endpoint for monitoring service availability.
use poem_openapi::{ApiResponse, OpenApi};
use super::ApiCategory;
#[derive(ApiResponse)]
enum HealthResponse {
/// Success
#[oai(status = 200)]
Ok,
/// Too Many Requests - rate limit exceeded
#[oai(status = 429)]
#[allow(dead_code)]
TooManyRequests,
}
/// Health check API for monitoring service availability.
#[derive(Default, Clone)]
pub struct HealthApi;
#[OpenApi(tag = "ApiCategory::Health")]
impl HealthApi {
#[oai(path = "/health", method = "get")]
async fn ping(&self) -> HealthResponse {
tracing::event!(target: "backend::health", tracing::Level::DEBUG, "Accessing health-check endpoint");
HealthResponse::Ok
}
}
#[tokio::test]
async fn health_check_works() {
let app = crate::get_test_app();
let cli = poem::test::TestClient::new(app);
let resp = cli.get("/api/health").send().await;
resp.assert_status_is_ok();
resp.assert_text("").await;
}

86
backend/src/route/meta.rs Normal file
View File

@ -0,0 +1,86 @@
//! Application metadata endpoint for retrieving version and name information.
use poem::Result;
use poem_openapi::{ApiResponse, Object, OpenApi, payload::Json};
use super::ApiCategory;
use crate::settings::ApplicationSettings;
#[derive(Object, Debug, Clone, serde::Serialize, serde::Deserialize)]
struct Meta {
version: String,
name: String,
}
impl From<&MetaApi> for Meta {
fn from(value: &MetaApi) -> Self {
let version = value.version.clone();
let name = value.name.clone();
Self { version, name }
}
}
#[derive(ApiResponse)]
enum MetaResponse {
/// Success
#[oai(status = 200)]
Meta(Json<Meta>),
/// Too Many Requests - rate limit exceeded
#[oai(status = 429)]
#[allow(dead_code)]
TooManyRequests,
}
/// API for retrieving application metadata (name and version).
#[derive(Clone)]
pub struct MetaApi {
name: String,
version: String,
}
impl From<&ApplicationSettings> for MetaApi {
fn from(value: &ApplicationSettings) -> Self {
let name = value.name.clone();
let version = value.version.clone();
Self { name, version }
}
}
#[OpenApi(tag = "ApiCategory::Meta")]
impl MetaApi {
#[oai(path = "/meta", method = "get")]
async fn meta(&self) -> Result<MetaResponse> {
tracing::event!(target: "backend::meta", tracing::Level::DEBUG, "Accessing meta endpoint");
Ok(MetaResponse::Meta(Json(self.into())))
}
}
#[cfg(test)]
mod tests {
#[tokio::test]
async fn meta_endpoint_returns_correct_data() {
let app = crate::get_test_app();
let cli = poem::test::TestClient::new(app);
let resp = cli.get("/api/meta").send().await;
resp.assert_status_is_ok();
let json_value: serde_json::Value = resp.json().await.value().deserialize();
assert!(
json_value.get("version").is_some(),
"Response should have version field"
);
assert!(
json_value.get("name").is_some(),
"Response should have name field"
);
}
#[tokio::test]
async fn meta_endpoint_returns_200_status() {
let app = crate::get_test_app();
let cli = poem::test::TestClient::new(app);
let resp = cli.get("/api/meta").send().await;
resp.assert_status_is_ok();
}
}

46
backend/src/route/mod.rs Normal file
View File

@ -0,0 +1,46 @@
//! API route handlers for the backend server.
//!
//! This module contains all the HTTP endpoint handlers organized by functionality:
//! - Contact form handling
//! - Health checks
//! - Application metadata
use poem_openapi::Tags;
mod contact;
mod health;
mod meta;
use crate::settings::Settings;
#[derive(Tags)]
enum ApiCategory {
Contact,
Health,
Meta,
}
pub(crate) struct Api {
contact: contact::ContactApi,
health: health::HealthApi,
meta: meta::MetaApi,
}
impl From<&Settings> for Api {
fn from(value: &Settings) -> Self {
let contact = contact::ContactApi::from(value.clone().email);
let health = health::HealthApi;
let meta = meta::MetaApi::from(&value.application);
Self {
contact,
health,
meta,
}
}
}
impl Api {
pub fn apis(self) -> (contact::ContactApi, health::HealthApi, meta::MetaApi) {
(self.contact, self.health, self.meta)
}
}

619
backend/src/settings.rs Normal file
View File

@ -0,0 +1,619 @@
//! Application configuration settings.
//!
//! This module provides configuration structures that can be loaded from:
//! - YAML configuration files (base.yaml and environment-specific files)
//! - Environment variables (prefixed with APP__)
//!
//! Settings include application details, email server configuration, and environment settings.
/// Application configuration settings.
///
/// Loads configuration from YAML files and environment variables.
#[derive(Debug, serde::Deserialize, Clone, Default)]
pub struct Settings {
/// Application-specific settings (name, version, host, port, etc.)
pub application: ApplicationSettings,
/// Debug mode flag
pub debug: bool,
/// Email server configuration for contact form
pub email: EmailSettings,
/// Frontend URL for CORS configuration
pub frontend_url: String,
/// Rate limiting configuration
#[serde(default)]
pub rate_limit: RateLimitSettings,
}
impl Settings {
/// Creates a new `Settings` instance by loading configuration from files and environment variables.
///
/// # Errors
///
/// Returns a `config::ConfigError` if:
/// - Configuration files cannot be read or parsed
/// - Required configuration values are missing
/// - Configuration values cannot be deserialized into the expected types
///
/// # Panics
///
/// Panics if:
/// - The current directory cannot be determined
/// - The `APP_ENVIRONMENT` variable contains an invalid value (not "dev", "development", "prod", or "production")
pub fn new() -> Result<Self, config::ConfigError> {
let base_path = std::env::current_dir().expect("Failed to determine the current directory");
let settings_directory = base_path.join("settings");
let environment: Environment = std::env::var("APP_ENVIRONMENT")
.unwrap_or_else(|_| "dev".into())
.try_into()
.expect("Failed to parse APP_ENVIRONMENT");
let environment_filename = format!("{environment}.yaml");
// Lower = takes precedence
let settings = config::Config::builder()
.add_source(config::File::from(settings_directory.join("base.yaml")))
.add_source(config::File::from(
settings_directory.join(environment_filename),
))
.add_source(
config::Environment::with_prefix("APP")
.prefix_separator("__")
.separator("__"),
)
.build()?;
settings.try_deserialize()
}
}
/// Application-specific configuration settings.
#[derive(Debug, serde::Deserialize, Clone, Default)]
pub struct ApplicationSettings {
/// Application name
pub name: String,
/// Application version
pub version: String,
/// Port to bind to
pub port: u16,
/// Host address to bind to
pub host: String,
/// Base URL of the application
pub base_url: String,
/// Protocol (http or https)
pub protocol: String,
}
/// Application environment.
#[derive(Debug, PartialEq, Eq, Default)]
pub enum Environment {
/// Development environment
#[default]
Development,
/// Production environment
Production,
}
impl std::fmt::Display for Environment {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
let self_str = match self {
Self::Development => "development",
Self::Production => "production",
};
write!(f, "{self_str}")
}
}
impl TryFrom<String> for Environment {
type Error = String;
fn try_from(value: String) -> Result<Self, Self::Error> {
Self::try_from(value.as_str())
}
}
impl TryFrom<&str> for Environment {
type Error = String;
fn try_from(value: &str) -> Result<Self, Self::Error> {
match value.to_lowercase().as_str() {
"development" | "dev" => Ok(Self::Development),
"production" | "prod" => Ok(Self::Production),
other => Err(format!(
"{other} is not a supported environment. Use either `development` or `production`"
)),
}
}
}
/// Email server configuration for the contact form.
#[derive(serde::Deserialize, Clone, Default)]
pub struct EmailSettings {
/// SMTP server hostname
pub host: String,
/// SMTP server port
pub port: u16,
/// SMTP authentication username
pub user: String,
/// Email address to send from
pub from: String,
/// SMTP authentication password
pub password: String,
/// Email address to send contact form submissions to
pub recipient: String,
/// STARTTLS configuration
pub starttls: Starttls,
/// Whether to use implicit TLS (SMTPS)
pub tls: bool,
}
impl std::fmt::Debug for EmailSettings {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
f.debug_struct("EmailSettings")
.field("host", &self.host)
.field("port", &self.port)
.field("user", &self.user)
.field("from", &self.from)
.field("password", &"[REDACTED]")
.field("recipient", &self.recipient)
.field("starttls", &self.starttls)
.field("tls", &self.tls)
.finish()
}
}
/// STARTTLS configuration for SMTP connections.
#[derive(Debug, PartialEq, Eq, Default, Clone)]
pub enum Starttls {
/// Never use STARTTLS (unencrypted connection)
#[default]
Never,
/// Use STARTTLS if available (opportunistic encryption)
Opportunistic,
/// Always use STARTTLS (required encryption)
Always,
}
impl TryFrom<&str> for Starttls {
type Error = String;
fn try_from(value: &str) -> Result<Self, Self::Error> {
match value.to_lowercase().as_str() {
"off" | "no" | "never" => Ok(Self::Never),
"opportunistic" => Ok(Self::Opportunistic),
"yes" | "always" => Ok(Self::Always),
other => Err(format!(
"{other} is not a supported option. Use either `yes`, `no`, or `opportunistic`"
)),
}
}
}
impl TryFrom<String> for Starttls {
type Error = String;
fn try_from(value: String) -> Result<Self, Self::Error> {
value.as_str().try_into()
}
}
impl From<bool> for Starttls {
fn from(value: bool) -> Self {
if value { Self::Always } else { Self::Never }
}
}
impl std::fmt::Display for Starttls {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
let self_str = match self {
Self::Never => "never",
Self::Opportunistic => "opportunistic",
Self::Always => "always",
};
write!(f, "{self_str}")
}
}
impl<'de> serde::Deserialize<'de> for Starttls {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
where
D: serde::Deserializer<'de>,
{
struct StartlsVisitor;
impl serde::de::Visitor<'_> for StartlsVisitor {
type Value = Starttls;
fn expecting(&self, formatter: &mut std::fmt::Formatter) -> std::fmt::Result {
formatter.write_str("a string or boolean representing STARTTLS setting (e.g., 'yes', 'no', 'opportunistic', true, false)")
}
fn visit_str<E>(self, value: &str) -> Result<Starttls, E>
where
E: serde::de::Error,
{
Starttls::try_from(value).map_err(E::custom)
}
fn visit_string<E>(self, value: String) -> Result<Starttls, E>
where
E: serde::de::Error,
{
Starttls::try_from(value.as_str()).map_err(E::custom)
}
fn visit_bool<E>(self, value: bool) -> Result<Starttls, E>
where
E: serde::de::Error,
{
Ok(Starttls::from(value))
}
}
deserializer.deserialize_any(StartlsVisitor)
}
}
/// Rate limiting configuration.
#[derive(Debug, serde::Deserialize, Clone)]
pub struct RateLimitSettings {
/// Whether rate limiting is enabled
#[serde(default = "default_rate_limit_enabled")]
pub enabled: bool,
/// Maximum number of requests allowed in the time window (burst size)
#[serde(default = "default_burst_size")]
pub burst_size: u32,
/// Time window in seconds for rate limiting
#[serde(default = "default_per_seconds")]
pub per_seconds: u64,
}
impl Default for RateLimitSettings {
fn default() -> Self {
Self {
enabled: default_rate_limit_enabled(),
burst_size: default_burst_size(),
per_seconds: default_per_seconds(),
}
}
}
const fn default_rate_limit_enabled() -> bool {
true
}
const fn default_burst_size() -> u32 {
100
}
const fn default_per_seconds() -> u64 {
60
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn environment_display_development() {
let env = Environment::Development;
assert_eq!(env.to_string(), "development");
}
#[test]
fn environment_display_production() {
let env = Environment::Production;
assert_eq!(env.to_string(), "production");
}
#[test]
fn environment_from_str_development() {
assert_eq!(
Environment::try_from("development").unwrap(),
Environment::Development
);
assert_eq!(
Environment::try_from("dev").unwrap(),
Environment::Development
);
assert_eq!(
Environment::try_from("Development").unwrap(),
Environment::Development
);
assert_eq!(
Environment::try_from("DEV").unwrap(),
Environment::Development
);
}
#[test]
fn environment_from_str_production() {
assert_eq!(
Environment::try_from("production").unwrap(),
Environment::Production
);
assert_eq!(
Environment::try_from("prod").unwrap(),
Environment::Production
);
assert_eq!(
Environment::try_from("Production").unwrap(),
Environment::Production
);
assert_eq!(
Environment::try_from("PROD").unwrap(),
Environment::Production
);
}
#[test]
fn environment_from_str_invalid() {
let result = Environment::try_from("invalid");
assert!(result.is_err());
assert!(result.unwrap_err().contains("not a supported environment"));
}
#[test]
fn environment_from_string_development() {
assert_eq!(
Environment::try_from("development".to_string()).unwrap(),
Environment::Development
);
}
#[test]
fn environment_from_string_production() {
assert_eq!(
Environment::try_from("production".to_string()).unwrap(),
Environment::Production
);
}
#[test]
fn environment_from_string_invalid() {
let result = Environment::try_from("invalid".to_string());
assert!(result.is_err());
}
#[test]
fn environment_default_is_development() {
let env = Environment::default();
assert_eq!(env, Environment::Development);
}
#[test]
fn startls_deserialize_from_string_never() {
let json = r#""never""#;
let result: Starttls = serde_json::from_str(json).unwrap();
assert_eq!(result, Starttls::Never);
let json = r#""no""#;
let result: Starttls = serde_json::from_str(json).unwrap();
assert_eq!(result, Starttls::Never);
let json = r#""off""#;
let result: Starttls = serde_json::from_str(json).unwrap();
assert_eq!(result, Starttls::Never);
}
#[test]
fn startls_deserialize_from_string_always() {
let json = r#""always""#;
let result: Starttls = serde_json::from_str(json).unwrap();
assert_eq!(result, Starttls::Always);
let json = r#""yes""#;
let result: Starttls = serde_json::from_str(json).unwrap();
assert_eq!(result, Starttls::Always);
}
#[test]
fn startls_deserialize_from_string_opportunistic() {
let json = r#""opportunistic""#;
let result: Starttls = serde_json::from_str(json).unwrap();
assert_eq!(result, Starttls::Opportunistic);
}
#[test]
fn startls_deserialize_from_bool() {
let json = "true";
let result: Starttls = serde_json::from_str(json).unwrap();
assert_eq!(result, Starttls::Always);
let json = "false";
let result: Starttls = serde_json::from_str(json).unwrap();
assert_eq!(result, Starttls::Never);
}
#[test]
fn startls_deserialize_from_string_invalid() {
let json = r#""invalid""#;
let result: Result<Starttls, _> = serde_json::from_str(json);
assert!(result.is_err());
}
#[test]
fn startls_default_is_never() {
let startls = Starttls::default();
assert_eq!(startls, Starttls::Never);
}
#[test]
fn startls_try_from_str_never() {
assert_eq!(Starttls::try_from("never").unwrap(), Starttls::Never);
assert_eq!(Starttls::try_from("no").unwrap(), Starttls::Never);
assert_eq!(Starttls::try_from("off").unwrap(), Starttls::Never);
assert_eq!(Starttls::try_from("NEVER").unwrap(), Starttls::Never);
assert_eq!(Starttls::try_from("No").unwrap(), Starttls::Never);
}
#[test]
fn startls_try_from_str_always() {
assert_eq!(Starttls::try_from("always").unwrap(), Starttls::Always);
assert_eq!(Starttls::try_from("yes").unwrap(), Starttls::Always);
assert_eq!(Starttls::try_from("ALWAYS").unwrap(), Starttls::Always);
assert_eq!(Starttls::try_from("Yes").unwrap(), Starttls::Always);
}
#[test]
fn startls_try_from_str_opportunistic() {
assert_eq!(
Starttls::try_from("opportunistic").unwrap(),
Starttls::Opportunistic
);
assert_eq!(
Starttls::try_from("OPPORTUNISTIC").unwrap(),
Starttls::Opportunistic
);
}
#[test]
fn startls_try_from_str_invalid() {
let result = Starttls::try_from("invalid");
assert!(result.is_err());
assert!(result
.unwrap_err()
.contains("not a supported option"));
}
#[test]
fn startls_try_from_string_never() {
assert_eq!(
Starttls::try_from("never".to_string()).unwrap(),
Starttls::Never
);
}
#[test]
fn startls_try_from_string_always() {
assert_eq!(
Starttls::try_from("yes".to_string()).unwrap(),
Starttls::Always
);
}
#[test]
fn startls_try_from_string_opportunistic() {
assert_eq!(
Starttls::try_from("opportunistic".to_string()).unwrap(),
Starttls::Opportunistic
);
}
#[test]
fn startls_try_from_string_invalid() {
let result = Starttls::try_from("invalid".to_string());
assert!(result.is_err());
}
#[test]
fn startls_from_bool_true() {
assert_eq!(Starttls::from(true), Starttls::Always);
}
#[test]
fn startls_from_bool_false() {
assert_eq!(Starttls::from(false), Starttls::Never);
}
#[test]
fn startls_display_never() {
let startls = Starttls::Never;
assert_eq!(startls.to_string(), "never");
}
#[test]
fn startls_display_always() {
let startls = Starttls::Always;
assert_eq!(startls.to_string(), "always");
}
#[test]
fn startls_display_opportunistic() {
let startls = Starttls::Opportunistic;
assert_eq!(startls.to_string(), "opportunistic");
}
#[test]
fn rate_limit_settings_default() {
let settings = RateLimitSettings::default();
assert!(settings.enabled);
assert_eq!(settings.burst_size, 100);
assert_eq!(settings.per_seconds, 60);
}
#[test]
fn rate_limit_settings_deserialize_full() {
let json = r#"{"enabled": true, "burst_size": 50, "per_seconds": 30}"#;
let settings: RateLimitSettings = serde_json::from_str(json).unwrap();
assert!(settings.enabled);
assert_eq!(settings.burst_size, 50);
assert_eq!(settings.per_seconds, 30);
}
#[test]
fn rate_limit_settings_deserialize_partial() {
let json = r#"{"enabled": false}"#;
let settings: RateLimitSettings = serde_json::from_str(json).unwrap();
assert!(!settings.enabled);
assert_eq!(settings.burst_size, 100); // default
assert_eq!(settings.per_seconds, 60); // default
}
#[test]
fn rate_limit_settings_deserialize_empty() {
let json = "{}";
let settings: RateLimitSettings = serde_json::from_str(json).unwrap();
assert!(settings.enabled); // default
assert_eq!(settings.burst_size, 100); // default
assert_eq!(settings.per_seconds, 60); // default
}
#[test]
fn startls_deserialize_from_incompatible_type() {
// Test that deserialization from an array fails with expected error message
let json = "[1, 2, 3]";
let result: Result<Starttls, _> = serde_json::from_str(json);
assert!(result.is_err());
let error = result.unwrap_err().to_string();
// The error should mention what was expected
assert!(
error.contains("STARTTLS") || error.contains("string") || error.contains("boolean")
);
}
#[test]
fn startls_deserialize_from_number() {
// Test that deserialization from a number fails
let json = "42";
let result: Result<Starttls, _> = serde_json::from_str(json);
assert!(result.is_err());
}
#[test]
fn startls_deserialize_from_object() {
// Test that deserialization from an object fails
let json = r#"{"foo": "bar"}"#;
let result: Result<Starttls, _> = serde_json::from_str(json);
assert!(result.is_err());
}
#[test]
fn email_settings_debug_redacts_password() {
let settings = EmailSettings {
host: "smtp.example.com".to_string(),
port: 587,
user: "user@example.com".to_string(),
from: "noreply@example.com".to_string(),
password: "super_secret_password".to_string(),
recipient: "admin@example.com".to_string(),
starttls: Starttls::Always,
tls: false,
};
let debug_output = format!("{settings:?}");
// Password should be redacted
assert!(debug_output.contains("[REDACTED]"));
// Password should not appear in output
assert!(!debug_output.contains("super_secret_password"));
// Other fields should still be present
assert!(debug_output.contains("smtp.example.com"));
assert!(debug_output.contains("user@example.com"));
}
}

228
backend/src/startup.rs Normal file
View File

@ -0,0 +1,228 @@
//! Application startup and server configuration.
//!
//! This module handles:
//! - Building the application with routes and middleware
//! - Setting up the OpenAPI service and Swagger UI
//! - Configuring CORS
//! - Starting the HTTP server
use poem::middleware::{AddDataEndpoint, Cors, CorsEndpoint};
use poem::{EndpointExt, Route};
use poem_openapi::OpenApiService;
use crate::{
middleware::rate_limit::{RateLimit, RateLimitConfig},
route::Api,
settings::Settings,
};
use crate::middleware::rate_limit::RateLimitEndpoint;
type Server = poem::Server<poem::listener::TcpListener<String>, std::convert::Infallible>;
/// The configured application with rate limiting, CORS, and settings data.
pub type App = AddDataEndpoint<CorsEndpoint<RateLimitEndpoint<Route>>, Settings>;
/// Application builder that holds the server configuration before running.
pub struct Application {
server: Server,
app: poem::Route,
host: String,
port: u16,
settings: Settings,
}
/// A fully configured application ready to run.
pub struct RunnableApplication {
server: Server,
app: App,
}
impl RunnableApplication {
/// Runs the application server.
///
/// # Errors
///
/// Returns a `std::io::Error` if the server fails to start or encounters
/// an I/O error during runtime (e.g., port already in use, network issues).
pub async fn run(self) -> Result<(), std::io::Error> {
self.server.run(self.app).await
}
}
impl From<RunnableApplication> for App {
fn from(value: RunnableApplication) -> Self {
value.app
}
}
impl From<Application> for RunnableApplication {
fn from(value: Application) -> Self {
// Configure rate limiting based on settings
let rate_limit_config = if value.settings.rate_limit.enabled {
tracing::event!(
target: "backend::startup",
tracing::Level::INFO,
burst_size = value.settings.rate_limit.burst_size,
per_seconds = value.settings.rate_limit.per_seconds,
"Rate limiting enabled"
);
RateLimitConfig::new(
value.settings.rate_limit.burst_size,
value.settings.rate_limit.per_seconds,
)
} else {
tracing::event!(
target: "backend::startup",
tracing::Level::INFO,
"Rate limiting disabled (using very high limits)"
);
// Use very high limits to effectively disable rate limiting
RateLimitConfig::new(u32::MAX, 1)
};
let app = value
.app
.with(RateLimit::new(&rate_limit_config))
.with(Cors::new())
.data(value.settings);
let server = value.server;
Self { server, app }
}
}
impl Application {
fn setup_app(settings: &Settings) -> poem::Route {
let api_service = OpenApiService::new(
Api::from(settings).apis(),
settings.application.clone().name,
settings.application.clone().version,
)
.url_prefix("/api");
let ui = api_service.swagger_ui();
poem::Route::new()
.nest("/api", api_service.clone())
.nest("/specs", api_service.spec_endpoint_yaml())
.nest("/", ui)
}
fn setup_server(
settings: &Settings,
tcp_listener: Option<poem::listener::TcpListener<String>>,
) -> Server {
let tcp_listener = tcp_listener.unwrap_or_else(|| {
let address = format!(
"{}:{}",
settings.application.host, settings.application.port
);
poem::listener::TcpListener::bind(address)
});
poem::Server::new(tcp_listener)
}
/// Builds a new application with the given settings and optional TCP listener.
///
/// If no listener is provided, one will be created based on the settings.
#[must_use]
pub fn build(
settings: Settings,
tcp_listener: Option<poem::listener::TcpListener<String>>,
) -> Self {
let port = settings.application.port;
let host = settings.application.clone().host;
let app = Self::setup_app(&settings);
let server = Self::setup_server(&settings, tcp_listener);
Self {
server,
app,
host,
port,
settings,
}
}
/// Converts the application into a runnable application.
#[must_use]
pub fn make_app(self) -> RunnableApplication {
self.into()
}
/// Returns the host address the application is configured to bind to.
#[must_use]
pub fn host(&self) -> String {
self.host.clone()
}
/// Returns the port the application is configured to bind to.
#[must_use]
pub const fn port(&self) -> u16 {
self.port
}
}
#[cfg(test)]
mod tests {
use super::*;
fn create_test_settings() -> Settings {
Settings {
application: crate::settings::ApplicationSettings {
name: "test-app".to_string(),
version: "1.0.0".to_string(),
port: 8080,
host: "127.0.0.1".to_string(),
base_url: "http://localhost:8080".to_string(),
protocol: "http".to_string(),
},
debug: false,
email: crate::settings::EmailSettings::default(),
frontend_url: "http://localhost:3000".to_string(),
rate_limit: crate::settings::RateLimitSettings {
enabled: false,
burst_size: 100,
per_seconds: 60,
},
}
}
#[test]
fn application_build_and_host() {
let settings = create_test_settings();
let app = Application::build(settings.clone(), None);
assert_eq!(app.host(), settings.application.host);
}
#[test]
fn application_build_and_port() {
let settings = create_test_settings();
let app = Application::build(settings, None);
assert_eq!(app.port(), 8080);
}
#[test]
fn application_host_returns_correct_value() {
let settings = create_test_settings();
let app = Application::build(settings, None);
assert_eq!(app.host(), "127.0.0.1");
}
#[test]
fn application_port_returns_correct_value() {
let settings = create_test_settings();
let app = Application::build(settings, None);
assert_eq!(app.port(), 8080);
}
#[test]
fn application_with_custom_listener() {
let settings = create_test_settings();
let tcp_listener =
std::net::TcpListener::bind("127.0.0.1:0").expect("Failed to bind random port");
let port = tcp_listener.local_addr().unwrap().port();
let listener = poem::listener::TcpListener::bind(format!("127.0.0.1:{port}"));
let app = Application::build(settings, Some(listener));
assert_eq!(app.host(), "127.0.0.1");
assert_eq!(app.port(), 8080);
}
}

69
backend/src/telemetry.rs Normal file
View File

@ -0,0 +1,69 @@
//! Logging and tracing configuration.
//!
//! This module provides utilities for setting up structured logging using the tracing crate.
//! Supports both pretty-printed logs for development and JSON logs for production.
use tracing_subscriber::layer::SubscriberExt;
/// Creates a tracing subscriber configured for the given debug mode.
///
/// In debug mode, logs are pretty-printed to stdout.
/// In production mode, logs are output as JSON.
#[must_use]
pub fn get_subscriber(debug: bool) -> impl tracing::Subscriber + Send + Sync {
let env_filter = if debug { "debug" } else { "info" }.to_string();
let env_filter = tracing_subscriber::EnvFilter::try_from_default_env()
.unwrap_or_else(|_| tracing_subscriber::EnvFilter::new(env_filter));
let stdout_log = tracing_subscriber::fmt::layer().pretty();
let subscriber = tracing_subscriber::Registry::default()
.with(env_filter)
.with(stdout_log);
let json_log = if debug {
None
} else {
Some(tracing_subscriber::fmt::layer().json())
};
subscriber.with(json_log)
}
/// Initializes the global tracing subscriber.
///
/// # Panics
///
/// Panics if:
/// - A global subscriber has already been set
/// - The subscriber cannot be set as the global default
pub fn init_subscriber(subscriber: impl tracing::Subscriber + Send + Sync) {
tracing::subscriber::set_global_default(subscriber).expect("Failed to set subscriber");
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn get_subscriber_debug_mode() {
let subscriber = get_subscriber(true);
// If we can create the subscriber without panicking, the test passes
// We can't easily inspect the subscriber's internals, but we can verify it's created
let _ = subscriber;
}
#[test]
fn get_subscriber_production_mode() {
let subscriber = get_subscriber(false);
// If we can create the subscriber without panicking, the test passes
let _ = subscriber;
}
#[test]
fn get_subscriber_creates_valid_subscriber() {
// Test both debug and non-debug modes create valid subscribers
let debug_subscriber = get_subscriber(true);
let prod_subscriber = get_subscriber(false);
// Basic smoke test - if these are created without panicking, they're valid
let _ = debug_subscriber;
let _ = prod_subscriber;
}
}

View File

@ -1,4 +0,0 @@
;;; Directory Local Variables -*- no-byte-compile: t -*-
;;; For more information see (info "(emacs) Directory Variables")
((typescript-mode . ((typescript-indent-level . 2))))

View File

@ -1,25 +0,0 @@
import { defineClientConfig } from '@vuepress/client';
import ResponsiveImage from './components/ResponsiveImage.vue';
import ListRepositories from './components/GitHub/ListRepositories.vue';
import FetchRepositories from './components/GitHub/FetchRepositories.vue';
import GithubRepository from './components/GitHub/GithubRepository.vue';
import ApiLoader from './components/ApiLoader.vue';
import LoaderAnimation from './components/LoaderAnimation.vue';
import FetchError from './components/FetchError.vue';
import Icon from './components/Icon.vue';
export default defineClientConfig({
enhance({ app }) {
app.component('ResponsiveImage', ResponsiveImage);
app.component('ListRepositories', ListRepositories);
app.component('FetchRepositories', FetchRepositories);
app.component('GithubRepository', GithubRepository);
app.component('ApiLoader', ApiLoader);
app.component('LoaderAnimation', LoaderAnimation);
app.component('FetchError', FetchError);
app.component('Icon', Icon);
},
setup() {},
layouts: {},
rootComponents: [],
});

View File

@ -1,36 +0,0 @@
<template>
<slot v-if="loading" name="loader">
<LoaderAnimation />
</slot>
<slot v-else-if="error" name="error">
<FetchError :url="props.url" />
</slot>
<slot v-else> </slot>
</template>
<script setup lang="ts">
import LoaderAnimation from './LoaderAnimation.vue';
import FetchError from './FetchError.vue';
import { useFetchAndCache } from '../composables/fetchAndCache';
const props = defineProps({
url: {
default: '',
required: true,
type: String,
},
cacheName: {
required: true,
type: String,
},
alreadyKnownData: Object,
});
const emits = defineEmits(['loaded', 'error', 'loading']);
const { loading, error } = useFetchAndCache(props.url, {
emits: emits,
cacheName: props.cacheName,
});
</script>

View File

@ -1,26 +0,0 @@
<template>
<div class="error rounded-corners card-width">
<p>API call to {{ props.url }} failed</p>
</div>
</template>
<script setup lang="ts">
const props = defineProps({
url: {
required: true,
type: String,
},
});
</script>
<style lang="less">
@import 'node_modules/nord/src/lesscss/nord.less';
@import '../styles/classes.less';
.error {
display: inline-block;
padding: 2rem;
text-align: center;
background: @nord11;
}
</style>

View File

@ -1,46 +0,0 @@
<template>
<ApiLoader :url="fetchUrl" @loaded="filterRepos" cache-name="repos" />
<slot />
</template>
<script setup lang="ts">
import { PropType, ref } from 'vue';
import { GithubRepo } from '../../types/github';
const props = defineProps({
sortBy: {
default: 'none',
required: false,
type: String as PropType<'stars' | 'forks' | 'pushed_at'>,
},
user: {
default: '',
required: true,
type: String,
},
limit: {
default: 5,
required: false,
type: Number,
},
});
const emits = defineEmits(['loaded']);
const fetchUrl = `https://api.github.com/users/${props.user}/repos?per_page=100`;
const repos = ref<GithubRepo[]>([]);
const filterRepos = (response: GithubRepo[]) => {
repos.value = response
.sort((a, b) => {
if (props.sortBy === 'stars') {
return b.stargazers_count - a.stargazers_count;
}
if (props.sortBy === 'pushed_at') {
const dateA = new Date(a.pushed_at);
const dateB = new Date(b.pushed_at);
return dateB.getTime() - dateA.getTime();
}
return b.forks_count - a.forks_count;
})
.slice(0, +props.limit);
emits('loaded', repos.value);
};
</script>

View File

@ -1,89 +0,0 @@
<template>
<div
class="githubRepo flex-col flex-space-between gap-1rem rounded-corners card-width"
>
<ApiLoader
:cache-name="repoName()"
:url="fetchUrl"
:already-known-data="props.data"
@loaded="(repo: GithubRepo) => (repository = repo)"
>
<h3>{{ repository?.name }}</h3>
<div>
<p>
{{ repository?.description }}
</p>
</div>
<div class="flex-row flex-start gap-1rem stats">
<div class="stars">
<Icon name="star" /> {{ repository?.stargazers_count }}
</div>
<div class="forks">
<Icon name="fork" /> {{ repository?.forks_count }}
</div>
<div class="link">
<a :href="repository?.html_url"><i class="icon phunic-link" /></a>
</div>
</div>
</ApiLoader>
</div>
</template>
<script setup lang="ts">
import ApiLoader from '../ApiLoader.vue';
import { GithubRepo } from '../../types/github';
import { PropType, Ref, ref } from 'vue';
const props = defineProps({
data: Object as PropType<GithubRepo>,
repoName: String,
});
const repoName = (): string => {
return props.data ? props.data.full_name : props.repoName;
};
const fetchUrl = `https://api.github.com/repos/${repoName()}`;
const repository: Ref<GithubRepo | null> = ref(null);
</script>
<style lang="less">
@import 'node_modules/nord/src/lesscss/nord.less';
@import '../../styles/classes.less';
.githubRepo {
padding: 2rem;
background-color: @nord4;
align-self: auto;
h3,
h3:first-child {
margin: 0;
padding: 0;
}
html.dark & {
background-color: @nord3;
}
.info {
max-width: 30rem;
}
.stats {
width: 4rem;
div {
.flex-row();
gap: 0.3rem;
}
}
.link {
a {
display: flex;
align-items: center;
}
}
}
</style>

View File

@ -1,54 +0,0 @@
<template>
<div class="list-repos flex-col gap-1rem">
<FetchRepositories
v-if="props.user !== ''"
:sort-by="props.sortBy"
:user="props.user"
:limit="props.limit"
@loaded="(response: GithubRepo[]) => (repos = response)"
>
<GithubRepository
:data="repo"
type="repositories"
v-for="repo in repos"
/>
</FetchRepositories>
<slot v-else />
</div>
</template>
<script setup lang="ts">
import FetchRepositories from './FetchRepositories.vue';
import GithubRepository from './GithubRepository.vue';
import { PropType, Ref, ref } from 'vue';
import { GithubRepo } from '../../types/github';
const props = defineProps({
sortBy: {
default: 'none',
required: false,
type: String as PropType<'stars' | 'forks' | 'pushed_at'>,
},
user: {
default: '',
required: false,
type: String,
},
limit: {
default: 5,
required: false,
type: Number,
},
});
const repos: Ref<GithubRepo[]> = ref(null);
</script>
<style lang="less">
@import '../../styles/classes.less';
.list-repos {
margin: 2rem auto;
}
</style>

View File

@ -1,17 +0,0 @@
<template>
<i :class="`icon phunic-${props.name}`" />
</template>
<script setup lang="ts">
const props = defineProps({
name: {
default: '',
required: true,
type: String,
},
});
</script>
<style lang="less">
@import '../styles/fonts.less';
</style>

View File

@ -1,47 +0,0 @@
<template>
<svg
class="circle-loader"
width="40"
height="40"
version="1.1"
xmlns="http://www.w3.org/2000/svg"
>
<circle cx="20" cy="20" r="15" />
</svg>
</template>
<style lang="less" scoped>
@import 'node_modules/nord/src/lesscss/nord.less';
.circle-loader {
margin-left: 48%;
fill: transparent;
stroke: @nord7;
stroke-width: 5;
animation: dash 1.5s ease infinite, rotate 2s linear infinite;
}
@keyframes dash {
0% {
stroke-dasharray: 1, 95;
stroke-dashoffset: 0;
}
50% {
stroke-dasharray: 85, 95;
stroke-dashoffset: -25;
}
100% {
stroke-dasharray: 85, 95;
stroke-dashoffset: -90;
}
}
@keyframes rotate {
0% {
transform: rotate(0deg);
}
100% {
transform: rotate(360deg);
}
}
</style>

View File

@ -1,25 +0,0 @@
<template>
<img :srcset="srcset" :sizes="sizes" :alt="props.alt" :src="props.src" />
</template>
<script setup lang="ts">
const props = defineProps<{
src: string;
width: number;
preview: string;
previewWidth: number;
previewThreshold?: number;
alt?: string;
}>();
const srcset = [
`${props.preview} ${props.previewWidth}w`,
`${props.src} ${props.width}w`,
].join(', ');
const sizes = [
`(max-width: ${props.previewThreshold || props.previewWidth}px) ${
props.previewWidth
}px`,
`${props.width}px`,
].join(', ');
</script>

View File

@ -1,62 +0,0 @@
import { Ref, computed, ref, watchEffect } from 'vue';
interface CacheOptions {
lifetime?: number;
timestampSuffix?: string;
forceUpdate?: boolean;
}
/**
* Cache data in local storage.
*
* The cache is updated if:
* - cache data does not exist
* - cached data is outdated and `data` is not null
* - or `options.forceUpdate` is true, regardless of the value of `data`
*
* Otherwise, data is retrieved from cache.
*
* @param {string} name Name of the cached value in local storage
* @param {Ref<T>} data Data to cache
* @param {CacheOptions} options Tweaks to the behaviour of the function
*/
export const useCache = <T>(
name: string,
data: Ref<T>,
options: CacheOptions,
) => {
const error = ref<string>(null);
const timestampName = name + (options?.timestampSuffix || '-timestamp');
const lifetime = options?.lifetime || 1000 * 60 * 60; // one hour in milliseconds
const lastUpdated: number = +localStorage.getItem(timestampName);
const cacheAge: number = Date.now() - lastUpdated;
const isDataOutdated = computed(() => {
return cacheAge > lifetime;
});
const shouldUpdate = computed(
() => options?.forceUpdate || (isDataOutdated.value && data.value != null),
);
const setData = () => {
console.log('Setting data in cache with name', name);
localStorage.setItem(name, JSON.stringify(data.value));
localStorage.setItem(timestampName, `${Date.now()}`);
};
const getData = () => {
console.log('Getting data from cache with name', name);
const cached = localStorage.getItem(name);
console.log('Value from storage:', cached);
try {
data.value = JSON.parse(cached);
} catch (err) {
console.error('Failed to parse cached data:', err);
data.value = null;
error.value = err;
}
};
getData();
watchEffect(() => (shouldUpdate.value ? setData() : getData()));
return { error, isDataOutdated };
};

View File

@ -1,72 +0,0 @@
import { ref, Ref } from 'vue';
import { useCache } from './cache';
type FetchAndCacheEmitter = (
event: 'loaded' | 'error' | 'loading',
...args: any[]
) => void;
interface UseFetchAndCacheOptions {
cacheLifetime?: number;
cacheName?: string;
emits?: FetchAndCacheEmitter;
}
const dummyEmits = (
_event: 'loaded' | 'error' | 'loading',
..._args: any[]
) => {};
export const useFetchAndCache = <T, E>(
url: string,
options?: UseFetchAndCacheOptions,
) => {
const data = ref<T | null>(null) as Ref<T>;
const error = ref<E | null>(null) as Ref<E>;
const loading = ref<boolean>(true);
const cacheLifetime: number = options?.cacheLifetime || 1000 * 60 * 60; // one hour
const cacheName: string = options?.cacheName || url;
const { isDataOutdated: isCacheOutdated, error: cacheError } = useCache(
cacheName,
data,
{
lifetime: cacheLifetime,
},
);
const emits: FetchAndCacheEmitter = options?.emits || dummyEmits;
const loaded = () => {
loading.value = false;
emits('loaded', data.value);
};
const fetchData = () => {
loading.value = true;
emits('loading');
console.log('Fetching from URL', url);
fetch(url)
.then((response) => {
if (!response.ok) {
throw new Error(response.statusText);
}
return response.json() as Promise<T>;
})
.then((responseData) => {
data.value = responseData;
loaded();
})
.catch((e) => {
console.warn('Caught error!', e);
error.value = e;
emits('error', e);
})
.finally(() => (loading.value = false));
};
if (isCacheOutdated.value || cacheError.value != null) {
fetchData();
} else {
loaded();
}
return { data, loading, error };
};

View File

@ -1,56 +0,0 @@
import { defaultTheme } from '@vuepress/theme-default';
import { viteBundler } from '@vuepress/bundler-vite';
import { defineUserConfig } from 'vuepress';
import { slimsearchPlugin } from '@vuepress/plugin-slimsearch';
import { umamiAnalyticsPlugin } from '@vuepress/plugin-umami-analytics';
import { head } from './head';
import { locales, searchLocaleLfn } from './locales';
import { themeLocales } from './themeLocales';
const isProd = process.env.NODE_ENV === 'production';
export default defineUserConfig({
lang: 'fr-FR',
title: 'Lucien Cartier-Tilet',
description: 'Site web personnel de Lucien Cartier-Tilet',
head: head,
bundler: isProd
? viteBundler({})
: viteBundler({
viteOptions: {
server: {
allowedHosts: true,
},
},
}),
markdown: {
html: true,
linkify: true,
typographer: true,
},
plugins: [
slimsearchPlugin({
indexContent: true,
indexLocaleOptions: {
'/lfn': searchLocaleLfn,
},
}),
isProd
? umamiAnalyticsPlugin({
id: '67166941-8c83-4a19-bc8c-139e44b7f7aa',
link: 'https://umami.phundrak.com/script.js',
})
: [],
],
locales: locales,
theme: defaultTheme({
contributors: false,
locales: themeLocales,
repo: 'https://labs.phundrak.com/phundrak/phundrak.com',
themePlugins: {
copyCode: false,
prismjs: false,
},
}),
});

View File

@ -1,142 +0,0 @@
import { HeadAttrsConfig } from 'vuepress';
interface SimplifiedHeader {
tag: string;
content: HeadAttrsConfig[];
}
const simplifiedHead: SimplifiedHeader[] = [
{
tag: 'meta',
content: [
{
name: 'author',
content: 'Lucien Cartier-Tilet',
},
{
name: 'fediverse:creator',
content: '@phundrak@mastodon.phundrak.com',
},
{
property: 'og:image',
content: 'https://cdn.phundrak.com/img/rich_preview.png',
},
{
property: 'org:title',
content: 'Lucien Cartier-Tilet',
},
{
property: 'og:description',
content: 'Site web personnel de Lucien Cartier-Tilet',
},
{
name: 'twitter:card',
content: 'summary',
},
{
name: 'twitter:site',
content: '@phundrak',
},
{
name: 'twitter:creator',
content: '@phundrak',
},
{
name: 'build-status',
content: `value: ${process.env.NODE_ENV}`,
},
{ name: 'msapplication-TileColor', content: '#3b4252' },
{ name: 'msapplication-TileImage', content: '/ms-icon-144x144.png' },
{ name: 'theme-color', content: '#3b4252' },
],
},
{
tag: 'link',
content: [
{
rel: 'apple-touch-icon',
sizes: '57x57',
href: '/apple-icon-57x57.png',
},
{
rel: 'apple-touch-icon',
sizes: '60x60',
href: '/apple-icon-60x60.png',
},
{
rel: 'apple-touch-icon',
sizes: '72x72',
href: '/apple-icon-72x72.png',
},
{
rel: 'apple-touch-icon',
sizes: '76x76',
href: '/apple-icon-76x76.png',
},
{
rel: 'apple-touch-icon',
sizes: '114x114',
href: '/apple-icon-114x114.png',
},
{
rel: 'apple-touch-icon',
sizes: '120x120',
href: '/apple-icon-120x120.png',
},
{
rel: 'apple-touch-icon',
sizes: '144x144',
href: '/apple-icon-144x144.png',
},
{
rel: 'apple-touch-icon',
sizes: '152x152',
href: '/apple-icon-152x152.png',
},
{
rel: 'apple-touch-icon',
sizes: '180x180',
href: '/apple-icon-180x180.png',
},
{
rel: 'icon',
type: 'image/png',
sizes: '192x192',
href: '/android-icon-192x192.png',
},
{
rel: 'icon',
type: 'image/png',
sizes: '32x32',
href: '/favicon-32x32.png',
},
{
rel: 'icon',
type: 'image/png',
sizes: '96x96',
href: '/favicon-96x96.png',
},
{
rel: 'icon',
type: 'image/png',
sizes: '16x16',
href: '/favicon-16x16.png',
},
{ rel: 'manifest', href: '/manifest.json' },
],
},
];
const headBuilder = [];
simplifiedHead.forEach((tag) => {
tag.content.forEach((element) => {
headBuilder.push([tag.tag, element]);
});
});
headBuilder.push([
'a',
{ rel: 'me', href: 'https://mastodon.phundrak.com/@phundrak' },
'Mastodon',
]);
export const head = headBuilder;

View File

@ -1,36 +0,0 @@
import SlimSarchLocaleData from '@vuepress/plugin-slimsearch';
export const locales = {
'/': {
lang: 'fr-FR',
title: 'Lucien Cartier-Tilet',
description: 'Site web personnel de Lucien Cartier-Tilet',
},
'/en/': {
lang: 'en-US',
title: 'Lucien Cartier-Tilet',
description: 'Personal website of Lucien Cartier-Tilet',
},
'/lfn/': {
lang: 'lfn',
title: 'Lucien Cartier-Tilet',
description: 'loca ueb de Lucien Cartier-Tilet',
},
};
export const searchLocaleLfn: SlimSarchLocaleData = {
cancel: 'Cansela',
placeholder: 'Xerca',
search: 'Xerca',
searching: 'Xercante',
defaultTitle: 'Documentos',
select: 'eleje',
navigate: 'naviga',
autocomplete: 'auto-completi',
exit: 'sorti',
queryHistory: 'Historia de xerca',
resultHistory: 'Historia de resultas',
emptyHistory: 'Historia vacua',
emptyResult: 'Resultas vacua',
loading: 'Cargante la indise de xerca...',
};

View File

@ -1,24 +0,0 @@
{
"subject": "acct:phundrak@emacs.ch",
"aliases": ["https://emacs.ch/@phundrak", "https://emacs.ch/users/phundrak"],
"links": [
{
"rel": "http://webfinger.net/rel/profile-page",
"type": "text/html",
"href": "https://emacs.ch/@phundrak"
},
{
"rel": "self",
"type": "application/activity+json",
"href": "https://emacs.ch/users/phundrak"
},
{
"rel": "http://ostatus.org/schema/1.0/subscribe",
"template": "https://emacs.ch/authorize_interaction?uri={uri}"
},
{
"rel": "http://openid.net/specs/connect/1.0/issuer",
"href": "https://auth.phundrak.com"
}
]
}

Binary file not shown.

Before

Width:  |  Height:  |  Size: 28 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 45 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 3.9 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 5.6 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 9.8 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 15 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 20 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 21 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 28 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 31 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 41 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 7.0 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 7.5 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 9.8 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 11 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 46 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 46 KiB

View File

@ -1,2 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<browserconfig><msapplication><tile><square70x70logo src="/ms-icon-70x70.png"/><square150x150logo src="/ms-icon-150x150.png"/><square310x310logo src="/ms-icon-310x310.png"/><TileColor>#eceff4</TileColor></tile></msapplication></browserconfig>

View File

@ -1,18 +0,0 @@
body {
margin: 3em;
}
body, code,p {
background: #e5e9f0 !important;
line-height: 1.4 !important;
color: #2E3440;
font-size: 16px !important;
}
blockquote, blockquote p {
border-left-color: #3b4252;
}
pre {
padding: 10px;
}

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.1 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 3.4 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 15 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.1 KiB

View File

@ -1,36 +0,0 @@
<?xml version="1.0" standalone="no"?>
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd" >
<svg xmlns="http://www.w3.org/2000/svg">
<metadata>Generated by IcoMoon</metadata>
<defs>
<font id="phunic" horiz-adv-x="1024">
<font-face units-per-em="1024" ascent="960" descent="-64" />
<missing-glyph horiz-adv-x="1024" />
<glyph unicode="&#x20;" horiz-adv-x="512" d="" />
<glyph unicode="&#xe900;" glyph-name="star" horiz-adv-x="1152" d="M575.8 960c18.4 0 35.2-10.4 43.2-27l137.2-282.6 306.4-45.2c18-2.6 33-15.2 38.6-32.6s1-36.2-11.8-49l-222.2-220.4 52.4-311.2c3-18-4.4-36.2-19.2-47s-34.6-12-50.6-3.4l-274 146.4-273.8-146.2c-16.2-8.6-35.8-7.4-50.6 3.4s-22.4 29-19.4 47l52.4 311.2-222.2 220.2c-13 12.8-17.4 31.8-11.8 49s20.6 29.8 38.6 32.6l306.4 45.2 137.2 282.6c8.2 16.6 24.8 27 43.2 27zM575.8 802l-105-216.4c-7-14.2-20.4-24.2-36.2-26.6l-236.6-34.8 171.8-170.2c11-11 16.2-26.6 13.6-42l-40.6-239.4 210.4 112.4c14.2 7.6 31.2 7.6 45.2 0l210.4-112.4-40.4 239.2c-2.6 15.4 2.4 31 13.6 42l171.8 170.2-236.6 35c-15.6 2.4-29.2 12.2-36.2 26.6l-105.2 216.4z" />
<glyph unicode="&#xe901;" glyph-name="envelope" d="M96 832c-53 0-96-43-96-96 0-30.2 14.2-58.6 38.4-76.8l435.2-326.4c22.8-17 54-17 76.8 0l435.2 326.4c24.2 18.2 38.4 46.6 38.4 76.8 0 53-43 96-96 96h-832zM0 608v-416c0-70.6 57.4-128 128-128h768c70.6 0 128 57.4 128 128v416l-435.2-326.4c-45.6-34.2-108-34.2-153.6 0l-435.2 326.4z" />
<glyph unicode="&#xe902;" glyph-name="emacs" horiz-adv-x="953" d="M783.986 963.865h-275.721c83.997-19.531 275.721-63.47 275.721-98.442 0-74.39-354.471 19.638-433.184 19.638-39.366 0-118.026-6.581-118.115-59.058-0.338-137.717 185.123-255.869 314.998-334.671-90.454 26.238-173.296 39.348-252.045 39.348-102.407 0-299.236-39.348-299.236-177.28 0-121.709 246.779-233.919 314.998-255.834 118.507-38.156 314.998-58.844 314.998-78.749 0-19.798-157.464-39.348-354.471-39.348h-118.098c78.821-39.561 196.918-39.561 315.104-39.561l118.098 0.089c78.732 0.089 275.561 2.935 275.561 70.905 0 59.253-236.194 109.524-354.381 133.912-164.739 33.975-284.367 100.096-275.561 149.475 20.955 118.667 314.998 118.187 551.299 118.187-103.047 82.876-354.471 236.285-354.471 303.31 0 28.551 39.455 35.238 78.732 35.346 157.553 0.907 275.721-33.318 315.158-33.318 78.66 0 118.045 49.095 118.045 108.172 0.035 98.53-117.99 137.877-157.427 137.877z" />
<glyph unicode="&#xe903;" glyph-name="gitea" d="M179.534 736.179c-10.527 0-22.386-0.823-35.824-3.754-14.172-2.932-54.552-12.077-87.618-43.842-73.302-65.32-54.584-169.208-52.304-184.844 2.769-19.058 11.207-71.999 51.604-118.097 74.605-91.383 235.239-89.273 235.239-89.273s19.688-47.089 49.823-90.418c40.723-53.917 82.592-95.932 123.315-100.981 102.622 0 307.714 0.127 307.714 0.127s19.581-0.143 46.131 16.798c22.805 13.846 43.141 38.115 43.141 38.114s21.011 22.5 50.332 73.811c8.959 15.801 16.455 31.062 22.97 45.559 0 0 89.91 190.801 89.91 376.498-1.792 56.198-15.641 66.099-18.899 69.357-6.679 6.679-15.653 6.554-15.653 6.554s-190.895-10.763-289.771-13.044c-21.601-0.488-43.054-0.981-64.33-1.145l0.127-190.572c0 0 93.541-39.449 135.404-65.347 6.027-3.747 16.6-11.1 20.998-23.479 3.421-9.936 3.232-21.334-1.654-31.433l-99.326-206.67c-10.099-20.687-34.869-29.534-55.231-19.598l-206.733 99.39c-20.362 9.774-29.153 34.516-19.216 55.039l99.39 206.733c9.774 20.362 34.516 29.153 55.040 19.216 27.903-13.465 43.837-21.063 43.969-21.125 0 59.356-0.127 177.655-0.127 177.655-47.239-0.652-145.331 3.627-145.331 3.627s-230.325 11.527-255.41 13.807c-7.982 0.489-17.153 1.336-27.679 1.336zM199.387 657.979c0 0 11.57-96.789 25.579-153.475 11.728-47.565 40.406-126.559 40.406-126.559s-42.528 5.053-70.056 14.825c-42.189 13.846-60.067 30.479-60.067 30.479s-31.131 21.836-46.768 64.839c-26.877 71.999-2.291 115.934-2.291 115.934s13.708 36.651 62.739 48.868c22.479 6.027 50.459 5.090 50.459 5.090zM523.077 350.102c-13.357-0.163-25.085-9.448-28.18-22.479s3.258-26.551 14.823-32.579c12.543-6.516 28.506-2.932 36.977 8.796 8.308 11.565 7.004 27.529-2.932 37.628l39.094 79.98c2.443-0.163 6.027-0.326 10.099 0.814 6.679 1.466 11.565 5.864 11.565 5.864 6.841-2.932 14.009-6.19 21.502-9.936 7.819-3.909 15.149-7.982 21.828-11.891 1.466-0.814 2.932-1.792 4.561-3.095 2.606-2.118 5.538-5.050 7.656-8.959 3.095-8.959-3.095-24.271-3.095-24.271-3.747-12.38-29.972-66.134-29.972-66.134-13.194 0.326-24.923-8.145-28.832-20.362-4.235-13.194 1.792-28.18 14.497-34.696s28.343-2.769 36.651 8.633c8.145 11.077 7.493 26.551-1.792 36.814 3.095 6.027 6.027 12.054 9.122 18.407 8.145 16.941 21.99 49.519 21.99 49.519 1.466 2.769 9.285 16.778 4.398 34.696-4.072 18.57-20.524 27.203-20.524 27.203-19.873 12.869-47.565 24.76-47.565 24.76s0 6.679-1.792 11.565c-1.792 5.050-4.561 8.308-6.353 10.262 7.656 15.801 15.312 31.438 22.968 47.239-6.679 3.258-13.194 6.516-19.873 9.936-7.819-15.963-15.801-32.090-23.619-48.053-10.914 0.163-21.013-5.701-26.226-15.312-5.538-10.262-4.398-22.968 3.095-32.253l-40.072-82.098z" />
<glyph unicode="&#xe904;" glyph-name="share" horiz-adv-x="896" d="M570.8 565.8l-188.2-94c1-7.8-0.4-14-0.4-23.8 0-8 1.4-14.2 0.4-23.8l188.2-94c34.4 33.4 81.4 53.8 133.2 53.8 106 0 192-84.2 192-192 0-106-86-192-192-192-107.8 0-192 86-192 192 0 9.8 0.4 16 1.4 23.8l-188.2 94c-34.4-33.4-81.4-53.8-133.2-53.8-106.040 0-192 86-192 192 0 107.8 85.96 192 192 192 51.8 0 98.8-20.4 133.2-53.8l188.2 94c-1 9.6-1.4 15.8-1.4 23.8 0 106.040 84.2 192 192 192 106 0 192-85.96 192-192 0-106-86-192-192-192-51.8 0-98.8 20.4-133.2 53.8v0z" />
<glyph unicode="&#xe905;" glyph-name="terminal" horiz-adv-x="1152" d="M18.744 786.74c-24.992 25-24.992 65.52 0 90.52 24.996 24.98 65.516 24.98 90.516 0l383.94-384.060c25-25 25-65.4 0-90.4l-383.94-384c-25-25-65.52-25-90.516 0-24.992 25-24.992 65.4 0 90.4l338.656 338.8-338.656 338.74zM1088 128c35.4 0 64-28.6 64-64s-28.6-64-64-64h-576c-35.4 0-64 28.6-64 64s28.6 64 64 64h576z" />
<glyph unicode="&#xe906;" glyph-name="at" d="M415.6 918.54c-186.9-36.64-337.4-187.32-374-374.2-55.28-281.8 137.3-532.4 398.2-570.2 38.020-5.776 72.34 24.52 72.34 62.98v1.326c0 31.48-22.88 57.76-53.68 62.48-168.7 25.96-298.4 172.26-298.4 348.4 0 205.8 177.22 371 386.8 350.8 183.080-17.738 317.2-182.5 317.2-366.4v-32.32c0-44.18-35.88-80.1-80-80.1s-80.020 35.92-80.020 80.1v240.2c0 17.694-14.322 32.040-32.020 32.040l-63.96-0.007c-14.598 0-26.4-9.984-30.24-23.36-49.7 24.3-108.48 32.76-172.12 10.212-77.5-27.46-136.24-97.82-147.44-179.28-18.966-138.020 87.62-256 221.8-256 52.88 0 100.86 19.088 139.18 49.76 48-62.6 130.46-97.38 218.8-74.98 92.36 21.409 153.96 111.809 152.16 205.609v41.8c0 298.4-267.8 531.264-574.6 471.14zM478.2 351.4c-52.94 0-96 43.12-96 96.1s43.060 96.1 96 96.1 96-43.12 96-96.1-41.2-96.1-96-96.1z" />
<glyph unicode="&#xe907;" glyph-name="mastodon" horiz-adv-x="896" d="M866 601.78c0 194.4-127.42 251.4-127.42 251.4-125.040 57.4-457.12 56.8-580.96 0 0 0-127.44-57-127.44-251.4 0-231.4-13.2-518.8 211.26-578.2 81.020-21.4 150.64-26 206.66-22.8 101.62 5.6 158.64 36.2 158.64 36.2l-3.4 73.8s-72.62-22.8-154.24-20.2c-80.82 2.8-166 8.8-179.26 108-1.147 8.146-1.801 17.557-1.801 27.12 0 0.239 0 0.478 0.001 0.717v-0.037c171.26-41.8 317.3-18.2 357.5-13.4 112.24 13.4 210 82.6 222.46 145.8 19.6 99.6 18 243 18 243zM715.76 351.38h-93.26v228.4c0 99.4-128 103.2-128-13.8v-125h-92.66v125.020c0 117-128 113.2-128 13.8v-228.4h-93.46c0 244.2-10.4 295.8 36.82 350 51.8 57.8 159.64 61.6 207.66-12.2l23.2-39 23.2 39c48.22 74.2 156.24 69.6 207.66 12.2 47.42-54.6 36.8-106 36.8-350z" />
<glyph unicode="&#xe908;" glyph-name="conlang" d="M512 800c-385.364 0-621.988-380.55-460.664-693.875l106.664 131.875 654 66 84-130-842.73-71.551c2.159-4.106 4.349-8.208 6.645-12.289l892.086 73.84 20.246-53.512c157.457 279.979-52.304 689.512-460.246 689.512zM410 490h204l54-66-328-30 70 96zM694 412l74-98-560-56 90 118 396 36z" />
<glyph unicode="&#xe909;" glyph-name="link" horiz-adv-x="1280" d="M1159.6 424.6c113 113 113 296 0 409-100 100-257.6 113-372.6 30.8l-3.2-2.2c-28.8-20.6-35.4-60.6-14.8-89.2s60.6-35.4 89.2-14.8l3.2 2.2c64.2 45.8 152 38.6 207.6-17.2 63-63 63-165 0-228l-224.4-224.8c-63-63-165-63-228 0-55.8 55.8-63 143.6-17.2 207.6l2.2 3.2c20.6 28.8 13.8 68.8-14.8 89.2s-68.8 13.8-89.2-14.8l-2.2-3.2c-82.4-114.8-69.4-272.4 30.6-372.4 113-113 296-113 409 0l224.6 224.6zM120.4 471.4c-113-113-113-296 0-409 100-100 257.6-113 372.6-30.8l3.2 2.2c28.8 20.6 35.4 60.6 14.8 89.2s-60.6 35.4-89.2 14.8l-3.2-2.2c-64.2-45.8-152-38.6-207.6 17.2-63 63.2-63 165.2 0 228.2l224.4 224.6c63 63 165 63 228 0 55.8-55.8 63-143.6 17.2-207.8l-2.2-3.2c-20.6-28.8-13.8-68.8 14.8-89.2s68.8-13.8 89.2 14.8l2.2 3.2c82.4 115 69.4 272.6-30.6 372.6-113 113-296 113-409 0l-224.6-224.6z" />
<glyph unicode="&#xe90a;" glyph-name="code" horiz-adv-x="1280" d="M829.6 878.42l-256-896.020c-9.8-34-45.2-53.6-79.2-44-34 9.8-53.6 45.2-44 79.2l256 895.98c9.8 33.988 45.2 53.668 79.2 43.956 34-9.71 53.6-45.136 44-79.116v0zM1037.2 717.2l224-224c25-25 25-65.4 0-90.4l-224-224c-25-25-65.4-25-90.4 0s-25 65.4 0 90.4l178.6 178.8-178.6 178.8c-25 25-25 65.4 0 90.4s65.4 25 90.4 0v0zM333.2 626.8l-178.7-178.8 178.7-178.8c25-25 25-65.4 0-90.4s-65.4-25-90.4 0l-224.056 224c-24.992 25-24.992 65.4 0 90.4l224.056 224c25 25 65.4 25 90.4 0s25-65.4 0-90.4v0z" />
<glyph unicode="&#xe90b;" glyph-name="fork" horiz-adv-x="896" d="M320 800c0-65.6-39.4-120.2-96-146.6v-175.6c37.6 21.8 81.4 34.2 128 34.2h192c70.6 0 128 57.4 128 128v13.4c-56.6 26.4-96 81-96 146.6 0 88.36 71.6 160 160 160s160-71.64 160-160c0-65.6-39.4-120.2-96-146.6v-13.4c0-141.4-114.6-256-256-256h-192c-70.6 0-128-57.4-128-128v-13.4c56.6-24.6 96-81 96-146.6 0-88.4-71.6-160-160-160-88.36 0-160 71.6-160 160 0 65.6 39.5 122 96 146.6v410.8c-56.5 26.4-96 81-96 146.6 0 88.36 71.64 160 160 160 88.4 0 160-71.64 160-160v0zM160 752c26.5 0 48 21.5 48 48s-21.5 48-48 48c-26.5 0-48-21.5-48-48s21.5-48 48-48zM736 848c-26.6 0-48-21.5-48-48s21.4-48 48-48c26.6 0 48 21.5 48 48s-21.4 48-48 48zM160 48c26.5 0 48 21.4 48 48s-21.5 48-48 48c-26.5 0-48-21.4-48-48s21.5-48 48-48z" />
<glyph unicode="&#xe90c;" glyph-name="house" horiz-adv-x="1152" d="M1151.6 449c0-36-30-64.2-64-64.2h-64l1.4-320.2c0-5.6-0.4-10.8-1-16.2v-32.4c0-44.2-35.8-80-80-80h-32c-2.2 0-4.4 1.8-6.6 0.2-2.8 1.6-5.6-0.2-8.4-0.2h-113c-44.2 0-80 35.8-80 80v176c0 35.4-28.6 64-64 64h-128c-35.4 0-64-28.6-64-64v-176c0-44.2-35.8-80-80-80h-111.8c-3 0-6 0.2-9 0.4-2.4-0.2-4.8-0.4-7.2-0.4h-32c-44.18 0-80 35.8-80 80v224c0 1.8 0.060 3.8 0.18 5.6v139.2h-64.080c-36.060 0-64.1 28.2-64.1 64.2 0 18 6.008 34 20.020 48l512.78 446.968c14 14.028 30 16.032 44 16.032s30-4.008 42.2-14.028l510.6-448.972c16-14 24.2-30 22-48v0z" />
<glyph unicode="&#xe90d;" glyph-name="language" horiz-adv-x="1280" d="M896 632c22 0 40-16.2 40-40v-8h120c22 0 40-16.2 40-40 0-22-18-40-40-40h-4l-3.2-9c-17.8-47.2-45-93.2-79.4-130.8 1.8-1 3.6-0.4 5.4-3.2l37.8-22.6c19-11.4 25-36 13.6-55-11.2-19-35.8-25-54.8-13.6l-37.8 22.6c-8.8 5.4-19.4 11-26.2 17-21-15-43.8-28-67.8-38.8l-7.4-3.2c-20.2-9-43.8 0.2-52.8 20.4s0.2 43.8 20.4 52.8l7.2 3.2c12.8 5.8 25.2 14 37 19.6l-24.2 24.4c-15.8 15.6-15.8 40.8 0 56.4 15.6 15.8 40.8 15.8 56.4 0l29.2-29 1.2 0.6c24.8 24.4 45 54.8 59.6 90h-214.2c-23.8 0-40 16.2-40 40 0 22 16.2 40 40 40h104v8c0 22 16.2 40 40 40v-1.8zM320 493.6l38-85.6h-77.8l39.8 85.6zM0 704c0 70.7 57.3 128 128 128h1024c70.6 0 128-57.3 128-128v-512c0-70.6-57.4-128-128-128h-1024c-70.7 0-128 57.4-128 128v512zM640 192h512v512h-512v-512zM356.6 608.2c-6.4 14.4-20.8 23.8-36.6 23.8s-30.2-9.4-36.6-23.8l-127.96-288c-8.96-18.4 0.12-43.8 20.32-52.8 20.18-9 43.84 0.2 52.84 20.4l17.8 42h147.2l17.8-42c9-20.2 32.6-29.4 52.8-20.4s29.4 34.4 20.4 52.8l-128 288z" />
<glyph unicode="&#xe90e;" glyph-name="mic-lines" horiz-adv-x="768" d="M384 256c106.060 0 192 85.94 192 192h-160c-17.6 0-32 14.4-32 32s14.4 32 32 32h160v64h-160c-17.6 0-32 14.4-32 32s14.4 32 32 32h160v65.8h-160c-17.672 0-32 14.328-32 32s14.328 32 32 32l160-1.8c0 106.060-85.94 192-192 192s-192-85.94-192-192v-320c0-106 84.2-192 192-192zM688 576c-26.6 0-48-21.4-48-46.2v-81.8c0-146.66-123.94-264.8-272.6-255.4-132.16 8.338-239.4 133.18-239.4 265.6v71.6c0 24.8-21.5 46.2-48 46.2s-48-21.4-48-46.2v-64.3c0-179.32 127.94-339.2 304-363.4v-70.1h-80c-36.38 0-65.68-30.36-63.92-67.14 0.78-16.46 15.52-28.86 31.92-28.86h320c16.444 0 31.14 12.432 31.92 28.86 1.68 36.74-27.52 67.14-63.92 67.14h-80v67.54c171.4 23.46 304 170.66 304 348.46v81.8c0 24.8-21.4 46.2-48 46.2z" />
<glyph unicode="&#xe90f;" glyph-name="question" horiz-adv-x="640" d="M408.6 895.98h-216.6c-105.88 0-192-86.12-192-192 0-35.34 28.62-62.2 64-62.2s64 28.64 64 62.2c0 35.28 28.68 64 64 64h216.6c57 0 103.4-46.38 103.4-103.58 0-39.44-21.94-74.94-61-94.66l-195.4-114.54c-21.4-11.6-31.6-32.6-31.6-55.2v-80c0-35.34 28.62-63.98 64-63.98s64 28.64 64 63.98v43.4l160 94c78.94 39.5 128 118.84 128 207 0 127.7-103.8 231.58-231.4 231.58zM288 160c-44.18 0-80-35.82-80-80s35.82-78.2 80-78.2 80 35.8 80 78.2-35.8 80-80 80z" />
<glyph unicode="&#xe910;" glyph-name="discord" horiz-adv-x="1280" d="M1049.062 820.328c-0.331 0.632-0.862 1.122-1.508 1.393l-0.020 0.007c-69.126 32.613-149.446 58.394-233.51 73.348l-5.862 0.864c-0.2 0.039-0.429 0.061-0.664 0.061-1.364 0-2.552-0.751-3.173-1.863l-0.009-0.018c-9.162-16.095-19.138-36.2-28.112-56.841l-1.688-4.359c-40.401 6.456-86.983 10.145-134.426 10.145s-94.023-3.689-139.472-10.794l5.046 0.65c-10.583 24.679-20.712 44.78-31.866 64.218l1.596-3.018c-0.669 1.124-1.878 1.866-3.26 1.866-0.208 0-0.412-0.017-0.61-0.049l0.022 0.003c-89.917-15.782-170.24-41.566-245.309-76.709l5.933 2.495c-0.662-0.286-1.201-0.752-1.568-1.338l-0.008-0.014c-152.458-227.676-194.222-449.754-173.734-669.082 0.124-1.122 0.692-2.092 1.521-2.743l0.009-0.007c83.919-62.742 181.476-113.306 286.742-146.499l6.908-1.879c0.327-0.102 0.702-0.16 1.092-0.16 1.236 0 2.333 0.59 3.027 1.503l0.007 0.009c20.992 28.236 40.943 60.215 58.246 93.782l1.828 3.902c0.254 0.49 0.402 1.069 0.402 1.683 0 1.595-1.004 2.956-2.415 3.485l-0.026 0.008c-35.78 13.792-65.987 28.482-94.765 45.347l3.029-1.641c-1.12 0.667-1.859 1.872-1.859 3.25 0 1.221 0.58 2.306 1.48 2.995l0.009 0.007c6.164 4.618 12.332 9.422 18.218 14.274 0.623 0.516 1.432 0.83 2.313 0.83 0.539 0 1.050-0.117 1.51-0.327l-0.023 0.009c192.458-87.834 400.82-87.834 591 0 0.455 0.221 0.99 0.351 1.556 0.351 0.873 0 1.673-0.308 2.298-0.822l-0.006 0.005c5.888-4.852 12.054-9.702 18.264-14.32 0.922-0.695 1.511-1.787 1.511-3.017 0-1.367-0.728-2.565-1.818-3.225l-0.017-0.009c-25.909-15.466-56.144-30.151-87.654-42.266l-4.126-1.394c-1.425-0.553-2.416-1.913-2.416-3.505 0-0.627 0.154-1.218 0.426-1.738l-0.010 0.021c19.628-37.639 39.545-69.579 61.585-99.876l-1.557 2.246c0.684-0.951 1.788-1.563 3.035-1.563 0.389 0 0.765 0.060 1.117 0.17l-0.026-0.007c112.357 34.95 210.088 85.528 296.679 150.197l-2.555-1.825c0.853 0.627 1.427 1.59 1.529 2.689l0.001 0.015c24.528 253.566-41.064 473.824-173.868 669.082zM444.982 284.84c-57.944 0-105.688 53.174-105.688 118.478s46.818 118.482 105.688 118.482c59.33 0 106.612-53.64 105.686-118.478 0-65.308-46.82-118.482-105.686-118.482zM835.742 284.84c-57.942 0-105.686 53.174-105.686 118.478s46.818 118.482 105.686 118.482c59.334 0 106.614-53.64 105.688-118.478 0-65.308-46.354-118.482-105.688-118.482z" />
<glyph unicode="&#xe911;" glyph-name="writefreely" horiz-adv-x="1494" d="M326.398 822.626c-67.928-27.714-164.043-198.071-111.689-197.965 11.689 0.026 27.462 19.506 44.156 54.547 30.149 63.267 71.106 92.993 97.712 70.916 21.255-17.641 21.513-89.775 0.91-254.979-39.539-316.996-6.135-421.177 135.046-421.177 82.517 0 169.528 63.124 226.677 164.445l25.359 44.964 4.578-63.285c20.127-278.251 408.917-148.508 518.99 173.19 86.012 251.376-51.838 536.687-202.038 418.17-64.428-50.843-4.336-141.484 66.875-100.878 111.357 63.494 127.429-242.113 21.46-408.032-84.677-132.585-214.644-193.552-277.853-130.344-32.561 32.562-33.85 102.731-5.484 298.609 18.047 124.625 25.9 232.713 22.694 312.261-1.122 27.769-2.859 28.573-61.669 28.573h-60.521l4.592-111.689c12.99-316.177-121.112-587.149-271.854-549.316-49.7 12.471-53.49 49.311-28.803 279.964 27.078 253.024 27.557 292.562 4.095 339.336-27.547 54.927-92.775 77.354-153.232 52.689z" />
<glyph unicode="&#xea96;" glyph-name="twitter" d="M1024 733.6c-37.6-16.8-78.2-28-120.6-33 43.4 26 76.6 67.2 92.4 116.2-40.6-24-85.6-41.6-133.4-51-38.4 40.8-93 66.2-153.4 66.2-116 0-210-94-210-210 0-16.4 1.8-32.4 5.4-47.8-174.6 8.8-329.4 92.4-433 219.6-18-31-28.4-67.2-28.4-105.6 0-72.8 37-137.2 93.4-174.8-34.4 1-66.8 10.6-95.2 26.2 0-0.8 0-1.8 0-2.6 0-101.8 72.4-186.8 168.6-206-17.6-4.8-36.2-7.4-55.4-7.4-13.6 0-26.6 1.4-39.6 3.8 26.8-83.4 104.4-144.2 196.2-146-72-56.4-162.4-90-261-90-17 0-33.6 1-50.2 3 93.2-59.8 203.6-94.4 322.2-94.4 386.4 0 597.8 320.2 597.8 597.8 0 9.2-0.2 18.2-0.6 27.2 41 29.4 76.6 66.4 104.8 108.6z" />
<glyph unicode="&#xea9b;" glyph-name="rss" d="M136.294 209.070c-75.196 0-136.292-61.334-136.292-136.076 0-75.154 61.1-135.802 136.292-135.802 75.466 0 136.494 60.648 136.494 135.802-0.002 74.742-61.024 136.076-136.494 136.076zM0.156 612.070v-196.258c127.784 0 247.958-49.972 338.458-140.512 90.384-90.318 140.282-211.036 140.282-339.3h197.122c-0.002 372.82-303.282 676.070-675.862 676.070zM0.388 960v-196.356c455.782 0 826.756-371.334 826.756-827.644h196.856c0 564.47-459.254 1024-1023.612 1024z" />
<glyph unicode="&#xea9d;" glyph-name="youtube" d="M1013.8 652.8c0 0-10 70.6-40.8 101.6-39 40.8-82.6 41-102.6 43.4-143.2 10.4-358.2 10.4-358.2 10.4h-0.4c0 0-215 0-358.2-10.4-20-2.4-63.6-2.6-102.6-43.4-30.8-31-40.6-101.6-40.6-101.6s-10.2-82.8-10.2-165.8v-77.6c0-82.8 10.2-165.8 10.2-165.8s10-70.6 40.6-101.6c39-40.8 90.2-39.4 113-43.8 82-7.8 348.2-10.2 348.2-10.2s215.2 0.4 358.4 10.6c20 2.4 63.6 2.6 102.6 43.4 30.8 31 40.8 101.6 40.8 101.6s10.2 82.8 10.2 165.8v77.6c-0.2 82.8-10.4 165.8-10.4 165.8zM406.2 315.2v287.8l276.6-144.4-276.6-143.4z" />
<glyph unicode="&#xea9f;" glyph-name="twitch" d="M96 960l-96-160v-736h256v-128h128l128 128h160l288 288v608h-864zM832 416l-160-160h-160l-128-128v128h-192v576h640v-416zM608 704h96v-256h-96v256zM416 704h96v-256h-96v256z" />
<glyph unicode="&#xeab0;" glyph-name="github" d="M512.008 947.358c-282.738 0-512.008-229.218-512.008-511.998 0-226.214 146.704-418.132 350.136-485.836 25.586-4.738 34.992 11.11 34.992 24.632 0 12.204-0.48 52.542-0.696 95.324-142.448-30.976-172.504 60.41-172.504 60.41-23.282 59.176-56.848 74.916-56.848 74.916-46.452 31.778 3.51 31.124 3.51 31.124 51.4-3.61 78.476-52.766 78.476-52.766 45.672-78.27 119.776-55.64 149.004-42.558 4.588 33.086 17.852 55.68 32.506 68.464-113.73 12.942-233.276 56.85-233.276 253.032 0 55.898 20.004 101.574 52.76 137.428-5.316 12.9-22.854 64.972 4.952 135.5 0 0 43.006 13.752 140.84-52.49 40.836 11.348 84.636 17.036 128.154 17.234 43.502-0.198 87.336-5.886 128.256-17.234 97.734 66.244 140.656 52.49 140.656 52.49 27.872-70.528 10.35-122.6 5.036-135.5 32.82-35.856 52.694-81.532 52.694-137.428 0-196.654-119.778-239.95-233.79-252.624 18.364-15.89 34.724-47.046 34.724-94.812 0-68.508-0.596-123.644-0.596-140.508 0-13.628 9.222-29.594 35.172-24.566 203.322 67.776 349.842 259.626 349.842 485.768 0 282.78-229.234 511.998-511.992 511.998z" />
<glyph unicode="&#xeac6;" glyph-name="reddit" d="M256 320c0 35.346 28.654 64 64 64s64-28.654 64-64c0-35.346-28.654-64-64-64s-64 28.654-64 64zM640 320c0 35.346 28.654 64 64 64s64-28.654 64-64c0-35.346-28.654-64-64-64s-64 28.654-64 64zM643.112 183.222c16.482 12.986 40.376 10.154 53.364-6.332s10.152-40.378-6.334-53.366c-45.896-36.158-115.822-59.524-178.142-59.524-62.322 0-132.248 23.366-178.144 59.522-16.486 12.99-19.32 36.882-6.332 53.368 12.99 16.482 36.882 19.318 53.366 6.332 26.422-20.818 78.722-43.222 131.11-43.222s104.688 22.404 131.112 43.222zM1024 448c0 70.692-57.308 128-128 128-48.116 0-89.992-26.57-111.852-65.82-65.792 35.994-145.952 59.246-233.28 64.608l76.382 171.526 146.194-42.2c13.152-37.342 48.718-64.114 90.556-64.114 53.020 0 96 42.98 96 96s-42.98 96-96 96c-36.56 0-68.342-20.442-84.554-50.514l-162.906 47.024c-18.224 5.258-37.538-3.722-45.252-21.052l-103.77-233.026c-85.138-5.996-163.262-29.022-227.636-64.236-21.864 39.25-63.766 65.804-111.882 65.804-70.692 0-128-57.308-128-128 0-52.312 31.402-97.254 76.372-117.102-8.070-24.028-12.372-49.104-12.372-74.898 0-176.73 200.576-320 448-320 247.422 0 448 143.27 448 320 0 25.792-4.3 50.862-12.368 74.886 44.97 19.85 76.368 64.802 76.368 117.114zM864 772c19.882 0 36-16.118 36-36s-16.118-36-36-36-36 16.118-36 36 16.118 36 36 36zM64 448c0 35.29 28.71 64 64 64 25.508 0 47.572-15.004 57.846-36.646-33.448-25.366-61.166-54.626-81.666-86.738-23.524 9.47-40.18 32.512-40.18 59.384zM512 12c-205.45 0-372 109.242-372 244s166.55 244 372 244c205.45 0 372-109.242 372-244s-166.55-244-372-244zM919.82 388.616c-20.5 32.112-48.218 61.372-81.666 86.738 10.276 21.642 32.338 36.646 57.846 36.646 35.29 0 64-28.71 64-64 0-26.872-16.656-49.914-40.18-59.384z" />
<glyph unicode="&#xeac9;" glyph-name="linkedin" d="M928 960h-832c-52.8 0-96-43.2-96-96v-832c0-52.8 43.2-96 96-96h832c52.8 0 96 43.2 96 96v832c0 52.8-43.2 96-96 96zM384 128h-128v448h128v-448zM320 640c-35.4 0-64 28.6-64 64s28.6 64 64 64c35.4 0 64-28.6 64-64s-28.6-64-64-64zM832 128h-128v256c0 35.4-28.6 64-64 64s-64-28.6-64-64v-256h-128v448h128v-79.4c26.4 36.2 66.8 79.4 112 79.4 79.6 0 144-71.6 144-160v-288z" />
<glyph unicode="&#xeae7;" glyph-name="git" d="M1004.692 493.606l-447.096 447.080c-25.738 25.754-67.496 25.754-93.268 0l-103.882-103.876 78.17-78.17c12.532 5.996 26.564 9.36 41.384 9.36 53.020 0 96-42.98 96-96 0-14.82-3.364-28.854-9.362-41.386l127.976-127.974c12.532 5.996 26.566 9.36 41.386 9.36 53.020 0 96-42.98 96-96s-42.98-96-96-96-96 42.98-96 96c0 14.82 3.364 28.854 9.362 41.386l-127.976 127.974c-3.042-1.456-6.176-2.742-9.384-3.876v-266.968c37.282-13.182 64-48.718 64-90.516 0-53.020-42.98-96-96-96s-96 42.98-96 96c0 41.796 26.718 77.334 64 90.516v266.968c-37.282 13.18-64 48.72-64 90.516 0 14.82 3.364 28.852 9.36 41.384l-78.17 78.17-295.892-295.876c-25.75-25.776-25.75-67.534 0-93.288l447.12-447.080c25.738-25.75 67.484-25.75 93.268 0l445.006 445.006c25.758 25.762 25.758 67.54-0.002 93.29z" />
</font></defs></svg>

Before

Width:  |  Height:  |  Size: 21 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 28 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 30 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 94 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 9.4 KiB

View File

@ -1,56 +0,0 @@
each(range(5), {
.gap-@{value}rem {
gap: @value * 1rem;
}
});
.flex {
display: flex;
}
.flex-inline {
display: inline-flex;
}
.flex-inline-col {
.flex-inline();
flex-direction: column;
}
.flex-col {
.flex();
flex-direction: column;
}
.flex-row {
.flex();
flex-direction: row;
}
@flex-justifications-prefixed: flex-start, flex-end;
each(@flex-justifications-prefixed, {
.@{value} {
.flex();
justify-content: @value;
}
});
@flex-justifications: center, space-between, space-around, space-evenly;
each(@flex-justifications, {
.flex-@{value} {
.flex();
justify-content: @value;
}
});
.rounded-corners {
border-radius: 0.3rem;
}
.center {
margin: 0 auto;
}
.card-width {
max-width: 35rem;
}

View File

@ -1,112 +0,0 @@
@font-face {
font-family: "phunic";
src: url("/fonts/phunic.eot");
src: url("/fonts/phunic.eot#iefix") format("embedded-opentype"),
url("/fonts/phunic.ttf") format("truetype"),
url("/fonts/phunic.woff") format("woff"),
url("/fonts/phunic.svg#phunic") format("svg");
font-weight: normal;
font-style: normal;
font-display: block;
}
i.icon {
/* use !important to prevent issues with browser extensions that change fonts */
font-family: "phunic" !important;
speak: never;
font-style: normal;
font-weight: normal;
font-variant: normal;
text-transform: none;
line-height: 1;
/* Better Font Rendering =========== */
-webkit-font-smoothing: antialiased;
-moz-osx-font-smoothing: grayscale;
&::before {
width: 1.4rem;
display: inline-block;
align-content: center;
text-align: center;
}
}
.phunic-envelope:before {
content: "\e901";
}
.phunic-discord:before {
content: "\e910";
}
.phunic-writefreely:before {
content: "\e911";
}
.phunic-mastodon:before {
content: "\e907";
}
.phunic-link:before {
content: "\e909";
}
.phunic-star:before {
content: "\e900";
}
.phunic-share:before {
content: "\e904";
}
.phunic-terminal:before {
content: "\e905";
}
.phunic-at:before {
content: "\e906";
}
.phunic-conlang:before {
content: "\e908";
}
.phunic-code:before {
content: "\e90a";
}
.phunic-fork:before {
content: "\e90b";
}
.phunic-house:before {
content: "\e90c";
}
.phunic-language:before {
content: "\e90d";
}
.phunic-mic-lines:before {
content: "\e90e";
}
.phunic-question:before {
content: "\e90f";
}
.phunic-emacs:before {
content: "\e902";
}
.phunic-gitea:before {
content: "\e903";
}
.phunic-twitter:before {
content: "\ea96";
}
.phunic-rss:before {
content: "\ea9b";
}
.phunic-youtube:before {
content: "\ea9d";
}
.phunic-twitch:before {
content: "\ea9f";
}
.phunic-github:before {
content: "\eab0";
}
.phunic-reddit:before {
content: "\eac6";
}
.phunic-linkedin:before {
content: "\eac9";
}
.phunic-git:before {
content: "\eae7";
}

View File

@ -1,174 +0,0 @@
/*
* Nord Theme:
* - Copyright (c) 2016-present Arctic Ice Studio <development@arcticicestudio.com>
* - Copyright (c) 2016-present Sven Greb <development@svengreb.de>
*/
:root {
--nord0: #2e3440;
--nord1: #3b4252;
--nord2: #434c5e;
--nord3: #4c566a;
--nord4: #d8dee9;
--nord5: #e5e9f0;
--nord6: #eceff4;
--nord7: #8fbcbb;
--nord8: #88c0d0;
--nord9: #81a1c1;
--nord10: #5e81ac;
--nord11: #bf616a;
--nord12: #d08770;
--nord13: #ebcb8b;
--nord14: #a3be8c;
--nord15: #b48ead;
scroll-behavior: smooth;
// brand colors
--c-brand: var(--nord10);
--c-brand-light: var(--nord9);
// background colors
--c-bg: var(--nord6);
--c-bg-light: var(--nord6);
--c-bg-lighter: var(--nord5);
--c-bg-dark: var(--nord5);
--c-bg-darker: var(--nord4);
--c-bg-navbar: var(--c-bg);
--c-bg-sidebar: var(--c-bg);
--c-bg-arrow: var(--nord4);
// text colors
--c-text: var(--nord1);
--c-text-accent: var(--c-brand);
--c-text-light: var(--nord2);
--c-text-lighter: var(--nord3);
--c-text-lightest: var(--nord4);
--c-text-quote: var(--nord2);
// border colors
--c-border: var(--nord4);
--c-border-dark: var(--nord4);
// custom container colors
--c-tip: var(--nord14);
--c-tip-bg: var(--c-bg);
--c-tip-title: var(--c-text);
--c-tip-text: var(--c-text);
--c-tip-text-accent: var(--c-text-accent);
--c-warning: var(--nord13);
--c-warning-bg: var(--c-bg);
--c-warning-bg-light: var(--c-bg-light);
--c-warning-bg-lighter: var(--c-bg-lighter);
--c-warning-border-dark: var(--nord3);
--c-warning-details-bg: var(--c-bg);
--c-warning-title: var(--nord12);
--c-warning-text: var(--nord12);
--c-warning-text-accent: var(--nord12);
--c-warning-text-light: var(--nord12);
--c-warning-text-quote: var(--nord12);
--c-danger: var(--nord11);
--c-danger-bg: var(--c-bg);
--c-danger-bg-light: var(--c-bg-light);
--c-danger-bg-lighter: var(--c-bg-light);
--c-danger-border-dark: var(--nord11);
--c-danger-details-bg: var(--nord2);
--c-danger-title: var(--nord11);
--c-danger-text: var(--nord11);
--c-danger-text-accent: var(--nord11);
--c-danger-text-light: var(--nord11);
--c-danger-text-quote: var(--nord11);
--c-details-bg: var(--c-bg-lighter);
// badge component colors
--c-badge-tip: var(--c-tip);
--c-badge-warning: var(--c-warning);
--c-badge-warning-text: var(--c-bg);
--c-badge-danger: var(--c-danger);
--c-badge-danger-text: var(--c-bg);
// transition vars
--t-color: 0.3s ease;
--t-transform: 0.3s ease;
// code blocks vars
--code-bg-color: var(--nord0);
--code-hl-bg-color: var(--nord1);
--code-ln-color: #9e9e9e;
--code-ln-wrapper-width: 3.5rem;
// font vars
--font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, Oxygen,
Ubuntu, Cantarell, "Fira Sans", "Droid Sans", "Helvetica Neue", sans-serif;
--font-family-code: Consolas, Monaco, "Andale Mono", "Ubuntu Mono", monospace;
// layout vars
--navbar-height: 3.6rem;
--navbar-padding-v: 0.7rem;
--navbar-padding-h: 1.5rem;
--sidebar-width: 20rem;
--sidebar-width-mobile: calc(var(--sidebar-width) * 0.82);
--content-width: 740px;
--homepage-width: 960px;
}
html.dark {
// brand colors
--c-brand: var(--nord14);
--c-brand-light: var(--nord14);
// background colors
--c-bg: var(--nord1);
--c-bg-light: var(--nord2);
--c-bg-lighter: var(--nord2);
--c-bg-dark: var(--nord3);
--c-bg-darker: var(--nord3);
// text colors
--c-text: var(--nord4);
--c-text-light: var(--nord5);
--c-text-lighter: var(--nord5);
--c-text-lightest: var(--nord6);
--c-text-quote: var(--c-text);
// border colors
--c-border: var(--nord3);
--c-border-dark: var(--nord3);
// custom container colors
--c-tip: var(--nord14);
--c-warning: var(--nord13);
--c-warning-bg: var(--c-bg);
--c-warning-bg-light: var(--c-bg-light);
--c-warning-bg-lighter: var(--c-bg-lighter);
--c-warning-border-dark: var(--nord3);
--c-warning-details-bg: var(--c-bg);
--c-warning-title: var(--nord13);
--c-warning-text: var(--nord13);
--c-warning-text-accent: var(--nord13);
--c-warning-text-light: var(--nord13);
--c-warning-text-quote: var(--nord13);
--c-danger: var(--nord11);
--c-danger-bg: var(--c-bg);
--c-danger-bg-light: var(--c-bg-light);
--c-danger-bg-lighter: var(--c-bg-light);
--c-danger-border-dark: var(--nord11);
--c-danger-details-bg: var(--nord2);
--c-danger-title: var(--nord11);
--c-danger-text: var(--nord11);
--c-danger-text-accent: var(--nord11);
--c-danger-text-light: var(--nord11);
--c-danger-text-quote: var(--nord11);
--c-details-bg: var(--c-bg-light);
// badge component colors
--c-badge-warning-text: var(--nord0);
--c-badge-danger-text: var(--nord0);
// code blocks vars
--code-hl-bg-color: var(--nord2);
}

View File

@ -1,56 +0,0 @@
const pages: string[] = [
'/index.md',
'/find-me.md',
'/resume.md',
'/projects.md',
'/conlanging.md',
'/vocal-synthesis.md',
'/about.md',
'/privacy.md',
];
const localePages = (languagePrefix: string) => {
return pages.map((page: string) => `/${languagePrefix}${page}`);
};
export const themeLocales = {
'/': {
selectLanguageName: 'Français',
tip: 'nota bene',
warning: 'attention',
sidebar: pages,
notFound: [
'Cest bien vide ici',
'Pourquoi sommes-nous ici?',
'Erreur 404',
'Le lien ne semble pas être correct',
],
backToHome: 'Retour accueil',
openInNewWindow: 'Ouvrir dans une nouvelle fenêtre',
toggleColorMode: 'Changer de thème',
toggleSidebar: 'Barre latérale',
lastUpdatedText: 'Dernière mise à jour',
},
'/lfn/': {
selectLanguageName: 'Elefen',
tip: 'avisa',
warning: 'averti',
danger: 'peril',
sidebar: localePages('lfn'),
notFound: [
'Ce? Se no ave no cosa asi',
'A do vade tu?',
'Era 404',
'La lia no es coreta',
],
backToHome: 'reversa a la paja prima',
openInNewWindow: 'abri en un nova fenetra',
toggleColorMode: 'cambia la colores',
toggleSidebar: 'bara ladal',
lastUpdatedText: 'Ultima refresci',
},
'/en/': {
selectLanguageName: 'English',
sidebar: localePages('en'),
},
};

View File

@ -1,106 +0,0 @@
export interface GithubRepo {
id: number;
node_id: string;
name: string;
full_name: string;
private: boolean;
owner: Owner;
html_url: string;
description: string;
fork: boolean;
url: string;
forks_url: string;
keys_url: string;
collaborators_url: string;
teams_url: string;
hooks_url: string;
issue_events_url: string;
events_url: string;
assignees_url: string;
branches_url: string;
tags_url: string;
blobs_url: string;
git_tags_url: string;
git_refs_url: string;
trees_url: string;
statuses_url: string;
languages_url: string;
stargazers_url: string;
contributors_url: string;
subscribers_url: string;
subscription_url: string;
commits_url: string;
git_commits_url: string;
comments_url: string;
issue_comment_url: string;
contents_url: string;
compare_url: string;
merges_url: string;
archive_url: string;
downloads_url: string;
issues_url: string;
pulls_url: string;
milestones_url: string;
notifications_url: string;
labels_url: string;
releases_url: string;
deployments_url: string;
created_at: string;
updated_at: string;
pushed_at: string;
git_url: string;
ssh_url: string;
clone_url: string;
svn_url: string;
homepage: string;
size: number;
stargazers_count: number;
watchers_count: number;
language: string;
has_issues: boolean;
has_projects: boolean;
has_downloads: boolean;
has_wiki: boolean;
has_pages: boolean;
forks_count: number;
mirror_url: null;
archived: boolean;
disabled: boolean;
open_issues_count: number;
license: null;
allow_forking: boolean;
is_template: boolean;
web_commit_signoff_required: boolean;
topics: any[];
visibility: string;
forks: number;
open_issues: number;
watchers: number;
default_branch: string;
}
export interface Owner {
login: string;
id: number;
node_id: string;
avatar_url: string;
gravatar_id: string;
url: string;
html_url: string;
followers_url: string;
following_url: string;
gists_url: string;
starred_url: string;
subscriptions_url: string;
organizations_url: string;
repos_url: string;
events_url: string;
received_events_url: string;
type: string;
site_admin: boolean;
}
export interface GithubError {
message: string;
documentation_url: string;
}

View File

@ -1,24 +0,0 @@
#+setupfile: ./headers
#+language: fr
* À Propos
** Introduction
Ceci est le site web personnel de Lucien Cartier-Tilet, aussi connu
sous le nom de « Pundrak » ou « Phundrak ».
Il est écrit avec [[https://v2.vuepress.vuejs.org/][Vuepress]] et est entièrement open-source. Vous pouvez
trouver son code source sur [[https://labs.phundrak.com/phundrak/phundrak.com][mon instance personnelle Gitea]]. Les icônes
utilisées sur ce site proviennent de plusieurs sources différentes :
- [[https://icomoon.io][IcoMoon]], que jutilise pour consolider toutes les icônes dans une
même fonte, y compris des icônes de leur pack par défaut,
- [[https://fontawesome.com/][FontAwesome]] doù viennent la majorité des icônes (leur
implémentation de leur paquet pour Vue laisse à mon avis plus quà
désirer),
- La {{{icon(conlang)}}} [[https://conlang.org/][Société de Création de Langues]] dont jai modifié
leur logo afin de créer licône pour mes langues construites,
- {{{icon(emacs)}}} [[https://www.gnu.org/software/emacs/][Emacs]] et {{{icon(writefreely)}}} [[https://writefreely.org/][WriteFreely]] dont jai recréé
une partie de leur logo respectif en SVG afin den créer une icône,
- {{{icon(gitea)}}} [[https://gitea.io/][Gitea]] dont jai modifié le logo en SVG pour lavoir en
monochrome.
#+include: other-links

View File

@ -1,39 +0,0 @@
#+setupfile: ./headers
#+language: fr
* Création de langues
Les /idéolangues/, ou /langues construites/ (en anglais /conlang/), sont des
langues construites et artificielles, nées de lesprit dune ou
parfois quelques personnes. Elles se distinguent ainsi des /langues
naturelles/ qui sont des langues ayant naturellement évolué depuis
dautres langues plus anciennes, comme le Français, lAnglais, le
Mandarin, le Japonais, le bahasa ou le !xhosa (oui, le point
dexclamation fait partie de lorthographe du nom de la langue).
Les idéolangues peuvent avoir différents buts lors de leur création,
par exemple :
- être parlées comme des langues naturelles par des individus afin de
servir de /lingua franca/ entre plusieurs communautés linguistiques,
comme le célèbre [[https://en.wikipedia.org/wiki/Esperanto][espéranto]] ou bien la [[https://elefen.org][lingua franca nova]]
- être une langue secrète que seules quelques personnes connaissent
afin de communiquer entre eux sans que dautres personnes puissent
comprendre, un peu comme un argot, mais plus poussé encore
- être une expérience concrète de linguistique, comme le [[https://en.wikipedia.org/wiki/Lojban][lojban]] qui
essaie dêtre la langue la plus logique qui soit
- complémenter un univers littéraire, comme les langues elfiques de
Tolkien ou le klingon de Star Trek
- juste être une forme dart, comme la peinture ou la poésie
Dans mon cas, les deux dernières justifications sont celles qui me
poussent à créer de nouvelles langues. Mes deux projets principaux
actuellement sont le [[https://conlang.phundrak.com/proto-nyqy][proto-ñyqy]] et l[[https://conlang.phundrak.com/eittlandic][éittlandais]]. La première est une
langue racine qui me permettra de développer toute une famille de
langues dans mon univers littéraire, tandis que la seconde sinscrit
dans un exercice créatif de création dun pays fictif présent dans
notre monde.
Plus dinformations peuvent être trouvées sur [[https://conlang.phundrak.com/][mon site
didéolinguistique]] (en anglais)
#+include: other-links

View File

@ -1,22 +0,0 @@
#+setupfile: ../headers
#+language: en
* About
** Introduction
This is the personal website of Lucien “Phundrak” Cartier-Tilet.
It is written with [[https://v2.vuepress.vuejs.org/][Vuepress]] and is completely open-source. You can
find the source code on my [[https://labs.phundrak.com/phundrak/phundrak.com][personal Gitea instance]]. Icons used on this
website come from different sources:
- [[https://icomoon.io/][IcoMoon]] which I use to consolidate all the icons used in one font,
including some icons from their default pack
- [[https://fontawesome.com/][FontAwesome]] from which most icons come from --- their Vue package
is, in my opinion, really not usable
- The {{{icon(conlang)}}} [[https://conlang.org/][Language Creation Society]] whose logo I modified in
order to create the icon used here when referring to my constructed
languages
- {{{icon(emacs)}}} [[https://www.gnu.org/software/emacs/][Emacs]] and {{{icon(writefreely)}}} [[https://writefreely.org/][WriteFreely]] whose respective
logo I partially remade as an SVG file in order to create an icon.
- {{{icon(gitea)}}} [[https://gitea.io][Gitea]] whose logo I modified to be monochromatic
#+include: other-links

View File

@ -1,28 +0,0 @@
#+setupfile: ../headers
#+language: en
* Conlanging
/Conlangs/, short for /constructed languages/, are artificial
languages born out of the mind of a single individual (sometimes a
couple of them), unlike natural languages born through countless
iterations by their native speakers, slowly evolving over time like
English, French, Mandarin, Japanese, Bahasa, or !Xhosa did.
They can serve various goals from their creators:
- be spoken by as many people as possible as a neutral language, like
[[https://en.wikipedia.org/wiki/Esperanto][Esperanto]] and [[https://elefen.org][Lingua Franca Nova]]
- be a secret language between a couple of people
- as a thought experiment, like [[https://en.wikipedia.org/wiki/Lojban][Lojban]]
- fill a litterary universe, like Tolkiens elvish languages or Star
Treks Klingon
- for the sake of art itself
In my case, the last two reasons are the main ones driving me to
create languages. My two main projects at the time of writing this
page are [[https://conlang.phundrak.com/proto-nyqy][Proto-Ñyqy]] and [[https://conlang.phundrak.com/eittlandic][Eittlandic]]. Both are accompanied by their own
worldbuilding project, although Proto-Ñyqys worldbuilding is still
largely secret while Eittlands worldbuilding is mostly public.
More information can be found on my [[https://conlang.phundrak.com/][conlanging website]].
#+include: other-links

View File

@ -1,31 +0,0 @@
#+setupfile: ../headers
#+language: en
* Where to find me
I am on various websites and some social networks where you can follow
me.
** Social Networks
- {{{icon(mastodon)}}} *Mastodon* :: [[https://mastodon.phundrak.com/@phundrak][@phundrak@mastodon.phundrak.com]]
- {{{icon(twitter)}}} *Twitter* :: [[https://twitter.com/phundrak][@phundrak]], though I harldy use it anymore
and mostly reshare my Mastodon messages when I think to, and
sometimes they get truncated
- {{{icon(writefreely)}}} *Writefreely* ::
- [[https://write.phundrak.com/phundrak][@phundrak@write.phundrak.com]] : blog alternative
- [[https://write.phundrak.com/phundraks-short-stories][@phundraks-short-stories@write.phundrak.com]] :: short stories,
mainly in French for now
- {{{icon(discord)}}} *Discord* :: =@phundrak= (tell me you come from here,
otherwise theres a chance Ill consider your message as spam)
** Other Websites
- {{{icon(envelope)}}} *Email* :: [[mailto:lucien@phundrak.com][lucien@phundrak.com]]
- {{{icon(rss)}}} *Blog* :: [[https://blog.phundrak.com][blog.phundrak.com]]
- {{{icon(gitea)}}} *Gitea* :: [[https://labs.phundrak.com/phundrak][@phundrak@labs.phundrak.com]]
- {{{icon(github)}}} *GitHub* :: [[https://github.com/Phundrak][Phundrak]]
- {{{icon(youtube)}}} *YouTube* :: [[https://www.youtube.com/@phundrak][@phundrak]]
- {{{icon(reddit)}}} *Reddit* :: [[https://www.reddit.com/user/phundrak][/u/phundrak]]
- {{{icon(linkedin)}}} *LinkedIn* :: [[https://www.linkedin.com/in/lucien-cartier-tilet/][Lucien Cartier-Tilet]]
- {{{icon(twitch)}}} *Twitch* :: [[https://www.twitch.tv/phundrak][phundrak]]
#+include: other-links

View File

@ -1,29 +0,0 @@
#+setupfile: ../headers
#+language: en
* Home
Hi, Im Lucien Cartier-Tilet, a consultant working at [[https://aubay.com][Aubay]].
I studied for my Masters 2 degree in THYP (in French: /Technologies de
lHypermédia/, in English: /Hypermedia Technologies/) at the Université
Vincennes Saint-Denis (Paris 8).
I worked at VoxWave from 2012 to 2018 as its co-founder and CTO.
During that time, I developed French singing vocal libraries for vocal
synthesizers, known as ALYS and LEORA.
Im a free software enthusiast, using GNU/Linux since 2008 and Emacs
since 2016.
I spend my personnal programming projects as well as on my constructed
worlds and languages. I also like to go climbing, and hiking whenever
I have the opportunity to.
I speak natively French, and English at a native level. I also speak
some Japanese, [[https://elefen.org][Lingua Franca Nova]], and Norwegian Bokmål.
#+begin_export html
This website is also available on Gemini as [gmi.phundrak.com/en](gemini://gmi.phundrak.com/en)!
#+end_export
#+include: other-links

View File

@ -1,190 +0,0 @@
#+setupfile: ../headers
#+language: en
* BSUP01 Keine Tashi
** Introduction
KEINE Tashi is a character and set of vocal libraries developed for
the shareware [[http://utau2008.web.fc2.com/][UTAU]], a singing voice synthesizer. I developed KEINE
Tashi over the course of several years, from 2012 to 2015. Three vocal
libraries have been released to the public, the most used one being
his *JPN Power Extend* one. On March 10th, 2017, I announced I would
cease any kind of activity related to UTAU.
#+begin_export html
<blockquote class="twitter-tweet" data-dnt="true" data-theme="dark"><p
lang="en" dir="ltr">Id like to also announce that from now on I am
dropping my previous UTAU projects other than covers and wont develop
any new UTAU library</p>— Pundrak (@Phundrak) <a
href="https://twitter.com/Phundrak/status/840174634377105408?ref_src=twsrc%5Etfw">March
10, 2017</a></blockquote> <component is="script" async
src="https://platform.twitter.com/widgets.js"
charset="utf-8"></component>
#+end_export
** Character and vocal libraries
Heres a copy and paste of some old pages describing KEINE Tashi:
*** Presentation
#+begin_export html
<ResponsiveImage
src="https://cdn.phundrak.com/img/UTAU/KEINE_Tashi_1024.webp"
:width="1024"
preview="https://cdn.phundrak.com/img/UTAU/KEINE_Tashi_512.webp"
:previewWidth="512">
Illustration de KEINE Tashi par Umi
</ResponsiveImage>
#+end_export
- Codename :: BSUP01 恵音བཀྲ་ཤིས་ KEINE Tashi
- First name :: Tashi (བཀྲ་ཤིས་), Tibetan name meaning “auspicious”
- Last name :: Keine (恵音), Japanese name meaning “Blessing sound”.
It reads as “keine”, although its regular reading should be
“megumine”.
- Model :: BSUP (Bödkay Shetang UTAU Project)
- Number :: 01
- Gender :: male
- Birthday (lore) :: June 28th, 1991
- Birthday (first release) :: October 14th, 2012
- Weight :: 154 lb / 70 kg
- Heigh :: 60″ / 182 cm (very tall for a Tibetan)
- Hair color :: black
- Eyes color :: brown~black
- Appearance :: Tashi wears a modernized Tibetan suit from the Amdo
Region (Chinese: 安多 Ānduō), colored in blue. He also wears some
turquoise jeweleries.
- Favorite food :: meat momo (Tibetan raviolies)
- Character item :: a Tibetan manuscript
- Voice and creator :: [[https://phundrak.com][Phundrak]] (me)
- Likes :: to meditate, calligraphy, old books, manuscripts (is that
a self-insert?)
- Dislikes :: selfishness, lies, arrogance
- Personality :: Tashi is somebody very calm, sweet. He really enjoys
old books and manuscripts, and he LOVES meditate! He's never hungry,
so, he can stay meditating for 2~3 days meditating, just like that,
until he realizes that he should eat something. And he always keeps
quiet, it's really hard to make him angry.
But when he is, his anger becomes wrath. Anyone who experienced it
can attest how complex and difficult it is to calm him down.
Strangely enough, shortly after being confronted by Tashi, the
victims of this wrath see their quality of life greatly improve.
Maybe these people needed to hear some truths they refused to face
before?
*** Vocal libraries
**** JPN VCV
- Download link ::
| Extension | Size | Link |
|-----------+----------+------|
| 7z | 25.7 MiB | [[https://cdn.phundrak.com/files/KeineTashi/BSUP01_KEINE_Tashi_JPN_VCV.7z][DL]] |
| tar.xz | 32.5 MiB | [[https://cdn.phundrak.com/files/KeineTashi/BSUP01_KEINE_Tashi_JPN_VCV.tar.xz][DL]] |
| zip | 38.0 MiB | [[https://cdn.phundrak.com/files/KeineTashi/BSUP01_KEINE_Tashi_JPN_VCV.zip][DL]] |
- File size :: 60.7 MB
- Total uncompressed size :: 94.4 MB
- Number of voice phonemes :: 1264 (253 audio files)
- Average frequency :: G#2
- Vocal range :: C2~D3
- FRQ file presence :: partial
- Release date :: October, 14th 2012
- Phoneme encoding :: Romaji with hiragana and CV romaji aliases
- Supported languages :: Japanese
- oto.ini :: Tuned myself
- Recommended engines :: TIPS, VS4U
**** JPN Extend Power
- Download link ::
| Extension | Size | Link |
|-----------+--------+------|
| 7z | 1.1Gio | [[https://cdn.phundrak.com/files/KeineTashi/BSUP01_KEINE_Tashi_JPN_Extend_Power.7z][DL]] |
| tar.xz | 1.1Gio | [[https://cdn.phundrak.com/files/KeineTashi/BSUP01_KEINE_Tashi_JPN_Extend_Power.tar.xz][DL]] |
| zip | 1.2Gio | [[https://cdn.phundrak.com/files/KeineTashi/BSUP01_KEINE_Tashi_JPN_Extend_Power.zip][DL]] |
- File size :: 114 MB
- Total uncompressed size :: 155 MB
- Number of voice phonemes :: 3020 (546 audio files)
- Average frequency :: C3
- Vocal range :: B1~D4
- FRQ file presence :: partial
- Release date :: June 28th, 2013
- Phoneme encoding :: Romaji (hiragana aliases)
- Supported languages :: Japanese
- oto.ini :: Tuned myself
- Recommended engines :: VS4U, world4utau
**** JPN Extend Youth
- Download link ::
| Extension | Size | Link |
|-----------+----------+------|
| 7z | 237.7Mio | [[https://cdn.phundrak.com/files/KeineTashi/BSUP01_KEINE_Tashi_JPN_Extend_Youth.7z][DL]] |
| tar.xz | 243.5Mio | [[https://cdn.phundrak.com/files/KeineTashi/BSUP01_KEINE_Tashi_JPN_Extend_Youth.tar.xz][DL]] |
| zip | 268.7Mio | [[https://cdn.phundrak.com/files/KeineTashi/BSUP01_KEINE_Tashi_JPN_Extend_Youth.zip][DL]] |
- File size :: 36.9 MB
- Total uncompressed size :: 42.0 MB
- Number of voice phonemes :: 1954 (182 audio files)
- Average frequency :: C4
- Vocal range :: F#3~A#4
- FRQ file presence :: partial
- Release date :: June 28th, 2013
- Phoneme encoding :: Romaji (hiragana aliases, romaji added with the
oto.ini update)
- Supported languages :: Japanese
- oto.ini :: Tuned myself
- Recommended engines :: fresamp, VS4U, world4utau
**** JPN Extend Native
- Status :: abandonned
**** TIB CVVC
- Status :: abandonned
**** ENG
#+begin_export html
<ResponsiveImage
src="https://cdn.phundrak.com/img/UTAU/KEINE_Tashi_EN_673.webp"
:width="673"
preview="https://cdn.phundrak.com/img/UTAU/KEINE_Tashi_EN_246.webp"
:previewWidth="300">
Illustration de KEINE Tashi EN
</ResponsiveImage>
#+end_export
- Status :: abandonned
** Usage clause and license
KEINE Tashi is released under the [[https://creativecommons.org/licenses/by-nc-sa/4.0/][CC BY-SA-NC 4.0 license]], meaning you
are free to:
- use :: make use of the vocal libraries in UTAU or any other singing
vocal synthesizer software.
- adapt :: remix, transform, and build upon the material
- share :: copy and redistribute the material in any medium or format
my work, on the condition of:
- Attribution :: You must give appropriate credit, provide a link to
the license, and indicate if changes were made. You may do so in any
reasonable manner, but not in any way that suggests the licensor
endorses you or your use.
- NonCommercial :: You may not use the material for commercial
purposes.
- ShareAlike :: If you remix, transform, or build upon the material,
you must distribute your contributions under the same license as the
original.
Although I cannot add anything to this legal notice, I would also like
if you followed the following rules of thumb regarding this character:
any religious use of this character and its vocal libraries is
forbidden, except for folk music, and Buddhist and Bön songs. However,
due to the current controversy, any song linked to His Holiness the
Gyalwa Karmapa is strictly forbidden until said controversy has been
officially resolved. This is also applicable to His Holiness the Dalai
Lama, the Venerable Shamar Rinpoche, and Tai Situ Rinpoche. If you
have any question or if you are unsure, please email me.
#+include: other-links

View File

@ -1,14 +0,0 @@
# -*- mode: org -*-
#+begin_export gmi
# Other Web Pages
=> ./index.gmi Home
=> ./find-me.gmi Where to find me
=> ./resume.gmi Resume
=> ./projets.gmi Programming Projets
=> ./conlanging.gmi Conlanging
=> ./vocal-synthesis.gmi Vocal Synthesis
=> ./about.gmi About
=> ./privacy.gmi Privacy
#+end_export

View File

@ -1,76 +0,0 @@
#+setupfile: ../headers
#+language: en
* Privacy
** Where is the website hosted?
This website is hosted on my private physical server, located in the
town of Bron in France, near Lyon. All of my websites are also hosted
on this server, except for [[https://labs.phundrak.com][=labs.phundrak.com=]] and =mail.phundrak.com=
which are hosted on servers rented to Scaleway and OVH France
respectively. These servers are also located in France.
** Cookies
*** What are cookies?
Cookies are small files a website saves on your computer or mobile
phone when you visit a website. Although not all sites make use of
them, they are nevertheless extremely common in order to allow
websites to function properly or function properly or more
efficiently.
This website uses some functional cookies in order to remember your
preferences, such as your preferred language or its colour theme.
These cookies are not and cannot be used to track you.
However, as this site is protected by Cloudflare, they may also host
some cookies to remember, for example, that your browser is safe or to
record traffic to the site.
*** How can I control cookies on my computer?
If you don't want Cloudflare to record your browsing activity on my
website, a good ad blocker should do the trick. I personally recommend
[[https://ublockorigin.com/][uBlock Origin]], one of the most effective ad blockers I know of if not
the most effective one.
You can also manually delete cookies from your browser, but given the
number of browsers out there, it might be quicker for you to look up
DuckDuckGo, Qwant or Startpage to do this for your current browser (if
you're worried about cookie usage, I guess you'll want to avoid
Google).
*** What about other methods of tracking users?
There are other more subtle methods of tracking someone on the
internet, or even via emails or any web content rendered on the
screen, such as web beacons (minuscule, invisible images). It is also
possible to store Flash cookies or local shared objects.
*** But is there any tracking at all on this website?
Well, there is, but it absolutely respects your privacy. I use my own
instance of [[https://umami.is][Umami]] which is an analytics service that is fully GDPR and
CCPA compliant. In short, when you visit a web page, some data get
sent to my service, but nothing that can identify you. If you come
back an hour later, I wont have any indication that you are the same
person.
If you still worry about your privacy, you have two options:
- Activate the Do Not Track setting of your browser (which Umami will
honour)
- Block the domain =umami.phundrak.com= in uBlock Origin (the only
ad blocker I will ever trust)
** Is there targeted advertisement on this website?
Theres no advertisement to begin with, and never will be. If you see
any, check your computer and browser for virus, that is not normal. If
it indeed comes from my website, it means it has been hacked. If you
can see in this websites repository that I myself added ads, it means
that I either lost my morals, or that I have been kidnapped and this
is a cry for help.
** How often is this page updated?
It is updated from time to time to reflect any changes in how my
website behaves, or if I notice errors on this page (such as typos).
** I have other questions
And I have the answers! Ill be more than happy to chat with you by
email, feel free to send me one at [[mailto:lucien@phundrak.com][lucien@phundrak.com]].
#+include: other-links

View File

@ -1,43 +0,0 @@
#+setupfile: ../headers
#+language: en
* Programming Projects
** Pinned GitHub Projects
#+begin_export gemini
Unfortunately, this content is not available on Gemini. Im working on it.
#+end_export
#+begin_export html
<ClientOnly>
<ListRepositories>
<GithubRepository repoName="rejeep/f.el" />
<GithubRepository repoName="Phundrak/eshell-info-banner.el" />
<GithubRepository repoName="Phundrak/dotfiles" />
<GithubRepository repoName="Phundrak/conlang.phundrak.com" />
</ListRepositories>
</ClientOnly>
#+end_export
** Most Starred Projects on GitHub
#+begin_export gemini
Unfortunately, this content is not available on Gemini. Im working on it.
#+end_export
#+begin_export html
<ClientOnly>
<ListRepositories sortBy='stars' user='phundrak' :limit='5' />
</ClientOnly>
#+end_export
** Latest Active Repositories on GitHub
#+begin_export gemini
Unfortunately, this content is not available on Gemini. Im working on it.
#+end_export
#+begin_export html
<ClientOnly>
<ListRepositories sortBy='pushed_at' user='phundrak' :limit='5' />
</ClientOnly>
#+end_export
#+include: other-links

View File

@ -1,84 +0,0 @@
#+setupfile: ../headers
#+language: en
* Resume
** Profesionnal Experiences
*** Aubay (2023 - )
- Consultant since September 2023
- Internship from early February to early August 2023
- Web app development
- Usage of Angular, Java Spring Boot, Spring Batch, and PostgreSQL
*** VoxWave (2014 - 2018)
Startup specialized in the creation of French virtual singers using
vocal synthesis. Its best known product is ALYS. [[./vocal-synthesis.md][More here]].
- Co-founder, CTO
- Development of singing synthesis vocal libraries
- Linguistic research
- User support
- Recruit training for vocal libraries development
** Education
*** 2nd Year Masters Degree (University of Paris 8)
Year repeated due to health issues with no long-lasting consequences.
*** 1st Year Masters Degree (University of Paris 8)
*** Computer Science Bachelor Degree (University of Paris 8)
*** English Literature (University of Lyon 2)
Studied for a year and a half until the creation of [[./resume.md#voxwave-2014-2018][VoxWave]].
** Web Programming
*** Front-end
- Professional use of Angular and TypeScript
- Personal use of Vue (including Nuxt)
*** Back-end
- Professional use of Java SpringBoot and SpringBatch
- Professional and personal use of PostgreSQL
- Personal use of Rust ([[https://github.com/poem-web/poem/][poem]], [[https://actix.rs/][actix-web]] and [[https://rocket.rs/][Rocket]])
- Some experience in back-end development with Django (Python)
- Personal use of MySQL and SQLite
** System Programming
- Frequent usage of Rust, C, EmacsLisp, and UNIX shells (bash, fish, Eshell)
- Occasional use of C++, Python, and CommonLisp
** Development Tools
*** IDEs and Text Editors
- Professional use of VS Code, Eclipse, and Git
- Advanced user of Emacs, including its LSP and Git integrations
- Basic knowledge of Vim, CLion, Pycharm, and WebStorm
*** CI/CD and Deploying to the Web
- Experienced with web servers such as Nginx and Caddyserver
- Good knowledge of virtualization and deployment with Docker and
Docker Compose for virtualization, Drone.io, and GitHub Actions for
deployment.
** Operating Systems
- Usage and administration of Linux (Arch Linux, Void Linux, Debian,
Ubuntu, Alpine Linux, NixOS)
- Administration of web servers and storage servers (Arch Linux,
Debian, Raspbian, Alpine Linux, NixOS)
- Basic knowledge with Guix System and Windows XP through 10 (except
Vista)
** Office Applications
- Good knowledge with [[https://orgmode.org/][org-mode]] (main tool), LaTeX
- I know my way around LibreOffice, Microsoft Office, OnlyOffice, and
WPS Office
** Audio
*** Singing Vocal Synthesis
- Development and creation of vocal libraries for VOCALOID3,
Alter/Ego, Chipspeech, and UTAU
- Usage of VOCALOID 2 through 4, Alter/Ego, Chipspeech, UTAU, CeVIO
Creative Studio
*** Audio Engineering
- Music writing and mix software: FL Studio
- Audio repair and cleaning: iZotope RX
- Mastering: T-RackS CS
#+include: other-links

View File

@ -1,58 +0,0 @@
#+setupfile: ../headers
#+language: en
#+begin_export html
---
title: Vocal Synthesis
---
#+end_export
* My works in vocal synthesis
From 2011 to 2018, I worked as an amateur and professional in singing
vocal synthesis. More precisely, I was creating vocal libraries used
by various libraries, mainly UTAU and Alter/Ego.
** UTAU
I began working with UTAU first by the end of 2011 on an unnamed and
deleted Japanese vocal library. While I didnt maintain it for long,
mainly due to its bad recording quality (I recorded it with a low-end
desktop microphone) and configuration, it did teach me the basics of
creating vocal libraries and working with audio files.
In October 14th, 2012, I released my second vocal library, named
/BSUP01 KEINE Tashi JPN VCV/ which was of higher quality both due to the
recording equipment, manner of recording, and configuration, though
still relatively average for the time. My best work with this series
of vocal libraries was /BSUP01 KEINE Tashi JPN Extend Power/, a
high-energy voice made in similar circumstances but with yet again
better know-how.
This series of vocal libraries also featured /BSUP01 KEINE Tashi TIB
CVVC/ and /BSUP02 Drolma TIB/, the two first Tibetan vocal libraries for
singing vocal synthesis worldwide.
I later created in UTAU /ALYS 001 JPN/, /ALYS 001 FRA/, and /ALYS 002 FRA/
as prototypes, known as /ALYS4UTAU/, for our upcoming product while
working at VoxWave.
While all these vocal libraries have been discontinued, vocal
libraries for /BSUP01 KEINE Tashi/ and /ALYS/ are available for download.
Please refer to the following pages:
- BSUP01 KEINE Tashi :: [[file:./keine-tashi.org][BSUP01 KEINE Tashi]]
- ALYS :: [[https://labs.phundrak.com/ALYS/ALYS][ALYS for Alter/Ego download]]
** Alter/Ego
[[https://www.plogue.com/products/alter-ego.html][Alter/Ego]] is a singing vocal synthesis engine made by [[https://www.plogue.com/][Plogue Inc.]].
ALYS was its first commercial vocal library as well as the first
professional singing vocal library available in French.
Due to the architecture and behaviour of Alter/Ego, important changes
had to be done to the recording script for ALYS (later re-used for
LEORA). Including the development of the new recording scripts, the
initial development period for ALYS spanned well over a year, with
some additional eight to nine months for its first major update.
ALYS for Alter/Ego, also known as /ALYS4AE/, is available free of charge
as a module for Alter/Ego
#+include: other-links

View File

@ -1,31 +0,0 @@
#+setupfile: ./headers
#+language: fr
* Où me trouver
Je suis présent sur différentes plateformes et quelques réseaux
sociaux où vous pouvez me suivre.
** Réseaux sociaux
- {{{icon(mastodon)}}} *Mastodon* :: [[https://mastodon.phundrak.com/@phundrak][@phundrak@mastodon.phundrak.com]]
- {{{icon(twitter)}}} *Twitter* :: [[https://twitter.com/phundrak][@phundrak]], cependant je ny suis plus très
actif et jy repartage principalement mes messages Mastodon qui
parfois se font tronquer
- {{{icon(writefreely)}}} *Writefreely* ::
- [[https://write.phundrak.com/phundrak][*@phundrak@write.phundrak.com*]] : billets personnels
- [[https://write.phundrak.com/phundraks-short-stories][*@phundraks-short-stories@write.phundrak.com*]] : histoires courtes
- {{{icon(discord)}}} *Discord* :: =@phundrak= (dites-moi que vous venez
dici, autrement il est possible que je considère le message comme
du pourriel)
** Autres plateformes
- {{{icon(envelope)}}} *Courriel* :: [[mailto:lucien@phundrak.com][lucien@phundrak.com]]
- {{{icon(rss)}}} *Blog* :: [[https://blog.phundrak.com][blog.phundrak.com]]
- {{{icon(gitea)}}} *Gitea* :: [[https://labs.phundrak.com/phundrak][@phundrak@labs.phundrak.com]]
- {{{icon(github)}}} *GitHub* :: [[https://github.com/Phundrak][Phundrak]]
- {{{icon(youtube)}}} *YouTube* :: [[https://www.youtube.com/@phundrak][@phundrak]]
- {{{icon(reddit)}}} *Reddit* :: [[https://www.reddit.com/user/phundrak][/u/phundrak]]
- {{{icon(linkedin)}}} *LinkedIn* :: [[https://www.linkedin.com/in/lucien-cartier-tilet/][Lucien Cartier-Tilet]]
- {{{icon(twitch)}}} *Twitch* :: [[https://www.twitch.tv/phundrak][phundrak]]
#+include: other-links

Some files were not shown because too many files have changed in this diff Show More