Compare commits
39 commits
main
...
feature/ar
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
4f7a12547e | ||
|
|
1eb4a508c1 | ||
|
|
3f13c00f12 | ||
|
|
33375c9221 | ||
|
|
c40d7d056e | ||
|
|
9f0e1f8b61 | ||
|
|
9601e179bc | ||
|
|
40c14ece94 | ||
|
|
99b92fe4ce | ||
|
|
1b5f85627c | ||
|
|
e9afa065b5 | ||
|
|
bf46bb5beb | ||
|
|
1c7755f455 | ||
|
|
6ed633f84b | ||
|
|
e941353b64 | ||
|
|
59872f579a | ||
|
|
d93797172a | ||
|
|
38efae02d4 | ||
|
|
b9324cf11c | ||
|
|
72b52463a2 | ||
|
|
eea56dd50b | ||
|
|
e8f2cc4c50 | ||
|
|
ac42d7df78 | ||
|
|
87bb05fdd5 | ||
|
|
3d8f794cab | ||
|
|
2db6bc5047 | ||
|
|
57d7173485 | ||
|
|
8688efc5af | ||
|
|
d6dab5fb1f | ||
|
|
7a6e7d0748 | ||
|
|
57f3ccbaeb | ||
|
|
e202eeb9d2 | ||
|
|
e952973f7f | ||
|
|
3b91c98d5e | ||
|
|
a882c9ecff | ||
|
|
69394c813d | ||
|
|
0b2ea19ebc | ||
|
|
f059e75acd | ||
|
|
d4ce94ddf8 |
522 changed files with 10693 additions and 27892 deletions
197
.claude/skills/zammad-compat/SKILL.md
Normal file
197
.claude/skills/zammad-compat/SKILL.md
Normal file
|
|
@ -0,0 +1,197 @@
|
|||
---
|
||||
name: zammad-compat
|
||||
description: Check upstream Zammad for breaking changes before upgrading the addon
|
||||
disable-model-invocation: true
|
||||
argument-hint: "[target-version]"
|
||||
allowed-tools: Bash(git clone *), Bash(git -C /tmp/zammad-upstream *)
|
||||
---
|
||||
|
||||
# Zammad Upstream Compatibility Check
|
||||
|
||||
Check the upstream zammad/zammad repository for changes that could break or require updates to our Zammad addon (`packages/zammad-addon-link`).
|
||||
|
||||
## Arguments
|
||||
|
||||
- `$ARGUMENTS` - Optional: target Zammad version/tag/branch to compare against (e.g. `6.6.0`, `stable`). If not provided, ask the user what version to compare against. The current version is in `docker/zammad/Dockerfile` as the `ZAMMAD_VERSION` ARG.
|
||||
|
||||
## Setup
|
||||
|
||||
1. Read the current Zammad version from `docker/zammad/Dockerfile` (the `ARG ZAMMAD_VERSION=` line).
|
||||
2. Clone or update the upstream Zammad repository:
|
||||
- If `/tmp/zammad-upstream` does not exist, clone it: `git clone --bare https://github.com/zammad/zammad.git /tmp/zammad-upstream`
|
||||
- If it exists, update it: `git -C /tmp/zammad-upstream fetch --all --tags`
|
||||
3. Determine the version range. The current version is the `ZAMMAD_VERSION` from step 1. The target version is the argument or user-provided version. Both versions should be used as git refs (tags are typically in the format `X.Y.Z`).
|
||||
|
||||
## Checks to Perform
|
||||
|
||||
Run ALL of these checks and compile results into a single report.
|
||||
|
||||
### 1. Replaced Stock Files
|
||||
|
||||
These are stock Zammad files that our addon REPLACES with modified copies. Changes upstream mean we need to port those changes into our modified versions.
|
||||
|
||||
For each file below, diff the upstream version between the current and target version. Report any changes found.
|
||||
|
||||
**Vue/TypeScript (Desktop UI):**
|
||||
- `app/frontend/apps/desktop/pages/ticket/components/TicketDetailView/ArticleReply.vue`
|
||||
- `app/frontend/apps/desktop/pages/personal-setting/views/PersonalSettingNotifications.vue`
|
||||
- `app/frontend/apps/desktop/components/Form/fields/FieldNotifications/FieldNotificationsInput.vue`
|
||||
- `app/frontend/apps/desktop/components/Form/fields/FieldNotifications/types.ts`
|
||||
|
||||
**CoffeeScript (Legacy UI):**
|
||||
- `app/assets/javascripts/app/controllers/_profile/notification.coffee`
|
||||
- `app/assets/javascripts/app/controllers/_ui_element/notification_matrix.coffee`
|
||||
- `app/assets/javascripts/app/lib/mixins/ticket_notification_matrix.coffee`
|
||||
- `app/assets/javascripts/app/views/generic/notification_matrix.jst.eco`
|
||||
- `app/assets/javascripts/app/views/profile/notification.jst.eco`
|
||||
|
||||
Command pattern for each file:
|
||||
```bash
|
||||
git -C /tmp/zammad-upstream diff <current-version> <target-version> -- <file-path>
|
||||
```
|
||||
|
||||
If a file does not exist at either version, note that (it may have been added, removed, or renamed).
|
||||
|
||||
### 2. Monkey-Patched Files
|
||||
|
||||
These are files our addon patches at runtime via Ruby `prepend`, `include`, or `after_initialize` hooks. Changes to these files could break our patches.
|
||||
|
||||
**Search Backend (OpenSearch compatibility patch):**
|
||||
- `lib/search_index_backend.rb` - We prepend `SearchIndexBackendOpenSearchPatch` to override `_mapping_item_type_es`. Check if this method signature or the `'flattened'` string usage has changed.
|
||||
|
||||
**Core Models (callback injection targets):**
|
||||
- `app/models/ticket/article.rb` - We inject `after_create` callbacks via `include` for Signal and WhatsApp message delivery. Check for changes to the callback chain, model structure, or the `Sender`/`Type` lookup patterns.
|
||||
- `app/models/link.rb` - We inject an `after_create` callback for Signal group setup on ticket split. Check for structural changes.
|
||||
|
||||
**Transaction System:**
|
||||
- `app/models/transaction/` directory - We register `Transaction::SignalNotification` as backend `0105_signal_notification`. Check if the transaction backend system has been refactored.
|
||||
|
||||
**Icons:**
|
||||
- `public/assets/images/icons.svg` - Our initializers append SVG icons at boot time. Check if the SVG structure or the icon injection mechanism has changed.
|
||||
|
||||
Command pattern:
|
||||
```bash
|
||||
git -C /tmp/zammad-upstream diff <current-version> <target-version> -- <file-path>
|
||||
```
|
||||
|
||||
For the search backend specifically, also check if `_mapping_item_type_es` still exists and still returns `'flattened'`:
|
||||
```bash
|
||||
git -C /tmp/zammad-upstream show <target-version>:lib/search_index_backend.rb | grep -n -A5 '_mapping_item_type_es\|flattened'
|
||||
```
|
||||
|
||||
### 3. API Surface Dependencies
|
||||
|
||||
These are Zammad APIs/interfaces/mixins our addon relies on. Changes could cause runtime failures.
|
||||
|
||||
**Channel Driver Interface:**
|
||||
- `app/models/channel/driver/` - Check if the driver base class or interface expectations have changed (methods: `fetchable?`, `disconnect`, `deliver`, `streamable?`).
|
||||
|
||||
**Controller Concerns:**
|
||||
- `app/controllers/concerns/creates_ticket_articles.rb` - Used by our webhook controllers. Check for interface changes.
|
||||
|
||||
**Ticket Article Types & Senders:**
|
||||
- `app/models/ticket/article/type.rb` and `app/models/ticket/article/sender.rb` - We look up types by name (`'signal message'`, `'whatsapp message'`). Check for changes in how types are registered or looked up.
|
||||
|
||||
**Authentication/Authorization:**
|
||||
- `app/policies/` directory structure - We create policies matching `controllers/` names. Check if the policy naming convention or base class has changed.
|
||||
|
||||
**Package System:**
|
||||
- `lib/package.rb` or the package install/uninstall API - We use `Package.install(file:)` and `Package.uninstall(name:, version:)` in setup.rb.
|
||||
|
||||
**Scheduler/Job System:**
|
||||
- `app/jobs/` base class patterns - Our jobs inherit from ApplicationJob. Check for changes.
|
||||
|
||||
Command pattern:
|
||||
```bash
|
||||
git -C /tmp/zammad-upstream diff --stat <current-version> <target-version> -- <path>
|
||||
git -C /tmp/zammad-upstream diff <current-version> <target-version> -- <specific-file>
|
||||
```
|
||||
|
||||
### 4. Path Collision Detection
|
||||
|
||||
Check if the target Zammad version has added any NEW files at paths that collide with our addon files. Our addon installs files at these paths:
|
||||
|
||||
**Controllers:** `app/controllers/channels_cdr_signal_controller.rb`, `channels_cdr_voice_controller.rb`, `channels_cdr_whatsapp_controller.rb`, `cdr_signal_channels_controller.rb`, `cdr_ticket_article_types_controller.rb`, `formstack_controller.rb`, `opensearch_controller.rb`
|
||||
|
||||
**Models:** `app/models/channel/driver/cdr_signal.rb`, `cdr_whatsapp.rb`, `app/models/ticket/article/enqueue_communicate_cdr_signal_job.rb`, `enqueue_communicate_cdr_whatsapp_job.rb`, `app/models/link/setup_split_signal_group.rb`, `app/models/transaction/signal_notification.rb`
|
||||
|
||||
**Jobs:** `app/jobs/communicate_cdr_signal_job.rb`, `communicate_cdr_whatsapp_job.rb`, `signal_notification_job.rb`, `create_ticket_from_form_job.rb`
|
||||
|
||||
**Libraries:** `lib/cdr_signal.rb`, `cdr_signal_api.rb`, `cdr_signal_poller.rb`, `cdr_whatsapp.rb`, `cdr_whatsapp_api.rb`, `signal_notification_sender.rb`
|
||||
|
||||
**Routes:** `config/routes/cdr_signal_channels.rb`, `channel_cdr_signal.rb`, `channel_cdr_voice.rb`, `channel_cdr_whatsapp.rb`, `cdr_ticket_article_types.rb`, `formstack.rb`, `opensearch.rb`
|
||||
|
||||
**Frontend Plugins:** `app/frontend/shared/entities/ticket-article/action/plugins/cdr_signal.ts`, `cdr_whatsapp.ts`, `app/frontend/apps/desktop/pages/ticket/components/TicketDetailView/article-type/plugins/signalMessage.ts`, `cdrWhatsappMessage.ts`
|
||||
|
||||
Check if any of these paths exist in the target version:
|
||||
```bash
|
||||
for path in <list-of-paths>; do
|
||||
git -C /tmp/zammad-upstream show <target-version>:$path 2>/dev/null && echo "COLLISION: $path exists upstream"
|
||||
done
|
||||
```
|
||||
|
||||
### 5. Dockerfile Patch Targets
|
||||
|
||||
Check files that are patched at Docker build time via `sed`:
|
||||
|
||||
- `lib/search_index_backend.rb` - `sed` replaces `'flattened'` with `'flat_object'`. Verify the string still exists in the target version.
|
||||
- `contrib/nginx/zammad.conf` - Structure modified for embedded mode. Check for format changes.
|
||||
- `docker-entrypoint.sh` - We inject addon install commands after the `# es config` comment. Verify this comment/anchor still exists.
|
||||
|
||||
Check the upstream Docker entrypoint:
|
||||
```bash
|
||||
git -C /tmp/zammad-upstream show <target-version>:contrib/docker/docker-entrypoint.sh 2>/dev/null | grep -n 'es config' || echo "Anchor comment not found - check entrypoint structure"
|
||||
```
|
||||
|
||||
Also check the Zammad Docker Compose repo if relevant (the base image may come from `zammad/zammad-docker-compose`).
|
||||
|
||||
### 6. Database Schema Conflicts
|
||||
|
||||
Check if the target Zammad version adds any columns or tables that could conflict with our migrations:
|
||||
- Column names: `whatsapp_uid`, `signal_uid`, `signal_username` on the users table
|
||||
- Setting names containing: `signal_notification`, `cdr_link`, `formstack`, `opensearch_dashboard`
|
||||
|
||||
```bash
|
||||
git -C /tmp/zammad-upstream diff <current-version> <target-version> -- db/migrate/ | grep -i 'signal\|whatsapp\|formstack\|opensearch'
|
||||
```
|
||||
|
||||
### 7. Frontend Build System
|
||||
|
||||
Check if the Vite/asset pipeline configuration has changed significantly, since our addon relies on being compiled into the Zammad frontend:
|
||||
|
||||
```bash
|
||||
git -C /tmp/zammad-upstream diff --stat <current-version> <target-version> -- vite.config.ts app/frontend/vite.config.ts config/initializers/assets.rb Gemfile
|
||||
```
|
||||
|
||||
Also check if CoffeeScript/Sprockets support has been removed (would break our legacy UI files):
|
||||
```bash
|
||||
git -C /tmp/zammad-upstream show <target-version>:Gemfile 2>/dev/null | grep -i 'coffee\|sprockets'
|
||||
```
|
||||
|
||||
## Report Format
|
||||
|
||||
Compile all findings into a structured report:
|
||||
|
||||
```
|
||||
## Zammad Compatibility Report: <current-version> -> <target-version>
|
||||
|
||||
### CRITICAL (Action Required Before Upgrade)
|
||||
- [List files that changed upstream AND are replaced by our addon - these need manual merging]
|
||||
- [List any broken monkey-patch targets]
|
||||
- [List any path collisions]
|
||||
|
||||
### WARNING (Review Needed)
|
||||
- [List API surface changes that could affect our code]
|
||||
- [List Dockerfile patch targets that changed]
|
||||
- [List build system changes]
|
||||
|
||||
### INFO (No Action Needed)
|
||||
- [List files checked with no changes]
|
||||
- [List confirmed-safe paths]
|
||||
|
||||
### Recommended Actions
|
||||
- For each CRITICAL item, describe what needs to be done
|
||||
- Note any files that should be re-copied from upstream and re-patched
|
||||
```
|
||||
|
||||
For each changed file in CRITICAL, show the upstream diff so the user can see what changed and decide how to integrate it.
|
||||
|
|
@ -3,3 +3,4 @@ out
|
|||
signald
|
||||
docker-compose.yml
|
||||
README.md
|
||||
.git
|
||||
|
|
|
|||
13
.gitignore
vendored
13
.gitignore
vendored
|
|
@ -7,6 +7,8 @@ build/**
|
|||
.next/**
|
||||
docker/zammad/addons/**
|
||||
!docker/zammad/addons/.gitkeep
|
||||
docker/zammad/gems/**
|
||||
!docker/zammad/gems/.gitkeep
|
||||
.npmrc
|
||||
coverage/
|
||||
build/
|
||||
|
|
@ -28,6 +30,15 @@ baileys-state
|
|||
signald-state
|
||||
project.org
|
||||
**/.openapi-generator/
|
||||
apps/bridge-worker/scripts/*
|
||||
ENVIRONMENT_VARIABLES_MIGRATION.md
|
||||
local-scripts/*
|
||||
docs/
|
||||
packages/zammad-addon-link/test/
|
||||
|
||||
# Allow Claude Code project config (overrides global gitignore)
|
||||
!CLAUDE.md
|
||||
!.claude/
|
||||
.claude/**
|
||||
!.claude/skills/
|
||||
!.claude/skills/**
|
||||
.claude/settings.local.json
|
||||
|
|
|
|||
|
|
@ -69,39 +69,6 @@ buildx-docker-release:
|
|||
variables:
|
||||
DOCKER_NS: ${CI_REGISTRY}/digiresilience/link/link-stack/buildx
|
||||
|
||||
link-docker-build:
|
||||
extends: .docker-build
|
||||
variables:
|
||||
DOCKER_NS: ${CI_REGISTRY}/digiresilience/link/link-stack/link
|
||||
DOCKERFILE_PATH: ./apps/link/Dockerfile
|
||||
|
||||
link-docker-release:
|
||||
extends: .docker-release
|
||||
variables:
|
||||
DOCKER_NS: ${CI_REGISTRY}/digiresilience/link/link-stack/link
|
||||
|
||||
bridge-frontend-docker-build:
|
||||
extends: .docker-build
|
||||
variables:
|
||||
DOCKER_NS: ${CI_REGISTRY}/digiresilience/link/link-stack/bridge-frontend
|
||||
DOCKERFILE_PATH: ./apps/bridge-frontend/Dockerfile
|
||||
|
||||
bridge-frontend-docker-release:
|
||||
extends: .docker-release
|
||||
variables:
|
||||
DOCKER_NS: ${CI_REGISTRY}/digiresilience/link/link-stack/bridge-frontend
|
||||
|
||||
bridge-worker-docker-build:
|
||||
extends: .docker-build
|
||||
variables:
|
||||
DOCKER_NS: ${CI_REGISTRY}/digiresilience/link/link-stack/bridge-worker
|
||||
DOCKERFILE_PATH: ./apps/bridge-worker/Dockerfile
|
||||
|
||||
bridge-worker-docker-release:
|
||||
extends: .docker-release
|
||||
variables:
|
||||
DOCKER_NS: ${CI_REGISTRY}/digiresilience/link/link-stack/bridge-worker
|
||||
|
||||
bridge-whatsapp-docker-build:
|
||||
extends: .docker-build
|
||||
variables:
|
||||
|
|
@ -205,33 +172,10 @@ zammad-docker-build:
|
|||
- pnpm install --frozen-lockfile
|
||||
- turbo build --force --filter @link-stack/zammad-addon-*
|
||||
- docker login -u $CI_REGISTRY_USER -p $CI_REGISTRY_PASSWORD $CI_REGISTRY
|
||||
- DOCKER_BUILDKIT=1 docker build --build-arg EMBEDDED=true --pull --no-cache -t ${DOCKER_NS}:${DOCKER_TAG} -f ${DOCKERFILE_PATH} ${BUILD_CONTEXT}
|
||||
- DOCKER_BUILDKIT=1 docker build --pull --no-cache -t ${DOCKER_NS}:${DOCKER_TAG} -f ${DOCKERFILE_PATH} ${BUILD_CONTEXT}
|
||||
- docker push ${DOCKER_NS}:${DOCKER_TAG}
|
||||
|
||||
zammad-docker-release:
|
||||
extends: .docker-release
|
||||
variables:
|
||||
DOCKER_NS: ${CI_REGISTRY}/digiresilience/link/link-stack/zammad
|
||||
|
||||
zammad-standalone-docker-build:
|
||||
extends: .docker-build
|
||||
variables:
|
||||
DOCKER_NS: ${CI_REGISTRY}/digiresilience/link/link-stack/zammad-standalone
|
||||
DOCKERFILE_PATH: ./docker/zammad/Dockerfile
|
||||
BUILD_CONTEXT: ./docker/zammad
|
||||
PNPM_HOME: "/pnpm"
|
||||
before_script:
|
||||
- export PATH="$PNPM_HOME:$PATH"
|
||||
- corepack enable && corepack prepare pnpm@9.15.4 --activate
|
||||
script:
|
||||
- pnpm add -g turbo
|
||||
- pnpm install --frozen-lockfile
|
||||
- turbo build --force --filter @link-stack/zammad-addon-*
|
||||
- docker login -u $CI_REGISTRY_USER -p $CI_REGISTRY_PASSWORD $CI_REGISTRY
|
||||
- DOCKER_BUILDKIT=1 docker build --pull --no-cache -t ${DOCKER_NS}:${DOCKER_TAG} -f ${DOCKERFILE_PATH} ${BUILD_CONTEXT}
|
||||
- docker push ${DOCKER_NS}:${DOCKER_TAG}
|
||||
|
||||
zammad-standalone-docker-release:
|
||||
extends: .docker-release
|
||||
variables:
|
||||
DOCKER_NS: ${CI_REGISTRY}/digiresilience/link/link-stack/zammad-standalone
|
||||
|
|
|
|||
2
.nvmrc
2
.nvmrc
|
|
@ -1 +1 @@
|
|||
v22.18.0
|
||||
v24
|
||||
|
|
|
|||
114
CLAUDE.md
Normal file
114
CLAUDE.md
Normal file
|
|
@ -0,0 +1,114 @@
|
|||
# CLAUDE.md
|
||||
|
||||
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
|
||||
|
||||
## Repository Overview
|
||||
|
||||
This is a monorepo for CDR Link - a Zammad addon and supporting services built by the Center for Digital Resilience. It adds Signal, WhatsApp, and voice channel support to Zammad via a custom `.zpm` addon package, along with a standalone WhatsApp bridge service. It uses pnpm workspaces and Turborepo for orchestration.
|
||||
|
||||
**Tech Stack:**
|
||||
- Zammad 6.5.x as the core helpdesk platform
|
||||
- Ruby (Rails initializers, controllers, models, jobs) for the Zammad addon
|
||||
- TypeScript/Node.js for build tooling and the WhatsApp bridge
|
||||
- CoffeeScript for Zammad legacy UI extensions
|
||||
- Vue 3 for Zammad desktop UI extensions
|
||||
- Docker for containerization
|
||||
- PostgreSQL, Redis, Memcached as backing services
|
||||
|
||||
## Project Structure
|
||||
|
||||
```
|
||||
apps/
|
||||
bridge-whatsapp/ # Standalone WhatsApp bridge (Hapi.js + Baileys)
|
||||
packages/
|
||||
zammad-addon-link/ # Zammad addon source (Ruby, CoffeeScript, Vue, TS)
|
||||
src/ # Addon source files (installed into /opt/zammad/)
|
||||
scripts/build.ts # Builds .zpm package from src/
|
||||
scripts/migrate.ts # Generates new migration stubs
|
||||
docker/
|
||||
zammad/ # Custom Zammad Docker image
|
||||
Dockerfile # Extends zammad/zammad-docker-compose base image
|
||||
install.rb # Extracts addon files from .zpm at build time
|
||||
setup.rb # Registers addon packages at container startup
|
||||
addons/ # Built .zpm files (gitignored, generated by turbo build)
|
||||
compose/ # Docker Compose service definitions
|
||||
```
|
||||
|
||||
## Common Development Commands
|
||||
|
||||
```bash
|
||||
pnpm install # Install all dependencies
|
||||
turbo build # Build all packages (generates .zpm files)
|
||||
npm run docker:zammad:build # Build custom Zammad Docker image
|
||||
npm run docker:all:up # Start all Docker services
|
||||
npm run docker:all:down # Stop all Docker services
|
||||
npm run docker:zammad:restart # Restart railsserver + scheduler (after Ruby changes)
|
||||
npm run update-version <version> # Update version across all packages
|
||||
npm run clean # Remove all build artifacts and dependencies
|
||||
```
|
||||
|
||||
## Zammad Addon Architecture
|
||||
|
||||
### Addon Build & Deploy Pipeline
|
||||
|
||||
1. `turbo build` runs `tsx scripts/build.ts` in `packages/zammad-addon-link/`
|
||||
2. Build script base64-encodes all files under `src/`, produces `docker/zammad/addons/zammad-addon-link-v{version}.zpm`
|
||||
3. `docker/zammad/Dockerfile` builds a custom image:
|
||||
- Copies `.zpm` files and runs `install.rb` to extract addon files into the Zammad directory tree
|
||||
- Rebuilds Vite frontend (`bundle exec vite build`) to include addon Vue components
|
||||
- Precompiles assets (`rake assets:precompile`) to include addon CoffeeScript
|
||||
- Applies `sed` patches (OpenSearch compatibility, entrypoint injection)
|
||||
4. At container startup, `setup.rb` registers the addon via `Package.install()` and runs migrations
|
||||
|
||||
### How the Addon Extends Zammad
|
||||
|
||||
**New files (no upstream conflict risk):** Controllers, channel drivers, jobs, routes, policies, library classes, views, CSS, SVG icons, frontend plugins. These add Signal/WhatsApp/voice channel support.
|
||||
|
||||
**Replaced stock files (HIGH conflict risk - must be manually merged on Zammad upgrades):**
|
||||
- `app/frontend/apps/desktop/pages/ticket/components/TicketDetailView/ArticleReply.vue` - Adds channel whitelist filtering via `cdr_link_allowed_channels` setting
|
||||
- `app/frontend/apps/desktop/pages/personal-setting/views/PersonalSettingNotifications.vue` - Adds Signal notification recipient field
|
||||
- `app/frontend/apps/desktop/components/Form/fields/FieldNotifications/FieldNotificationsInput.vue` - Adds Signal column to notification matrix
|
||||
- `app/frontend/apps/desktop/components/Form/fields/FieldNotifications/types.ts` - Extended notification types
|
||||
- `app/assets/javascripts/app/controllers/_profile/notification.coffee` - Signal notification prefs (legacy UI)
|
||||
- `app/assets/javascripts/app/controllers/_ui_element/notification_matrix.coffee` - Signal column (legacy UI)
|
||||
- `app/assets/javascripts/app/lib/mixins/ticket_notification_matrix.coffee` - Notification matrix mixin
|
||||
- `app/assets/javascripts/app/views/generic/notification_matrix.jst.eco` - Notification matrix template
|
||||
- `app/assets/javascripts/app/views/profile/notification.jst.eco` - Notification profile template
|
||||
|
||||
**Runtime monkey-patches (HIGH conflict risk):**
|
||||
- `config/initializers/opensearch_compatibility.rb` - Prepends to `SearchIndexBackend._mapping_item_type_es()` to replace `'flattened'` with `'flat_object'` for OpenSearch
|
||||
- `config/initializers/cdr_signal.rb` - Injects `after_create` callbacks into `Ticket::Article` and `Link` models
|
||||
- `config/initializers/cdr_whatsapp.rb` - Injects `after_create` callback into `Ticket::Article`
|
||||
|
||||
**Dockerfile-level patches:**
|
||||
- `lib/search_index_backend.rb` - `sed` replaces `'flattened'` with `'flat_object'`
|
||||
- `/docker-entrypoint.sh` - `sed` injects addon install commands after `# es config` anchor
|
||||
- `contrib/nginx/zammad.conf` - Adds `/link` proxy location in embedded mode
|
||||
|
||||
### Key Zammad API Dependencies
|
||||
|
||||
The addon depends on these Zammad interfaces remaining stable:
|
||||
- `Channel::Driver` interface (`fetchable?`, `disconnect`, `deliver`, `streamable?`)
|
||||
- `Ticket::Article` model callbacks and `Sender`/`Type` lookup by name
|
||||
- `Link` model and `Link::Type`/`Link::Object`
|
||||
- `SearchIndexBackend._mapping_item_type_es` method
|
||||
- `Transaction` backend registration system
|
||||
- `Package.install(file:)` / `Package.uninstall(name:, version:)` API
|
||||
- `CreatesTicketArticles` controller concern
|
||||
- Policy naming convention (`controllers/<name>_controller_policy.rb`)
|
||||
|
||||
## Zammad Development Notes
|
||||
|
||||
- After changing any Ruby files, restart railsserver and scheduler: `npm run docker:zammad:restart`
|
||||
- The addon must be rebuilt (`turbo build`) and the Docker image rebuilt (`npm run docker:zammad:build`) for changes to take effect in Docker
|
||||
- Use `/zammad-compat <version>` to check upstream Zammad for breaking changes before upgrading
|
||||
- The current Zammad base version is set in `docker/zammad/Dockerfile` as `ARG ZAMMAD_VERSION`
|
||||
|
||||
## Docker Services
|
||||
|
||||
Defined in `docker/compose/`:
|
||||
- **zammad.yml**: zammad-init, zammad-railsserver, zammad-nginx, zammad-scheduler, zammad-websocket, zammad-memcached, zammad-redis
|
||||
- **bridge-whatsapp.yml**: bridge-whatsapp
|
||||
- **postgresql.yml**: postgresql
|
||||
- **signal-cli-rest-api.yml**: signal-cli-rest-api
|
||||
- **opensearch.yml**: opensearch + dashboards
|
||||
|
|
@ -1,7 +1,7 @@
|
|||
FROM node:22-bookworm-slim AS base
|
||||
|
||||
FROM base AS builder
|
||||
ARG APP_DIR=/opt/bridge-worker
|
||||
ARG APP_DIR=/opt/bridge-deltachat
|
||||
ENV PNPM_HOME="/pnpm"
|
||||
ENV PATH="$PNPM_HOME:$PATH"
|
||||
RUN mkdir -p ${APP_DIR}/
|
||||
|
|
@ -9,10 +9,10 @@ RUN corepack enable && corepack prepare pnpm@9.15.4 --activate
|
|||
RUN pnpm add -g turbo
|
||||
WORKDIR ${APP_DIR}
|
||||
COPY . .
|
||||
RUN turbo prune --scope=@link-stack/bridge-worker --docker
|
||||
RUN turbo prune --scope=@link-stack/bridge-deltachat --docker
|
||||
|
||||
FROM base AS installer
|
||||
ARG APP_DIR=/opt/bridge-worker
|
||||
ARG APP_DIR=/opt/bridge-deltachat
|
||||
ENV PNPM_HOME="/pnpm"
|
||||
ENV PATH="$PNPM_HOME:$PATH"
|
||||
WORKDIR ${APP_DIR}
|
||||
|
|
@ -22,24 +22,28 @@ COPY --from=builder ${APP_DIR}/out/full/ .
|
|||
COPY --from=builder ${APP_DIR}/out/pnpm-lock.yaml ./pnpm-lock.yaml
|
||||
RUN pnpm install --frozen-lockfile
|
||||
RUN pnpm add -g turbo
|
||||
RUN turbo run build --filter=@link-stack/bridge-worker
|
||||
RUN turbo run build --filter=@link-stack/bridge-deltachat
|
||||
|
||||
FROM base as runner
|
||||
ARG BUILD_DATE
|
||||
ARG VERSION
|
||||
ARG APP_DIR=/opt/bridge-worker
|
||||
ARG APP_DIR=/opt/bridge-deltachat
|
||||
ENV PNPM_HOME="/pnpm"
|
||||
ENV PATH="$PNPM_HOME:$PATH"
|
||||
RUN corepack enable && corepack prepare pnpm@9.15.4 --activate
|
||||
RUN mkdir -p ${APP_DIR}/
|
||||
RUN DEBIAN_FRONTEND=noninteractive apt-get update && \
|
||||
apt-get install -y --no-install-recommends \
|
||||
dumb-init
|
||||
RUN corepack enable && corepack prepare pnpm@9.15.4 --activate
|
||||
WORKDIR ${APP_DIR}
|
||||
COPY --from=installer ${APP_DIR} ./
|
||||
RUN chown -R node:node ${APP_DIR}
|
||||
WORKDIR ${APP_DIR}/apps/bridge-worker/
|
||||
WORKDIR ${APP_DIR}/apps/bridge-deltachat/
|
||||
RUN chmod +x docker-entrypoint.sh
|
||||
USER node
|
||||
RUN mkdir /home/node/deltachat-data
|
||||
EXPOSE 5001
|
||||
ENV PORT 5001
|
||||
ENV NODE_ENV production
|
||||
ENTRYPOINT ["/opt/bridge-worker/apps/bridge-worker/docker-entrypoint.sh"]
|
||||
ENV COREPACK_ENABLE_NETWORK=0
|
||||
ENTRYPOINT ["/opt/bridge-deltachat/apps/bridge-deltachat/docker-entrypoint.sh"]
|
||||
5
apps/bridge-deltachat/docker-entrypoint.sh
Executable file
5
apps/bridge-deltachat/docker-entrypoint.sh
Executable file
|
|
@ -0,0 +1,5 @@
|
|||
#!/bin/bash
|
||||
|
||||
set -e
|
||||
echo "starting bridge-deltachat"
|
||||
exec dumb-init pnpm run start
|
||||
3
apps/bridge-deltachat/eslint.config.mjs
Normal file
3
apps/bridge-deltachat/eslint.config.mjs
Normal file
|
|
@ -0,0 +1,3 @@
|
|||
import config from "@link-stack/eslint-config/node";
|
||||
|
||||
export default config;
|
||||
34
apps/bridge-deltachat/package.json
Normal file
34
apps/bridge-deltachat/package.json
Normal file
|
|
@ -0,0 +1,34 @@
|
|||
{
|
||||
"name": "@link-stack/bridge-deltachat",
|
||||
"version": "3.5.0-beta.1",
|
||||
"main": "build/main/index.js",
|
||||
"author": "Darren Clarke <darren@redaranj.com>",
|
||||
"license": "AGPL-3.0-or-later",
|
||||
"prettier": "@link-stack/prettier-config",
|
||||
"dependencies": {
|
||||
"@deltachat/jsonrpc-client": "^1.151.1",
|
||||
"@deltachat/stdio-rpc-server": "^1.151.1",
|
||||
"@hono/node-server": "^1.13.8",
|
||||
"hono": "^4.7.4",
|
||||
"@link-stack/logger": "workspace:*"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@link-stack/eslint-config": "workspace:*",
|
||||
"@link-stack/prettier-config": "workspace:*",
|
||||
"@link-stack/typescript-config": "workspace:*",
|
||||
"@types/node": "*",
|
||||
"dotenv-cli": "^10.0.0",
|
||||
"eslint": "^9.23.0",
|
||||
"prettier": "^3.5.3",
|
||||
"tsx": "^4.20.6",
|
||||
"typescript": "^5.9.3"
|
||||
},
|
||||
"scripts": {
|
||||
"build": "tsc -p tsconfig.json",
|
||||
"dev": "dotenv -- tsx src/index.ts",
|
||||
"start": "node build/main/index.js",
|
||||
"lint": "eslint src/",
|
||||
"format": "prettier --write src/",
|
||||
"format:check": "prettier --check src/"
|
||||
}
|
||||
}
|
||||
35
apps/bridge-deltachat/src/attachments.ts
Normal file
35
apps/bridge-deltachat/src/attachments.ts
Normal file
|
|
@ -0,0 +1,35 @@
|
|||
/**
|
||||
* Attachment size configuration for messaging channels
|
||||
*
|
||||
* Environment variables:
|
||||
* - BRIDGE_MAX_ATTACHMENT_SIZE_MB: Maximum size for a single attachment in MB (default: 50)
|
||||
*/
|
||||
|
||||
/**
|
||||
* Get the maximum attachment size in bytes from environment variable
|
||||
* Defaults to 50MB if not set
|
||||
*/
|
||||
export function getMaxAttachmentSize(): number {
|
||||
const envValue = process.env.BRIDGE_MAX_ATTACHMENT_SIZE_MB;
|
||||
const sizeInMB = envValue ? Number.parseInt(envValue, 10) : 50;
|
||||
|
||||
if (Number.isNaN(sizeInMB) || sizeInMB <= 0) {
|
||||
console.warn(`Invalid BRIDGE_MAX_ATTACHMENT_SIZE_MB value: ${envValue}, using default 50MB`);
|
||||
return 50 * 1024 * 1024;
|
||||
}
|
||||
|
||||
return sizeInMB * 1024 * 1024;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the maximum total size for all attachments in a message
|
||||
* This is 4x the single attachment size
|
||||
*/
|
||||
export function getMaxTotalAttachmentSize(): number {
|
||||
return getMaxAttachmentSize() * 4;
|
||||
}
|
||||
|
||||
/**
|
||||
* Maximum number of attachments per message
|
||||
*/
|
||||
export const MAX_ATTACHMENTS = 10;
|
||||
33
apps/bridge-deltachat/src/index.ts
Normal file
33
apps/bridge-deltachat/src/index.ts
Normal file
|
|
@ -0,0 +1,33 @@
|
|||
import { serve } from "@hono/node-server";
|
||||
import { createLogger } from "@link-stack/logger";
|
||||
|
||||
import { createRoutes } from "./routes.ts";
|
||||
import DeltaChatService from "./service.ts";
|
||||
|
||||
const logger = createLogger("bridge-deltachat-index");
|
||||
|
||||
const main = async () => {
|
||||
const service = new DeltaChatService();
|
||||
await service.initialize();
|
||||
|
||||
const app = createRoutes(service);
|
||||
const port = Number.parseInt(process.env.PORT || "5001", 10);
|
||||
|
||||
serve({ fetch: app.fetch, port }, (info) => {
|
||||
logger.info({ port: info.port }, "bridge-deltachat listening");
|
||||
});
|
||||
|
||||
const shutdown = async () => {
|
||||
logger.info("Shutting down...");
|
||||
await service.teardown();
|
||||
process.exit(0);
|
||||
};
|
||||
|
||||
process.on("SIGTERM", shutdown);
|
||||
process.on("SIGINT", shutdown);
|
||||
};
|
||||
|
||||
main().catch((error) => {
|
||||
logger.error(error);
|
||||
process.exit(1);
|
||||
});
|
||||
62
apps/bridge-deltachat/src/routes.ts
Normal file
62
apps/bridge-deltachat/src/routes.ts
Normal file
|
|
@ -0,0 +1,62 @@
|
|||
import { createLogger } from "@link-stack/logger";
|
||||
import { Hono } from "hono";
|
||||
|
||||
import type DeltaChatService from "./service.ts";
|
||||
|
||||
const logger = createLogger("bridge-deltachat-routes");
|
||||
|
||||
const errorMessage = (error: unknown): string => (error instanceof Error ? error.message : String(error));
|
||||
|
||||
export function createRoutes(service: DeltaChatService): Hono {
|
||||
const app = new Hono();
|
||||
|
||||
app.post("/api/bots/:id/configure", async (c) => {
|
||||
const id = c.req.param("id");
|
||||
const { email, password } = await c.req.json<{ email: string; password: string }>();
|
||||
|
||||
try {
|
||||
const result = await service.configure(id, email, password);
|
||||
logger.info({ id, email }, "Bot configured");
|
||||
return c.json(result);
|
||||
} catch (error) {
|
||||
logger.error({ id, error: errorMessage(error) }, "Failed to configure bot");
|
||||
return c.json({ error: errorMessage(error) }, 500);
|
||||
}
|
||||
});
|
||||
|
||||
app.get("/api/bots/:id", async (c) => {
|
||||
const id = c.req.param("id");
|
||||
return c.json(await service.getBot(id));
|
||||
});
|
||||
|
||||
app.post("/api/bots/:id/send", async (c) => {
|
||||
const id = c.req.param("id");
|
||||
const { email, message, attachments } = await c.req.json<{
|
||||
email: string;
|
||||
message: string;
|
||||
attachments?: Array<{ data: string; filename: string; mime_type: string }>;
|
||||
}>();
|
||||
|
||||
try {
|
||||
const result = await service.send(id, email, message, attachments);
|
||||
logger.info({ id, attachmentCount: attachments?.length || 0 }, "Sent message");
|
||||
return c.json({ result });
|
||||
} catch (error) {
|
||||
logger.error({ id, error: errorMessage(error) }, "Failed to send message");
|
||||
return c.json({ error: errorMessage(error) }, 500);
|
||||
}
|
||||
});
|
||||
|
||||
app.post("/api/bots/:id/unconfigure", async (c) => {
|
||||
const id = c.req.param("id");
|
||||
await service.unconfigure(id);
|
||||
logger.info({ id }, "Bot unconfigured");
|
||||
return c.body(null, 200);
|
||||
});
|
||||
|
||||
app.get("/api/health", (c) => {
|
||||
return c.json({ status: "ok" });
|
||||
});
|
||||
|
||||
return app;
|
||||
}
|
||||
365
apps/bridge-deltachat/src/service.ts
Normal file
365
apps/bridge-deltachat/src/service.ts
Normal file
|
|
@ -0,0 +1,365 @@
|
|||
import fs from "node:fs";
|
||||
import os from "node:os";
|
||||
import path from "node:path";
|
||||
|
||||
import { startDeltaChat, type DeltaChatOverJsonRpcServer } from "@deltachat/stdio-rpc-server";
|
||||
import { createLogger } from "@link-stack/logger";
|
||||
|
||||
import { getMaxAttachmentSize, getMaxTotalAttachmentSize, MAX_ATTACHMENTS } from "./attachments";
|
||||
|
||||
const logger = createLogger("bridge-deltachat-service");
|
||||
|
||||
interface BotMapping {
|
||||
[botId: string]: number;
|
||||
}
|
||||
|
||||
export default class DeltaChatService {
|
||||
private dc: DeltaChatOverJsonRpcServer | null = null;
|
||||
private botMapping: BotMapping = {};
|
||||
private dataDir: string;
|
||||
private mappingFile: string;
|
||||
|
||||
constructor() {
|
||||
this.dataDir = process.env.DELTACHAT_DATA_DIR || "/home/node/deltachat-data";
|
||||
this.mappingFile = path.join(this.dataDir, "bot-mapping.json");
|
||||
}
|
||||
|
||||
async initialize(): Promise<void> {
|
||||
if (!fs.existsSync(this.dataDir)) {
|
||||
fs.mkdirSync(this.dataDir, { recursive: true });
|
||||
}
|
||||
|
||||
logger.info({ dataDir: this.dataDir }, "Starting deltachat-rpc-server");
|
||||
this.dc = await startDeltaChat(this.dataDir);
|
||||
logger.info("deltachat-rpc-server started");
|
||||
|
||||
this.loadBotMapping();
|
||||
|
||||
for (const [botId, accountId] of Object.entries(this.botMapping)) {
|
||||
try {
|
||||
const isConfigured = await this.dc.rpc.isConfigured(accountId);
|
||||
if (isConfigured) {
|
||||
await this.dc.rpc.startIo(accountId);
|
||||
logger.info({ botId, accountId }, "Resumed IO for existing bot");
|
||||
} else {
|
||||
logger.warn({ botId, accountId }, "Account not configured, removing from mapping");
|
||||
delete this.botMapping[botId];
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error({ botId, accountId, err: error }, "Failed to resume bot, removing from mapping");
|
||||
delete this.botMapping[botId];
|
||||
}
|
||||
}
|
||||
|
||||
this.saveBotMapping();
|
||||
this.registerEventListeners();
|
||||
}
|
||||
|
||||
async teardown(): Promise<void> {
|
||||
if (this.dc) {
|
||||
for (const [botId, accountId] of Object.entries(this.botMapping)) {
|
||||
try {
|
||||
await this.dc.rpc.stopIo(accountId);
|
||||
logger.info({ botId, accountId }, "Stopped IO for bot");
|
||||
} catch (error) {
|
||||
logger.error({ botId, accountId, err: error }, "Error stopping IO");
|
||||
}
|
||||
}
|
||||
this.dc.close();
|
||||
this.dc = null;
|
||||
}
|
||||
}
|
||||
|
||||
private loadBotMapping(): void {
|
||||
if (fs.existsSync(this.mappingFile)) {
|
||||
try {
|
||||
const data = fs.readFileSync(this.mappingFile, "utf8");
|
||||
this.botMapping = JSON.parse(data);
|
||||
logger.info({ botCount: Object.keys(this.botMapping).length }, "Loaded bot mapping");
|
||||
} catch (error) {
|
||||
logger.error({ err: error }, "Failed to load bot mapping, starting fresh");
|
||||
this.botMapping = {};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private saveBotMapping(): void {
|
||||
fs.writeFileSync(this.mappingFile, JSON.stringify(this.botMapping, null, 2), "utf8");
|
||||
}
|
||||
|
||||
private validateBotId(id: string): void {
|
||||
if (!/^[a-zA-Z0-9_-]+$/.test(id)) {
|
||||
throw new Error(`Invalid bot ID format: ${id}`);
|
||||
}
|
||||
}
|
||||
|
||||
private getBotIdForAccount(accountId: number): string | undefined {
|
||||
return Object.entries(this.botMapping).find(([, aid]) => aid === accountId)?.[0];
|
||||
}
|
||||
|
||||
private registerEventListeners(): void {
|
||||
if (!this.dc) return;
|
||||
|
||||
this.dc.on("IncomingMsg", (accountId, event) => {
|
||||
this.handleIncomingMessage(accountId, event.chatId, event.msgId).catch((error) => {
|
||||
logger.error({ err: error, accountId }, "Error handling incoming message");
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
private async handleIncomingMessage(accountId: number, chatId: number, msgId: number): Promise<void> {
|
||||
if (!this.dc) return;
|
||||
|
||||
const botId = this.getBotIdForAccount(accountId);
|
||||
if (!botId) {
|
||||
logger.warn({ accountId }, "Received message for unknown account");
|
||||
return;
|
||||
}
|
||||
|
||||
const msg = await this.dc.rpc.getMessage(accountId, msgId);
|
||||
|
||||
// Incoming states: 10=fresh, 13=noticed, 16=seen
|
||||
const isIncoming = msg.state === 10 || msg.state === 13 || msg.state === 16;
|
||||
if (msg.isBot || !isIncoming) {
|
||||
logger.debug({ msgId, isBot: msg.isBot, state: msg.state }, "Skipping message");
|
||||
return;
|
||||
}
|
||||
|
||||
const contact = await this.dc.rpc.getContact(accountId, msg.fromId);
|
||||
const senderEmail = contact.address;
|
||||
const botConfig = await this.dc.rpc.getConfig(accountId, "configured_addr");
|
||||
const botEmail = botConfig || "";
|
||||
|
||||
logger.info({ botId, senderEmail, msgId }, "Processing incoming message");
|
||||
|
||||
let attachment: string | undefined;
|
||||
let filename: string | undefined;
|
||||
let mimeType: string | undefined;
|
||||
|
||||
if (msg.file) {
|
||||
try {
|
||||
const fileData = fs.readFileSync(msg.file);
|
||||
attachment = fileData.toString("base64");
|
||||
filename = msg.fileName || path.basename(msg.file);
|
||||
mimeType = msg.fileMime || "application/octet-stream";
|
||||
logger.info({ filename, mimeType, size: fileData.length }, "Attachment found");
|
||||
} catch (error) {
|
||||
logger.error({ err: error, file: msg.file }, "Failed to read attachment file");
|
||||
}
|
||||
}
|
||||
|
||||
const payload: Record<string, unknown> = {
|
||||
from: senderEmail,
|
||||
to: botEmail,
|
||||
message: msg.text || "",
|
||||
message_id: String(msgId),
|
||||
sent_at: new Date(msg.timestamp * 1000).toISOString(),
|
||||
};
|
||||
|
||||
if (attachment) {
|
||||
payload.attachment = attachment;
|
||||
payload.filename = filename;
|
||||
payload.mime_type = mimeType;
|
||||
}
|
||||
|
||||
const zammadUrl = process.env.ZAMMAD_URL || "http://zammad-nginx:8080";
|
||||
try {
|
||||
const response = await fetch(`${zammadUrl}/api/v1/channels_cdr_deltachat_bot_webhook/${botId}`, {
|
||||
method: "POST",
|
||||
headers: { "Content-Type": "application/json" },
|
||||
body: JSON.stringify(payload),
|
||||
});
|
||||
|
||||
if (response.ok) {
|
||||
logger.info({ botId, msgId }, "Message forwarded to Zammad");
|
||||
} else {
|
||||
const errorText = await response.text();
|
||||
logger.error({ status: response.status, error: errorText, botId }, "Failed to send message to Zammad");
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error({ err: error, botId }, "Failed to POST to Zammad webhook");
|
||||
}
|
||||
|
||||
try {
|
||||
await this.dc.rpc.markseenMsgs(accountId, [msgId]);
|
||||
} catch (error) {
|
||||
logger.error({ err: error, msgId }, "Failed to mark message as seen");
|
||||
}
|
||||
}
|
||||
|
||||
async configure(botId: string, email: string, password: string): Promise<{ accountId: number; email: string }> {
|
||||
this.validateBotId(botId);
|
||||
if (!this.dc) throw new Error("DeltaChat not initialized");
|
||||
|
||||
if (this.botMapping[botId] !== undefined) {
|
||||
throw new Error(`Bot ${botId} is already configured`);
|
||||
}
|
||||
|
||||
const accountId = await this.dc.rpc.addAccount();
|
||||
logger.info({ botId, accountId, email }, "Created new account");
|
||||
|
||||
try {
|
||||
await this.dc.rpc.batchSetConfig(accountId, {
|
||||
addr: email,
|
||||
mail_pw: password,
|
||||
bot: "1",
|
||||
e2ee_enabled: "1",
|
||||
});
|
||||
|
||||
logger.info({ botId, accountId }, "Configuring account (verifying credentials)...");
|
||||
await this.dc.rpc.configure(accountId);
|
||||
logger.info({ botId, accountId }, "Account configured successfully");
|
||||
|
||||
await this.dc.rpc.startIo(accountId);
|
||||
logger.info({ botId, accountId }, "IO started");
|
||||
|
||||
this.botMapping[botId] = accountId;
|
||||
this.saveBotMapping();
|
||||
|
||||
return { accountId, email };
|
||||
} catch (error) {
|
||||
logger.error({ botId, accountId, err: error }, "Configuration failed, removing account");
|
||||
try {
|
||||
await this.dc.rpc.removeAccount(accountId);
|
||||
} catch (error_) {
|
||||
logger.error({ removeErr: error_ }, "Failed to clean up account after configuration failure");
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async getBot(botId: string): Promise<{ configured: boolean; email: string | null }> {
|
||||
this.validateBotId(botId);
|
||||
|
||||
const accountId = this.botMapping[botId];
|
||||
if (accountId === undefined || !this.dc) {
|
||||
return { configured: false, email: null };
|
||||
}
|
||||
|
||||
try {
|
||||
const isConfigured = await this.dc.rpc.isConfigured(accountId);
|
||||
const email = await this.dc.rpc.getConfig(accountId, "configured_addr");
|
||||
return { configured: isConfigured, email: email || null };
|
||||
} catch {
|
||||
return { configured: false, email: null };
|
||||
}
|
||||
}
|
||||
|
||||
async unconfigure(botId: string): Promise<void> {
|
||||
this.validateBotId(botId);
|
||||
if (!this.dc) throw new Error("DeltaChat not initialized");
|
||||
|
||||
const accountId = this.botMapping[botId];
|
||||
if (accountId === undefined) {
|
||||
logger.warn({ botId }, "Bot not found for unconfigure");
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
await this.dc.rpc.stopIo(accountId);
|
||||
} catch (error) {
|
||||
logger.warn({ botId, accountId, err: error }, "Error stopping IO during unconfigure");
|
||||
}
|
||||
|
||||
try {
|
||||
await this.dc.rpc.removeAccount(accountId);
|
||||
} catch (error) {
|
||||
logger.warn({ botId, accountId, err: error }, "Error removing account during unconfigure");
|
||||
}
|
||||
|
||||
delete this.botMapping[botId];
|
||||
this.saveBotMapping();
|
||||
logger.info({ botId, accountId }, "Bot unconfigured and removed");
|
||||
}
|
||||
|
||||
async send(
|
||||
botId: string,
|
||||
email: string,
|
||||
message: string,
|
||||
attachments?: Array<{ data: string; filename: string; mime_type: string }>
|
||||
): Promise<{ recipient: string; timestamp: string; source: string }> {
|
||||
this.validateBotId(botId);
|
||||
if (!this.dc) throw new Error("DeltaChat not initialized");
|
||||
|
||||
const accountId = this.botMapping[botId];
|
||||
if (accountId === undefined) {
|
||||
throw new Error(`Bot ${botId} is not configured`);
|
||||
}
|
||||
|
||||
const contactId = await this.dc.rpc.createContact(accountId, email, "");
|
||||
const chatId = await this.dc.rpc.createChatByContactId(accountId, contactId);
|
||||
|
||||
if (attachments && attachments.length > 0) {
|
||||
const MAX_ATTACHMENT_SIZE = getMaxAttachmentSize();
|
||||
const MAX_TOTAL_SIZE = getMaxTotalAttachmentSize();
|
||||
|
||||
if (attachments.length > MAX_ATTACHMENTS) {
|
||||
throw new Error(`Too many attachments: ${attachments.length} (max ${MAX_ATTACHMENTS})`);
|
||||
}
|
||||
|
||||
let totalSize = 0;
|
||||
|
||||
for (const att of attachments) {
|
||||
const estimatedSize = (att.data.length * 3) / 4;
|
||||
|
||||
if (estimatedSize > MAX_ATTACHMENT_SIZE) {
|
||||
logger.warn(
|
||||
{ filename: att.filename, size: estimatedSize, maxSize: MAX_ATTACHMENT_SIZE },
|
||||
"Attachment exceeds size limit, skipping"
|
||||
);
|
||||
continue;
|
||||
}
|
||||
|
||||
totalSize += estimatedSize;
|
||||
if (totalSize > MAX_TOTAL_SIZE) {
|
||||
logger.warn(
|
||||
{ totalSize, maxTotalSize: MAX_TOTAL_SIZE },
|
||||
"Total attachment size exceeds limit, skipping remaining"
|
||||
);
|
||||
break;
|
||||
}
|
||||
|
||||
const buffer = Buffer.from(att.data, "base64");
|
||||
const tmpFile = path.join(os.tmpdir(), `dc-${Date.now()}-${att.filename}`);
|
||||
fs.writeFileSync(tmpFile, buffer);
|
||||
|
||||
try {
|
||||
await this.dc.rpc.sendMsg(accountId, chatId, {
|
||||
text: message,
|
||||
html: null,
|
||||
viewtype: null,
|
||||
file: tmpFile,
|
||||
filename: att.filename,
|
||||
location: null,
|
||||
overrideSenderName: null,
|
||||
quotedMessageId: null,
|
||||
quotedText: null,
|
||||
});
|
||||
// Only include text with the first attachment; clear for subsequent
|
||||
message = "";
|
||||
} finally {
|
||||
try {
|
||||
fs.unlinkSync(tmpFile);
|
||||
} catch {
|
||||
// ignore cleanup errors
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// If we had message text but all attachments were skipped, send text only
|
||||
if (message) {
|
||||
await this.dc.rpc.miscSendTextMessage(accountId, chatId, message);
|
||||
}
|
||||
} else {
|
||||
await this.dc.rpc.miscSendTextMessage(accountId, chatId, message);
|
||||
}
|
||||
|
||||
const botEmail = (await this.dc.rpc.getConfig(accountId, "configured_addr")) || botId;
|
||||
|
||||
return {
|
||||
recipient: email,
|
||||
timestamp: new Date().toISOString(),
|
||||
source: botEmail,
|
||||
};
|
||||
}
|
||||
}
|
||||
9
apps/bridge-deltachat/tsconfig.json
Normal file
9
apps/bridge-deltachat/tsconfig.json
Normal file
|
|
@ -0,0 +1,9 @@
|
|||
{
|
||||
"extends": "@link-stack/typescript-config/tsconfig.node.json",
|
||||
"compilerOptions": {
|
||||
"outDir": "build/main",
|
||||
"rootDir": "src"
|
||||
},
|
||||
"include": ["src/**/*.ts", "src/**/.*.ts"],
|
||||
"exclude": ["node_modules/**"]
|
||||
}
|
||||
|
|
@ -1,3 +0,0 @@
|
|||
{
|
||||
"extends": "next/core-web-vitals"
|
||||
}
|
||||
36
apps/bridge-frontend/.gitignore
vendored
36
apps/bridge-frontend/.gitignore
vendored
|
|
@ -1,36 +0,0 @@
|
|||
# See https://help.github.com/articles/ignoring-files/ for more about ignoring files.
|
||||
|
||||
# dependencies
|
||||
/node_modules
|
||||
/.pnp
|
||||
.pnp.js
|
||||
.yarn/install-state.gz
|
||||
|
||||
# testing
|
||||
/coverage
|
||||
|
||||
# next.js
|
||||
/.next/
|
||||
/out/
|
||||
|
||||
# production
|
||||
/build
|
||||
|
||||
# misc
|
||||
.DS_Store
|
||||
*.pem
|
||||
|
||||
# debug
|
||||
npm-debug.log*
|
||||
yarn-debug.log*
|
||||
yarn-error.log*
|
||||
|
||||
# local env files
|
||||
.env*.local
|
||||
|
||||
# vercel
|
||||
.vercel
|
||||
|
||||
# typescript
|
||||
*.tsbuildinfo
|
||||
next-env.d.ts
|
||||
|
|
@ -1,133 +0,0 @@
|
|||
# Bridge Frontend
|
||||
|
||||
Frontend application for managing communication bridges between various messaging platforms and the CDR Link system.
|
||||
|
||||
## Overview
|
||||
|
||||
Bridge Frontend provides a web interface for configuring and managing communication channels including Signal, WhatsApp, Facebook, and Voice integrations. It handles bot registration, webhook configuration, and channel settings.
|
||||
|
||||
## Features
|
||||
|
||||
- **Channel Management**: Configure Signal, WhatsApp, Facebook, and Voice channels
|
||||
- **Bot Registration**: Register and manage bots for each communication platform
|
||||
- **Webhook Configuration**: Set up webhooks for message routing
|
||||
- **Settings Management**: Configure channel-specific settings and behaviors
|
||||
- **User Authentication**: Secure access with NextAuth.js
|
||||
|
||||
## Development
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- Node.js >= 20
|
||||
- npm >= 10
|
||||
- PostgreSQL database
|
||||
- Running bridge-worker service
|
||||
|
||||
### Setup
|
||||
|
||||
```bash
|
||||
# Install dependencies
|
||||
npm install
|
||||
|
||||
# Run database migrations
|
||||
npm run migrate:latest
|
||||
|
||||
# Run development server
|
||||
npm run dev
|
||||
|
||||
# Build for production
|
||||
npm run build
|
||||
|
||||
# Start production server
|
||||
npm run start
|
||||
```
|
||||
|
||||
### Environment Variables
|
||||
|
||||
Required environment variables:
|
||||
|
||||
- `DATABASE_URL` - PostgreSQL connection string
|
||||
- `DATABASE_HOST` - Database host
|
||||
- `DATABASE_NAME` - Database name
|
||||
- `DATABASE_USER` - Database username
|
||||
- `DATABASE_PASSWORD` - Database password
|
||||
- `NEXTAUTH_URL` - Application URL
|
||||
- `NEXTAUTH_SECRET` - NextAuth.js secret
|
||||
- `GOOGLE_CLIENT_ID` - Google OAuth client ID
|
||||
- `GOOGLE_CLIENT_SECRET` - Google OAuth client secret
|
||||
|
||||
### Available Scripts
|
||||
|
||||
- `npm run dev` - Start development server
|
||||
- `npm run build` - Build for production
|
||||
- `npm run start` - Start production server
|
||||
- `npm run lint` - Run ESLint
|
||||
- `npm run migrate:latest` - Run all pending migrations
|
||||
- `npm run migrate:down` - Rollback last migration
|
||||
- `npm run migrate:up` - Run next migration
|
||||
- `npm run migrate:make` - Create new migration
|
||||
|
||||
## Architecture
|
||||
|
||||
### Database Schema
|
||||
|
||||
The application manages the following main entities:
|
||||
|
||||
- **Bots**: Communication channel bot configurations
|
||||
- **Webhooks**: Webhook endpoints for external integrations
|
||||
- **Settings**: Channel-specific configuration settings
|
||||
- **Users**: User accounts with role-based permissions
|
||||
|
||||
### API Routes
|
||||
|
||||
- `/api/auth` - Authentication endpoints
|
||||
- `/api/[service]/bots` - Bot management for each service
|
||||
- `/api/[service]/webhooks` - Webhook configuration
|
||||
|
||||
### Page Structure
|
||||
|
||||
- `/` - Dashboard/home page
|
||||
- `/login` - Authentication page
|
||||
- `/[...segment]` - Dynamic routing for CRUD operations
|
||||
- `@create` - Create new entities
|
||||
- `@detail` - View entity details
|
||||
- `@edit` - Edit existing entities
|
||||
|
||||
## Integration
|
||||
|
||||
### Database Access
|
||||
|
||||
Uses Kysely ORM for type-safe database queries:
|
||||
|
||||
```typescript
|
||||
import { db } from '@link-stack/database'
|
||||
|
||||
const bots = await db
|
||||
.selectFrom('bots')
|
||||
.selectAll()
|
||||
.execute()
|
||||
```
|
||||
|
||||
### Authentication
|
||||
|
||||
Integrated with NextAuth.js using database adapter:
|
||||
|
||||
```typescript
|
||||
import { authOptions } from '@link-stack/auth'
|
||||
```
|
||||
|
||||
## Docker Support
|
||||
|
||||
```bash
|
||||
# Build image
|
||||
docker build -t link-stack/bridge-frontend .
|
||||
|
||||
# Run with docker-compose
|
||||
docker-compose -f docker/compose/bridge.yml up
|
||||
```
|
||||
|
||||
## Related Services
|
||||
|
||||
- **bridge-worker**: Processes messages from configured channels
|
||||
- **bridge-whatsapp**: WhatsApp-specific integration service
|
||||
- **bridge-migrations**: Database schema management
|
||||
|
|
@ -1,14 +0,0 @@
|
|||
import { Metadata } from "next";
|
||||
import { getSession } from "next-auth/react";
|
||||
import { Login } from "@/app/_components/Login";
|
||||
|
||||
export const dynamic = "force-dynamic";
|
||||
|
||||
export const metadata: Metadata = {
|
||||
title: "Login",
|
||||
};
|
||||
|
||||
export default async function Page() {
|
||||
const session = await getSession();
|
||||
return <Login session={session} />;
|
||||
}
|
||||
|
|
@ -1,12 +0,0 @@
|
|||
import { Create } from "@link-stack/bridge-ui";
|
||||
|
||||
type PageProps = {
|
||||
params: Promise<{ segment: string[] }>;
|
||||
};
|
||||
|
||||
export default async function Page({ params }: PageProps) {
|
||||
const { segment } = await params;
|
||||
const service = segment[0];
|
||||
|
||||
return <Create service={service} />;
|
||||
}
|
||||
|
|
@ -1,28 +0,0 @@
|
|||
import { db } from "@link-stack/bridge-common";
|
||||
import { serviceConfig, Detail } from "@link-stack/bridge-ui";
|
||||
|
||||
type PageProps = {
|
||||
params: Promise<{ segment: string[] }>;
|
||||
};
|
||||
|
||||
export default async function Page({ params }: PageProps) {
|
||||
const { segment } = await params;
|
||||
const service = segment[0];
|
||||
const id = segment?.[1];
|
||||
|
||||
if (!id) return null;
|
||||
|
||||
const {
|
||||
[service]: { table },
|
||||
} = serviceConfig;
|
||||
|
||||
const row = await db
|
||||
.selectFrom(table)
|
||||
.selectAll()
|
||||
.where("id", "=", id)
|
||||
.executeTakeFirst();
|
||||
|
||||
if (!row) return null;
|
||||
|
||||
return <Detail service={service} row={row} />;
|
||||
}
|
||||
|
|
@ -1,28 +0,0 @@
|
|||
import { db } from "@link-stack/bridge-common";
|
||||
import { serviceConfig, Edit } from "@link-stack/bridge-ui";
|
||||
|
||||
type PageProps = {
|
||||
params: Promise<{ segment: string[] }>;
|
||||
};
|
||||
|
||||
export default async function Page({ params }: PageProps) {
|
||||
const { segment } = await params;
|
||||
const service = segment[0];
|
||||
const id = segment?.[1];
|
||||
|
||||
if (!id) return null;
|
||||
|
||||
const {
|
||||
[service]: { table },
|
||||
} = serviceConfig;
|
||||
|
||||
const row = await db
|
||||
.selectFrom(table)
|
||||
.selectAll()
|
||||
.where("id", "=", id)
|
||||
.executeTakeFirst();
|
||||
|
||||
if (!row) return null;
|
||||
|
||||
return <Edit service={service} row={row} />;
|
||||
}
|
||||
|
|
@ -1,3 +0,0 @@
|
|||
import { ServiceLayout } from "@link-stack/bridge-ui";
|
||||
|
||||
export default ServiceLayout;
|
||||
|
|
@ -1,23 +0,0 @@
|
|||
import { db } from "@link-stack/bridge-common";
|
||||
import { serviceConfig, List } from "@link-stack/bridge-ui";
|
||||
|
||||
type PageProps = {
|
||||
params: Promise<{
|
||||
segment: string[];
|
||||
}>;
|
||||
};
|
||||
|
||||
export default async function Page({ params }: PageProps) {
|
||||
const { segment } = await params;
|
||||
const service = segment[0];
|
||||
|
||||
if (!service) return null;
|
||||
|
||||
const config = serviceConfig[service];
|
||||
|
||||
if (!config) return null;
|
||||
|
||||
const rows = await db.selectFrom(config.table).selectAll().execute();
|
||||
|
||||
return <List service={service} rows={rows} />;
|
||||
}
|
||||
|
|
@ -1,9 +0,0 @@
|
|||
import { InternalLayout } from "@/app/_components/InternalLayout";
|
||||
|
||||
export default function Layout({
|
||||
children,
|
||||
}: Readonly<{
|
||||
children: React.ReactNode;
|
||||
}>) {
|
||||
return <InternalLayout>{children}</InternalLayout>;
|
||||
}
|
||||
|
|
@ -1,5 +0,0 @@
|
|||
import { Home } from "@link-stack/bridge-ui";
|
||||
|
||||
export default function Page() {
|
||||
return <Home />;
|
||||
}
|
||||
|
|
@ -1,29 +0,0 @@
|
|||
"use client";
|
||||
|
||||
import { FC, PropsWithChildren, useState } from "react";
|
||||
import { Grid } from "@mui/material";
|
||||
import { CssBaseline } from "@mui/material";
|
||||
import { AppRouterCacheProvider } from "@mui/material-nextjs/v14-appRouter";
|
||||
import { SessionProvider } from "next-auth/react";
|
||||
import { Sidebar } from "./Sidebar";
|
||||
|
||||
export const InternalLayout: FC<PropsWithChildren> = ({ children }) => {
|
||||
const [open, setOpen] = useState(true);
|
||||
|
||||
return (
|
||||
<AppRouterCacheProvider>
|
||||
<SessionProvider>
|
||||
<CssBaseline />
|
||||
<Grid container direction="row">
|
||||
<Sidebar open={open} setOpen={setOpen} />
|
||||
<Grid
|
||||
item
|
||||
sx={{ ml: open ? "270px" : "70px", width: "100%", height: "100vh" }}
|
||||
>
|
||||
{children as any}
|
||||
</Grid>
|
||||
</Grid>
|
||||
</SessionProvider>
|
||||
</AppRouterCacheProvider>
|
||||
);
|
||||
};
|
||||
|
|
@ -1,185 +0,0 @@
|
|||
"use client";
|
||||
|
||||
import { FC, useState } from "react";
|
||||
import {
|
||||
Box,
|
||||
Grid,
|
||||
Container,
|
||||
IconButton,
|
||||
Typography,
|
||||
TextField,
|
||||
} from "@mui/material";
|
||||
import {
|
||||
Apple as AppleIcon,
|
||||
Google as GoogleIcon,
|
||||
Key as KeyIcon,
|
||||
} from "@mui/icons-material";
|
||||
import { signIn } from "next-auth/react";
|
||||
import Image from "next/image";
|
||||
import LinkLogo from "@/app/_images/link-logo-small.png";
|
||||
import { colors, fonts } from "@link-stack/ui";
|
||||
import { useSearchParams } from "next/navigation";
|
||||
|
||||
type LoginProps = {
|
||||
session: any;
|
||||
};
|
||||
|
||||
export const Login: FC<LoginProps> = ({ session }) => {
|
||||
const origin =
|
||||
typeof window !== "undefined" && window.location.origin
|
||||
? window.location.origin
|
||||
: "";
|
||||
const [email, setEmail] = useState("");
|
||||
const [password, setPassword] = useState("");
|
||||
const params = useSearchParams();
|
||||
const error = params.get("error");
|
||||
const { darkGray, cdrLinkOrange, white } = colors;
|
||||
const { poppins } = fonts;
|
||||
const buttonStyles = {
|
||||
borderRadius: 500,
|
||||
width: "100%",
|
||||
fontSize: "16px",
|
||||
fontWeight: "bold",
|
||||
backgroundColor: white,
|
||||
"&:hover": {
|
||||
color: white,
|
||||
backgroundColor: cdrLinkOrange,
|
||||
},
|
||||
};
|
||||
const fieldStyles = {
|
||||
"& label.Mui-focused": {
|
||||
color: cdrLinkOrange,
|
||||
},
|
||||
"& .MuiInput-underline:after": {
|
||||
borderBottomColor: cdrLinkOrange,
|
||||
},
|
||||
"& .MuiFilledInput-underline:after": {
|
||||
borderBottomColor: cdrLinkOrange,
|
||||
},
|
||||
"& .MuiOutlinedInput-root": {
|
||||
"&.Mui-focused fieldset": {
|
||||
borderColor: cdrLinkOrange,
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
return (
|
||||
<Box sx={{ backgroundColor: darkGray, height: "100vh" }}>
|
||||
<Container maxWidth="md" sx={{ p: 10 }}>
|
||||
<Grid container spacing={2} direction="column" alignItems="center">
|
||||
<Grid
|
||||
item
|
||||
container
|
||||
direction="row"
|
||||
justifyContent="center"
|
||||
alignItems="center"
|
||||
>
|
||||
<Grid item>
|
||||
<Box
|
||||
sx={{
|
||||
width: "70px",
|
||||
height: "70px",
|
||||
margin: "0 auto",
|
||||
}}
|
||||
>
|
||||
<Image
|
||||
src={LinkLogo}
|
||||
alt="Link logo"
|
||||
width={70}
|
||||
height={70}
|
||||
style={{
|
||||
objectFit: "cover",
|
||||
filter: "grayscale(100) brightness(100)",
|
||||
}}
|
||||
/>
|
||||
</Box>
|
||||
</Grid>
|
||||
<Grid item>
|
||||
<Typography
|
||||
variant="h2"
|
||||
sx={{
|
||||
fontSize: 36,
|
||||
color: "white",
|
||||
fontWeight: 700,
|
||||
mt: 1,
|
||||
ml: 0.5,
|
||||
fontFamily: poppins.style.fontFamily,
|
||||
}}
|
||||
>
|
||||
CDR Bridge
|
||||
</Typography>
|
||||
</Grid>
|
||||
</Grid>
|
||||
|
||||
<Grid item sx={{ width: "100%" }}>
|
||||
{!session ? (
|
||||
<Container
|
||||
maxWidth="xs"
|
||||
sx={{
|
||||
p: 3,
|
||||
mt: 3,
|
||||
}}
|
||||
>
|
||||
<Grid
|
||||
container
|
||||
spacing={3}
|
||||
direction="column"
|
||||
alignItems="center"
|
||||
>
|
||||
{error ? (
|
||||
<Grid item sx={{ width: "100%" }}>
|
||||
<Box sx={{ backgroundColor: "red", p: 3 }}>
|
||||
<Typography
|
||||
variant="body1"
|
||||
sx={{
|
||||
fontSize: 18,
|
||||
color: "white",
|
||||
textAlign: "center",
|
||||
}}
|
||||
>
|
||||
{`${error} error`}
|
||||
</Typography>
|
||||
</Box>
|
||||
</Grid>
|
||||
) : null}
|
||||
<Grid item sx={{ width: "100%" }}>
|
||||
<IconButton
|
||||
sx={buttonStyles}
|
||||
onClick={() =>
|
||||
signIn("google", {
|
||||
callbackUrl: `${origin}`,
|
||||
})
|
||||
}
|
||||
>
|
||||
<GoogleIcon sx={{ mr: 1 }} />
|
||||
Sign in with Google
|
||||
</IconButton>
|
||||
</Grid>
|
||||
<Grid item sx={{ width: "100%" }}>
|
||||
<IconButton
|
||||
aria-label="Sign in with Apple"
|
||||
sx={buttonStyles}
|
||||
onClick={() =>
|
||||
signIn("apple", {
|
||||
callbackUrl: `${window.location.origin}`,
|
||||
})
|
||||
}
|
||||
>
|
||||
<AppleIcon sx={{ mr: 1 }} />
|
||||
Sign in with Apple
|
||||
</IconButton>
|
||||
</Grid>
|
||||
</Grid>
|
||||
</Container>
|
||||
) : null}
|
||||
{session ? (
|
||||
<Box component="h4">
|
||||
{` ${session.user.name ?? session.user.email}.`}
|
||||
</Box>
|
||||
) : null}
|
||||
</Grid>
|
||||
</Grid>
|
||||
</Container>
|
||||
</Box>
|
||||
);
|
||||
};
|
||||
|
|
@ -1,399 +0,0 @@
|
|||
"use client";
|
||||
|
||||
import { FC } from "react";
|
||||
import {
|
||||
Box,
|
||||
Grid,
|
||||
Typography,
|
||||
List,
|
||||
ListItemButton,
|
||||
ListItemIcon,
|
||||
ListItemText,
|
||||
ListItemSecondaryAction,
|
||||
Drawer,
|
||||
} from "@mui/material";
|
||||
import {
|
||||
ExpandCircleDown as ExpandCircleDownIcon,
|
||||
AccountCircle as AccountCircleIcon,
|
||||
Chat as ChatIcon,
|
||||
PermPhoneMsg as PhoneIcon,
|
||||
WhatsApp as WhatsAppIcon,
|
||||
Facebook as FacebookIcon,
|
||||
AirlineStops as AirlineStopsIcon,
|
||||
Logout as LogoutIcon,
|
||||
} from "@mui/icons-material";
|
||||
import { usePathname } from "next/navigation";
|
||||
import Link from "next/link";
|
||||
import Image from "next/image";
|
||||
import { typography, fonts, Button } from "@link-stack/ui";
|
||||
import LinkLogo from "@/app/_images/link-logo-small.png";
|
||||
import { useSession, signOut } from "next-auth/react";
|
||||
|
||||
const openWidth = 270;
|
||||
const closedWidth = 70;
|
||||
|
||||
const MenuItem = ({
|
||||
name,
|
||||
href,
|
||||
Icon,
|
||||
iconSize,
|
||||
inset = false,
|
||||
selected = false,
|
||||
open = true,
|
||||
badge,
|
||||
target = "_self",
|
||||
}: any) => (
|
||||
<Link href={href} target={target}>
|
||||
<ListItemButton
|
||||
sx={{
|
||||
p: 0,
|
||||
mb: 1,
|
||||
bl: iconSize === 0 ? "1px solid white" : "inherit",
|
||||
}}
|
||||
selected={selected}
|
||||
>
|
||||
{iconSize > 0 ? (
|
||||
<ListItemIcon
|
||||
sx={{
|
||||
color: `white`,
|
||||
minWidth: 0,
|
||||
mr: 2,
|
||||
textAlign: "center",
|
||||
margin: open ? "0 8 0 0" : "0 auto",
|
||||
}}
|
||||
>
|
||||
<Box
|
||||
sx={{
|
||||
width: iconSize,
|
||||
height: iconSize,
|
||||
mr: 0.5,
|
||||
mt: "-4px",
|
||||
}}
|
||||
>
|
||||
<Icon />
|
||||
</Box>
|
||||
</ListItemIcon>
|
||||
) : (
|
||||
<Box
|
||||
sx={{
|
||||
width: 30,
|
||||
height: "28px",
|
||||
position: "relative",
|
||||
ml: "9px",
|
||||
mr: "1px",
|
||||
}}
|
||||
>
|
||||
<Box
|
||||
sx={{
|
||||
width: "1px",
|
||||
height: "56px",
|
||||
backgroundColor: "white",
|
||||
position: "absolute",
|
||||
left: "3px",
|
||||
top: "-10px",
|
||||
}}
|
||||
/>
|
||||
<Box
|
||||
sx={{
|
||||
width: "42px",
|
||||
height: "42px",
|
||||
position: "absolute",
|
||||
top: "-27px",
|
||||
left: "3px",
|
||||
border: "1px solid #fff",
|
||||
borderColor: "transparent transparent transparent #fff",
|
||||
borderRadius: "60px",
|
||||
rotate: "-35deg",
|
||||
}}
|
||||
/>
|
||||
</Box>
|
||||
)}
|
||||
{open && (
|
||||
<ListItemText
|
||||
inset={inset}
|
||||
primary={
|
||||
<Typography
|
||||
variant="body1"
|
||||
sx={{
|
||||
fontSize: 16,
|
||||
fontWeight: "bold",
|
||||
border: 0,
|
||||
textAlign: "left",
|
||||
color: "white",
|
||||
}}
|
||||
>
|
||||
{name}
|
||||
</Typography>
|
||||
}
|
||||
/>
|
||||
)}
|
||||
{badge && badge > 0 ? (
|
||||
<ListItemSecondaryAction>
|
||||
<Typography
|
||||
color="textSecondary"
|
||||
variant="body1"
|
||||
className="badge"
|
||||
sx={{
|
||||
backgroundColor: "#FFB620",
|
||||
color: "black !important",
|
||||
borderRadius: 10,
|
||||
px: 1,
|
||||
fontSize: 12,
|
||||
fontWeight: "bold",
|
||||
}}
|
||||
>
|
||||
{badge}
|
||||
</Typography>
|
||||
</ListItemSecondaryAction>
|
||||
) : null}
|
||||
</ListItemButton>
|
||||
</Link>
|
||||
);
|
||||
|
||||
interface SidebarProps {
|
||||
open: boolean;
|
||||
setOpen: (open: boolean) => void;
|
||||
}
|
||||
|
||||
export const Sidebar: FC<SidebarProps> = ({ open, setOpen }) => {
|
||||
const pathname = usePathname();
|
||||
const { poppins } = fonts;
|
||||
const { bodyLarge } = typography;
|
||||
const { data: session } = useSession();
|
||||
const user = session?.user;
|
||||
|
||||
const logout = () => {
|
||||
signOut({ callbackUrl: "/login" });
|
||||
};
|
||||
|
||||
return (
|
||||
<Drawer
|
||||
sx={{ width: open ? openWidth : closedWidth, flexShrink: 0 }}
|
||||
variant="permanent"
|
||||
anchor="left"
|
||||
open={open}
|
||||
PaperProps={{
|
||||
sx: {
|
||||
width: open ? openWidth : closedWidth,
|
||||
border: 0,
|
||||
overflow: "visible",
|
||||
},
|
||||
}}
|
||||
>
|
||||
<Box
|
||||
sx={{
|
||||
position: "absolute",
|
||||
top: 24,
|
||||
right: open ? -8 : -16,
|
||||
color: "#1C75FD",
|
||||
rotate: open ? "90deg" : "-90deg",
|
||||
}}
|
||||
onClick={() => {
|
||||
setOpen!(!open);
|
||||
}}
|
||||
>
|
||||
<ExpandCircleDownIcon
|
||||
sx={{
|
||||
width: 24,
|
||||
height: 24,
|
||||
background: "white",
|
||||
borderRadius: 500,
|
||||
}}
|
||||
/>
|
||||
</Box>
|
||||
<Grid
|
||||
container
|
||||
direction="column"
|
||||
justifyContent="space-between"
|
||||
wrap="nowrap"
|
||||
spacing={0}
|
||||
sx={{ backgroundColor: "#25272A", height: "100%", p: 2 }}
|
||||
>
|
||||
<Grid item container>
|
||||
<Grid item sx={{ width: open ? "40px" : "100%" }}>
|
||||
<Box
|
||||
sx={{
|
||||
width: "40px",
|
||||
height: "40px",
|
||||
margin: open ? "0" : "0 auto",
|
||||
}}
|
||||
>
|
||||
<Image
|
||||
src={LinkLogo}
|
||||
alt="Link logo"
|
||||
width={40}
|
||||
height={40}
|
||||
style={{
|
||||
objectFit: "cover",
|
||||
filter: "grayscale(100) brightness(100)",
|
||||
}}
|
||||
/>
|
||||
</Box>
|
||||
.
|
||||
</Grid>
|
||||
{open && (
|
||||
<Grid item>
|
||||
<Typography
|
||||
variant="h2"
|
||||
sx={{
|
||||
fontSize: 26,
|
||||
color: "white",
|
||||
fontWeight: 700,
|
||||
mt: 1,
|
||||
ml: 0.5,
|
||||
fontFamily: poppins.style.fontFamily,
|
||||
}}
|
||||
>
|
||||
CDR Bridge
|
||||
</Typography>
|
||||
</Grid>
|
||||
)}
|
||||
</Grid>
|
||||
<Grid item>
|
||||
<Box
|
||||
sx={{
|
||||
height: "0.5px",
|
||||
width: "100%",
|
||||
backgroundColor: "#666",
|
||||
mb: 1,
|
||||
}}
|
||||
/>
|
||||
</Grid>
|
||||
<Grid
|
||||
item
|
||||
container
|
||||
direction="column"
|
||||
sx={{
|
||||
mt: "6px",
|
||||
overflow: "scroll",
|
||||
scrollbarWidth: "none",
|
||||
msOverflowStyle: "none",
|
||||
"&::-webkit-scrollbar": { display: "none" },
|
||||
}}
|
||||
flexGrow={1}
|
||||
>
|
||||
<List
|
||||
component="nav"
|
||||
sx={{
|
||||
a: {
|
||||
textDecoration: "none",
|
||||
|
||||
".MuiListItemButton-root": {
|
||||
p: 1,
|
||||
borderRadius: 2,
|
||||
"&:hover": {
|
||||
background: "#555",
|
||||
},
|
||||
".MuiTypography-root": {
|
||||
p: {
|
||||
color: "#999 !important",
|
||||
fontSize: 16,
|
||||
},
|
||||
},
|
||||
".badge": {
|
||||
p: { fontSize: 12, color: "black !important" },
|
||||
},
|
||||
},
|
||||
".Mui-selected": {
|
||||
background: "#444",
|
||||
color: "#fff !important",
|
||||
".MuiTypography-root": {
|
||||
p: {
|
||||
color: "#fff !important",
|
||||
fontSize: 16,
|
||||
},
|
||||
},
|
||||
".badge": {
|
||||
p: { fontSize: 12, color: "black !important" },
|
||||
},
|
||||
},
|
||||
},
|
||||
}}
|
||||
>
|
||||
<MenuItem
|
||||
name="WhatsApp"
|
||||
href="/whatsapp"
|
||||
selected={pathname.endsWith("/whatsapp")}
|
||||
Icon={WhatsAppIcon}
|
||||
iconSize={20}
|
||||
/>
|
||||
<MenuItem
|
||||
name="Signal"
|
||||
href="/signal"
|
||||
selected={pathname.startsWith("/signal")}
|
||||
Icon={ChatIcon}
|
||||
iconSize={20}
|
||||
/>
|
||||
<MenuItem
|
||||
name="Facebook"
|
||||
href="/facebook"
|
||||
selected={pathname.startsWith("/facebook")}
|
||||
Icon={FacebookIcon}
|
||||
iconSize={20}
|
||||
/>
|
||||
<MenuItem
|
||||
name="Voice"
|
||||
href="/voice"
|
||||
selected={pathname.startsWith("/voice")}
|
||||
Icon={PhoneIcon}
|
||||
iconSize={20}
|
||||
/>
|
||||
<MenuItem
|
||||
name="Webhooks"
|
||||
href="/webhooks"
|
||||
selected={pathname.startsWith("/webhooks")}
|
||||
Icon={AirlineStopsIcon}
|
||||
iconSize={20}
|
||||
/>
|
||||
<MenuItem
|
||||
name="Users"
|
||||
href="/users"
|
||||
selected={pathname.startsWith("/users")}
|
||||
Icon={AccountCircleIcon}
|
||||
iconSize={20}
|
||||
/>
|
||||
</List>
|
||||
</Grid>
|
||||
<Grid
|
||||
item
|
||||
container
|
||||
direction="row"
|
||||
alignItems="center"
|
||||
spacing={1}
|
||||
sx={{
|
||||
borderTop: "1px solid #ffffff33",
|
||||
pt: 0.5,
|
||||
}}
|
||||
>
|
||||
{user?.image && (
|
||||
<Grid item>
|
||||
<Box sx={{ width: 20, height: 20 }}>
|
||||
<Image
|
||||
src={user?.image ?? ""}
|
||||
alt="Profile image"
|
||||
width={20}
|
||||
height={20}
|
||||
unoptimized
|
||||
/>
|
||||
</Box>
|
||||
</Grid>
|
||||
)}
|
||||
|
||||
<Grid item>
|
||||
<Box
|
||||
sx={{
|
||||
...bodyLarge,
|
||||
color: "white",
|
||||
}}
|
||||
>
|
||||
{user?.email}
|
||||
</Box>
|
||||
</Grid>
|
||||
<Grid item>
|
||||
<Button text="Logout" kind="secondary" onClick={logout} />
|
||||
</Grid>
|
||||
</Grid>
|
||||
</Grid>
|
||||
</Drawer>
|
||||
);
|
||||
};
|
||||
Binary file not shown.
|
Before Width: | Height: | Size: 4.6 KiB |
|
|
@ -1,13 +0,0 @@
|
|||
import GoogleProvider from "next-auth/providers/google";
|
||||
|
||||
export const authOptions = {
|
||||
providers: [
|
||||
GoogleProvider({
|
||||
clientId: process.env.GOOGLE_CLIENT_ID!,
|
||||
clientSecret: process.env.GOOGLE_CLIENT_SECRET!,
|
||||
}),
|
||||
],
|
||||
session: {
|
||||
strategy: "jwt" as any,
|
||||
},
|
||||
};
|
||||
|
|
@ -1 +0,0 @@
|
|||
export { receiveMessage as POST } from "@link-stack/bridge-ui";
|
||||
|
|
@ -1 +0,0 @@
|
|||
export { relinkBot as POST } from "@link-stack/bridge-ui";
|
||||
|
|
@ -1 +0,0 @@
|
|||
export { getBot as GET } from "@link-stack/bridge-ui";
|
||||
|
|
@ -1 +0,0 @@
|
|||
export { sendMessage as POST } from "@link-stack/bridge-ui";
|
||||
|
|
@ -1,3 +0,0 @@
|
|||
import { handleWebhook } from "@link-stack/bridge-ui";
|
||||
|
||||
export { handleWebhook as GET, handleWebhook as POST };
|
||||
|
|
@ -1,9 +0,0 @@
|
|||
import NextAuth from "next-auth";
|
||||
import { authOptions } from "@/app/_lib/authentication";
|
||||
|
||||
// Force this route to be dynamic (not statically generated at build time)
|
||||
export const dynamic = 'force-dynamic';
|
||||
|
||||
const handler = NextAuth(authOptions);
|
||||
|
||||
export { handler as GET, handler as POST };
|
||||
|
|
@ -1,23 +0,0 @@
|
|||
import type { Metadata } from "next";
|
||||
import { LicenseInfo } from "@mui/x-license";
|
||||
|
||||
LicenseInfo.setLicenseKey(
|
||||
"2a7dd73ee59e3e028b96b0d2adee1ad8Tz0xMTMwOTUsRT0xNzc5MDYyMzk5MDAwLFM9cHJvLExNPXN1YnNjcmlwdGlvbixQVj1pbml0aWFsLEtWPTI=",
|
||||
);
|
||||
|
||||
export const metadata: Metadata = {
|
||||
title: "CDR Bridge",
|
||||
description: "",
|
||||
};
|
||||
|
||||
export default function RootLayout({
|
||||
children,
|
||||
}: Readonly<{
|
||||
children: React.ReactNode;
|
||||
}>) {
|
||||
return (
|
||||
<html lang="en">
|
||||
<body>{children}</body>
|
||||
</html>
|
||||
);
|
||||
}
|
||||
|
|
@ -1,7 +0,0 @@
|
|||
#!/bin/bash
|
||||
|
||||
set -e
|
||||
echo "running migrations"
|
||||
(cd ../bridge-migrations/ && pnpm run migrate:up:all)
|
||||
echo "starting bridge-frontend"
|
||||
exec dumb-init pnpm run start
|
||||
|
|
@ -1,82 +0,0 @@
|
|||
import { withAuth } from "next-auth/middleware";
|
||||
import { NextResponse } from "next/server";
|
||||
|
||||
export default withAuth(
|
||||
function middleware(req) {
|
||||
const isDev = process.env.NODE_ENV === "development";
|
||||
const nonce = Buffer.from(crypto.randomUUID()).toString("base64");
|
||||
|
||||
// Allow digiresilience.org for embedding documentation
|
||||
const frameSrcDirective = `frame-src 'self' https://digiresilience.org;`;
|
||||
|
||||
const cspHeader = `
|
||||
default-src 'self';
|
||||
${frameSrcDirective}
|
||||
connect-src 'self';
|
||||
script-src 'self' 'nonce-${nonce}' 'strict-dynamic' ${isDev ? "'unsafe-eval'" : ""};
|
||||
style-src 'self' 'unsafe-inline';
|
||||
img-src 'self' blob: data:;
|
||||
font-src 'self';
|
||||
object-src 'none';
|
||||
base-uri 'self';
|
||||
form-action 'self';
|
||||
frame-ancestors 'self';
|
||||
upgrade-insecure-requests;
|
||||
`;
|
||||
const contentSecurityPolicyHeaderValue = cspHeader
|
||||
.replace(/\s{2,}/g, " ")
|
||||
.trim();
|
||||
|
||||
const requestHeaders = new Headers(req.headers);
|
||||
requestHeaders.set("x-nonce", nonce);
|
||||
requestHeaders.set(
|
||||
"Content-Security-Policy",
|
||||
contentSecurityPolicyHeaderValue,
|
||||
);
|
||||
|
||||
const response = NextResponse.next({
|
||||
request: {
|
||||
headers: requestHeaders,
|
||||
},
|
||||
});
|
||||
|
||||
response.headers.set(
|
||||
"Content-Security-Policy",
|
||||
contentSecurityPolicyHeaderValue,
|
||||
);
|
||||
|
||||
// Additional security headers
|
||||
response.headers.set("X-Frame-Options", "SAMEORIGIN");
|
||||
response.headers.set("X-Content-Type-Options", "nosniff");
|
||||
response.headers.set("Referrer-Policy", "strict-origin-when-cross-origin");
|
||||
response.headers.set("X-XSS-Protection", "1; mode=block");
|
||||
response.headers.set(
|
||||
"Permissions-Policy",
|
||||
"camera=(), microphone=(), geolocation=()"
|
||||
);
|
||||
|
||||
return response;
|
||||
},
|
||||
{
|
||||
pages: {
|
||||
signIn: `/login`,
|
||||
},
|
||||
callbacks: {
|
||||
authorized: ({ token }) => {
|
||||
if (process.env.SETUP_MODE === "true") {
|
||||
return true;
|
||||
}
|
||||
|
||||
if (token?.email) {
|
||||
return true;
|
||||
}
|
||||
|
||||
return false;
|
||||
},
|
||||
},
|
||||
}
|
||||
);
|
||||
|
||||
export const config = {
|
||||
matcher: ["/((?!ws|wss|api|_next/static|_next/image|favicon.ico).*)"],
|
||||
};
|
||||
|
|
@ -1,7 +0,0 @@
|
|||
/** @type {import('next').NextConfig} */
|
||||
const nextConfig = {
|
||||
transpilePackages: ["@link-stack/ui", "@link-stack/bridge-common", "@link-stack/bridge-ui"],
|
||||
poweredByHeader: false,
|
||||
};
|
||||
|
||||
export default nextConfig;
|
||||
|
|
@ -1,40 +0,0 @@
|
|||
{
|
||||
"name": "@link-stack/bridge-frontend",
|
||||
"version": "3.3.5",
|
||||
"type": "module",
|
||||
"scripts": {
|
||||
"dev": "next dev",
|
||||
"build": "next build",
|
||||
"start": "next start",
|
||||
"lint": "next lint",
|
||||
"migrate:up:all": "tsx database/migrate.ts up:all",
|
||||
"migrate:up:one": "tsx database/migrate.ts up:one",
|
||||
"migrate:down:all": "tsx database/migrate.ts down:all",
|
||||
"migrate:down:one": "tsx database/migrate.ts down:one"
|
||||
},
|
||||
"dependencies": {
|
||||
"@auth/kysely-adapter": "^1.10.0",
|
||||
"@mui/icons-material": "^6",
|
||||
"@mui/material": "^6",
|
||||
"@mui/material-nextjs": "^6",
|
||||
"@mui/x-license": "^7",
|
||||
"@link-stack/bridge-common": "workspace:*",
|
||||
"@link-stack/bridge-ui": "workspace:*",
|
||||
"next": "15.5.9",
|
||||
"next-auth": "^4.24.11",
|
||||
"react": "19.2.0",
|
||||
"react-dom": "19.2.0",
|
||||
"sharp": "^0.34.4",
|
||||
"tsx": "^4.20.6",
|
||||
"@link-stack/ui": "workspace:*"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@link-stack/eslint-config": "workspace:*",
|
||||
"@link-stack/typescript-config": "workspace:*",
|
||||
"@types/node": "^24",
|
||||
"@types/pg": "^8.15.5",
|
||||
"@types/react": "^19",
|
||||
"@types/react-dom": "^19",
|
||||
"typescript": "^5"
|
||||
}
|
||||
}
|
||||
|
|
@ -1,2 +0,0 @@
|
|||
User-agent: *
|
||||
Disallow: /
|
||||
|
|
@ -1,41 +0,0 @@
|
|||
{
|
||||
"compilerOptions": {
|
||||
"lib": [
|
||||
"dom",
|
||||
"dom.iterable",
|
||||
"esnext"
|
||||
],
|
||||
"allowJs": true,
|
||||
"skipLibCheck": true,
|
||||
"strict": true,
|
||||
"noEmit": true,
|
||||
"forceConsistentCasingInFileNames": true,
|
||||
"esModuleInterop": true,
|
||||
"module": "esnext",
|
||||
"moduleResolution": "bundler",
|
||||
"resolveJsonModule": true,
|
||||
"isolatedModules": true,
|
||||
"jsx": "preserve",
|
||||
"incremental": true,
|
||||
"paths": {
|
||||
"@/*": [
|
||||
"./*"
|
||||
]
|
||||
},
|
||||
"plugins": [
|
||||
{
|
||||
"name": "next"
|
||||
}
|
||||
],
|
||||
"target": "ES2017"
|
||||
},
|
||||
"include": [
|
||||
"next-env.d.ts",
|
||||
"**/*.ts",
|
||||
"**/*.tsx",
|
||||
".next/types/**/*.ts"
|
||||
],
|
||||
"exclude": [
|
||||
"node_modules"
|
||||
]
|
||||
}
|
||||
|
|
@ -1,158 +0,0 @@
|
|||
# Bridge Migrations
|
||||
|
||||
Database migration management for the CDR Link bridge system.
|
||||
|
||||
## Overview
|
||||
|
||||
Bridge Migrations handles database schema versioning and migrations for all bridge-related tables using Kysely migration framework. It manages the database structure for authentication, messaging channels, webhooks, and settings.
|
||||
|
||||
## Features
|
||||
|
||||
- **Schema Versioning**: Track and apply database schema changes
|
||||
- **Up/Down Migrations**: Support for rolling forward and backward
|
||||
- **Type-Safe Migrations**: TypeScript-based migration files
|
||||
- **Migration History**: Track applied migrations in the database
|
||||
- **Multiple Migration Strategies**: Run all, run one, or rollback migrations
|
||||
|
||||
## Migration Files
|
||||
|
||||
Current migrations in order:
|
||||
|
||||
1. **0001-add-next-auth.ts** - NextAuth.js authentication tables
|
||||
2. **0002-add-signal.ts** - Signal messenger integration
|
||||
3. **0003-add-whatsapp.ts** - WhatsApp integration
|
||||
4. **0004-add-voice.ts** - Voice/Twilio integration
|
||||
5. **0005-add-facebook.ts** - Facebook Messenger integration
|
||||
6. **0006-add-webhooks.ts** - Webhook configuration
|
||||
7. **0007-add-settings.ts** - Application settings
|
||||
8. **0008-add-user-role.ts** - User role management
|
||||
|
||||
## Development
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- Node.js >= 20
|
||||
- npm >= 10
|
||||
- PostgreSQL database
|
||||
- Database connection credentials
|
||||
|
||||
### Setup
|
||||
|
||||
```bash
|
||||
# Install dependencies
|
||||
npm install
|
||||
|
||||
# Run all pending migrations
|
||||
npm run migrate:latest
|
||||
|
||||
# Check migration status
|
||||
npm run migrate:list
|
||||
```
|
||||
|
||||
### Environment Variables
|
||||
|
||||
Required environment variables:
|
||||
|
||||
- `DATABASE_URL` - PostgreSQL connection string
|
||||
- `DATABASE_HOST` - Database host
|
||||
- `DATABASE_NAME` - Database name
|
||||
- `DATABASE_USER` - Database username
|
||||
- `DATABASE_PASSWORD` - Database password
|
||||
|
||||
### Available Scripts
|
||||
|
||||
- `npm run migrate:latest` - Run all pending migrations
|
||||
- `npm run migrate:up` - Run next pending migration
|
||||
- `npm run migrate:down` - Rollback last migration
|
||||
- `npm run migrate:up:all` - Run all migrations (alias)
|
||||
- `npm run migrate:up:one` - Run one migration
|
||||
- `npm run migrate:down:all` - Rollback all migrations
|
||||
- `npm run migrate:down:one` - Rollback one migration
|
||||
- `npm run migrate:list` - List migration status
|
||||
- `npm run migrate:make <name>` - Create new migration file
|
||||
|
||||
## Creating New Migrations
|
||||
|
||||
To create a new migration:
|
||||
|
||||
```bash
|
||||
npm run migrate:make add-new-feature
|
||||
```
|
||||
|
||||
This creates a new timestamped migration file in the `migrations/` directory.
|
||||
|
||||
Example migration structure:
|
||||
|
||||
```typescript
|
||||
import { Kysely } from 'kysely'
|
||||
|
||||
export async function up(db: Kysely<any>): Promise<void> {
|
||||
await db.schema
|
||||
.createTable('new_table')
|
||||
.addColumn('id', 'serial', (col) => col.primaryKey())
|
||||
.addColumn('name', 'varchar', (col) => col.notNull())
|
||||
.addColumn('created_at', 'timestamp', (col) =>
|
||||
col.defaultTo('now()').notNull()
|
||||
)
|
||||
.execute()
|
||||
}
|
||||
|
||||
export async function down(db: Kysely<any>): Promise<void> {
|
||||
await db.schema.dropTable('new_table').execute()
|
||||
}
|
||||
```
|
||||
|
||||
## Database Schema
|
||||
|
||||
### Core Tables
|
||||
|
||||
- **users** - User accounts with roles
|
||||
- **accounts** - OAuth account connections
|
||||
- **sessions** - User sessions
|
||||
- **verification_tokens** - Email verification
|
||||
|
||||
### Communication Tables
|
||||
|
||||
- **bots** - Bot configurations for each service
|
||||
- **signal_messages** - Signal message history
|
||||
- **whatsapp_messages** - WhatsApp message history
|
||||
- **voice_messages** - Voice/call records
|
||||
- **facebook_messages** - Facebook message history
|
||||
|
||||
### Configuration Tables
|
||||
|
||||
- **webhooks** - External webhook endpoints
|
||||
- **settings** - Application settings
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Test Migrations**: Always test migrations in development first
|
||||
2. **Backup Database**: Create backups before running migrations in production
|
||||
3. **Review Changes**: Review migration files before applying
|
||||
4. **Atomic Operations**: Keep migrations focused and atomic
|
||||
5. **Rollback Plan**: Ensure down() methods properly reverse changes
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Common Issues
|
||||
|
||||
1. **Migration Failed**: Check error logs and database permissions
|
||||
2. **Locked Migrations**: Check for concurrent migration processes
|
||||
3. **Missing Tables**: Ensure all previous migrations have run
|
||||
4. **Connection Issues**: Verify DATABASE_URL and network access
|
||||
|
||||
### Recovery
|
||||
|
||||
If migrations fail:
|
||||
|
||||
1. Check migration history table
|
||||
2. Manually verify database state
|
||||
3. Run specific migrations as needed
|
||||
4. Use rollback if necessary
|
||||
|
||||
## Integration
|
||||
|
||||
Migrations are used by:
|
||||
- **bridge-frontend** - Requires migrated schema
|
||||
- **bridge-worker** - Depends on message tables
|
||||
- **bridge-whatsapp** - Uses bot configuration tables
|
||||
|
|
@ -1,96 +0,0 @@
|
|||
import * as path from "path";
|
||||
import { fileURLToPath } from "url";
|
||||
import { promises as fs } from "fs";
|
||||
import {
|
||||
Kysely,
|
||||
Migrator,
|
||||
MigrationResult,
|
||||
FileMigrationProvider,
|
||||
PostgresDialect,
|
||||
CamelCasePlugin,
|
||||
} from "kysely";
|
||||
import pkg from "pg";
|
||||
import { createLogger } from "@link-stack/logger";
|
||||
|
||||
const logger = createLogger('bridge-migrations-migrate');
|
||||
const { Pool } = pkg;
|
||||
import * as dotenv from "dotenv";
|
||||
|
||||
interface Database {}
|
||||
|
||||
export const migrate = async (arg: string) => {
|
||||
const __filename = fileURLToPath(import.meta.url);
|
||||
const __dirname = path.dirname(__filename);
|
||||
if (process.env.NODE_ENV !== "production") {
|
||||
dotenv.config({ path: path.join(__dirname, "../.env.local") });
|
||||
}
|
||||
const db = new Kysely<Database>({
|
||||
dialect: new PostgresDialect({
|
||||
pool: new Pool({
|
||||
host: process.env.DATABASE_HOST,
|
||||
database: process.env.DATABASE_NAME,
|
||||
port: parseInt(process.env.DATABASE_PORT!),
|
||||
user: process.env.DATABASE_USER,
|
||||
password: process.env.DATABASE_PASSWORD,
|
||||
}),
|
||||
}),
|
||||
plugins: [new CamelCasePlugin()],
|
||||
});
|
||||
const migrator = new Migrator({
|
||||
db,
|
||||
provider: new FileMigrationProvider({
|
||||
fs,
|
||||
path,
|
||||
migrationFolder: path.join(__dirname, "migrations"),
|
||||
}),
|
||||
});
|
||||
|
||||
let error: any = null;
|
||||
let results: MigrationResult[] = [];
|
||||
|
||||
if (arg === "up:all") {
|
||||
const out = await migrator.migrateToLatest();
|
||||
results = out.results ?? [];
|
||||
error = out.error;
|
||||
} else if (arg === "up:one") {
|
||||
const out = await migrator.migrateUp();
|
||||
results = out.results ?? [];
|
||||
error = out.error;
|
||||
} else if (arg === "down:all") {
|
||||
const migrations = await migrator.getMigrations();
|
||||
for (const _ of migrations) {
|
||||
const out = await migrator.migrateDown();
|
||||
if (out.results) {
|
||||
results = results.concat(out.results);
|
||||
error = out.error;
|
||||
}
|
||||
}
|
||||
} else if (arg === "down:one") {
|
||||
const out = await migrator.migrateDown();
|
||||
if (out.results) {
|
||||
results = out.results ?? [];
|
||||
error = out.error;
|
||||
}
|
||||
}
|
||||
|
||||
results?.forEach((it) => {
|
||||
if (it.status === "Success") {
|
||||
logger.info(
|
||||
`Migration "${it.migrationName} ${it.direction.toLowerCase()}" was executed successfully`,
|
||||
);
|
||||
} else if (it.status === "Error") {
|
||||
logger.error(`Failed to execute migration "${it.migrationName}"`);
|
||||
}
|
||||
});
|
||||
|
||||
if (error) {
|
||||
logger.error("Failed to migrate");
|
||||
logger.error(error);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
await db.destroy();
|
||||
};
|
||||
|
||||
const arg = process.argv.slice(2).pop();
|
||||
migrate(arg as string);
|
||||
|
|
@ -1,72 +0,0 @@
|
|||
import { Kysely, sql } from "kysely";
|
||||
|
||||
export async function up(db: Kysely<any>): Promise<void> {
|
||||
await db.schema
|
||||
.createTable("User")
|
||||
.addColumn("id", "uuid", (col) =>
|
||||
col.primaryKey().defaultTo(sql`gen_random_uuid()`),
|
||||
)
|
||||
.addColumn("name", "text")
|
||||
.addColumn("email", "text", (col) => col.unique().notNull())
|
||||
.addColumn("emailVerified", "timestamptz")
|
||||
.addColumn("image", "text")
|
||||
.execute();
|
||||
|
||||
await db.schema
|
||||
.createTable("Account")
|
||||
.addColumn("id", "uuid", (col) =>
|
||||
col.primaryKey().defaultTo(sql`gen_random_uuid()`),
|
||||
)
|
||||
.addColumn("userId", "uuid", (col) =>
|
||||
col.references("User.id").onDelete("cascade").notNull(),
|
||||
)
|
||||
.addColumn("type", "text", (col) => col.notNull())
|
||||
.addColumn("provider", "text", (col) => col.notNull())
|
||||
.addColumn("providerAccountId", "text", (col) => col.notNull())
|
||||
.addColumn("refresh_token", "text")
|
||||
.addColumn("access_token", "text")
|
||||
.addColumn("expires_at", "bigint")
|
||||
.addColumn("token_type", "text")
|
||||
.addColumn("scope", "text")
|
||||
.addColumn("id_token", "text")
|
||||
.addColumn("session_state", "text")
|
||||
.execute();
|
||||
|
||||
await db.schema
|
||||
.createTable("Session")
|
||||
.addColumn("id", "uuid", (col) =>
|
||||
col.primaryKey().defaultTo(sql`gen_random_uuid()`),
|
||||
)
|
||||
.addColumn("userId", "uuid", (col) =>
|
||||
col.references("User.id").onDelete("cascade").notNull(),
|
||||
)
|
||||
.addColumn("sessionToken", "text", (col) => col.notNull().unique())
|
||||
.addColumn("expires", "timestamptz", (col) => col.notNull())
|
||||
.execute();
|
||||
|
||||
await db.schema
|
||||
.createTable("VerificationToken")
|
||||
.addColumn("identifier", "text", (col) => col.notNull())
|
||||
.addColumn("token", "text", (col) => col.notNull().unique())
|
||||
.addColumn("expires", "timestamptz", (col) => col.notNull())
|
||||
.execute();
|
||||
|
||||
await db.schema
|
||||
.createIndex("Account_userId_index")
|
||||
.on("Account")
|
||||
.column("userId")
|
||||
.execute();
|
||||
|
||||
await db.schema
|
||||
.createIndex("Session_userId_index")
|
||||
.on("Session")
|
||||
.column("userId")
|
||||
.execute();
|
||||
}
|
||||
|
||||
export async function down(db: Kysely<any>): Promise<void> {
|
||||
await db.schema.dropTable("Account").ifExists().execute();
|
||||
await db.schema.dropTable("Session").ifExists().execute();
|
||||
await db.schema.dropTable("User").ifExists().execute();
|
||||
await db.schema.dropTable("VerificationToken").ifExists().execute();
|
||||
}
|
||||
|
|
@ -1,33 +0,0 @@
|
|||
import { Kysely, sql } from "kysely";
|
||||
|
||||
export async function up(db: Kysely<any>): Promise<void> {
|
||||
await db.schema
|
||||
.createTable("SignalBot")
|
||||
.addColumn("id", "uuid", (col) =>
|
||||
col.primaryKey().defaultTo(sql`gen_random_uuid()`),
|
||||
)
|
||||
.addColumn("phone_number", "text")
|
||||
.addColumn("token", "text", (col) => col.unique().notNull())
|
||||
.addColumn("user_id", "uuid")
|
||||
.addColumn("name", "text")
|
||||
.addColumn("description", "text")
|
||||
.addColumn("qr_code", "text")
|
||||
.addColumn("verified", "boolean", (col) => col.notNull().defaultTo(false))
|
||||
.addColumn("created_at", "timestamptz", (col) =>
|
||||
col.notNull().defaultTo(sql`now()`),
|
||||
)
|
||||
.addColumn("updated_at", "timestamptz", (col) =>
|
||||
col.notNull().defaultTo(sql`now()`),
|
||||
)
|
||||
.execute();
|
||||
|
||||
await db.schema
|
||||
.createIndex("SignalBotToken")
|
||||
.on("SignalBot")
|
||||
.column("token")
|
||||
.execute();
|
||||
}
|
||||
|
||||
export async function down(db: Kysely<any>): Promise<void> {
|
||||
await db.schema.dropTable("SignalBot").ifExists().execute();
|
||||
}
|
||||
|
|
@ -1,33 +0,0 @@
|
|||
import { Kysely, sql } from "kysely";
|
||||
|
||||
export async function up(db: Kysely<any>): Promise<void> {
|
||||
await db.schema
|
||||
.createTable("WhatsappBot")
|
||||
.addColumn("id", "uuid", (col) =>
|
||||
col.primaryKey().defaultTo(sql`gen_random_uuid()`),
|
||||
)
|
||||
.addColumn("phone_number", "text")
|
||||
.addColumn("token", "text", (col) => col.unique().notNull())
|
||||
.addColumn("user_id", "uuid")
|
||||
.addColumn("name", "text")
|
||||
.addColumn("description", "text")
|
||||
.addColumn("qr_code", "text")
|
||||
.addColumn("verified", "boolean", (col) => col.notNull().defaultTo(false))
|
||||
.addColumn("created_at", "timestamptz", (col) =>
|
||||
col.notNull().defaultTo(sql`now()`),
|
||||
)
|
||||
.addColumn("updated_at", "timestamptz", (col) =>
|
||||
col.notNull().defaultTo(sql`now()`),
|
||||
)
|
||||
.execute();
|
||||
|
||||
await db.schema
|
||||
.createIndex("WhatsappBotToken")
|
||||
.on("WhatsappBot")
|
||||
.column("token")
|
||||
.execute();
|
||||
}
|
||||
|
||||
export async function down(db: Kysely<any>): Promise<void> {
|
||||
await db.schema.dropTable("WhatsappBot").ifExists().execute();
|
||||
}
|
||||
|
|
@ -1,77 +0,0 @@
|
|||
import { Kysely, sql } from "kysely";
|
||||
|
||||
export async function up(db: Kysely<any>): Promise<void> {
|
||||
await db.schema
|
||||
.createTable("VoiceProvider")
|
||||
.addColumn("id", "uuid", (col) =>
|
||||
col.primaryKey().defaultTo(sql`gen_random_uuid()`),
|
||||
)
|
||||
.addColumn("kind", "text", (col) => col.notNull())
|
||||
.addColumn("name", "text", (col) => col.notNull())
|
||||
.addColumn("description", "text")
|
||||
.addColumn("credentials", "jsonb", (col) => col.notNull())
|
||||
.addColumn("created_at", "timestamptz", (col) =>
|
||||
col.notNull().defaultTo(sql`now()`),
|
||||
)
|
||||
.addColumn("updated_at", "timestamptz", (col) =>
|
||||
col.notNull().defaultTo(sql`now()`),
|
||||
)
|
||||
.execute();
|
||||
|
||||
await db.schema
|
||||
.createIndex("VoiceProviderName")
|
||||
.on("VoiceProvider")
|
||||
.column("name")
|
||||
.execute();
|
||||
|
||||
await db.schema
|
||||
.createTable("VoiceLine")
|
||||
.addColumn("id", "uuid", (col) =>
|
||||
col.primaryKey().defaultTo(sql`gen_random_uuid()`),
|
||||
)
|
||||
.addColumn("provider_id", "uuid", (col) =>
|
||||
col.notNull().references("VoiceProvider.id").onDelete("cascade"),
|
||||
)
|
||||
.addColumn("provider_line_sid", "text", (col) => col.notNull())
|
||||
.addColumn("number", "text", (col) => col.notNull())
|
||||
.addColumn("name", "text", (col) => col.notNull())
|
||||
.addColumn("description", "text")
|
||||
.addColumn("language", "text", (col) => col.notNull())
|
||||
.addColumn("voice", "text", (col) => col.notNull())
|
||||
.addColumn("prompt_text", "text")
|
||||
.addColumn("prompt_audio", "jsonb")
|
||||
.addColumn("audio_prompt_enabled", "boolean", (col) =>
|
||||
col.notNull().defaultTo(false),
|
||||
)
|
||||
.addColumn("audio_converted_at", "timestamptz")
|
||||
.addColumn("created_at", "timestamptz", (col) =>
|
||||
col.notNull().defaultTo(sql`now()`),
|
||||
)
|
||||
.addColumn("updated_at", "timestamptz", (col) =>
|
||||
col.notNull().defaultTo(sql`now()`),
|
||||
)
|
||||
.execute();
|
||||
|
||||
await db.schema
|
||||
.createIndex("VoiceLineProviderId")
|
||||
.on("VoiceLine")
|
||||
.column("provider_id")
|
||||
.execute();
|
||||
|
||||
await db.schema
|
||||
.createIndex("VoiceLineProviderLineSid")
|
||||
.on("VoiceLine")
|
||||
.column("provider_line_sid")
|
||||
.execute();
|
||||
|
||||
await db.schema
|
||||
.createIndex("VoiceLineNumber")
|
||||
.on("VoiceLine")
|
||||
.column("number")
|
||||
.execute();
|
||||
}
|
||||
|
||||
export async function down(db: Kysely<any>): Promise<void> {
|
||||
await db.schema.dropTable("VoiceLine").ifExists().execute();
|
||||
await db.schema.dropTable("VoiceProvider").ifExists().execute();
|
||||
}
|
||||
|
|
@ -1,36 +0,0 @@
|
|||
import { Kysely, sql } from "kysely";
|
||||
|
||||
export async function up(db: Kysely<any>): Promise<void> {
|
||||
await db.schema
|
||||
.createTable("FacebookBot")
|
||||
.addColumn("id", "uuid", (col) =>
|
||||
col.primaryKey().defaultTo(sql`gen_random_uuid()`),
|
||||
)
|
||||
.addColumn("name", "text")
|
||||
.addColumn("description", "text")
|
||||
.addColumn("token", "text")
|
||||
.addColumn("page_access_token", "text")
|
||||
.addColumn("app_secret", "text")
|
||||
.addColumn("verify_token", "text")
|
||||
.addColumn("page_id", "text")
|
||||
.addColumn("app_id", "text")
|
||||
.addColumn("user_id", "uuid")
|
||||
.addColumn("verified", "boolean", (col) => col.notNull().defaultTo(false))
|
||||
.addColumn("created_at", "timestamptz", (col) =>
|
||||
col.notNull().defaultTo(sql`now()`),
|
||||
)
|
||||
.addColumn("updated_at", "timestamptz", (col) =>
|
||||
col.notNull().defaultTo(sql`now()`),
|
||||
)
|
||||
.execute();
|
||||
|
||||
await db.schema
|
||||
.createIndex("FacebookBotToken")
|
||||
.on("FacebookBot")
|
||||
.column("token")
|
||||
.execute();
|
||||
}
|
||||
|
||||
export async function down(db: Kysely<any>): Promise<void> {
|
||||
await db.schema.dropTable("FacebookBot").ifExists().execute();
|
||||
}
|
||||
|
|
@ -1,41 +0,0 @@
|
|||
import { Kysely, sql } from "kysely";
|
||||
|
||||
export async function up(db: Kysely<any>): Promise<void> {
|
||||
await db.schema
|
||||
.createTable("Webhook")
|
||||
.addColumn("id", "uuid", (col) =>
|
||||
col.primaryKey().defaultTo(sql`gen_random_uuid()`),
|
||||
)
|
||||
.addColumn("name", "text", (col) => col.notNull())
|
||||
.addColumn("description", "text")
|
||||
.addColumn("backend_type", "text", (col) => col.notNull())
|
||||
.addColumn("backend_id", "uuid", (col) => col.notNull())
|
||||
.addColumn("endpoint_url", "text", (col) =>
|
||||
col.notNull().check(sql`endpoint_url ~ '^https?://[^/]+'`),
|
||||
)
|
||||
.addColumn("http_method", "text", (col) =>
|
||||
col
|
||||
.notNull()
|
||||
.defaultTo("post")
|
||||
.check(sql`http_method in ('post', 'put')`),
|
||||
)
|
||||
.addColumn("headers", "jsonb")
|
||||
.addColumn("created_at", "timestamptz", (col) =>
|
||||
col.notNull().defaultTo(sql`now()`),
|
||||
)
|
||||
.addColumn("updated_at", "timestamptz", (col) =>
|
||||
col.notNull().defaultTo(sql`now()`),
|
||||
)
|
||||
.execute();
|
||||
|
||||
await db.schema
|
||||
.createIndex("WebhookBackendTypeBackendId")
|
||||
.on("Webhook")
|
||||
.column("backend_type")
|
||||
.column("backend_id")
|
||||
.execute();
|
||||
}
|
||||
|
||||
export async function down(db: Kysely<any>): Promise<void> {
|
||||
await db.schema.dropTable("Webhook").ifExists().execute();
|
||||
}
|
||||
|
|
@ -1,28 +0,0 @@
|
|||
import { Kysely, sql } from "kysely";
|
||||
|
||||
export async function up(db: Kysely<any>): Promise<void> {
|
||||
await db.schema
|
||||
.createTable("Setting")
|
||||
.addColumn("id", "uuid", (col) =>
|
||||
col.primaryKey().defaultTo(sql`gen_random_uuid()`),
|
||||
)
|
||||
.addColumn("name", "text")
|
||||
.addColumn("value", "jsonb")
|
||||
.addColumn("created_at", "timestamptz", (col) =>
|
||||
col.notNull().defaultTo(sql`now()`),
|
||||
)
|
||||
.addColumn("updated_at", "timestamptz", (col) =>
|
||||
col.notNull().defaultTo(sql`now()`),
|
||||
)
|
||||
.execute();
|
||||
|
||||
await db.schema
|
||||
.createIndex("SettingName")
|
||||
.on("Setting")
|
||||
.column("name")
|
||||
.execute();
|
||||
}
|
||||
|
||||
export async function down(db: Kysely<any>): Promise<void> {
|
||||
await db.schema.dropTable("Setting").ifExists().execute();
|
||||
}
|
||||
|
|
@ -1,9 +0,0 @@
|
|||
import { Kysely } from "kysely";
|
||||
|
||||
export async function up(db: Kysely<any>): Promise<void> {
|
||||
await db.schema.alterTable("User").addColumn("role", "text").execute();
|
||||
}
|
||||
|
||||
export async function down(db: Kysely<any>): Promise<void> {
|
||||
await db.schema.alterTable("User").dropColumn("role").execute();
|
||||
}
|
||||
|
|
@ -1,25 +0,0 @@
|
|||
{
|
||||
"name": "@link-stack/bridge-migrations",
|
||||
"version": "3.3.5",
|
||||
"type": "module",
|
||||
"scripts": {
|
||||
"migrate:up:all": "tsx migrate.ts up:all",
|
||||
"migrate:up:one": "tsx migrate.ts up:one",
|
||||
"migrate:down:all": "tsx migrate.ts down:all",
|
||||
"migrate:down:one": "tsx migrate.ts down:one"
|
||||
},
|
||||
"dependencies": {
|
||||
"@link-stack/logger": "workspace:*",
|
||||
"dotenv": "^17.2.3",
|
||||
"kysely": "0.27.5",
|
||||
"pg": "^8.16.3",
|
||||
"tsx": "^4.20.6"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/node": "^24",
|
||||
"@types/pg": "^8.15.5",
|
||||
"@link-stack/eslint-config": "workspace:*",
|
||||
"@link-stack/typescript-config": "workspace:*",
|
||||
"typescript": "^5"
|
||||
}
|
||||
}
|
||||
|
|
@ -1,7 +1,7 @@
|
|||
FROM node:22-bookworm-slim AS base
|
||||
|
||||
FROM base AS builder
|
||||
ARG APP_DIR=/opt/bridge-frontend
|
||||
ARG APP_DIR=/opt/bridge-signal
|
||||
ENV PNPM_HOME="/pnpm"
|
||||
ENV PATH="$PNPM_HOME:$PATH"
|
||||
RUN mkdir -p ${APP_DIR}/
|
||||
|
|
@ -9,46 +9,47 @@ RUN corepack enable && corepack prepare pnpm@9.15.4 --activate
|
|||
RUN pnpm add -g turbo
|
||||
WORKDIR ${APP_DIR}
|
||||
COPY . .
|
||||
RUN turbo prune --scope=@link-stack/bridge-frontend --scope=@link-stack/bridge-migrations --docker
|
||||
RUN turbo prune --scope=@link-stack/bridge-signal --docker
|
||||
|
||||
FROM base AS installer
|
||||
ARG APP_DIR=/opt/bridge-frontend
|
||||
ARG APP_DIR=/opt/bridge-signal
|
||||
ENV PNPM_HOME="/pnpm"
|
||||
ENV PATH="$PNPM_HOME:$PATH"
|
||||
WORKDIR ${APP_DIR}
|
||||
RUN corepack enable && corepack prepare pnpm@9.15.4 --activate
|
||||
COPY --from=builder ${APP_DIR}/.gitignore .gitignore
|
||||
COPY --from=builder ${APP_DIR}/out/json/ .
|
||||
COPY --from=builder ${APP_DIR}/out/full/ .
|
||||
COPY --from=builder ${APP_DIR}/out/pnpm-lock.yaml ./pnpm-lock.yaml
|
||||
RUN pnpm install --frozen-lockfile
|
||||
|
||||
COPY --from=builder ${APP_DIR}/out/full/ .
|
||||
RUN pnpm add -g turbo
|
||||
RUN turbo run build --filter=@link-stack/bridge-frontend --filter=@link-stack/bridge-migrations
|
||||
RUN turbo run build --filter=@link-stack/bridge-signal
|
||||
|
||||
FROM base AS runner
|
||||
ARG APP_DIR=/opt/bridge-frontend
|
||||
WORKDIR ${APP_DIR}/
|
||||
FROM base as runner
|
||||
ARG BUILD_DATE
|
||||
ARG VERSION
|
||||
LABEL maintainer="Darren Clarke <darren@redaranj.com>"
|
||||
LABEL org.label-schema.build-date=$BUILD_DATE
|
||||
LABEL org.label-schema.version=$VERSION
|
||||
ENV APP_DIR ${APP_DIR}
|
||||
ARG APP_DIR=/opt/bridge-signal
|
||||
ARG SIGNAL_CLI_VERSION=0.13.12
|
||||
ENV PNPM_HOME="/pnpm"
|
||||
ENV PATH="$PNPM_HOME:$PATH"
|
||||
RUN corepack enable && corepack prepare pnpm@9.15.4 --activate
|
||||
RUN mkdir -p ${APP_DIR}/
|
||||
RUN DEBIAN_FRONTEND=noninteractive apt-get update && \
|
||||
apt-get install -y --no-install-recommends \
|
||||
dumb-init
|
||||
RUN mkdir -p ${APP_DIR}
|
||||
dumb-init curl ca-certificates && \
|
||||
curl -L "https://github.com/AsamK/signal-cli/releases/download/v${SIGNAL_CLI_VERSION}/signal-cli-${SIGNAL_CLI_VERSION}-Linux-native.tar.gz" \
|
||||
| tar xz -C /usr/local/bin && \
|
||||
chmod +x /usr/local/bin/signal-cli && \
|
||||
apt-get remove -y curl && apt-get autoremove -y && rm -rf /var/lib/apt/lists/*
|
||||
RUN corepack enable && corepack prepare pnpm@9.15.4 --activate
|
||||
WORKDIR ${APP_DIR}
|
||||
COPY --from=installer ${APP_DIR} ./
|
||||
RUN chown -R node:node ${APP_DIR}/
|
||||
WORKDIR ${APP_DIR}/apps/bridge-frontend/
|
||||
RUN chown -R node:node ${APP_DIR}
|
||||
WORKDIR ${APP_DIR}/apps/bridge-signal/
|
||||
RUN chmod +x docker-entrypoint.sh
|
||||
USER node
|
||||
EXPOSE 3000
|
||||
ENV PORT 3000
|
||||
RUN mkdir /home/node/signal-data
|
||||
EXPOSE 5002
|
||||
ENV PORT 5002
|
||||
ENV NODE_ENV production
|
||||
ENTRYPOINT ["/opt/bridge-frontend/apps/bridge-frontend/docker-entrypoint.sh"]
|
||||
ENV SIGNAL_DATA_DIR /home/node/signal-data
|
||||
ENV COREPACK_ENABLE_NETWORK=0
|
||||
ENTRYPOINT ["/opt/bridge-signal/apps/bridge-signal/docker-entrypoint.sh"]
|
||||
|
|
@ -1,5 +1,5 @@
|
|||
#!/bin/bash
|
||||
|
||||
set -e
|
||||
echo "starting bridge-worker"
|
||||
echo "starting bridge-signal"
|
||||
exec dumb-init pnpm run start
|
||||
3
apps/bridge-signal/eslint.config.mjs
Normal file
3
apps/bridge-signal/eslint.config.mjs
Normal file
|
|
@ -0,0 +1,3 @@
|
|||
import config from "@link-stack/eslint-config/node";
|
||||
|
||||
export default config;
|
||||
32
apps/bridge-signal/package.json
Normal file
32
apps/bridge-signal/package.json
Normal file
|
|
@ -0,0 +1,32 @@
|
|||
{
|
||||
"name": "@link-stack/bridge-signal",
|
||||
"version": "3.5.0-beta.1",
|
||||
"main": "build/main/index.js",
|
||||
"author": "Darren Clarke <darren@redaranj.com>",
|
||||
"license": "AGPL-3.0-or-later",
|
||||
"prettier": "@link-stack/prettier-config",
|
||||
"dependencies": {
|
||||
"@hono/node-server": "^1.13.8",
|
||||
"hono": "^4.7.4",
|
||||
"@link-stack/logger": "workspace:*"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@link-stack/eslint-config": "workspace:*",
|
||||
"@link-stack/prettier-config": "workspace:*",
|
||||
"@link-stack/typescript-config": "workspace:*",
|
||||
"@types/node": "*",
|
||||
"dotenv-cli": "^10.0.0",
|
||||
"eslint": "^9.23.0",
|
||||
"prettier": "^3.5.3",
|
||||
"tsx": "^4.20.6",
|
||||
"typescript": "^5.9.3"
|
||||
},
|
||||
"scripts": {
|
||||
"build": "tsc -p tsconfig.json",
|
||||
"dev": "dotenv -- tsx src/index.ts",
|
||||
"start": "node build/main/index.js",
|
||||
"lint": "eslint src/",
|
||||
"format": "prettier --write src/",
|
||||
"format:check": "prettier --check src/"
|
||||
}
|
||||
}
|
||||
35
apps/bridge-signal/src/attachments.ts
Normal file
35
apps/bridge-signal/src/attachments.ts
Normal file
|
|
@ -0,0 +1,35 @@
|
|||
/**
|
||||
* Attachment size configuration for messaging channels
|
||||
*
|
||||
* Environment variables:
|
||||
* - BRIDGE_MAX_ATTACHMENT_SIZE_MB: Maximum size for a single attachment in MB (default: 50)
|
||||
*/
|
||||
|
||||
/**
|
||||
* Get the maximum attachment size in bytes from environment variable
|
||||
* Defaults to 50MB if not set
|
||||
*/
|
||||
export function getMaxAttachmentSize(): number {
|
||||
const envValue = process.env.BRIDGE_MAX_ATTACHMENT_SIZE_MB;
|
||||
const sizeInMB = envValue ? Number.parseInt(envValue, 10) : 50;
|
||||
|
||||
if (Number.isNaN(sizeInMB) || sizeInMB <= 0) {
|
||||
console.warn(`Invalid BRIDGE_MAX_ATTACHMENT_SIZE_MB value: ${envValue}, using default 50MB`);
|
||||
return 50 * 1024 * 1024;
|
||||
}
|
||||
|
||||
return sizeInMB * 1024 * 1024;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the maximum total size for all attachments in a message
|
||||
* This is 4x the single attachment size
|
||||
*/
|
||||
export function getMaxTotalAttachmentSize(): number {
|
||||
return getMaxAttachmentSize() * 4;
|
||||
}
|
||||
|
||||
/**
|
||||
* Maximum number of attachments per message
|
||||
*/
|
||||
export const MAX_ATTACHMENTS = 10;
|
||||
33
apps/bridge-signal/src/index.ts
Normal file
33
apps/bridge-signal/src/index.ts
Normal file
|
|
@ -0,0 +1,33 @@
|
|||
import { serve } from "@hono/node-server";
|
||||
import { createLogger } from "@link-stack/logger";
|
||||
|
||||
import { createRoutes } from "./routes.ts";
|
||||
import SignalService from "./service.ts";
|
||||
|
||||
const logger = createLogger("bridge-signal-index");
|
||||
|
||||
const main = async () => {
|
||||
const service = new SignalService();
|
||||
await service.initialize();
|
||||
|
||||
const app = createRoutes(service);
|
||||
const port = Number.parseInt(process.env.PORT || "5002", 10);
|
||||
|
||||
serve({ fetch: app.fetch, port }, (info) => {
|
||||
logger.info({ port: info.port }, "bridge-signal listening");
|
||||
});
|
||||
|
||||
const shutdown = async () => {
|
||||
logger.info("Shutting down...");
|
||||
await service.teardown();
|
||||
process.exit(0);
|
||||
};
|
||||
|
||||
process.on("SIGTERM", shutdown);
|
||||
process.on("SIGINT", shutdown);
|
||||
};
|
||||
|
||||
main().catch((error) => {
|
||||
logger.error(error);
|
||||
process.exit(1);
|
||||
});
|
||||
130
apps/bridge-signal/src/routes.ts
Normal file
130
apps/bridge-signal/src/routes.ts
Normal file
|
|
@ -0,0 +1,130 @@
|
|||
import { createLogger } from "@link-stack/logger";
|
||||
import { Hono } from "hono";
|
||||
|
||||
import type SignalService from "./service.ts";
|
||||
|
||||
const logger = createLogger("bridge-signal-routes");
|
||||
|
||||
const errorMessage = (error: unknown): string => (error instanceof Error ? error.message : String(error));
|
||||
|
||||
export function createRoutes(service: SignalService): Hono {
|
||||
const app = new Hono();
|
||||
|
||||
// Start device linking
|
||||
app.post("/api/bots/:id/register", async (c) => {
|
||||
const id = c.req.param("id");
|
||||
const { phoneNumber, deviceName } = await c.req.json<{
|
||||
phoneNumber: string;
|
||||
deviceName?: string;
|
||||
}>();
|
||||
|
||||
try {
|
||||
const result = await service.register(id, phoneNumber, deviceName);
|
||||
logger.info({ id }, "Device linking started");
|
||||
return c.json(result);
|
||||
} catch (error) {
|
||||
logger.error({ id, error: errorMessage(error) }, "Failed to start device linking");
|
||||
return c.json({ error: errorMessage(error) }, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Bot status
|
||||
app.get("/api/bots/:id", async (c) => {
|
||||
const id = c.req.param("id");
|
||||
try {
|
||||
return c.json(await service.getBot(id));
|
||||
} catch (error) {
|
||||
return c.json({ error: errorMessage(error) }, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Send message
|
||||
app.post("/api/bots/:id/send", async (c) => {
|
||||
const id = c.req.param("id");
|
||||
const { recipient, message, attachments, autoGroup } = await c.req.json<{
|
||||
recipient: string;
|
||||
message: string;
|
||||
attachments?: Array<{ data: string; filename: string; mime_type: string }>;
|
||||
autoGroup?: { ticketNumber: string };
|
||||
}>();
|
||||
|
||||
try {
|
||||
const result = await service.send(id, recipient, message, attachments, autoGroup);
|
||||
logger.info({ id, recipient: result.recipient, attachmentCount: attachments?.length || 0 }, "Sent message");
|
||||
return c.json({ result });
|
||||
} catch (error) {
|
||||
logger.error({ id, error: errorMessage(error) }, "Failed to send message");
|
||||
return c.json({ error: errorMessage(error) }, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Unregister bot
|
||||
app.post("/api/bots/:id/unregister", async (c) => {
|
||||
const id = c.req.param("id");
|
||||
try {
|
||||
await service.unregister(id);
|
||||
logger.info({ id }, "Bot unregistered");
|
||||
return c.body(null, 200);
|
||||
} catch (error) {
|
||||
logger.error({ id, error: errorMessage(error) }, "Failed to unregister bot");
|
||||
return c.json({ error: errorMessage(error) }, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Create group
|
||||
app.post("/api/bots/:id/groups", async (c) => {
|
||||
const id = c.req.param("id");
|
||||
const { name, members, description } = await c.req.json<{
|
||||
name: string;
|
||||
members: string[];
|
||||
description?: string;
|
||||
}>();
|
||||
|
||||
try {
|
||||
const result = await service.createGroup(id, name, members, description);
|
||||
logger.info({ id, groupId: result.groupId }, "Group created");
|
||||
return c.json(result);
|
||||
} catch (error) {
|
||||
logger.error({ id, error: errorMessage(error) }, "Failed to create group");
|
||||
return c.json({ error: errorMessage(error) }, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Update group
|
||||
app.put("/api/bots/:id/groups/:groupId", async (c) => {
|
||||
const id = c.req.param("id");
|
||||
const groupId = c.req.param("groupId");
|
||||
const { name, description } = await c.req.json<{
|
||||
name?: string;
|
||||
description?: string;
|
||||
}>();
|
||||
|
||||
try {
|
||||
await service.updateGroup(id, groupId, name, description);
|
||||
logger.info({ id, groupId }, "Group updated");
|
||||
return c.json({ success: true });
|
||||
} catch (error) {
|
||||
logger.error({ id, error: errorMessage(error) }, "Failed to update group");
|
||||
return c.json({ error: errorMessage(error) }, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// List groups
|
||||
app.get("/api/bots/:id/groups", async (c) => {
|
||||
const id = c.req.param("id");
|
||||
try {
|
||||
const groups = await service.listGroups(id);
|
||||
return c.json(groups);
|
||||
} catch (error) {
|
||||
logger.error({ id, error: errorMessage(error) }, "Failed to list groups");
|
||||
return c.json({ error: errorMessage(error) }, 500);
|
||||
}
|
||||
});
|
||||
|
||||
// Health check
|
||||
app.get("/api/health", (c) => {
|
||||
return c.json({ status: "ok" });
|
||||
});
|
||||
|
||||
return app;
|
||||
}
|
||||
462
apps/bridge-signal/src/service.ts
Normal file
462
apps/bridge-signal/src/service.ts
Normal file
|
|
@ -0,0 +1,462 @@
|
|||
import fs from "node:fs";
|
||||
import path from "node:path";
|
||||
|
||||
import { createLogger } from "@link-stack/logger";
|
||||
|
||||
import { getMaxAttachmentSize, getMaxTotalAttachmentSize, MAX_ATTACHMENTS } from "./attachments.ts";
|
||||
import { SignalCli, SignalEnvelope } from "./signal-cli.ts";
|
||||
|
||||
const logger = createLogger("bridge-signal-service");
|
||||
|
||||
interface BotMapping {
|
||||
[botId: string]: {
|
||||
phoneNumber: string;
|
||||
webhookToken?: string;
|
||||
};
|
||||
}
|
||||
|
||||
interface Attachment {
|
||||
data: string; // base64
|
||||
filename: string;
|
||||
mime_type: string;
|
||||
}
|
||||
|
||||
interface SendResult {
|
||||
recipient: string;
|
||||
timestamp: number;
|
||||
source: string;
|
||||
groupId?: string;
|
||||
}
|
||||
|
||||
export default class SignalService {
|
||||
private cli: SignalCli | null = null;
|
||||
private botMapping: BotMapping = {};
|
||||
private dataDir: string;
|
||||
private mappingFile: string;
|
||||
private autoGroupsEnabled: boolean;
|
||||
|
||||
constructor() {
|
||||
this.dataDir = process.env.SIGNAL_DATA_DIR || "/home/node/signal-data";
|
||||
this.mappingFile = path.join(this.dataDir, "bot-mapping.json");
|
||||
this.autoGroupsEnabled = process.env.BRIDGE_SIGNAL_AUTO_GROUPS?.toLowerCase() === "true";
|
||||
}
|
||||
|
||||
async initialize(): Promise<void> {
|
||||
// Ensure data directory exists
|
||||
if (!fs.existsSync(this.dataDir)) {
|
||||
fs.mkdirSync(this.dataDir, { recursive: true });
|
||||
}
|
||||
|
||||
// Load bot mapping
|
||||
this.loadBotMapping();
|
||||
|
||||
// Start signal-cli
|
||||
this.cli = new SignalCli(this.dataDir);
|
||||
await this.cli.start();
|
||||
|
||||
// Register message listener
|
||||
this.cli.on("message", ({ account, envelope }: { account: string; envelope: SignalEnvelope }) => {
|
||||
this.handleIncomingMessage(account, envelope).catch((error) => {
|
||||
logger.error({ err: error, account }, "Error handling incoming message");
|
||||
});
|
||||
});
|
||||
|
||||
this.cli.on("close", (code: number | null) => {
|
||||
logger.warn({ code }, "signal-cli process closed unexpectedly");
|
||||
});
|
||||
|
||||
this.cli.on("error", (err: Error) => {
|
||||
logger.error({ err }, "signal-cli process error");
|
||||
});
|
||||
|
||||
logger.info({ dataDir: this.dataDir, botCount: Object.keys(this.botMapping).length }, "SignalService initialized");
|
||||
}
|
||||
|
||||
async teardown(): Promise<void> {
|
||||
if (this.cli) {
|
||||
this.cli.close();
|
||||
this.cli = null;
|
||||
}
|
||||
}
|
||||
|
||||
// --- Bot management ---
|
||||
|
||||
async register(botId: string, phoneNumber: string, deviceName = "Zammad"): Promise<{ linkUri: string }> {
|
||||
this.validateBotId(botId);
|
||||
if (!this.cli) throw new Error("SignalService not initialized");
|
||||
|
||||
logger.info({ botId, phoneNumber }, "Starting device linking");
|
||||
|
||||
const result = (await this.cli.call("startLink")) as Record<string, unknown> | undefined;
|
||||
const linkUri = result?.deviceLinkUri as string;
|
||||
if (!linkUri) {
|
||||
throw new Error("signal-cli startLink did not return a deviceLinkUri");
|
||||
}
|
||||
|
||||
// Finish linking in the background
|
||||
(async () => {
|
||||
try {
|
||||
const finishResult = await this.cli!.call("finishLink", {
|
||||
deviceLinkUri: linkUri,
|
||||
deviceName,
|
||||
});
|
||||
const linkedNumber = (finishResult as string) || phoneNumber;
|
||||
|
||||
this.botMapping[botId] = { phoneNumber: linkedNumber };
|
||||
this.saveBotMapping();
|
||||
logger.info({ botId, phoneNumber: linkedNumber }, "Device linking completed");
|
||||
} catch (error) {
|
||||
logger.error({ err: error, botId }, "Device linking failed");
|
||||
}
|
||||
})();
|
||||
|
||||
return { linkUri };
|
||||
}
|
||||
|
||||
async getBot(botId: string): Promise<{ registered: boolean; phoneNumber: string | null }> {
|
||||
this.validateBotId(botId);
|
||||
|
||||
const mapping = this.botMapping[botId];
|
||||
if (!mapping) {
|
||||
return { registered: false, phoneNumber: null };
|
||||
}
|
||||
|
||||
return { registered: true, phoneNumber: mapping.phoneNumber };
|
||||
}
|
||||
|
||||
async unregister(botId: string): Promise<void> {
|
||||
this.validateBotId(botId);
|
||||
|
||||
const mapping = this.botMapping[botId];
|
||||
if (!mapping) {
|
||||
logger.warn({ botId }, "Bot not found for unregister");
|
||||
return;
|
||||
}
|
||||
|
||||
delete this.botMapping[botId];
|
||||
this.saveBotMapping();
|
||||
logger.info({ botId }, "Bot unregistered");
|
||||
}
|
||||
|
||||
// --- Messaging ---
|
||||
|
||||
async send(
|
||||
botId: string,
|
||||
recipient: string,
|
||||
message: string,
|
||||
attachments?: Attachment[],
|
||||
autoGroup?: { ticketNumber: string }
|
||||
): Promise<SendResult> {
|
||||
this.validateBotId(botId);
|
||||
if (!this.cli) throw new Error("SignalService not initialized");
|
||||
|
||||
const mapping = this.botMapping[botId];
|
||||
if (!mapping) throw new Error(`Bot ${botId} is not registered`);
|
||||
|
||||
const account = mapping.phoneNumber;
|
||||
let finalRecipient = recipient;
|
||||
let groupId: string | undefined;
|
||||
|
||||
// Auto-group: create a group if enabled and recipient is a phone number
|
||||
if (this.autoGroupsEnabled && autoGroup && !recipient.startsWith("group.")) {
|
||||
try {
|
||||
const groupName = `Support Request: ${autoGroup.ticketNumber}`;
|
||||
logger.info({ botId, groupName, recipient }, "Creating auto-group");
|
||||
|
||||
const createResult = (await this.cli.call("updateGroup", {
|
||||
account,
|
||||
name: groupName,
|
||||
members: [recipient],
|
||||
description: "Private support conversation",
|
||||
})) as Record<string, unknown> | undefined;
|
||||
|
||||
if (createResult?.groupId) {
|
||||
groupId = createResult.groupId as string;
|
||||
finalRecipient = groupId;
|
||||
logger.info({ botId, groupId, groupName }, "Auto-group created");
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error({ err: error, botId }, "Failed to create auto-group, sending to original recipient");
|
||||
}
|
||||
}
|
||||
|
||||
// Build base64 attachments
|
||||
const base64Attachments: string[] = [];
|
||||
if (attachments && attachments.length > 0) {
|
||||
const MAX_SIZE = getMaxAttachmentSize();
|
||||
const MAX_TOTAL = getMaxTotalAttachmentSize();
|
||||
|
||||
if (attachments.length > MAX_ATTACHMENTS) {
|
||||
logger.warn({ count: attachments.length, max: MAX_ATTACHMENTS }, "Too many attachments, truncating");
|
||||
attachments = attachments.slice(0, MAX_ATTACHMENTS);
|
||||
}
|
||||
|
||||
let totalSize = 0;
|
||||
for (const att of attachments) {
|
||||
const estimatedSize = (att.data.length * 3) / 4;
|
||||
if (estimatedSize > MAX_SIZE) {
|
||||
logger.warn({ filename: att.filename, size: estimatedSize }, "Attachment too large, skipping");
|
||||
continue;
|
||||
}
|
||||
totalSize += estimatedSize;
|
||||
if (totalSize > MAX_TOTAL) {
|
||||
logger.warn({ totalSize }, "Total attachment size exceeded, skipping remaining");
|
||||
break;
|
||||
}
|
||||
base64Attachments.push(att.data);
|
||||
}
|
||||
}
|
||||
|
||||
// Send the message
|
||||
const isGroup = finalRecipient.startsWith("group.");
|
||||
const sendParams: Record<string, unknown> = {
|
||||
account,
|
||||
message,
|
||||
};
|
||||
|
||||
if (isGroup) {
|
||||
sendParams.groupId = finalRecipient;
|
||||
} else {
|
||||
sendParams.recipients = [finalRecipient];
|
||||
}
|
||||
|
||||
if (base64Attachments.length > 0) {
|
||||
sendParams.base64Attachments = base64Attachments;
|
||||
}
|
||||
|
||||
const result = (await this.cli.call("send", sendParams)) as Record<string, unknown> | undefined;
|
||||
const timestamp = (result?.timestamp as number) || Date.now();
|
||||
|
||||
return {
|
||||
recipient: finalRecipient,
|
||||
timestamp,
|
||||
source: account,
|
||||
groupId,
|
||||
};
|
||||
}
|
||||
|
||||
// --- Groups ---
|
||||
|
||||
async createGroup(
|
||||
botId: string,
|
||||
name: string,
|
||||
members: string[],
|
||||
description?: string
|
||||
): Promise<{ groupId: string }> {
|
||||
this.validateBotId(botId);
|
||||
if (!this.cli) throw new Error("SignalService not initialized");
|
||||
|
||||
const mapping = this.botMapping[botId];
|
||||
if (!mapping) throw new Error(`Bot ${botId} is not registered`);
|
||||
|
||||
const params: Record<string, unknown> = {
|
||||
account: mapping.phoneNumber,
|
||||
name,
|
||||
members,
|
||||
};
|
||||
if (description) params.description = description;
|
||||
|
||||
const result = (await this.cli.call("updateGroup", params)) as Record<string, unknown> | undefined;
|
||||
return { groupId: (result?.groupId as string) || String(result) };
|
||||
}
|
||||
|
||||
async updateGroup(botId: string, groupId: string, name?: string, description?: string): Promise<void> {
|
||||
this.validateBotId(botId);
|
||||
if (!this.cli) throw new Error("SignalService not initialized");
|
||||
|
||||
const mapping = this.botMapping[botId];
|
||||
if (!mapping) throw new Error(`Bot ${botId} is not registered`);
|
||||
|
||||
const params: Record<string, unknown> = {
|
||||
account: mapping.phoneNumber,
|
||||
groupId,
|
||||
};
|
||||
if (name) params.name = name;
|
||||
if (description) params.description = description;
|
||||
|
||||
await this.cli.call("updateGroup", params);
|
||||
}
|
||||
|
||||
async listGroups(botId: string): Promise<unknown[]> {
|
||||
this.validateBotId(botId);
|
||||
if (!this.cli) throw new Error("SignalService not initialized");
|
||||
|
||||
const mapping = this.botMapping[botId];
|
||||
if (!mapping) throw new Error(`Bot ${botId} is not registered`);
|
||||
|
||||
const result = await this.cli.call("listGroups", { account: mapping.phoneNumber });
|
||||
return Array.isArray(result) ? result : [];
|
||||
}
|
||||
|
||||
// --- Incoming message handler ---
|
||||
|
||||
private async handleIncomingMessage(account: string, envelope: SignalEnvelope): Promise<void> {
|
||||
// Find botId for this account
|
||||
const botId = this.findBotIdByAccount(account);
|
||||
if (!botId) {
|
||||
logger.debug({ account }, "No bot mapping for account, ignoring message");
|
||||
return;
|
||||
}
|
||||
|
||||
const source = envelope.sourceNumber || envelope.source;
|
||||
const sourceUuid = envelope.sourceUuid;
|
||||
|
||||
// Skip messages from self
|
||||
if (source === account) {
|
||||
return;
|
||||
}
|
||||
|
||||
const dataMessage = envelope.dataMessage;
|
||||
if (!dataMessage) {
|
||||
// Could be typing indicator, receipt, etc. -- ignore
|
||||
return;
|
||||
}
|
||||
|
||||
// Check for group info
|
||||
const isGroup = !!dataMessage.groupInfo?.groupId;
|
||||
const groupId = dataMessage.groupInfo?.groupId;
|
||||
const groupType = dataMessage.groupInfo?.type;
|
||||
|
||||
// Detect group join events
|
||||
if (
|
||||
isGroup &&
|
||||
groupType &&
|
||||
["DELIVER", "UPDATE"].includes(groupType) && // Group update events (member joins) -- forward to Zammad
|
||||
groupType === "UPDATE" &&
|
||||
source
|
||||
) {
|
||||
await this.postGroupMemberJoined(botId, groupId!, source);
|
||||
}
|
||||
|
||||
// Process data messages with content
|
||||
const messageText = dataMessage.message;
|
||||
if (!messageText && (!dataMessage.attachments || dataMessage.attachments.length === 0)) {
|
||||
return;
|
||||
}
|
||||
|
||||
// Handle attachments
|
||||
let attachment: string | undefined;
|
||||
let filename: string | undefined;
|
||||
let mimeType: string | undefined;
|
||||
|
||||
if (dataMessage.attachments && dataMessage.attachments.length > 0) {
|
||||
const att = dataMessage.attachments[0];
|
||||
const storedFile = att.storedFilename;
|
||||
if (storedFile) {
|
||||
const filePath = path.join(this.dataDir, "attachments", storedFile);
|
||||
try {
|
||||
if (fs.existsSync(filePath)) {
|
||||
const fileData = fs.readFileSync(filePath);
|
||||
attachment = fileData.toString("base64");
|
||||
filename = att.filename || storedFile;
|
||||
mimeType = att.contentType || "application/octet-stream";
|
||||
logger.info({ filename, mimeType, size: fileData.length }, "Attachment found");
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error({ err: error, filePath }, "Failed to read attachment file");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const messageId = `${source}@${dataMessage.timestamp || envelope.timestamp}`;
|
||||
const sentAt = dataMessage.timestamp || envelope.timestamp;
|
||||
|
||||
const payload: Record<string, unknown> = {
|
||||
from: source,
|
||||
to: isGroup ? groupId : account,
|
||||
user_id: sourceUuid,
|
||||
message: messageText || "",
|
||||
message_id: messageId,
|
||||
sent_at: sentAt ? new Date(sentAt).toISOString() : new Date().toISOString(),
|
||||
is_group: isGroup,
|
||||
};
|
||||
|
||||
if (attachment) {
|
||||
payload.attachment = attachment;
|
||||
payload.filename = filename;
|
||||
payload.mime_type = mimeType;
|
||||
}
|
||||
|
||||
// POST to Zammad webhook
|
||||
const zammadUrl = process.env.ZAMMAD_URL || "http://zammad-nginx:8080";
|
||||
try {
|
||||
const response = await fetch(`${zammadUrl}/api/v1/channels_cdr_signal_bot_webhook/${botId}`, {
|
||||
method: "POST",
|
||||
headers: { "Content-Type": "application/json" },
|
||||
body: JSON.stringify(payload),
|
||||
});
|
||||
|
||||
if (response.ok) {
|
||||
logger.info({ botId, messageId }, "Message forwarded to Zammad");
|
||||
} else {
|
||||
const errorText = await response.text();
|
||||
logger.error({ status: response.status, error: errorText, botId }, "Failed to send message to Zammad");
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error({ err: error, botId }, "Failed to POST to Zammad webhook");
|
||||
}
|
||||
}
|
||||
|
||||
private async postGroupMemberJoined(botId: string, groupId: string, memberPhone: string): Promise<void> {
|
||||
const zammadUrl = process.env.ZAMMAD_URL || "http://zammad-nginx:8080";
|
||||
const payload = {
|
||||
event: "group_member_joined",
|
||||
group_id: groupId,
|
||||
member_phone: memberPhone,
|
||||
timestamp: new Date().toISOString(),
|
||||
};
|
||||
|
||||
try {
|
||||
const response = await fetch(`${zammadUrl}/api/v1/channels_cdr_signal_bot_webhook/${botId}`, {
|
||||
method: "POST",
|
||||
headers: { "Content-Type": "application/json" },
|
||||
body: JSON.stringify(payload),
|
||||
});
|
||||
|
||||
if (response.ok) {
|
||||
logger.info({ botId, groupId, memberPhone }, "Group member join notification sent to Zammad");
|
||||
} else {
|
||||
logger.error({ status: response.status, botId, groupId }, "Failed to notify Zammad of group member join");
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error({ err: error, botId }, "Failed to POST group_member_joined to Zammad");
|
||||
}
|
||||
}
|
||||
|
||||
private findBotIdByAccount(account: string): string | undefined {
|
||||
for (const [botId, mapping] of Object.entries(this.botMapping)) {
|
||||
if (mapping.phoneNumber === account) {
|
||||
return botId;
|
||||
}
|
||||
}
|
||||
return undefined;
|
||||
}
|
||||
|
||||
private validateBotId(botId: string): void {
|
||||
if (!botId || !/^[a-zA-Z0-9_-]+$/.test(botId)) {
|
||||
throw new Error(`Invalid bot ID: ${botId}`);
|
||||
}
|
||||
}
|
||||
|
||||
private loadBotMapping(): void {
|
||||
try {
|
||||
if (fs.existsSync(this.mappingFile)) {
|
||||
const data = fs.readFileSync(this.mappingFile, "utf8");
|
||||
this.botMapping = JSON.parse(data);
|
||||
logger.info({ count: Object.keys(this.botMapping).length }, "Loaded bot mapping");
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error({ err: error }, "Failed to load bot mapping, starting fresh");
|
||||
this.botMapping = {};
|
||||
}
|
||||
}
|
||||
|
||||
private saveBotMapping(): void {
|
||||
try {
|
||||
fs.writeFileSync(this.mappingFile, JSON.stringify(this.botMapping, null, 2));
|
||||
logger.debug("Saved bot mapping");
|
||||
} catch (error) {
|
||||
logger.error({ err: error }, "Failed to save bot mapping");
|
||||
}
|
||||
}
|
||||
}
|
||||
247
apps/bridge-signal/src/signal-cli.ts
Normal file
247
apps/bridge-signal/src/signal-cli.ts
Normal file
|
|
@ -0,0 +1,247 @@
|
|||
import { ChildProcess, spawn } from "node:child_process";
|
||||
import { EventEmitter } from "node:events";
|
||||
import { createInterface, Interface } from "node:readline";
|
||||
|
||||
import { createLogger } from "@link-stack/logger";
|
||||
|
||||
const logger = createLogger("bridge-signal-cli");
|
||||
|
||||
interface PendingRequest {
|
||||
resolve: (value: unknown) => void;
|
||||
reject: (reason: unknown) => void;
|
||||
method: string;
|
||||
timer: ReturnType<typeof setTimeout>;
|
||||
}
|
||||
|
||||
interface JsonRpcRequest {
|
||||
jsonrpc: "2.0";
|
||||
method: string;
|
||||
params?: Record<string, unknown>;
|
||||
id: string;
|
||||
}
|
||||
|
||||
interface JsonRpcResponse {
|
||||
jsonrpc: "2.0";
|
||||
id?: string;
|
||||
result?: unknown;
|
||||
error?: { code: number; message: string; data?: unknown };
|
||||
method?: string;
|
||||
params?: Record<string, unknown>;
|
||||
}
|
||||
|
||||
export interface SignalEnvelope {
|
||||
source?: string;
|
||||
sourceNumber?: string;
|
||||
sourceUuid?: string;
|
||||
sourceName?: string;
|
||||
timestamp?: number;
|
||||
dataMessage?: {
|
||||
timestamp?: number;
|
||||
message?: string;
|
||||
groupInfo?: {
|
||||
groupId?: string;
|
||||
type?: string;
|
||||
};
|
||||
attachments?: Array<{
|
||||
id?: string;
|
||||
contentType?: string;
|
||||
filename?: string;
|
||||
size?: number;
|
||||
storedFilename?: string;
|
||||
}>;
|
||||
};
|
||||
syncMessage?: {
|
||||
sentMessage?: {
|
||||
destination?: string;
|
||||
destinationNumber?: string;
|
||||
timestamp?: number;
|
||||
message?: string;
|
||||
groupInfo?: { groupId?: string };
|
||||
};
|
||||
};
|
||||
typingMessage?: unknown;
|
||||
receiptMessage?: unknown;
|
||||
}
|
||||
|
||||
const REQUEST_TIMEOUT_MS = 60_000;
|
||||
|
||||
// eslint-disable-next-line unicorn/prefer-event-target
|
||||
export class SignalCli extends EventEmitter {
|
||||
private process: ChildProcess | null = null;
|
||||
private readline: Interface | null = null;
|
||||
private pending: Map<string, PendingRequest> = new Map();
|
||||
private nextId = 1;
|
||||
private configDir: string;
|
||||
private closed = false;
|
||||
|
||||
constructor(configDir: string) {
|
||||
super();
|
||||
this.configDir = configDir;
|
||||
}
|
||||
|
||||
async start(): Promise<void> {
|
||||
const args = ["--config", this.configDir, "--output=json", "jsonRpc"];
|
||||
|
||||
logger.info({ configDir: this.configDir, args }, "Starting signal-cli subprocess");
|
||||
|
||||
this.process = spawn("signal-cli", args, {
|
||||
stdio: ["pipe", "pipe", "pipe"],
|
||||
});
|
||||
|
||||
if (!this.process.stdout || !this.process.stdin) {
|
||||
throw new Error("Failed to open signal-cli stdio pipes");
|
||||
}
|
||||
|
||||
this.readline = createInterface({
|
||||
input: this.process.stdout,
|
||||
crlfDelay: Infinity,
|
||||
});
|
||||
|
||||
this.readline.on("line", (line: string) => {
|
||||
this.handleLine(line);
|
||||
});
|
||||
|
||||
this.process.stderr?.on("data", (data: Buffer) => {
|
||||
const msg = data.toString().trim();
|
||||
if (msg) {
|
||||
logger.warn({ stderr: msg }, "signal-cli stderr");
|
||||
}
|
||||
});
|
||||
|
||||
this.process.on("close", (code: number | null) => {
|
||||
this.closed = true;
|
||||
logger.info({ code }, "signal-cli process exited");
|
||||
this.rejectAllPending("signal-cli process exited");
|
||||
this.emit("close", code);
|
||||
});
|
||||
|
||||
this.process.on("error", (err: Error) => {
|
||||
logger.error({ err }, "signal-cli process error");
|
||||
this.emit("error", err);
|
||||
});
|
||||
|
||||
// Wait briefly for the process to start (or fail immediately)
|
||||
await new Promise<void>((resolve, reject) => {
|
||||
const timer = setTimeout(() => {
|
||||
resolve();
|
||||
}, 1000);
|
||||
|
||||
this.process!.on("error", (err) => {
|
||||
clearTimeout(timer);
|
||||
reject(err);
|
||||
});
|
||||
|
||||
this.process!.on("close", (code) => {
|
||||
if (code !== null && code !== 0) {
|
||||
clearTimeout(timer);
|
||||
reject(new Error(`signal-cli exited with code ${code}`));
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
logger.info("signal-cli subprocess started");
|
||||
}
|
||||
|
||||
async call(method: string, params: Record<string, unknown> = {}): Promise<unknown> {
|
||||
if (this.closed || !this.process?.stdin) {
|
||||
throw new Error("signal-cli is not running");
|
||||
}
|
||||
|
||||
const id = String(this.nextId++);
|
||||
const request: JsonRpcRequest = {
|
||||
jsonrpc: "2.0",
|
||||
method,
|
||||
params,
|
||||
id,
|
||||
};
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const timer = setTimeout(() => {
|
||||
this.pending.delete(id);
|
||||
reject(new Error(`signal-cli request timed out: ${method} (id=${id})`));
|
||||
}, REQUEST_TIMEOUT_MS);
|
||||
|
||||
this.pending.set(id, { resolve, reject, method, timer });
|
||||
|
||||
const line = JSON.stringify(request) + "\n";
|
||||
logger.debug({ method, id, params: Object.keys(params) }, "Sending JSON-RPC request");
|
||||
|
||||
this.process!.stdin!.write(line, (err) => {
|
||||
if (err) {
|
||||
clearTimeout(timer);
|
||||
this.pending.delete(id);
|
||||
reject(new Error(`Failed to write to signal-cli stdin: ${err.message}`));
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
close(): void {
|
||||
this.closed = true;
|
||||
this.rejectAllPending("signal-cli closing");
|
||||
if (this.readline) {
|
||||
this.readline.close();
|
||||
this.readline = null;
|
||||
}
|
||||
if (this.process) {
|
||||
this.process.kill("SIGTERM");
|
||||
// Force kill after 5 seconds
|
||||
setTimeout(() => {
|
||||
if (this.process && !this.process.killed) {
|
||||
this.process.kill("SIGKILL");
|
||||
}
|
||||
}, 5000);
|
||||
this.process = null;
|
||||
}
|
||||
}
|
||||
|
||||
private handleLine(line: string): void {
|
||||
if (!line.trim()) return;
|
||||
|
||||
let msg: JsonRpcResponse;
|
||||
try {
|
||||
msg = JSON.parse(line);
|
||||
} catch {
|
||||
logger.warn({ line: line.slice(0, 200) }, "Non-JSON output from signal-cli");
|
||||
return;
|
||||
}
|
||||
|
||||
// Response to a pending request
|
||||
if (msg.id !== undefined) {
|
||||
const pending = this.pending.get(String(msg.id));
|
||||
if (pending) {
|
||||
this.pending.delete(String(msg.id));
|
||||
clearTimeout(pending.timer);
|
||||
|
||||
if (msg.error) {
|
||||
logger.warn({ method: pending.method, error: msg.error }, "JSON-RPC error response");
|
||||
pending.reject(new Error(`signal-cli ${pending.method}: ${msg.error.message}`));
|
||||
} else {
|
||||
logger.debug({ method: pending.method, id: msg.id }, "JSON-RPC response received");
|
||||
pending.resolve(msg.result);
|
||||
}
|
||||
} else {
|
||||
logger.warn({ id: msg.id }, "Received response for unknown request id");
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
// Notification (no id field)
|
||||
if (msg.method === "receive") {
|
||||
const envelope = msg.params?.envelope as SignalEnvelope | undefined;
|
||||
const account = msg.params?.account as string | undefined;
|
||||
if (envelope) {
|
||||
logger.debug({ account, source: envelope.source || envelope.sourceNumber }, "Received message notification");
|
||||
this.emit("message", { account, envelope });
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private rejectAllPending(reason: string): void {
|
||||
for (const [_id, pending] of this.pending) {
|
||||
clearTimeout(pending.timer);
|
||||
pending.reject(new Error(reason));
|
||||
}
|
||||
this.pending.clear();
|
||||
}
|
||||
}
|
||||
9
apps/bridge-signal/tsconfig.json
Normal file
9
apps/bridge-signal/tsconfig.json
Normal file
|
|
@ -0,0 +1,9 @@
|
|||
{
|
||||
"extends": "@link-stack/typescript-config/tsconfig.node.json",
|
||||
"compilerOptions": {
|
||||
"outDir": "build/main",
|
||||
"rootDir": "src"
|
||||
},
|
||||
"include": ["src/**/*.ts", "src/**/.*.ts"],
|
||||
"exclude": ["node_modules/**"]
|
||||
}
|
||||
|
|
@ -45,4 +45,5 @@ RUN mkdir /home/node/baileys
|
|||
EXPOSE 5000
|
||||
ENV PORT 5000
|
||||
ENV NODE_ENV production
|
||||
ENV COREPACK_ENABLE_NETWORK=0
|
||||
ENTRYPOINT ["/opt/bridge-whatsapp/apps/bridge-whatsapp/docker-entrypoint.sh"]
|
||||
|
|
|
|||
3
apps/bridge-whatsapp/eslint.config.mjs
Normal file
3
apps/bridge-whatsapp/eslint.config.mjs
Normal file
|
|
@ -0,0 +1,3 @@
|
|||
import config from "@link-stack/eslint-config/node";
|
||||
|
||||
export default config;
|
||||
|
|
@ -1,33 +1,36 @@
|
|||
{
|
||||
"name": "@link-stack/bridge-whatsapp",
|
||||
"version": "3.3.5",
|
||||
"version": "3.5.0-beta.1",
|
||||
"main": "build/main/index.js",
|
||||
"author": "Darren Clarke <darren@redaranj.com>",
|
||||
"license": "AGPL-3.0-or-later",
|
||||
"prettier": "@link-stack/prettier-config",
|
||||
"dependencies": {
|
||||
"@adiwajshing/keyed-db": "0.2.4",
|
||||
"@hapi/hapi": "^21.4.3",
|
||||
"@hapipal/schmervice": "^3.0.0",
|
||||
"@hapipal/toys": "^4.0.0",
|
||||
"@link-stack/bridge-common": "workspace:*",
|
||||
"@link-stack/logger": "workspace:*",
|
||||
"@hono/node-server": "^1.13.8",
|
||||
"@whiskeysockets/baileys": "6.7.21",
|
||||
"hapi-pino": "^13.0.0",
|
||||
"link-preview-js": "^3.1.0"
|
||||
"hono": "^4.7.4",
|
||||
"link-preview-js": "^3.1.0",
|
||||
"@link-stack/logger": "workspace:*"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@link-stack/eslint-config": "workspace:*",
|
||||
"@link-stack/jest-config": "workspace:*",
|
||||
"@link-stack/prettier-config": "workspace:*",
|
||||
"@link-stack/typescript-config": "workspace:*",
|
||||
"@types/long": "^5",
|
||||
"@types/node": "*",
|
||||
"dotenv-cli": "^10.0.0",
|
||||
"eslint": "^9.23.0",
|
||||
"prettier": "^3.5.3",
|
||||
"tsx": "^4.20.6",
|
||||
"typescript": "^5.9.3"
|
||||
},
|
||||
"scripts": {
|
||||
"build": "tsc -p tsconfig.json",
|
||||
"dev": "dotenv -- tsx src/index.ts",
|
||||
"start": "node build/main/index.js"
|
||||
"start": "node build/main/index.js",
|
||||
"lint": "eslint src/",
|
||||
"format": "prettier --write src/",
|
||||
"format:check": "prettier --check src/"
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -11,10 +11,10 @@
|
|||
*/
|
||||
export function getMaxAttachmentSize(): number {
|
||||
const envValue = process.env.BRIDGE_MAX_ATTACHMENT_SIZE_MB;
|
||||
const sizeInMB = envValue ? parseInt(envValue, 10) : 50;
|
||||
const sizeInMB = envValue ? Number.parseInt(envValue, 10) : 50;
|
||||
|
||||
// Validate the value
|
||||
if (isNaN(sizeInMB) || sizeInMB <= 0) {
|
||||
if (Number.isNaN(sizeInMB) || sizeInMB <= 0) {
|
||||
console.warn(`Invalid BRIDGE_MAX_ATTACHMENT_SIZE_MB value: ${envValue}, using default 50MB`);
|
||||
return 50 * 1024 * 1024;
|
||||
}
|
||||
|
|
@ -1,42 +1,33 @@
|
|||
import * as Hapi from "@hapi/hapi";
|
||||
import hapiPino from "hapi-pino";
|
||||
import Schmervice from "@hapipal/schmervice";
|
||||
import WhatsappService from "./service.js";
|
||||
import {
|
||||
RegisterBotRoute,
|
||||
UnverifyBotRoute,
|
||||
GetBotRoute,
|
||||
SendMessageRoute,
|
||||
ReceiveMessageRoute,
|
||||
} from "./routes.js";
|
||||
import { serve } from "@hono/node-server";
|
||||
import { createLogger } from "@link-stack/logger";
|
||||
|
||||
const logger = createLogger('bridge-whatsapp-index');
|
||||
import { createRoutes } from "./routes.ts";
|
||||
import WhatsappService from "./service.ts";
|
||||
|
||||
const server = Hapi.server({ port: 5000 });
|
||||
|
||||
const startServer = async () => {
|
||||
await server.register({ plugin: hapiPino });
|
||||
|
||||
server.route(RegisterBotRoute);
|
||||
server.route(UnverifyBotRoute);
|
||||
server.route(GetBotRoute);
|
||||
server.route(SendMessageRoute);
|
||||
server.route(ReceiveMessageRoute);
|
||||
|
||||
await server.register(Schmervice);
|
||||
server.registerService(WhatsappService);
|
||||
|
||||
await server.start();
|
||||
|
||||
return server;
|
||||
};
|
||||
const logger = createLogger("bridge-whatsapp-index");
|
||||
|
||||
const main = async () => {
|
||||
await startServer();
|
||||
const service = new WhatsappService();
|
||||
await service.initialize();
|
||||
|
||||
const app = createRoutes(service);
|
||||
const port = Number.parseInt(process.env.PORT || "5000", 10);
|
||||
|
||||
serve({ fetch: app.fetch, port }, (info) => {
|
||||
logger.info({ port: info.port }, "bridge-whatsapp listening");
|
||||
});
|
||||
|
||||
const shutdown = async () => {
|
||||
logger.info("Shutting down...");
|
||||
await service.teardown();
|
||||
process.exit(0);
|
||||
};
|
||||
|
||||
process.on("SIGTERM", shutdown);
|
||||
process.on("SIGINT", shutdown);
|
||||
};
|
||||
|
||||
main().catch((err) => {
|
||||
logger.error(err);
|
||||
main().catch((error) => {
|
||||
logger.error(error);
|
||||
process.exit(1);
|
||||
});
|
||||
|
|
|
|||
|
|
@ -1,131 +1,69 @@
|
|||
import * as Hapi from "@hapi/hapi";
|
||||
import Toys from "@hapipal/toys";
|
||||
import WhatsappService from "./service";
|
||||
import { createLogger } from "@link-stack/logger";
|
||||
import { Hono } from "hono";
|
||||
|
||||
const withDefaults = Toys.withRouteDefaults({
|
||||
options: {
|
||||
cors: true,
|
||||
},
|
||||
});
|
||||
import type WhatsappService from "./service.ts";
|
||||
|
||||
const getService = (request: Hapi.Request): WhatsappService => {
|
||||
const { whatsappService } = request.services();
|
||||
const logger = createLogger("bridge-whatsapp-routes");
|
||||
|
||||
return whatsappService as WhatsappService;
|
||||
};
|
||||
const errorMessage = (error: unknown): string => (error instanceof Error ? error.message : String(error));
|
||||
|
||||
interface MessageRequest {
|
||||
phoneNumber: string;
|
||||
message: string;
|
||||
attachments?: Array<{ data: string; filename: string; mime_type: string }>;
|
||||
export function createRoutes(service: WhatsappService): Hono {
|
||||
const app = new Hono();
|
||||
|
||||
app.post("/api/bots/:id/register", async (c) => {
|
||||
const id = c.req.param("id");
|
||||
try {
|
||||
await service.register(id);
|
||||
logger.info({ id }, "Bot registered");
|
||||
return c.body(null, 200);
|
||||
} catch (error) {
|
||||
logger.error({ id, error: errorMessage(error) }, "Failed to register bot");
|
||||
return c.json({ error: errorMessage(error) }, 500);
|
||||
}
|
||||
});
|
||||
|
||||
app.get("/api/bots/:id", async (c) => {
|
||||
const id = c.req.param("id");
|
||||
try {
|
||||
return c.json(service.getBot(id));
|
||||
} catch (error) {
|
||||
return c.json({ error: errorMessage(error) }, 500);
|
||||
}
|
||||
});
|
||||
|
||||
app.post("/api/bots/:id/send", async (c) => {
|
||||
const id = c.req.param("id");
|
||||
const { phoneNumber, message, attachments } = await c.req.json<{
|
||||
phoneNumber: string;
|
||||
message: string;
|
||||
attachments?: Array<{ data: string; filename: string; mime_type: string }>;
|
||||
}>();
|
||||
|
||||
try {
|
||||
const result = await service.send(id, phoneNumber, message, attachments);
|
||||
logger.info({ id, attachmentCount: attachments?.length || 0 }, "Sent message");
|
||||
return c.json({ result });
|
||||
} catch (error) {
|
||||
logger.error({ id, error: errorMessage(error) }, "Failed to send message");
|
||||
return c.json({ error: errorMessage(error) }, 500);
|
||||
}
|
||||
});
|
||||
|
||||
app.post("/api/bots/:id/unverify", async (c) => {
|
||||
const id = c.req.param("id");
|
||||
try {
|
||||
await service.unverify(id);
|
||||
logger.info({ id }, "Bot unverified");
|
||||
return c.body(null, 200);
|
||||
} catch (error) {
|
||||
logger.error({ id, error: errorMessage(error) }, "Failed to unverify bot");
|
||||
return c.json({ error: errorMessage(error) }, 500);
|
||||
}
|
||||
});
|
||||
|
||||
app.get("/api/health", (c) => {
|
||||
return c.json({ status: "ok" });
|
||||
});
|
||||
|
||||
return app;
|
||||
}
|
||||
|
||||
export const SendMessageRoute = withDefaults({
|
||||
method: "post",
|
||||
path: "/api/bots/{id}/send",
|
||||
options: {
|
||||
description: "Send a message",
|
||||
async handler(request: Hapi.Request, _h: Hapi.ResponseToolkit) {
|
||||
const { id } = request.params;
|
||||
const { phoneNumber, message, attachments } =
|
||||
request.payload as MessageRequest;
|
||||
const whatsappService = getService(request);
|
||||
await whatsappService.send(
|
||||
id,
|
||||
phoneNumber,
|
||||
message as string,
|
||||
attachments,
|
||||
);
|
||||
request.logger.info(
|
||||
{
|
||||
id,
|
||||
attachmentCount: attachments?.length || 0,
|
||||
},
|
||||
"Sent a message at %s",
|
||||
new Date().toISOString(),
|
||||
);
|
||||
|
||||
return _h
|
||||
.response({
|
||||
result: {
|
||||
recipient: phoneNumber,
|
||||
timestamp: new Date().toISOString(),
|
||||
source: id,
|
||||
},
|
||||
})
|
||||
.code(200);
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
export const ReceiveMessageRoute = withDefaults({
|
||||
method: "get",
|
||||
path: "/api/bots/{id}/receive",
|
||||
options: {
|
||||
description: "Receive messages",
|
||||
async handler(request: Hapi.Request, _h: Hapi.ResponseToolkit) {
|
||||
const { id } = request.params;
|
||||
const whatsappService = getService(request);
|
||||
const date = new Date();
|
||||
const twoDaysAgo = new Date(date.getTime());
|
||||
twoDaysAgo.setDate(date.getDate() - 2);
|
||||
request.logger.info({ id }, "Received messages at %s", new Date().toISOString());
|
||||
|
||||
return whatsappService.receive(id, twoDaysAgo);
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
export const RegisterBotRoute = withDefaults({
|
||||
method: "post",
|
||||
path: "/api/bots/{id}/register",
|
||||
options: {
|
||||
description: "Register a bot",
|
||||
async handler(request: Hapi.Request, _h: Hapi.ResponseToolkit) {
|
||||
const { id } = request.params;
|
||||
const whatsappService = getService(request);
|
||||
|
||||
await whatsappService.register(id);
|
||||
/*
|
||||
, (error: string) => {
|
||||
if (error) {
|
||||
return _h.response(error).code(500);
|
||||
}
|
||||
request.logger.info({ id }, "Register bot at %s", new Date());
|
||||
|
||||
return _h.response().code(200);
|
||||
});
|
||||
*/
|
||||
|
||||
return _h.response().code(200);
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
export const UnverifyBotRoute = withDefaults({
|
||||
method: "post",
|
||||
path: "/api/bots/{id}/unverify",
|
||||
options: {
|
||||
description: "Unverify bot",
|
||||
async handler(request: Hapi.Request, _h: Hapi.ResponseToolkit) {
|
||||
const { id } = request.params;
|
||||
const whatsappService = getService(request);
|
||||
|
||||
return whatsappService.unverify(id);
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
export const GetBotRoute = withDefaults({
|
||||
method: "get",
|
||||
path: "/api/bots/{id}",
|
||||
options: {
|
||||
description: "Get bot info",
|
||||
async handler(request: Hapi.Request, _h: Hapi.ResponseToolkit) {
|
||||
const { id } = request.params;
|
||||
const whatsappService = getService(request);
|
||||
|
||||
return whatsappService.getBot(id);
|
||||
},
|
||||
},
|
||||
});
|
||||
|
|
|
|||
|
|
@ -1,37 +1,36 @@
|
|||
import { Server } from "@hapi/hapi";
|
||||
import { Service } from "@hapipal/schmervice";
|
||||
import fs from "node:fs";
|
||||
|
||||
import { createLogger } from "@link-stack/logger";
|
||||
import makeWASocket, {
|
||||
type WASocket,
|
||||
type SocketConfig,
|
||||
DisconnectReason,
|
||||
proto,
|
||||
downloadContentFromMessage,
|
||||
MediaType,
|
||||
fetchLatestBaileysVersion,
|
||||
isJidBroadcast,
|
||||
isJidStatusBroadcast,
|
||||
useMultiFileAuthState,
|
||||
} from "@whiskeysockets/baileys";
|
||||
import fs from "fs";
|
||||
import { createLogger } from "@link-stack/logger";
|
||||
import {
|
||||
getMaxAttachmentSize,
|
||||
getMaxTotalAttachmentSize,
|
||||
MAX_ATTACHMENTS,
|
||||
} from "@link-stack/bridge-common";
|
||||
|
||||
import { getMaxAttachmentSize, getMaxTotalAttachmentSize, MAX_ATTACHMENTS } from "./attachments";
|
||||
|
||||
type MediaType = "audio" | "document" | "image" | "video" | "sticker";
|
||||
|
||||
const logger = createLogger("bridge-whatsapp-service");
|
||||
|
||||
export type AuthCompleteCallback = (error?: string) => void;
|
||||
|
||||
export default class WhatsappService extends Service {
|
||||
connections: { [key: string]: any } = {};
|
||||
loginConnections: { [key: string]: any } = {};
|
||||
interface BotConnection {
|
||||
socket: WASocket;
|
||||
}
|
||||
|
||||
export default class WhatsappService {
|
||||
connections: Record<string, BotConnection> = {};
|
||||
loginConnections: Record<string, BotConnection> = {};
|
||||
|
||||
static browserDescription: [string, string, string] = ["Bridge", "Chrome", "2.0"];
|
||||
|
||||
constructor(server: Server, options: never) {
|
||||
super(server, options);
|
||||
}
|
||||
|
||||
getBaseDirectory(): string {
|
||||
return `/home/node/baileys`;
|
||||
}
|
||||
|
|
@ -76,7 +75,7 @@ export default class WhatsappService extends Service {
|
|||
private async resetConnections() {
|
||||
for (const connection of Object.values(this.connections)) {
|
||||
try {
|
||||
connection.end(null);
|
||||
connection.socket.end(new Error("Connection reset"));
|
||||
} catch (error) {
|
||||
logger.error({ error }, "Connection reset error");
|
||||
}
|
||||
|
|
@ -86,18 +85,16 @@ export default class WhatsappService extends Service {
|
|||
|
||||
private async createConnection(
|
||||
botID: string,
|
||||
server: Server,
|
||||
options: any,
|
||||
authCompleteCallback?: any,
|
||||
options: Partial<SocketConfig>,
|
||||
authCompleteCallback?: AuthCompleteCallback
|
||||
) {
|
||||
const authDirectory = this.getAuthDirectory(botID);
|
||||
const { state, saveCreds } = await useMultiFileAuthState(authDirectory);
|
||||
const msgRetryCounterMap: any = {};
|
||||
const socket = makeWASocket({
|
||||
...options,
|
||||
auth: state,
|
||||
generateHighQualityLinkPreview: false,
|
||||
msgRetryCounterMap,
|
||||
syncFullHistory: true,
|
||||
shouldIgnoreJid: (jid) => isJidBroadcast(jid) || isJidStatusBroadcast(jid),
|
||||
});
|
||||
let pause = 5000;
|
||||
|
|
@ -120,16 +117,17 @@ export default class WhatsappService extends Service {
|
|||
logger.info("opened connection");
|
||||
} else if (connectionState === "close") {
|
||||
logger.info({ lastDisconnect }, "connection closed");
|
||||
const disconnectStatusCode = (lastDisconnect?.error as any)?.output?.statusCode;
|
||||
const disconnectStatusCode = (lastDisconnect?.error as { output?: { statusCode?: number } } | undefined)
|
||||
?.output?.statusCode;
|
||||
if (disconnectStatusCode === DisconnectReason.restartRequired) {
|
||||
logger.info("reconnecting after got new login");
|
||||
await this.createConnection(botID, server, options);
|
||||
await this.createConnection(botID, options);
|
||||
authCompleteCallback?.();
|
||||
} else if (disconnectStatusCode !== DisconnectReason.loggedOut) {
|
||||
logger.info("reconnecting");
|
||||
await this.sleep(pause);
|
||||
pause *= 2;
|
||||
this.createConnection(botID, server, options);
|
||||
this.createConnection(botID, options);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -147,9 +145,17 @@ export default class WhatsappService extends Service {
|
|||
await this.queueUnreadMessages(botID, messages);
|
||||
}
|
||||
}
|
||||
|
||||
if (events["messaging-history.set"]) {
|
||||
const { messages, isLatest } = events["messaging-history.set"];
|
||||
logger.info({ messageCount: messages.length, isLatest }, "received message history on connection");
|
||||
if (messages.length > 0) {
|
||||
await this.queueUnreadMessages(botID, messages);
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
this.connections[botID] = { socket, msgRetryCounterMap };
|
||||
this.connections[botID] = { socket };
|
||||
}
|
||||
|
||||
private async updateConnections() {
|
||||
|
|
@ -165,9 +171,8 @@ export default class WhatsappService extends Service {
|
|||
const { version, isLatest } = await fetchLatestBaileysVersion();
|
||||
logger.info({ version: version.join("."), isLatest }, "using WA version");
|
||||
|
||||
await this.createConnection(botID, this.server, {
|
||||
await this.createConnection(botID, {
|
||||
browser: WhatsappService.browserDescription,
|
||||
printQRInTerminal: true,
|
||||
version,
|
||||
});
|
||||
}
|
||||
|
|
@ -175,23 +180,21 @@ export default class WhatsappService extends Service {
|
|||
}
|
||||
|
||||
private async queueMessage(botID: string, webMessageInfo: proto.IWebMessageInfo) {
|
||||
const {
|
||||
key: { id, fromMe, remoteJid },
|
||||
message,
|
||||
messageTimestamp,
|
||||
} = webMessageInfo;
|
||||
logger.info("Message type debug");
|
||||
for (const key in message) {
|
||||
logger.info(
|
||||
{ key, exists: !!message[key as keyof proto.IMessage] },
|
||||
"Message field",
|
||||
);
|
||||
const { key, message, messageTimestamp } = webMessageInfo;
|
||||
if (!key) {
|
||||
logger.warn("Message missing key, skipping");
|
||||
return;
|
||||
}
|
||||
const { id, fromMe, remoteJid } = key;
|
||||
// Baileys 7 uses LIDs (Linked IDs) instead of phone numbers in some cases.
|
||||
// senderPn contains the actual phone number when available.
|
||||
const senderPn = (key as { senderPn?: string }).senderPn;
|
||||
const participantPn = (key as { participantPn?: string }).participantPn;
|
||||
logger.info({ remoteJid, senderPn, participantPn, fromMe }, "Processing incoming message");
|
||||
const isValidMessage = message && remoteJid !== "status@broadcast" && !fromMe;
|
||||
if (isValidMessage) {
|
||||
const { audioMessage, documentMessage, imageMessage, videoMessage } = message;
|
||||
const isMediaMessage =
|
||||
audioMessage || documentMessage || imageMessage || videoMessage;
|
||||
const isMediaMessage = audioMessage || documentMessage || imageMessage || videoMessage;
|
||||
|
||||
const messageContent = Object.values(message)[0];
|
||||
let messageType: MediaType;
|
||||
|
|
@ -222,8 +225,8 @@ export default class WhatsappService extends Service {
|
|||
|
||||
const stream = await downloadContentFromMessage(
|
||||
messageContent,
|
||||
// @ts-ignore
|
||||
messageType,
|
||||
// @ts-expect-error messageType is dynamically resolved
|
||||
messageType
|
||||
);
|
||||
let buffer = Buffer.from([]);
|
||||
for await (const chunk of stream) {
|
||||
|
|
@ -237,34 +240,54 @@ export default class WhatsappService extends Service {
|
|||
const extendedTextMessage = message?.extendedTextMessage?.text;
|
||||
const imageMessage = message?.imageMessage?.caption;
|
||||
const videoMessage = message?.videoMessage?.caption;
|
||||
const messageText = [
|
||||
conversation,
|
||||
extendedTextMessage,
|
||||
imageMessage,
|
||||
videoMessage,
|
||||
].find((text) => text && text !== "");
|
||||
const messageText = [conversation, extendedTextMessage, imageMessage, videoMessage].find(
|
||||
(text) => text && text !== ""
|
||||
);
|
||||
|
||||
// Extract phone number and user ID (LID) separately
|
||||
// remoteJid may contain LIDs (Baileys 7+) which are not phone numbers
|
||||
const jidValue = remoteJid?.split("@")[0];
|
||||
const isLidJid = remoteJid?.endsWith("@lid");
|
||||
|
||||
// Phone number: prefer senderPn/participantPn, fall back to remoteJid only if it's not a LID
|
||||
const senderPhone =
|
||||
senderPn?.split("@")[0] || participantPn?.split("@")[0] || (isLidJid ? undefined : jidValue);
|
||||
|
||||
// User ID (LID): extract from remoteJid if it's a LID format
|
||||
const senderUserId = isLidJid ? jidValue : undefined;
|
||||
|
||||
// Must have at least one identifier
|
||||
if (!senderPhone && !senderUserId) {
|
||||
logger.warn({ remoteJid, senderPn, participantPn }, "Could not determine sender identity, skipping message");
|
||||
return;
|
||||
}
|
||||
|
||||
const payload = {
|
||||
to: botID,
|
||||
from: remoteJid?.split("@")[0],
|
||||
messageId: id,
|
||||
sentAt: new Date((messageTimestamp as number) * 1000).toISOString(),
|
||||
from: senderPhone,
|
||||
user_id: senderUserId,
|
||||
message_id: id,
|
||||
sent_at: new Date((messageTimestamp as number) * 1000).toISOString(),
|
||||
message: messageText,
|
||||
attachment,
|
||||
filename,
|
||||
mimeType,
|
||||
mime_type: mimeType,
|
||||
};
|
||||
|
||||
await fetch(
|
||||
`${process.env.BRIDGE_FRONTEND_URL}/api/whatsapp/bots/${botID}/receive`,
|
||||
{
|
||||
method: "POST",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
body: JSON.stringify(payload),
|
||||
// Send directly to Zammad's WhatsApp webhook
|
||||
const zammadUrl = process.env.ZAMMAD_URL || "http://zammad-nginx:8080";
|
||||
const response = await fetch(`${zammadUrl}/api/v1/channels_cdr_whatsapp_bot_webhook/${botID}`, {
|
||||
method: "POST",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
);
|
||||
body: JSON.stringify(payload),
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const errorText = await response.text();
|
||||
logger.error({ status: response.status, error: errorText, botID }, "Failed to send message to Zammad");
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -275,7 +298,7 @@ export default class WhatsappService extends Service {
|
|||
}
|
||||
}
|
||||
|
||||
getBot(botID: string): Record<string, any> {
|
||||
getBot(botID: string): Record<string, unknown> {
|
||||
const botDirectory = this.getBotDirectory(botID);
|
||||
const qrPath = `${botDirectory}/qr.txt`;
|
||||
const verifiedFile = `${botDirectory}/verified`;
|
||||
|
|
@ -295,7 +318,7 @@ export default class WhatsappService extends Service {
|
|||
} catch (error) {
|
||||
logger.warn({ botID, error }, "Error during logout, forcing disconnect");
|
||||
try {
|
||||
connection.socket.end(undefined);
|
||||
connection.socket.end(new Error("Forced disconnect"));
|
||||
} catch (endError) {
|
||||
logger.warn({ botID, endError }, "Error ending socket connection");
|
||||
}
|
||||
|
|
@ -314,12 +337,7 @@ export default class WhatsappService extends Service {
|
|||
|
||||
async register(botID: string, callback?: AuthCompleteCallback): Promise<void> {
|
||||
const { version } = await fetchLatestBaileysVersion();
|
||||
await this.createConnection(
|
||||
botID,
|
||||
this.server,
|
||||
{ version, browser: WhatsappService.browserDescription },
|
||||
callback,
|
||||
);
|
||||
await this.createConnection(botID, { version, browser: WhatsappService.browserDescription }, callback);
|
||||
callback?.();
|
||||
}
|
||||
|
||||
|
|
@ -327,10 +345,10 @@ export default class WhatsappService extends Service {
|
|||
botID: string,
|
||||
phoneNumber: string,
|
||||
message: string,
|
||||
attachments?: Array<{ data: string; filename: string; mime_type: string }>,
|
||||
): Promise<void> {
|
||||
attachments?: Array<{ data: string; filename: string; mime_type: string }>
|
||||
): Promise<{ recipient: string; timestamp: string; source: string }> {
|
||||
const connection = this.connections[botID]?.socket;
|
||||
const digits = phoneNumber.replace(/\D+/g, "");
|
||||
const digits = phoneNumber.replaceAll(/\D+/g, "");
|
||||
// LIDs are 15+ digits, phone numbers with country code are typically 10-14 digits
|
||||
const suffix = digits.length > 14 ? "@lid" : "@s.whatsapp.net";
|
||||
const recipient = `${digits}${suffix}`;
|
||||
|
|
@ -346,9 +364,7 @@ export default class WhatsappService extends Service {
|
|||
const MAX_TOTAL_SIZE = getMaxTotalAttachmentSize();
|
||||
|
||||
if (attachments.length > MAX_ATTACHMENTS) {
|
||||
throw new Error(
|
||||
`Too many attachments: ${attachments.length} (max ${MAX_ATTACHMENTS})`,
|
||||
);
|
||||
throw new Error(`Too many attachments: ${attachments.length} (max ${MAX_ATTACHMENTS})`);
|
||||
}
|
||||
|
||||
let totalSize = 0;
|
||||
|
|
@ -364,7 +380,7 @@ export default class WhatsappService extends Service {
|
|||
size: estimatedSize,
|
||||
maxSize: MAX_ATTACHMENT_SIZE,
|
||||
},
|
||||
"Attachment exceeds size limit, skipping",
|
||||
"Attachment exceeds size limit, skipping"
|
||||
);
|
||||
continue;
|
||||
}
|
||||
|
|
@ -376,7 +392,7 @@ export default class WhatsappService extends Service {
|
|||
totalSize,
|
||||
maxTotalSize: MAX_TOTAL_SIZE,
|
||||
},
|
||||
"Total attachment size exceeds limit, skipping remaining",
|
||||
"Total attachment size exceeds limit, skipping remaining"
|
||||
);
|
||||
break;
|
||||
}
|
||||
|
|
@ -407,15 +423,11 @@ export default class WhatsappService extends Service {
|
|||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async receive(
|
||||
botID: string,
|
||||
_lastReceivedDate: Date,
|
||||
): Promise<proto.IWebMessageInfo[]> {
|
||||
const connection = this.connections[botID]?.socket;
|
||||
const messages = await connection.loadAllUnreadMessages();
|
||||
|
||||
return messages;
|
||||
return {
|
||||
recipient: phoneNumber,
|
||||
timestamp: new Date().toISOString(),
|
||||
source: botID,
|
||||
};
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,8 +0,0 @@
|
|||
import type WhatsappService from "./service.js";
|
||||
|
||||
declare module "@hapipal/schmervice" {
|
||||
interface SchmerviceDecorator {
|
||||
(namespace: "whatsapp"): WhatsappService;
|
||||
}
|
||||
type ServiceFunctionalInterface = { name: string };
|
||||
}
|
||||
|
|
@ -1,16 +1,8 @@
|
|||
{
|
||||
"extends": "@link-stack/typescript-config/tsconfig.node.json",
|
||||
"compilerOptions": {
|
||||
"module": "commonjs",
|
||||
"target": "es2018",
|
||||
"esModuleInterop": true,
|
||||
"moduleResolution": "node",
|
||||
"outDir": "build/main",
|
||||
"rootDir": "src",
|
||||
"skipLibCheck": true,
|
||||
"types": ["node"],
|
||||
"lib": ["es2020", "DOM"],
|
||||
"composite": true
|
||||
"rootDir": "src"
|
||||
},
|
||||
"include": ["src/**/*.ts", "src/**/.*.ts"],
|
||||
"exclude": ["node_modules/**"]
|
||||
|
|
|
|||
|
|
@ -1,12 +0,0 @@
|
|||
.git
|
||||
.idea
|
||||
**/node_modules
|
||||
!/node_modules
|
||||
**/build
|
||||
**/dist
|
||||
**/tmp
|
||||
**/.env*
|
||||
**/coverage
|
||||
**/.next
|
||||
**/cypress/videos
|
||||
**/cypress/screenshots
|
||||
|
|
@ -1,144 +0,0 @@
|
|||
# Bridge Worker
|
||||
|
||||
Background job processor for handling asynchronous tasks in the CDR Link communication bridge system.
|
||||
|
||||
## Overview
|
||||
|
||||
Bridge Worker uses Graphile Worker to process queued jobs for message handling, media conversion, webhook notifications, and scheduled tasks. It manages the flow of messages between various communication channels (Signal, WhatsApp, Facebook, Voice) and the Zammad ticketing system.
|
||||
|
||||
## Features
|
||||
|
||||
- **Message Processing**: Handle incoming/outgoing messages for all supported channels
|
||||
- **Media Conversion**: Convert audio/video files between formats
|
||||
- **Webhook Notifications**: Notify external systems of events
|
||||
- **Scheduled Tasks**: Cron-based job scheduling
|
||||
- **Job Queue Management**: Reliable job processing with retries
|
||||
- **Multi-Channel Support**: Signal, WhatsApp, Facebook, Voice (Twilio)
|
||||
|
||||
## Development
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- Node.js >= 20
|
||||
- npm >= 10
|
||||
- PostgreSQL database
|
||||
- Redis (for caching)
|
||||
- FFmpeg (for media conversion)
|
||||
|
||||
### Setup
|
||||
|
||||
```bash
|
||||
# Install dependencies
|
||||
npm install
|
||||
|
||||
# Build TypeScript
|
||||
npm run build
|
||||
|
||||
# Run development server with auto-reload
|
||||
npm run dev
|
||||
|
||||
# Start production worker
|
||||
npm run start
|
||||
```
|
||||
|
||||
### Environment Variables
|
||||
|
||||
Required environment variables:
|
||||
|
||||
- `DATABASE_URL` - PostgreSQL connection string
|
||||
- `GRAPHILE_WORKER_CONCURRENCY` - Number of concurrent jobs (default: 10)
|
||||
- `GRAPHILE_WORKER_POLL_INTERVAL` - Job poll interval in ms (default: 1000)
|
||||
- `ZAMMAD_URL` - Zammad instance URL
|
||||
- `ZAMMAD_API_TOKEN` - Zammad API token
|
||||
- `TWILIO_ACCOUNT_SID` - Twilio account SID
|
||||
- `TWILIO_AUTH_TOKEN` - Twilio auth token
|
||||
- `SIGNAL_CLI_URL` - Signal CLI REST API URL
|
||||
- `WHATSAPP_SERVICE_URL` - WhatsApp bridge service URL
|
||||
- `FACEBOOK_APP_SECRET` - Facebook app secret
|
||||
- `FACEBOOK_PAGE_ACCESS_TOKEN` - Facebook page token
|
||||
|
||||
### Available Scripts
|
||||
|
||||
- `npm run build` - Compile TypeScript
|
||||
- `npm run dev` - Development mode with watch
|
||||
- `npm run start` - Start production worker
|
||||
|
||||
## Task Types
|
||||
|
||||
### Signal Tasks
|
||||
- `receive-signal-message` - Process incoming Signal messages
|
||||
- `send-signal-message` - Send outgoing Signal messages
|
||||
- `fetch-signal-messages` - Fetch messages from Signal CLI
|
||||
|
||||
### WhatsApp Tasks
|
||||
- `receive-whatsapp-message` - Process incoming WhatsApp messages
|
||||
- `send-whatsapp-message` - Send outgoing WhatsApp messages
|
||||
|
||||
### Facebook Tasks
|
||||
- `receive-facebook-message` - Process incoming Facebook messages
|
||||
- `send-facebook-message` - Send outgoing Facebook messages
|
||||
|
||||
### Voice Tasks
|
||||
- `receive-voice-message` - Process incoming voice calls/messages
|
||||
- `send-voice-message` - Send voice messages via Twilio
|
||||
- `twilio-recording` - Handle Twilio call recordings
|
||||
- `voice-line-audio-update` - Update voice line audio
|
||||
- `voice-line-delete` - Delete voice line
|
||||
- `voice-line-provider-update` - Update voice provider settings
|
||||
|
||||
### Common Tasks
|
||||
- `notify-webhooks` - Send webhook notifications
|
||||
- `import-label-studio` - Import Label Studio annotations
|
||||
|
||||
## Architecture
|
||||
|
||||
### Job Processing
|
||||
|
||||
Jobs are queued in PostgreSQL using Graphile Worker:
|
||||
|
||||
```typescript
|
||||
await addJob('send-signal-message', {
|
||||
to: '+1234567890',
|
||||
message: 'Hello world'
|
||||
})
|
||||
```
|
||||
|
||||
### Cron Schedule
|
||||
|
||||
Scheduled tasks are configured in `crontab`:
|
||||
- Periodic message fetching
|
||||
- Cleanup tasks
|
||||
- Health checks
|
||||
|
||||
### Error Handling
|
||||
|
||||
- Automatic retries with exponential backoff
|
||||
- Dead letter queue for failed jobs
|
||||
- Comprehensive logging with winston
|
||||
|
||||
## Media Handling
|
||||
|
||||
Supports conversion between formats:
|
||||
- Audio: MP3, OGG, WAV, M4A
|
||||
- Uses fluent-ffmpeg for processing
|
||||
- Automatic format detection
|
||||
|
||||
## Integration Points
|
||||
|
||||
- **Zammad**: Creates/updates tickets via API
|
||||
- **Signal CLI**: REST API for Signal messaging
|
||||
- **WhatsApp Bridge**: HTTP API for WhatsApp
|
||||
- **Twilio**: Voice and SMS capabilities
|
||||
- **Facebook**: Graph API for Messenger
|
||||
|
||||
## Docker Support
|
||||
|
||||
```bash
|
||||
# Build image
|
||||
docker build -t link-stack/bridge-worker .
|
||||
|
||||
# Run with docker-compose
|
||||
docker-compose -f docker/compose/bridge.yml up
|
||||
```
|
||||
|
||||
The worker includes cron support via built-in crontab.
|
||||
|
|
@ -1,2 +0,0 @@
|
|||
*/1 * * * * fetch-signal-messages ?max=1&id=fetchSignalMessagesCron {"scheduleTasks": "true"}
|
||||
*/2 * * * * check-group-membership ?max=1&id=checkGroupMembershipCron {}
|
||||
|
|
@ -1,16 +0,0 @@
|
|||
import type {} from "graphile-worker";
|
||||
|
||||
const preset: any = {
|
||||
worker: {
|
||||
connectionString: process.env.DATABASE_URL,
|
||||
maxPoolSize: process.env.BRIDGE_WORKER_POOL_SIZE
|
||||
? parseInt(process.env.BRIDGE_WORKER_POOL_SIZE, 10)
|
||||
: 10,
|
||||
pollInterval: process.env.BRIDGE_WORKER_POLL_INTERVAL
|
||||
? parseInt(process.env.BRIDGE_WORKER_POLL_INTERVAL, 10)
|
||||
: 2000,
|
||||
fileExtensions: [".ts"],
|
||||
},
|
||||
};
|
||||
|
||||
export default preset;
|
||||
|
|
@ -1,46 +0,0 @@
|
|||
import { run } from "graphile-worker";
|
||||
import { createLogger } from "@link-stack/logger";
|
||||
import * as path from "path";
|
||||
import { fileURLToPath } from "url";
|
||||
|
||||
const logger = createLogger("bridge-worker");
|
||||
const __filename = fileURLToPath(import.meta.url);
|
||||
const __dirname = path.dirname(__filename);
|
||||
|
||||
const startWorker = async () => {
|
||||
logger.info("Starting worker...");
|
||||
|
||||
await run({
|
||||
connectionString: process.env.DATABASE_URL,
|
||||
noHandleSignals: false,
|
||||
concurrency: process.env.BRIDGE_WORKER_CONCURRENCY
|
||||
? parseInt(process.env.BRIDGE_WORKER_CONCURRENCY, 10)
|
||||
: 10,
|
||||
maxPoolSize: process.env.BRIDGE_WORKER_POOL_SIZE
|
||||
? parseInt(process.env.BRIDGE_WORKER_POOL_SIZE, 10)
|
||||
: 10,
|
||||
pollInterval: process.env.BRIDGE_WORKER_POLL_INTERVAL
|
||||
? parseInt(process.env.BRIDGE_WORKER_POLL_INTERVAL, 10)
|
||||
: 1000,
|
||||
taskDirectory: `${__dirname}/tasks`,
|
||||
crontabFile: `${__dirname}/crontab`,
|
||||
});
|
||||
};
|
||||
|
||||
const main = async () => {
|
||||
await startWorker();
|
||||
};
|
||||
|
||||
main().catch((err) => {
|
||||
logger.error(
|
||||
{
|
||||
error: err,
|
||||
message: err.message,
|
||||
stack: err.stack,
|
||||
name: err.name,
|
||||
},
|
||||
"Worker failed to start",
|
||||
);
|
||||
console.error("Full error:", err);
|
||||
process.exit(1);
|
||||
});
|
||||
|
|
@ -1,20 +0,0 @@
|
|||
/* eslint-disable camelcase */
|
||||
// import { SavedVoiceProvider } from "@digiresilience/bridge-db";
|
||||
import Twilio from "twilio";
|
||||
|
||||
type SavedVoiceProvider = any;
|
||||
|
||||
export const twilioClientFor = (
|
||||
provider: SavedVoiceProvider,
|
||||
): Twilio.Twilio => {
|
||||
const { accountSid, apiKeySid, apiKeySecret } = provider.credentials;
|
||||
if (!accountSid || !apiKeySid || !apiKeySecret)
|
||||
throw new Error(
|
||||
`twilio provider ${provider.name} does not have credentials`,
|
||||
);
|
||||
|
||||
return Twilio(apiKeySid, apiKeySecret, {
|
||||
accountSid,
|
||||
});
|
||||
};
|
||||
|
||||
|
|
@ -1,56 +0,0 @@
|
|||
/*
|
||||
import pgPromise from "pg-promise";
|
||||
import * as pgMonitor from "pg-monitor";
|
||||
import {
|
||||
dbInitOptions,
|
||||
IRepositories,
|
||||
AppDatabase,
|
||||
} from "@digiresilience/bridge-db";
|
||||
import config from "@digiresilience/bridge-config";
|
||||
import type { IInitOptions } from "pg-promise";
|
||||
|
||||
export const initDiagnostics = (
|
||||
logSql: boolean,
|
||||
initOpts: IInitOptions<IRepositories>,
|
||||
): void => {
|
||||
if (logSql) {
|
||||
pgMonitor.attach(initOpts);
|
||||
} else {
|
||||
pgMonitor.attach(initOpts, ["error"]);
|
||||
}
|
||||
};
|
||||
|
||||
export const stopDiagnostics = (): void => pgMonitor.detach();
|
||||
|
||||
let pgp: any;
|
||||
let pgpInitOptions: any;
|
||||
|
||||
export const initPgp = (): void => {
|
||||
pgpInitOptions = dbInitOptions(config);
|
||||
pgp = pgPromise(pgpInitOptions);
|
||||
};
|
||||
|
||||
const initDb = (): AppDatabase => {
|
||||
const db = pgp(config.db.connection);
|
||||
return db;
|
||||
};
|
||||
|
||||
export const stopDb = async (db: AppDatabase): Promise<void> => {
|
||||
return db.$pool.end();
|
||||
};
|
||||
*/
|
||||
|
||||
export type AppDatabase = any;
|
||||
|
||||
export const withDb = <T>(f: (db: AppDatabase) => Promise<T>): Promise<T> => {
|
||||
/*
|
||||
const db = initDb();
|
||||
initDiagnostics(config.logging.sql, pgpInitOptions);
|
||||
try {
|
||||
return f(db);
|
||||
} finally {
|
||||
stopDiagnostics();
|
||||
}
|
||||
*/
|
||||
return f(null);
|
||||
};
|
||||
|
|
@ -1,272 +0,0 @@
|
|||
import { createLogger } from "@link-stack/logger";
|
||||
|
||||
const logger = createLogger('formstack-field-mapping');
|
||||
|
||||
/**
|
||||
* Field mapping configuration for Formstack to Zammad integration
|
||||
*
|
||||
* This configuration is completely flexible - you define your own internal field names
|
||||
* and map them to both Formstack source fields and Zammad custom fields.
|
||||
*/
|
||||
export interface FieldMappingConfig {
|
||||
/**
|
||||
* Map internal field keys to Formstack field names
|
||||
*
|
||||
* Required keys (system):
|
||||
* - formId: The Formstack Form ID field
|
||||
* - uniqueId: The Formstack submission unique ID field
|
||||
*
|
||||
* Optional keys with special behavior:
|
||||
* - email: Used for user lookup/creation (if provided)
|
||||
* - phone: Used for user lookup/creation (if provided)
|
||||
* - signalAccount: Used for Signal-based user lookup (tried first before phone)
|
||||
* - name: User's full name (can be nested object with first/last, used in user creation)
|
||||
* - organization: Used in ticket title template placeholder {organization}
|
||||
* - typeOfSupport: Used in ticket title template placeholder {typeOfSupport}
|
||||
* - descriptionOfIssue: Used as article subject (defaults to "Support Request" if not provided)
|
||||
*
|
||||
* All other keys are completely arbitrary and defined by your form.
|
||||
*/
|
||||
sourceFields: Record<string, string>;
|
||||
|
||||
/**
|
||||
* Map Zammad custom field names to internal field keys (from sourceFields)
|
||||
*
|
||||
* Example:
|
||||
* {
|
||||
* "us_state": "state", // Zammad field "us_state" gets value from sourceFields["state"]
|
||||
* "zip_code": "zipCode", // Zammad field "zip_code" gets value from sourceFields["zipCode"]
|
||||
* "custom_field": "myField" // Any custom field mapping
|
||||
* }
|
||||
*
|
||||
* The values in this object must correspond to keys in sourceFields.
|
||||
*/
|
||||
zammadFields: Record<string, string>;
|
||||
|
||||
/**
|
||||
* Configuration for ticket creation
|
||||
*/
|
||||
ticket: {
|
||||
/** Zammad group name to assign tickets to */
|
||||
group: string;
|
||||
|
||||
/** Article type name (e.g., "note", "cdr_signal", "email") */
|
||||
defaultArticleType: string;
|
||||
|
||||
/**
|
||||
* Template for ticket title
|
||||
* Supports placeholders: {name}, {organization}, {typeOfSupport}
|
||||
* Placeholders reference internal field keys from sourceFields
|
||||
*/
|
||||
titleTemplate?: string;
|
||||
};
|
||||
|
||||
/**
|
||||
* Configuration for extracting nested field values
|
||||
*/
|
||||
nestedFields?: {
|
||||
/**
|
||||
* How to extract first/last name from a nested Name field
|
||||
* Example: { firstNamePath: "first", lastNamePath: "last" }
|
||||
* for a field like { "Name": { "first": "John", "last": "Doe" } }
|
||||
*/
|
||||
name?: {
|
||||
firstNamePath?: string;
|
||||
lastNamePath?: string;
|
||||
};
|
||||
};
|
||||
}
|
||||
|
||||
let cachedMapping: FieldMappingConfig | null = null;
|
||||
|
||||
/**
|
||||
* Load field mapping configuration from environment variable (REQUIRED)
|
||||
*/
|
||||
export function loadFieldMapping(): FieldMappingConfig {
|
||||
if (cachedMapping) {
|
||||
return cachedMapping;
|
||||
}
|
||||
|
||||
const configJson = process.env.FORMSTACK_FIELD_MAPPING;
|
||||
|
||||
if (!configJson) {
|
||||
throw new Error(
|
||||
'FORMSTACK_FIELD_MAPPING environment variable is required. ' +
|
||||
'Please set it to a JSON string containing your field mapping configuration.'
|
||||
);
|
||||
}
|
||||
|
||||
logger.info('Loading Formstack field mapping from environment variable');
|
||||
|
||||
try {
|
||||
const config = JSON.parse(configJson) as FieldMappingConfig;
|
||||
|
||||
// Validate required sections exist
|
||||
if (!config.sourceFields || typeof config.sourceFields !== 'object') {
|
||||
throw new Error('Invalid field mapping configuration: sourceFields must be an object');
|
||||
}
|
||||
|
||||
if (!config.zammadFields || typeof config.zammadFields !== 'object') {
|
||||
throw new Error('Invalid field mapping configuration: zammadFields must be an object');
|
||||
}
|
||||
|
||||
if (!config.ticket || typeof config.ticket !== 'object') {
|
||||
throw new Error('Invalid field mapping configuration: ticket must be an object');
|
||||
}
|
||||
|
||||
// Validate required ticket fields
|
||||
if (!config.ticket.group) {
|
||||
throw new Error('Invalid field mapping configuration: ticket.group is required');
|
||||
}
|
||||
|
||||
if (!config.ticket.defaultArticleType) {
|
||||
throw new Error('Invalid field mapping configuration: ticket.defaultArticleType is required');
|
||||
}
|
||||
|
||||
// Validate required source fields
|
||||
const systemRequiredFields = ['formId', 'uniqueId'];
|
||||
for (const field of systemRequiredFields) {
|
||||
if (!config.sourceFields[field]) {
|
||||
throw new Error(`Invalid field mapping configuration: sourceFields.${field} is required (system field)`);
|
||||
}
|
||||
}
|
||||
|
||||
// Validate zammadFields reference valid sourceFields
|
||||
for (const [zammadField, sourceKey] of Object.entries(config.zammadFields)) {
|
||||
if (!config.sourceFields[sourceKey]) {
|
||||
logger.warn(
|
||||
{ zammadField, sourceKey },
|
||||
'Zammad field maps to non-existent source field key'
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
logger.info('Successfully loaded Formstack field mapping configuration');
|
||||
cachedMapping = config;
|
||||
return cachedMapping;
|
||||
|
||||
} catch (error) {
|
||||
logger.error({
|
||||
error: error instanceof Error ? error.message : error,
|
||||
jsonLength: configJson.length
|
||||
}, 'Failed to parse field mapping configuration');
|
||||
|
||||
throw new Error(
|
||||
`Failed to parse Formstack field mapping JSON: ${error instanceof Error ? error.message : error}`
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get a field value from formData using the source field name mapping
|
||||
*/
|
||||
export function getFieldValue(
|
||||
formData: any,
|
||||
internalFieldKey: string,
|
||||
mapping?: FieldMappingConfig
|
||||
): any {
|
||||
const config = mapping || loadFieldMapping();
|
||||
const sourceFieldName = config.sourceFields[internalFieldKey];
|
||||
if (!sourceFieldName) {
|
||||
return undefined;
|
||||
}
|
||||
return formData[sourceFieldName];
|
||||
}
|
||||
|
||||
/**
|
||||
* Get a nested field value (e.g., Name.first)
|
||||
*/
|
||||
export function getNestedFieldValue(
|
||||
fieldValue: any,
|
||||
path: string | undefined
|
||||
): any {
|
||||
if (!path || !fieldValue) {
|
||||
return undefined;
|
||||
}
|
||||
|
||||
const parts = path.split('.');
|
||||
let current = fieldValue;
|
||||
|
||||
for (const part of parts) {
|
||||
if (current && typeof current === 'object') {
|
||||
current = current[part];
|
||||
} else {
|
||||
return undefined;
|
||||
}
|
||||
}
|
||||
|
||||
return current;
|
||||
}
|
||||
|
||||
/**
|
||||
* Format field value (handle arrays, objects, etc.)
|
||||
*/
|
||||
export function formatFieldValue(value: any): string | undefined {
|
||||
if (value === null || value === undefined || value === '') {
|
||||
return undefined;
|
||||
}
|
||||
if (Array.isArray(value)) {
|
||||
return value.join(', ');
|
||||
}
|
||||
if (typeof value === 'object') {
|
||||
return JSON.stringify(value);
|
||||
}
|
||||
return String(value);
|
||||
}
|
||||
|
||||
/**
|
||||
* Build ticket title from template and data
|
||||
* Replaces placeholders like {name}, {organization}, {typeOfSupport} with provided values
|
||||
*/
|
||||
export function buildTicketTitle(
|
||||
mapping: FieldMappingConfig,
|
||||
data: Record<string, string | undefined>
|
||||
): string {
|
||||
const template = mapping.ticket.titleTemplate || '{name}';
|
||||
|
||||
let title = template;
|
||||
|
||||
// Replace all placeholders in the template
|
||||
for (const [key, value] of Object.entries(data)) {
|
||||
const placeholder = `{${key}}`;
|
||||
if (title.includes(placeholder)) {
|
||||
if (value) {
|
||||
title = title.replace(placeholder, value);
|
||||
} else {
|
||||
// Remove empty placeholder and surrounding separators
|
||||
title = title.replace(` - ${placeholder}`, '').replace(`${placeholder} - `, '').replace(placeholder, '');
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return title.trim();
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all Zammad field values from form data using the mapping
|
||||
* Returns an object with Zammad field names as keys and formatted values
|
||||
*/
|
||||
export function getZammadFieldValues(
|
||||
formData: any,
|
||||
mapping?: FieldMappingConfig
|
||||
): Record<string, string> {
|
||||
const config = mapping || loadFieldMapping();
|
||||
const result: Record<string, string> = {};
|
||||
|
||||
for (const [zammadFieldName, sourceKey] of Object.entries(config.zammadFields)) {
|
||||
const value = getFieldValue(formData, sourceKey, config);
|
||||
const formatted = formatFieldValue(value);
|
||||
if (formatted !== undefined) {
|
||||
result[zammadFieldName] = formatted;
|
||||
}
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
/**
|
||||
* Reset cached mapping (useful for testing)
|
||||
*/
|
||||
export function resetMappingCache(): void {
|
||||
cachedMapping = null;
|
||||
}
|
||||
|
|
@ -1,87 +0,0 @@
|
|||
import { Readable } from "stream";
|
||||
import ffmpeg from "fluent-ffmpeg";
|
||||
import * as R from "remeda";
|
||||
import { createLogger } from "@link-stack/logger";
|
||||
|
||||
const logger = createLogger('bridge-worker-media-convert');
|
||||
|
||||
const requiredCodecs = ["mp3", "webm", "wav"];
|
||||
|
||||
export interface AudioConvertOpts {
|
||||
bitrate?: string;
|
||||
audioCodec?: string;
|
||||
format?: string;
|
||||
}
|
||||
|
||||
const defaultAudioConvertOpts = {
|
||||
bitrate: "32k",
|
||||
audioCodec: "libmp3lame",
|
||||
format: "mp3",
|
||||
};
|
||||
|
||||
/**
|
||||
* Converts an audio file to a different format. defaults to converting to mp3 with a 32k bitrate using the libmp3lame codec
|
||||
*
|
||||
* @param input the buffer containing the binary data of the input file
|
||||
* @param opts options to control how the audio file is converted
|
||||
* @return resolves to a buffer containing the binary data of the converted file
|
||||
**/
|
||||
export const convert = (
|
||||
input: Buffer,
|
||||
opts?: AudioConvertOpts,
|
||||
): Promise<Buffer> => {
|
||||
const settings = { ...defaultAudioConvertOpts, ...opts };
|
||||
return new Promise((resolve, reject) => {
|
||||
const stream = Readable.from(input);
|
||||
let out = Buffer.alloc(0);
|
||||
const cmd = ffmpeg(stream)
|
||||
.audioCodec(settings.audioCodec)
|
||||
.audioBitrate(settings.bitrate)
|
||||
.toFormat(settings.format)
|
||||
.on("error", (err, _stdout, _stderr) => {
|
||||
logger.error({ error: err }, 'FFmpeg conversion error');
|
||||
reject(err);
|
||||
})
|
||||
.on("end", () => {
|
||||
resolve(out);
|
||||
});
|
||||
const outstream = cmd.pipe();
|
||||
outstream.on("data", (chunk: Buffer) => {
|
||||
out = Buffer.concat([out, chunk]);
|
||||
});
|
||||
});
|
||||
};
|
||||
|
||||
/**
|
||||
* Check if ffmpeg is installed and usable. Checks for required codecs and a working ffmpeg installation.
|
||||
*
|
||||
* @return resolves to true if ffmpeg is installed and usable
|
||||
* */
|
||||
export const selfCheck = (): Promise<boolean> => {
|
||||
return new Promise((resolve) => {
|
||||
ffmpeg.getAvailableFormats((err, codecs) => {
|
||||
if (err) {
|
||||
logger.error({ error: err }, 'FFMPEG error');
|
||||
resolve(false);
|
||||
}
|
||||
|
||||
const preds = R.map(
|
||||
requiredCodecs,
|
||||
(codec) => (available: any) =>
|
||||
available[codec] &&
|
||||
available[codec].canDemux &&
|
||||
available[codec].canMux,
|
||||
);
|
||||
|
||||
resolve(R.allPass(codecs, preds));
|
||||
});
|
||||
});
|
||||
};
|
||||
|
||||
export const assertFfmpegAvailable = async (): Promise<void> => {
|
||||
const r = await selfCheck();
|
||||
if (!r)
|
||||
throw new Error(
|
||||
`ffmpeg is not installed, could not be located, or does not support the required codecs: ${requiredCodecs}`,
|
||||
);
|
||||
};
|
||||
|
|
@ -1,69 +0,0 @@
|
|||
export const tagMap = {
|
||||
AccountImpersonation: [
|
||||
{ field: "incidentType tag", value: "account-impersonation" },
|
||||
],
|
||||
AppleID: [{ field: "incidentType tag", value: "malfunction-failure" }],
|
||||
Blocked: [{ field: "incidentType tag", value: "account-deactivation" }],
|
||||
CyberBullying: [{ field: "incidentType tag", value: "cyber-bullying" }],
|
||||
DeviceSuspiciousBehavior: [
|
||||
{ field: "incidentType tag", value: "compromise-device" },
|
||||
],
|
||||
Doxxing: [{ field: "incidentType tag", value: "doxxing" }],
|
||||
DSTips: [{ field: "incidentType tag", value: "informational" }],
|
||||
HackedLaptop: [
|
||||
{ field: "incidentType tag", value: "compromised-device" },
|
||||
{ field: "device tag", value: "laptop" },
|
||||
],
|
||||
"Hacked/StolenAccount": [
|
||||
{ field: "incidentType tag", value: "compromised-account" },
|
||||
],
|
||||
HateSpeech: [{ field: "incidentType tag", value: "hate-speech" }],
|
||||
InfectedPhone: [
|
||||
{ field: "incidentType tag", value: "malware" },
|
||||
{ field: "device tag", value: "smartphone" },
|
||||
],
|
||||
Kidnapping: [{ field: "incidentType tag", value: "kidnapping" }],
|
||||
LaptopGiveaway: [{ field: "incidentType tag", value: "other" }],
|
||||
ForensicAnalysis: [{ field: "incidentType tag", value: "malware" }],
|
||||
ISF: [{ field: "incidentType tag", value: "other" }],
|
||||
NumberBanned: [
|
||||
{ field: "incidentType tag", value: "disruption" },
|
||||
{ field: "device tag", value: "smartphone" },
|
||||
],
|
||||
OnlineHarassment: [{ field: "incidentType tag", value: "online-harassment" }],
|
||||
PhoneHarassment: [{ field: "incidentType tag", value: "phone-harassment" }],
|
||||
PoliticalAds: [{ field: "incidentType tag", value: "spam" }],
|
||||
SeizedPhone: [
|
||||
{ field: "incidentType tag", value: "confiscation" },
|
||||
{ field: "device tag", value: "smartphone" },
|
||||
],
|
||||
SexED: [{ field: "incidentType tag", value: "informational" }],
|
||||
Sextortion: [{ field: "incidentType tag", value: "sextortion" }],
|
||||
Spam: [{ field: "incidentType tag", value: "spam" }],
|
||||
SuspendedAccount: [
|
||||
{ field: "incidentType tag", value: "account-suspension" },
|
||||
],
|
||||
SuspendedActivities: [
|
||||
{ field: "incidentType tag", value: "content-moderation" },
|
||||
],
|
||||
SuspendedGroup: [{ field: "incidentType tag", value: "account-suspension" }],
|
||||
SuspendedPage: [{ field: "incidentType tag", value: "account-suspension" }],
|
||||
"Stolen/LostPhone": [
|
||||
{ field: "incidentType tag", value: "loss" },
|
||||
{ field: "device tag", value: "smartphone" },
|
||||
],
|
||||
Facebook: [{ field: "platform tag", value: "facebook" }],
|
||||
Google: [{ field: "platform tag", value: "google" }],
|
||||
Instagram: [{ field: "platform tag", value: "instagram" }],
|
||||
SMS: [{ field: "service tag", value: "sms" }],
|
||||
Twitter: [{ field: "platform tag", value: "twitter" }],
|
||||
Website: [{ field: "service tag", value: "website" }],
|
||||
WhatsApp: [{ field: "platform tag", value: "whatsapp" }],
|
||||
YouTube: [{ field: "platform tag", value: "youtube" }],
|
||||
Linkedin: [{ field: "platform tag", value: "linkedin" }],
|
||||
PoliticalActivist: [{ field: "targetedGroup tag", value: "policy-politics" }],
|
||||
ElectoralCandidate: [
|
||||
{ field: "targetedGroup tag", value: "policy-politics" },
|
||||
],
|
||||
PhishingLink: [{ field: "incidentType tag", value: "phishing" }],
|
||||
};
|
||||
|
|
@ -1,26 +0,0 @@
|
|||
import * as Worker from "graphile-worker";
|
||||
// import { defState } from "@digiresilience/montar";
|
||||
//import config from "@digiresilience/bridge-config";
|
||||
|
||||
/*
|
||||
const startWorkerUtils = async (): Promise<Worker.WorkerUtils> => {
|
||||
const workerUtils = await Worker.makeWorkerUtils({
|
||||
connectionString: config.worker.connection,
|
||||
});
|
||||
return workerUtils;
|
||||
};
|
||||
|
||||
const stopWorkerUtils = async (): Promise<void> => {
|
||||
return workerUtils.release();
|
||||
};
|
||||
|
||||
const workerUtils = defState("workerUtils", {
|
||||
start: startWorkerUtils,
|
||||
stop: stopWorkerUtils,
|
||||
});
|
||||
|
||||
|
||||
*/
|
||||
|
||||
export const workerUtils: any = {};
|
||||
export default workerUtils;
|
||||
|
|
@ -1,171 +0,0 @@
|
|||
/* eslint-disable camelcase,@typescript-eslint/explicit-module-boundary-types,@typescript-eslint/no-explicit-any */
|
||||
import querystring from "querystring";
|
||||
import Wreck from "@hapi/wreck";
|
||||
|
||||
export interface User {
|
||||
id: number;
|
||||
firstname?: string;
|
||||
lastname?: string;
|
||||
email?: string;
|
||||
phone?: string;
|
||||
}
|
||||
export interface Ticket {
|
||||
id: number;
|
||||
title?: string;
|
||||
group_id?: number;
|
||||
customer_id?: number;
|
||||
}
|
||||
|
||||
export interface ZammadClient {
|
||||
ticket: {
|
||||
create: (data: any) => Promise<Ticket>;
|
||||
update: (id: number, data: any) => Promise<Ticket>;
|
||||
};
|
||||
user: {
|
||||
search: (data: any) => Promise<User[]>;
|
||||
create: (data: any) => Promise<User>;
|
||||
};
|
||||
get: (path: string) => Promise<any>;
|
||||
}
|
||||
|
||||
export type ZammadCredentials =
|
||||
| { username: string; password: string }
|
||||
| { token: string };
|
||||
|
||||
export interface ZammadClientOpts {
|
||||
headers?: Record<string, any>;
|
||||
}
|
||||
|
||||
const formatAuth = (credentials: any) => {
|
||||
if (credentials.username) {
|
||||
return (
|
||||
"Basic " +
|
||||
Buffer.from(`${credentials.username}:${credentials.password}`).toString(
|
||||
"base64",
|
||||
)
|
||||
);
|
||||
}
|
||||
|
||||
if (credentials.token) {
|
||||
return `Token ${credentials.token}`;
|
||||
}
|
||||
|
||||
throw new Error("invalid zammad credentials type");
|
||||
};
|
||||
|
||||
export const Zammad = (
|
||||
credentials: ZammadCredentials,
|
||||
host: string,
|
||||
opts?: ZammadClientOpts,
|
||||
): ZammadClient => {
|
||||
const extraHeaders = (opts && opts.headers) || {};
|
||||
|
||||
const wreck = Wreck.defaults({
|
||||
baseUrl: `${host}/api/v1/`,
|
||||
headers: {
|
||||
authorization: formatAuth(credentials),
|
||||
...extraHeaders,
|
||||
},
|
||||
json: true,
|
||||
});
|
||||
|
||||
return {
|
||||
ticket: {
|
||||
create: async (payload) => {
|
||||
const { payload: result } = await wreck.post("tickets", { payload });
|
||||
return result as Ticket;
|
||||
},
|
||||
update: async (id, payload) => {
|
||||
const { payload: result } = await wreck.put(`tickets/${id}`, {
|
||||
payload,
|
||||
});
|
||||
return result as Ticket;
|
||||
},
|
||||
},
|
||||
user: {
|
||||
search: async (query) => {
|
||||
const qp = querystring.stringify({ query });
|
||||
const { payload: result } = await wreck.get(`users/search?${qp}`);
|
||||
return result as User[];
|
||||
},
|
||||
create: async (payload) => {
|
||||
const { payload: result } = await wreck.post("users", { payload });
|
||||
return result as User;
|
||||
},
|
||||
},
|
||||
get: async (path) => {
|
||||
const { payload: result } = await wreck.get(path);
|
||||
return result;
|
||||
},
|
||||
};
|
||||
};
|
||||
|
||||
/**
|
||||
* Sanitizes phone number to E.164 format: +15554446666
|
||||
* Strips all non-digit characters except +, ensures + prefix
|
||||
* @param phoneNumber - Raw phone number (e.g., "(555) 444-6666", "5554446666", "+1 555 444 6666")
|
||||
* @returns E.164 formatted phone number (e.g., "+15554446666")
|
||||
* @throws Error if phone number is invalid
|
||||
*/
|
||||
export const sanitizePhoneNumber = (phoneNumber: string): string => {
|
||||
// Remove all characters except digits and +
|
||||
let cleaned = phoneNumber.replace(/[^\d+]/g, "");
|
||||
|
||||
// Ensure it starts with +
|
||||
if (!cleaned.startsWith("+")) {
|
||||
// Assume US/Canada if no country code (11 digits starting with 1, or 10 digits)
|
||||
if (cleaned.length === 10) {
|
||||
cleaned = "+1" + cleaned;
|
||||
} else if (cleaned.length === 11 && cleaned.startsWith("1")) {
|
||||
cleaned = "+" + cleaned;
|
||||
} else if (cleaned.length >= 10) {
|
||||
// International number without +, add it
|
||||
cleaned = "+" + cleaned;
|
||||
}
|
||||
}
|
||||
|
||||
// Validate E.164 format: + followed by 10-15 digits
|
||||
if (!/^\+\d{10,15}$/.test(cleaned)) {
|
||||
throw new Error(`Invalid phone number format: ${phoneNumber}`);
|
||||
}
|
||||
|
||||
return cleaned;
|
||||
};
|
||||
|
||||
export const getUser = async (zammad: ZammadClient, phoneNumber: string) => {
|
||||
// Sanitize to E.164 format
|
||||
const sanitized = sanitizePhoneNumber(phoneNumber);
|
||||
|
||||
// Remove + for Zammad search query
|
||||
const searchNumber = sanitized.replace("+", "");
|
||||
|
||||
// Try sanitized format first (e.g., "6464229653" for "+16464229653")
|
||||
let results = await zammad.user.search(`phone:${searchNumber}`);
|
||||
if (results.length > 0) return results[0];
|
||||
|
||||
// Fall back to searching for original input (handles legacy formatted numbers)
|
||||
// This ensures we can find users with "(646) 422-9653" format in database
|
||||
const originalCleaned = phoneNumber.replace(/[^\d+]/g, "").replace("+", "");
|
||||
if (originalCleaned !== searchNumber) {
|
||||
results = await zammad.user.search(`phone:${originalCleaned}`);
|
||||
if (results.length > 0) return results[0];
|
||||
}
|
||||
|
||||
return undefined;
|
||||
};
|
||||
|
||||
export const getOrCreateUser = async (
|
||||
zammad: ZammadClient,
|
||||
phoneNumber: string,
|
||||
) => {
|
||||
const customer = await getUser(zammad, phoneNumber);
|
||||
if (customer) return customer;
|
||||
|
||||
// Sanitize phone number to E.164 format before storing
|
||||
const sanitized = sanitizePhoneNumber(phoneNumber);
|
||||
|
||||
return zammad.user.create({
|
||||
phone: sanitized,
|
||||
note: "User created from incoming voice call",
|
||||
});
|
||||
};
|
||||
|
|
@ -1,30 +0,0 @@
|
|||
{
|
||||
"name": "@link-stack/bridge-worker",
|
||||
"version": "3.3.5",
|
||||
"type": "module",
|
||||
"main": "build/main/index.js",
|
||||
"author": "Darren Clarke <darren@redaranj.com>",
|
||||
"license": "AGPL-3.0-or-later",
|
||||
"scripts": {
|
||||
"build": "tsc -p tsconfig.json && cp crontab build/main/crontab",
|
||||
"dev": "dotenv -- graphile-worker",
|
||||
"start": "node build/main/index.js"
|
||||
},
|
||||
"dependencies": {
|
||||
"@hapi/wreck": "^18.1.0",
|
||||
"@link-stack/bridge-common": "workspace:*",
|
||||
"@link-stack/logger": "workspace:*",
|
||||
"@link-stack/signal-api": "workspace:*",
|
||||
"fluent-ffmpeg": "^2.1.3",
|
||||
"graphile-worker": "^0.16.6",
|
||||
"remeda": "^2.32.0",
|
||||
"twilio": "^5.10.2"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/fluent-ffmpeg": "^2.1.27",
|
||||
"dotenv-cli": "^10.0.0",
|
||||
"@link-stack/eslint-config": "workspace:*",
|
||||
"@link-stack/typescript-config": "workspace:*",
|
||||
"typescript": "^5.9.3"
|
||||
}
|
||||
}
|
||||
|
|
@ -1,121 +0,0 @@
|
|||
#!/usr/bin/env node
|
||||
/**
|
||||
* Check Signal group membership status and update Zammad tickets
|
||||
*
|
||||
* This task queries the Signal CLI API to check if users have joined
|
||||
* their assigned groups. When a user joins (moves from pendingInvites to members),
|
||||
* it updates the ticket's group_joined flag in Zammad.
|
||||
*
|
||||
* Note: This task sends webhooks for all group members every time it runs.
|
||||
* The Zammad webhook handler is idempotent and will ignore duplicate notifications
|
||||
* if group_joined is already true.
|
||||
*/
|
||||
|
||||
import { db, getWorkerUtils } from "@link-stack/bridge-common";
|
||||
import { createLogger } from "@link-stack/logger";
|
||||
import * as signalApi from "@link-stack/signal-api";
|
||||
|
||||
const logger = createLogger("check-group-membership");
|
||||
|
||||
const { Configuration, GroupsApi } = signalApi;
|
||||
|
||||
interface CheckGroupMembershipTaskOptions {
|
||||
// Optional: Check specific group. If not provided, checks all groups with group_joined=false
|
||||
groupId?: string;
|
||||
botToken?: string;
|
||||
}
|
||||
|
||||
const checkGroupMembershipTask = async (
|
||||
options: CheckGroupMembershipTaskOptions = {},
|
||||
): Promise<void> => {
|
||||
const config = new Configuration({
|
||||
basePath: process.env.BRIDGE_SIGNAL_URL,
|
||||
});
|
||||
const groupsClient = new GroupsApi(config);
|
||||
const worker = await getWorkerUtils();
|
||||
|
||||
// Get all Signal bots
|
||||
const bots = await db.selectFrom("SignalBot").selectAll().execute();
|
||||
|
||||
for (const bot of bots) {
|
||||
try {
|
||||
logger.debug(
|
||||
{ botId: bot.id, phoneNumber: bot.phoneNumber },
|
||||
"Checking groups for bot",
|
||||
);
|
||||
|
||||
// Get all groups for this bot
|
||||
const groups = await groupsClient.v1GroupsNumberGet({
|
||||
number: bot.phoneNumber,
|
||||
});
|
||||
|
||||
logger.debug(
|
||||
{ botId: bot.id, groupCount: groups.length },
|
||||
"Retrieved groups from Signal CLI",
|
||||
);
|
||||
|
||||
// For each group, check if we have tickets waiting for members to join
|
||||
for (const group of groups) {
|
||||
if (!group.id || !group.internalId) {
|
||||
logger.debug({ groupName: group.name }, "Skipping group without ID");
|
||||
continue;
|
||||
}
|
||||
|
||||
// Log info about each group temporarily for debugging
|
||||
logger.info(
|
||||
{
|
||||
groupId: group.id,
|
||||
groupName: group.name,
|
||||
membersCount: group.members?.length || 0,
|
||||
members: group.members,
|
||||
pendingInvitesCount: group.pendingInvites?.length || 0,
|
||||
pendingInvites: group.pendingInvites,
|
||||
pendingRequestsCount: group.pendingRequests?.length || 0,
|
||||
},
|
||||
"Checking group membership",
|
||||
);
|
||||
|
||||
// Notify Zammad about each member who has joined
|
||||
// This handles both cases:
|
||||
// 1. New contacts who must accept invite (they move from pendingInvites to members)
|
||||
// 2. Existing contacts who are auto-added (they appear directly in members)
|
||||
if (group.members && group.members.length > 0) {
|
||||
for (const memberPhone of group.members) {
|
||||
// Check if this member was previously pending
|
||||
// We'll send the webhook and let Zammad decide if it needs to update
|
||||
await worker.addJob("common/notify-webhooks", {
|
||||
backendId: bot.id,
|
||||
payload: {
|
||||
event: "group_member_joined",
|
||||
group_id: group.id,
|
||||
member_phone: memberPhone,
|
||||
timestamp: new Date().toISOString(),
|
||||
},
|
||||
});
|
||||
|
||||
logger.info(
|
||||
{
|
||||
groupId: group.id,
|
||||
memberPhone,
|
||||
},
|
||||
"Notified Zammad about group member",
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
} catch (error: any) {
|
||||
logger.error(
|
||||
{
|
||||
botId: bot.id,
|
||||
error: error.message,
|
||||
stack: error.stack,
|
||||
},
|
||||
"Error checking group membership for bot",
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
logger.info("Completed group membership check");
|
||||
};
|
||||
|
||||
export default checkGroupMembershipTask;
|
||||
|
|
@ -1,72 +0,0 @@
|
|||
import { db } from "@link-stack/bridge-common";
|
||||
import { createLogger } from "@link-stack/logger";
|
||||
|
||||
const logger = createLogger('notify-webhooks');
|
||||
|
||||
export interface NotifyWebhooksOptions {
|
||||
backendId: string;
|
||||
payload: any;
|
||||
}
|
||||
|
||||
const notifyWebhooksTask = async (
|
||||
options: NotifyWebhooksOptions,
|
||||
): Promise<void> => {
|
||||
const { backendId, payload } = options;
|
||||
|
||||
logger.debug({
|
||||
backendId,
|
||||
payloadKeys: Object.keys(payload),
|
||||
}, 'Processing webhook notification');
|
||||
|
||||
const webhooks = await db
|
||||
.selectFrom("Webhook")
|
||||
.selectAll()
|
||||
.where("backendId", "=", backendId)
|
||||
.execute();
|
||||
|
||||
logger.debug({ count: webhooks.length, backendId }, 'Found webhooks');
|
||||
|
||||
for (const webhook of webhooks) {
|
||||
const { endpointUrl, httpMethod, headers } = webhook;
|
||||
const finalHeaders = { "Content-Type": "application/json", ...headers };
|
||||
const body = JSON.stringify(payload);
|
||||
|
||||
logger.debug({
|
||||
url: endpointUrl,
|
||||
method: httpMethod,
|
||||
bodyLength: body.length,
|
||||
headerKeys: Object.keys(finalHeaders),
|
||||
}, 'Sending webhook');
|
||||
|
||||
try {
|
||||
const result = await fetch(endpointUrl, {
|
||||
method: httpMethod,
|
||||
headers: finalHeaders,
|
||||
body,
|
||||
});
|
||||
|
||||
logger.debug({
|
||||
url: endpointUrl,
|
||||
status: result.status,
|
||||
statusText: result.statusText,
|
||||
ok: result.ok,
|
||||
}, 'Webhook response');
|
||||
|
||||
if (!result.ok) {
|
||||
const responseText = await result.text();
|
||||
logger.error({
|
||||
url: endpointUrl,
|
||||
status: result.status,
|
||||
responseSample: responseText.substring(0, 500),
|
||||
}, 'Webhook error response');
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error({
|
||||
url: endpointUrl,
|
||||
error: error instanceof Error ? error.message : error,
|
||||
}, 'Webhook request failed');
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
export default notifyWebhooksTask;
|
||||
|
|
@ -1,34 +0,0 @@
|
|||
import { db, getWorkerUtils } from "@link-stack/bridge-common";
|
||||
|
||||
interface ReceiveFacebookMessageTaskOptions {
|
||||
message: any;
|
||||
}
|
||||
|
||||
const receiveFacebookMessageTask = async ({
|
||||
message,
|
||||
}: ReceiveFacebookMessageTaskOptions): Promise<void> => {
|
||||
const worker = await getWorkerUtils();
|
||||
|
||||
for (const entry of message.entry) {
|
||||
for (const messaging of entry.messaging) {
|
||||
const pageId = messaging.recipient.id;
|
||||
const row = await db
|
||||
.selectFrom("FacebookBot")
|
||||
.selectAll()
|
||||
.where("pageId", "=", pageId)
|
||||
.executeTakeFirstOrThrow();
|
||||
const backendId = row.id;
|
||||
const payload = {
|
||||
to: pageId,
|
||||
from: messaging.sender.id,
|
||||
sent_at: new Date(messaging.timestamp).toISOString(),
|
||||
message: messaging.message.text,
|
||||
message_id: messaging.message.mid,
|
||||
};
|
||||
|
||||
await worker.addJob("common/notify-webhooks", { backendId, payload });
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
export default receiveFacebookMessageTask;
|
||||
|
|
@ -1,43 +0,0 @@
|
|||
import { db } from "@link-stack/bridge-common";
|
||||
import { createLogger } from "@link-stack/logger";
|
||||
|
||||
const logger = createLogger('bridge-worker-send-facebook-message');
|
||||
|
||||
interface SendFacebookMessageTaskOptions {
|
||||
token: string;
|
||||
to: string;
|
||||
message: string;
|
||||
}
|
||||
|
||||
const sendFacebookMessageTask = async (
|
||||
options: SendFacebookMessageTaskOptions,
|
||||
): Promise<void> => {
|
||||
const { token, to, message } = options;
|
||||
const { pageId, pageAccessToken } = await db
|
||||
.selectFrom("FacebookBot")
|
||||
.selectAll()
|
||||
.where("token", "=", token)
|
||||
.executeTakeFirstOrThrow();
|
||||
|
||||
const endpoint = `https://graph.facebook.com/v19.0/${pageId}/messages`;
|
||||
|
||||
const outgoingMessage = {
|
||||
recipient: { id: to },
|
||||
message: { text: message },
|
||||
messaging_type: "RESPONSE",
|
||||
access_token: pageAccessToken,
|
||||
};
|
||||
|
||||
try {
|
||||
const response = await fetch(endpoint, {
|
||||
method: "POST",
|
||||
headers: { "Content-Type": "application/json" },
|
||||
body: JSON.stringify(outgoingMessage),
|
||||
});
|
||||
} catch (error) {
|
||||
logger.error({ error });
|
||||
throw error;
|
||||
}
|
||||
};
|
||||
|
||||
export default sendFacebookMessageTask;
|
||||
|
|
@ -1,258 +0,0 @@
|
|||
import { db, getWorkerUtils } from "@link-stack/bridge-common";
|
||||
import { createLogger } from "@link-stack/logger";
|
||||
import * as signalApi from "@link-stack/signal-api";
|
||||
|
||||
const logger = createLogger("fetch-signal-messages");
|
||||
|
||||
const { Configuration, MessagesApi, AttachmentsApi } = signalApi;
|
||||
const config = new Configuration({
|
||||
basePath: process.env.BRIDGE_SIGNAL_URL,
|
||||
});
|
||||
|
||||
const fetchAttachments = async (attachments: any[] | undefined) => {
|
||||
const formattedAttachments = [];
|
||||
|
||||
if (attachments) {
|
||||
const attachmentsClient = new AttachmentsApi(config);
|
||||
|
||||
for (const att of attachments) {
|
||||
const { id, contentType, filename: name } = att;
|
||||
|
||||
const blob = await attachmentsClient.v1AttachmentsAttachmentGet({
|
||||
attachment: id,
|
||||
});
|
||||
const arrayBuffer = await blob.arrayBuffer();
|
||||
const base64Attachment = Buffer.from(arrayBuffer).toString("base64");
|
||||
|
||||
// Generate default filename if not provided by Signal API
|
||||
let defaultFilename = name;
|
||||
if (!defaultFilename) {
|
||||
// Check if id already has an extension
|
||||
const hasExtension = id.includes(".");
|
||||
if (hasExtension) {
|
||||
// ID already includes extension
|
||||
defaultFilename = id;
|
||||
} else {
|
||||
// Add extension based on content type
|
||||
const extension = contentType?.split("/")[1] || "bin";
|
||||
defaultFilename = `${id}.${extension}`;
|
||||
}
|
||||
}
|
||||
|
||||
const formattedAttachment = {
|
||||
filename: defaultFilename,
|
||||
mimeType: contentType,
|
||||
attachment: base64Attachment,
|
||||
};
|
||||
|
||||
formattedAttachments.push(formattedAttachment);
|
||||
}
|
||||
}
|
||||
|
||||
return formattedAttachments;
|
||||
};
|
||||
|
||||
type ProcessMessageArgs = {
|
||||
id: string;
|
||||
phoneNumber: string;
|
||||
message: any;
|
||||
};
|
||||
|
||||
const processMessage = async ({
|
||||
id,
|
||||
phoneNumber,
|
||||
message: msg,
|
||||
}: ProcessMessageArgs): Promise<Record<string, any>[]> => {
|
||||
const { envelope } = msg;
|
||||
const { source, sourceUuid, dataMessage, syncMessage, receiptMessage, typingMessage } =
|
||||
envelope;
|
||||
|
||||
// Log all envelope types to understand what events we're receiving
|
||||
logger.info(
|
||||
{
|
||||
source,
|
||||
sourceUuid,
|
||||
hasDataMessage: !!dataMessage,
|
||||
hasSyncMessage: !!syncMessage,
|
||||
hasReceiptMessage: !!receiptMessage,
|
||||
hasTypingMessage: !!typingMessage,
|
||||
envelopeKeys: Object.keys(envelope),
|
||||
},
|
||||
"Received Signal envelope",
|
||||
);
|
||||
|
||||
const isGroup = !!(
|
||||
dataMessage?.groupV2 ||
|
||||
dataMessage?.groupContext ||
|
||||
dataMessage?.groupInfo
|
||||
);
|
||||
|
||||
// Check if this is a group membership change event
|
||||
const groupInfo = dataMessage?.groupInfo;
|
||||
if (groupInfo) {
|
||||
logger.info(
|
||||
{
|
||||
type: groupInfo.type,
|
||||
groupId: groupInfo.groupId,
|
||||
source,
|
||||
groupInfoKeys: Object.keys(groupInfo),
|
||||
fullGroupInfo: groupInfo,
|
||||
},
|
||||
"Received group info event",
|
||||
);
|
||||
|
||||
// If user joined the group, notify Zammad
|
||||
if (groupInfo.type === "JOIN" || groupInfo.type === "JOINED") {
|
||||
const worker = await getWorkerUtils();
|
||||
const groupId = groupInfo.groupId
|
||||
? `group.${Buffer.from(groupInfo.groupId).toString("base64")}`
|
||||
: null;
|
||||
|
||||
if (groupId) {
|
||||
await worker.addJob("common/notify-webhooks", {
|
||||
backendId: id,
|
||||
payload: {
|
||||
event: "group_member_joined",
|
||||
group_id: groupId,
|
||||
member_phone: source,
|
||||
timestamp: new Date().toISOString(),
|
||||
},
|
||||
});
|
||||
|
||||
logger.info(
|
||||
{
|
||||
groupId,
|
||||
memberPhone: source,
|
||||
},
|
||||
"User joined Signal group, notifying Zammad",
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (!dataMessage) return [];
|
||||
|
||||
const { attachments } = dataMessage;
|
||||
const rawTimestamp = dataMessage?.timestamp;
|
||||
|
||||
logger.debug(
|
||||
{
|
||||
sourceUuid,
|
||||
source,
|
||||
rawTimestamp,
|
||||
hasGroupV2: !!dataMessage?.groupV2,
|
||||
hasGroupContext: !!dataMessage?.groupContext,
|
||||
hasGroupInfo: !!dataMessage?.groupInfo,
|
||||
isGroup,
|
||||
groupV2Id: dataMessage?.groupV2?.id,
|
||||
groupContextType: dataMessage?.groupContext?.type,
|
||||
groupInfoType: dataMessage?.groupInfo?.type,
|
||||
},
|
||||
"Processing message",
|
||||
);
|
||||
const timestamp = new Date(rawTimestamp);
|
||||
|
||||
const formattedAttachments = await fetchAttachments(attachments);
|
||||
const primaryAttachment = formattedAttachments[0] ?? {};
|
||||
const additionalAttachments = formattedAttachments.slice(1);
|
||||
|
||||
const groupId =
|
||||
dataMessage?.groupV2?.id ||
|
||||
dataMessage?.groupContext?.id ||
|
||||
dataMessage?.groupInfo?.groupId;
|
||||
const toRecipient = groupId
|
||||
? `group.${Buffer.from(groupId).toString("base64")}`
|
||||
: phoneNumber;
|
||||
|
||||
const primaryMessage = {
|
||||
token: id,
|
||||
to: toRecipient,
|
||||
from: source,
|
||||
messageId: `${sourceUuid}-${rawTimestamp}`,
|
||||
message: dataMessage?.message,
|
||||
sentAt: timestamp.toISOString(),
|
||||
attachment: primaryAttachment.attachment,
|
||||
filename: primaryAttachment.filename,
|
||||
mimeType: primaryAttachment.mimeType,
|
||||
isGroup,
|
||||
};
|
||||
const formattedMessages = [primaryMessage];
|
||||
|
||||
let count = 1;
|
||||
for (const attachment of additionalAttachments) {
|
||||
const additionalMessage = {
|
||||
...primaryMessage,
|
||||
...attachment,
|
||||
message: attachment.filename,
|
||||
messageId: `${sourceUuid}-${count}-${rawTimestamp}`,
|
||||
};
|
||||
formattedMessages.push(additionalMessage);
|
||||
count++;
|
||||
}
|
||||
|
||||
return formattedMessages;
|
||||
};
|
||||
|
||||
interface FetchSignalMessagesTaskOptions {
|
||||
scheduleTasks: string;
|
||||
}
|
||||
|
||||
const fetchSignalMessagesTask = async ({
|
||||
scheduleTasks = "false",
|
||||
}: FetchSignalMessagesTaskOptions): Promise<void> => {
|
||||
const worker = await getWorkerUtils();
|
||||
|
||||
if (scheduleTasks === "true") {
|
||||
// because cron only has minimum 1 minute resolution
|
||||
for (const offset of [15000, 30000, 45000]) {
|
||||
await worker.addJob(
|
||||
"fetch-signal-messages",
|
||||
{ scheduleTasks: "false" },
|
||||
{
|
||||
maxAttempts: 1,
|
||||
runAt: new Date(Date.now() + offset),
|
||||
jobKey: `fetchSignalMessages-${offset}`,
|
||||
},
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
const messagesClient = new MessagesApi(config);
|
||||
const rows = await db.selectFrom("SignalBot").selectAll().execute();
|
||||
|
||||
for (const row of rows) {
|
||||
const { id, phoneNumber } = row;
|
||||
const messages = await messagesClient.v1ReceiveNumberGet({
|
||||
number: phoneNumber,
|
||||
});
|
||||
|
||||
logger.debug({ botId: id, phoneNumber }, "Fetching messages for bot");
|
||||
|
||||
for (const message of messages) {
|
||||
const formattedMessages = await processMessage({
|
||||
id,
|
||||
phoneNumber,
|
||||
message,
|
||||
});
|
||||
for (const formattedMessage of formattedMessages) {
|
||||
if (formattedMessage.to !== formattedMessage.from) {
|
||||
logger.debug(
|
||||
{
|
||||
messageId: formattedMessage.messageId,
|
||||
from: formattedMessage.from,
|
||||
to: formattedMessage.to,
|
||||
isGroup: formattedMessage.isGroup,
|
||||
hasMessage: !!formattedMessage.message,
|
||||
hasAttachment: !!formattedMessage.attachment,
|
||||
},
|
||||
"Creating job for message",
|
||||
);
|
||||
|
||||
await worker.addJob("signal/receive-signal-message", formattedMessage);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
export default fetchSignalMessagesTask;
|
||||
|
|
@ -1,436 +0,0 @@
|
|||
import { createLogger } from "@link-stack/logger";
|
||||
import { db } from "@link-stack/bridge-common";
|
||||
import { Zammad, getUser, sanitizePhoneNumber } from "../../lib/zammad.js";
|
||||
import {
|
||||
loadFieldMapping,
|
||||
getFieldValue,
|
||||
getNestedFieldValue,
|
||||
formatFieldValue,
|
||||
buildTicketTitle,
|
||||
getZammadFieldValues,
|
||||
type FieldMappingConfig,
|
||||
} from "../../lib/formstack-field-mapping.js";
|
||||
|
||||
const logger = createLogger("create-ticket-from-form");
|
||||
|
||||
export interface CreateTicketFromFormOptions {
|
||||
formData: any;
|
||||
receivedAt: string;
|
||||
}
|
||||
|
||||
const createTicketFromFormTask = async (
|
||||
options: CreateTicketFromFormOptions,
|
||||
): Promise<void> => {
|
||||
const { formData, receivedAt } = options;
|
||||
|
||||
// Load field mapping configuration
|
||||
const mapping = loadFieldMapping();
|
||||
|
||||
// Log only non-PII metadata using configured field names
|
||||
const formId = getFieldValue(formData, "formId", mapping);
|
||||
const uniqueId = getFieldValue(formData, "uniqueId", mapping);
|
||||
|
||||
logger.info(
|
||||
{
|
||||
formId,
|
||||
uniqueId,
|
||||
receivedAt,
|
||||
fieldCount: Object.keys(formData).length,
|
||||
},
|
||||
"Processing Formstack form submission",
|
||||
);
|
||||
|
||||
// Extract fields using dynamic mapping
|
||||
const nameField = getFieldValue(formData, "name", mapping);
|
||||
const firstName = mapping.nestedFields?.name?.firstNamePath
|
||||
? getNestedFieldValue(nameField, mapping.nestedFields.name.firstNamePath) || ""
|
||||
: "";
|
||||
const lastName = mapping.nestedFields?.name?.lastNamePath
|
||||
? getNestedFieldValue(nameField, mapping.nestedFields.name.lastNamePath) || ""
|
||||
: "";
|
||||
const fullName =
|
||||
firstName && lastName
|
||||
? `${firstName} ${lastName}`.trim()
|
||||
: firstName || lastName || "Unknown";
|
||||
|
||||
// Extract well-known fields used for special logic (all optional)
|
||||
const email = getFieldValue(formData, "email", mapping);
|
||||
const rawPhone = getFieldValue(formData, "phone", mapping);
|
||||
const rawSignalAccount = getFieldValue(formData, "signalAccount", mapping);
|
||||
const organization = getFieldValue(formData, "organization", mapping);
|
||||
const typeOfSupport = getFieldValue(formData, "typeOfSupport", mapping);
|
||||
const descriptionOfIssue = getFieldValue(formData, "descriptionOfIssue", mapping);
|
||||
|
||||
// Sanitize phone numbers to E.164 format (+15554446666)
|
||||
let phone: string | undefined;
|
||||
if (rawPhone) {
|
||||
try {
|
||||
phone = sanitizePhoneNumber(rawPhone);
|
||||
logger.info({ rawPhone, sanitized: phone }, "Sanitized phone number");
|
||||
} catch (error: any) {
|
||||
logger.warn({ rawPhone, error: error.message }, "Invalid phone number format, ignoring");
|
||||
phone = undefined;
|
||||
}
|
||||
}
|
||||
|
||||
let signalAccount: string | undefined;
|
||||
if (rawSignalAccount) {
|
||||
try {
|
||||
signalAccount = sanitizePhoneNumber(rawSignalAccount);
|
||||
logger.info({ rawSignalAccount, sanitized: signalAccount }, "Sanitized signal account");
|
||||
} catch (error: any) {
|
||||
logger.warn({ rawSignalAccount, error: error.message }, "Invalid signal account format, ignoring");
|
||||
signalAccount = undefined;
|
||||
}
|
||||
}
|
||||
|
||||
// Validate that at least one contact method is provided
|
||||
if (!email && !phone && !signalAccount) {
|
||||
logger.error(
|
||||
{ formId, uniqueId },
|
||||
"No contact information provided - at least one of email, phone, or signalAccount is required",
|
||||
);
|
||||
throw new Error(
|
||||
"At least one contact method (email, phone, or signalAccount) is required for ticket creation",
|
||||
);
|
||||
}
|
||||
|
||||
// Build ticket title using configured template
|
||||
// Pass all potentially used fields - the template determines which are actually used
|
||||
const title = buildTicketTitle(mapping, {
|
||||
name: fullName,
|
||||
organization: formatFieldValue(organization),
|
||||
typeOfSupport: formatFieldValue(typeOfSupport),
|
||||
});
|
||||
|
||||
// Build article body - format all fields as HTML
|
||||
const formatAllFields = (data: any): string => {
|
||||
let html = "";
|
||||
|
||||
// Add formatted name field first if we have it
|
||||
if (fullName && fullName !== "Unknown") {
|
||||
html += `<strong>Name:</strong><br>${fullName}<br>`;
|
||||
}
|
||||
|
||||
for (const [key, value] of Object.entries(data)) {
|
||||
// Skip metadata fields and name field (we already formatted it above)
|
||||
const skipFields = [
|
||||
mapping.sourceFields.formId,
|
||||
mapping.sourceFields.uniqueId,
|
||||
mapping.sourceFields.name, // Skip raw name field
|
||||
"HandshakeKey",
|
||||
].filter(Boolean);
|
||||
|
||||
if (skipFields.includes(key)) continue;
|
||||
if (value === null || value === undefined || value === "") continue;
|
||||
|
||||
const displayValue = Array.isArray(value)
|
||||
? value.join(", ")
|
||||
: typeof value === "object"
|
||||
? JSON.stringify(value)
|
||||
: value;
|
||||
html += `<strong>${key}:</strong><br>${displayValue}<br>`;
|
||||
}
|
||||
return html;
|
||||
};
|
||||
|
||||
const body = formatAllFields(formData);
|
||||
|
||||
// Get Zammad configuration from environment
|
||||
const zammadUrl = process.env.ZAMMAD_URL || "http://zammad-nginx:8080";
|
||||
const zammadToken = process.env.ZAMMAD_API_TOKEN;
|
||||
|
||||
if (!zammadToken) {
|
||||
logger.error("ZAMMAD_API_TOKEN environment variable is not configured");
|
||||
throw new Error("ZAMMAD_API_TOKEN is required");
|
||||
}
|
||||
|
||||
const zammad = Zammad({ token: zammadToken }, zammadUrl);
|
||||
|
||||
try {
|
||||
// Look up the configured article type
|
||||
let articleTypeId: number | undefined;
|
||||
try {
|
||||
const articleTypes = await zammad.get("ticket_article_types");
|
||||
const configuredType = articleTypes.find(
|
||||
(t: any) => t.name === mapping.ticket.defaultArticleType,
|
||||
);
|
||||
articleTypeId = configuredType?.id;
|
||||
if (articleTypeId) {
|
||||
logger.info(
|
||||
{ articleTypeId, typeName: mapping.ticket.defaultArticleType },
|
||||
"Found configured article type",
|
||||
);
|
||||
} else {
|
||||
logger.warn(
|
||||
{ typeName: mapping.ticket.defaultArticleType },
|
||||
"Configured article type not found, ticket will use default type",
|
||||
);
|
||||
}
|
||||
} catch (error: any) {
|
||||
logger.warn({ error: error.message }, "Failed to look up article type");
|
||||
}
|
||||
|
||||
// Get or create user
|
||||
// Try to find existing user by: phone -> email
|
||||
// Note: We can't search by Signal account since Signal group IDs aren't phone numbers
|
||||
let customer;
|
||||
|
||||
// Try phone if provided
|
||||
if (phone) {
|
||||
customer = await getUser(zammad, phone);
|
||||
if (customer) {
|
||||
logger.info(
|
||||
{ customerId: customer.id, method: "phone" },
|
||||
"Found existing user by phone",
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
// Fall back to email if no customer found yet
|
||||
if (!customer && email) {
|
||||
// Validate email format before using in search
|
||||
const emailRegex = /^[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}$/;
|
||||
if (emailRegex.test(email)) {
|
||||
const emailResults = await zammad.user.search(`email:${email}`);
|
||||
if (emailResults.length > 0) {
|
||||
customer = emailResults[0];
|
||||
logger.info(
|
||||
{ customerId: customer.id, method: "email" },
|
||||
"Found existing user by email",
|
||||
);
|
||||
}
|
||||
} else {
|
||||
logger.warn({ email }, "Invalid email format provided, skipping email search");
|
||||
}
|
||||
}
|
||||
|
||||
if (!customer) {
|
||||
// Create new user
|
||||
logger.info("Creating new user from form submission");
|
||||
|
||||
// Build user data with whatever contact info we have
|
||||
const userData: any = {
|
||||
firstname: firstName,
|
||||
lastname: lastName,
|
||||
roles: ["Customer"],
|
||||
};
|
||||
|
||||
// Add contact info only if provided
|
||||
if (email) {
|
||||
userData.email = email;
|
||||
}
|
||||
|
||||
// Use phone number if provided (don't use Signal group ID as phone)
|
||||
if (phone) {
|
||||
userData.phone = phone;
|
||||
}
|
||||
|
||||
customer = await zammad.user.create(userData);
|
||||
}
|
||||
|
||||
logger.info(
|
||||
{
|
||||
customerId: customer.id,
|
||||
email: customer.email,
|
||||
},
|
||||
"Using customer for ticket",
|
||||
);
|
||||
|
||||
// Look up the configured group
|
||||
const groups = await zammad.get("groups");
|
||||
const targetGroup = groups.find((g: any) => g.name === mapping.ticket.group);
|
||||
|
||||
if (!targetGroup) {
|
||||
logger.error({ groupName: mapping.ticket.group }, "Configured group not found");
|
||||
throw new Error(`Zammad group "${mapping.ticket.group}" not found`);
|
||||
}
|
||||
|
||||
logger.info(
|
||||
{ groupId: targetGroup.id, groupName: targetGroup.name },
|
||||
"Using configured group",
|
||||
);
|
||||
|
||||
// Build custom fields using Zammad field mapping
|
||||
// This dynamically maps all configured fields without hardcoding
|
||||
const customFields = getZammadFieldValues(formData, mapping);
|
||||
|
||||
// Check if this is a Signal ticket
|
||||
let signalArticleType = null;
|
||||
let signalChannelId = null;
|
||||
let signalBotToken = null;
|
||||
|
||||
if (signalAccount) {
|
||||
try {
|
||||
logger.info({ signalAccount }, "Looking up Signal channel and article type");
|
||||
|
||||
// Look up Signal channels from Zammad (admin-only endpoint)
|
||||
// Note: bot_token is NOT included in this response for security reasons
|
||||
const channels = await zammad.get("cdr_signal_channels");
|
||||
if (channels.length > 0) {
|
||||
const zammadChannel = channels[0]; // Use first active Signal channel
|
||||
signalChannelId = zammadChannel.id;
|
||||
|
||||
logger.info(
|
||||
{
|
||||
channelId: zammadChannel.id,
|
||||
phoneNumber: zammadChannel.phone_number,
|
||||
},
|
||||
"Found active Signal channel from Zammad",
|
||||
);
|
||||
|
||||
// Look up the bot_token from our own cdr database using the phone number
|
||||
const signalBot = await db
|
||||
.selectFrom("SignalBot")
|
||||
.selectAll()
|
||||
.where("phoneNumber", "=", zammadChannel.phone_number)
|
||||
.executeTakeFirst();
|
||||
|
||||
if (signalBot) {
|
||||
signalBotToken = signalBot.token;
|
||||
logger.info(
|
||||
{ botId: signalBot.id, phoneNumber: signalBot.phoneNumber },
|
||||
"Found Signal bot token from cdr database",
|
||||
);
|
||||
} else {
|
||||
logger.warn(
|
||||
{ phoneNumber: zammadChannel.phone_number },
|
||||
"Signal bot not found in cdr database",
|
||||
);
|
||||
}
|
||||
} else {
|
||||
logger.warn("No active Signal channels found");
|
||||
}
|
||||
|
||||
// Look up cdr_signal article type
|
||||
const articleTypes = await zammad.get("ticket_article_types");
|
||||
signalArticleType = articleTypes.find((t: any) => t.name === "cdr_signal");
|
||||
|
||||
if (!signalArticleType) {
|
||||
logger.warn("Signal article type (cdr_signal) not found, using default type");
|
||||
} else {
|
||||
logger.info(
|
||||
{ articleTypeId: signalArticleType.id },
|
||||
"Found Signal article type",
|
||||
);
|
||||
}
|
||||
} catch (error: any) {
|
||||
logger.warn(
|
||||
{ error: error.message },
|
||||
"Failed to look up Signal article type, creating regular ticket",
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
// Create the ticket
|
||||
const articleData: any = {
|
||||
subject: descriptionOfIssue || "Support Request",
|
||||
body,
|
||||
content_type: "text/html",
|
||||
internal: false,
|
||||
};
|
||||
|
||||
// Use Signal article type if available, otherwise use configured default
|
||||
if (signalArticleType) {
|
||||
articleData.type_id = signalArticleType.id;
|
||||
logger.info({ typeId: signalArticleType.id }, "Using Signal article type");
|
||||
|
||||
// IMPORTANT: Set sender to "Customer" for Signal tickets created from Formstack
|
||||
// This prevents the article from being echoed back to the user via Signal
|
||||
// (enqueue_communicate_cdr_signal_job only sends if sender != 'Customer')
|
||||
articleData.sender = "Customer";
|
||||
} else if (articleTypeId) {
|
||||
articleData.type_id = articleTypeId;
|
||||
}
|
||||
|
||||
const ticketData: any = {
|
||||
title,
|
||||
group_id: targetGroup.id,
|
||||
customer_id: customer.id,
|
||||
article: articleData,
|
||||
...customFields,
|
||||
};
|
||||
|
||||
// Add Signal preferences if we have Signal channel and article type
|
||||
// Note: signalAccount from Formstack is the phone number the user typed in
|
||||
// Groups are added later via update_group webhook from bridge-worker
|
||||
if (signalChannelId && signalBotToken && signalArticleType && signalAccount) {
|
||||
ticketData.preferences = {
|
||||
channel_id: signalChannelId,
|
||||
cdr_signal: {
|
||||
bot_token: signalBotToken,
|
||||
chat_id: signalAccount, // Use Signal phone number as chat_id
|
||||
},
|
||||
};
|
||||
|
||||
logger.info(
|
||||
{
|
||||
channelId: signalChannelId,
|
||||
chatId: signalAccount,
|
||||
},
|
||||
"Adding Signal preferences to ticket",
|
||||
);
|
||||
}
|
||||
|
||||
logger.info(
|
||||
{
|
||||
title,
|
||||
groupId: targetGroup.id,
|
||||
customerId: customer.id,
|
||||
hasArticleType: !!articleTypeId || !!signalArticleType,
|
||||
isSignalTicket: !!signalArticleType && !!signalAccount,
|
||||
customFieldCount: Object.keys(customFields).length,
|
||||
},
|
||||
"Creating ticket",
|
||||
);
|
||||
|
||||
const ticket = await zammad.ticket.create(ticketData);
|
||||
|
||||
// Set create_article_type_id for Signal tickets to enable proper replies
|
||||
if (signalArticleType && signalChannelId) {
|
||||
try {
|
||||
await zammad.ticket.update(ticket.id, {
|
||||
create_article_type_id: signalArticleType.id,
|
||||
});
|
||||
logger.info(
|
||||
{
|
||||
ticketId: ticket.id,
|
||||
articleTypeId: signalArticleType.id,
|
||||
},
|
||||
"Set create_article_type_id for Signal ticket",
|
||||
);
|
||||
} catch (error: any) {
|
||||
logger.warn(
|
||||
{
|
||||
error: error.message,
|
||||
ticketId: ticket.id,
|
||||
},
|
||||
"Failed to set create_article_type_id, ticket may not support Signal replies",
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
logger.info(
|
||||
{
|
||||
ticketId: ticket.id,
|
||||
ticketNumber: ticket.id,
|
||||
title,
|
||||
isSignalTicket: !!signalChannelId,
|
||||
},
|
||||
"Successfully created ticket from Formstack submission",
|
||||
);
|
||||
} catch (error: any) {
|
||||
logger.error(
|
||||
{
|
||||
error: error.message,
|
||||
stack: error.stack,
|
||||
formId,
|
||||
uniqueId,
|
||||
},
|
||||
"Failed to create ticket from Formstack submission",
|
||||
);
|
||||
throw error;
|
||||
}
|
||||
};
|
||||
|
||||
export default createTicketFromFormTask;
|
||||
|
|
@ -1,227 +0,0 @@
|
|||
import { db, getWorkerUtils } from "@link-stack/bridge-common";
|
||||
import { createLogger } from "@link-stack/logger";
|
||||
import * as signalApi from "@link-stack/signal-api";
|
||||
const { Configuration, GroupsApi } = signalApi;
|
||||
|
||||
const logger = createLogger('bridge-worker-receive-signal-message');
|
||||
|
||||
interface ReceiveSignalMessageTaskOptions {
|
||||
token: string;
|
||||
to: string;
|
||||
from: string;
|
||||
messageId: string;
|
||||
sentAt: string;
|
||||
message: string;
|
||||
attachment?: string;
|
||||
filename?: string;
|
||||
mimeType?: string;
|
||||
isGroup?: boolean;
|
||||
}
|
||||
|
||||
const receiveSignalMessageTask = async ({
|
||||
token,
|
||||
to,
|
||||
from,
|
||||
messageId,
|
||||
sentAt,
|
||||
message,
|
||||
attachment,
|
||||
filename,
|
||||
mimeType,
|
||||
isGroup,
|
||||
}: ReceiveSignalMessageTaskOptions): Promise<void> => {
|
||||
logger.debug({
|
||||
messageId,
|
||||
from,
|
||||
to,
|
||||
isGroup,
|
||||
hasMessage: !!message,
|
||||
hasAttachment: !!attachment,
|
||||
token,
|
||||
}, 'Processing incoming message');
|
||||
const worker = await getWorkerUtils();
|
||||
const row = await db
|
||||
.selectFrom("SignalBot")
|
||||
.selectAll()
|
||||
.where("id", "=", token)
|
||||
.executeTakeFirstOrThrow();
|
||||
|
||||
const backendId = row.id;
|
||||
let finalTo = to;
|
||||
let createdInternalId: string | undefined;
|
||||
|
||||
// Check if auto-group creation is enabled and this is NOT already a group message
|
||||
const enableAutoGroups = process.env.BRIDGE_SIGNAL_AUTO_GROUPS === "true";
|
||||
|
||||
logger.debug({
|
||||
enableAutoGroups,
|
||||
isGroup,
|
||||
shouldCreateGroup: enableAutoGroups && !isGroup && from && to,
|
||||
}, 'Auto-groups config');
|
||||
|
||||
// If this is already a group message and auto-groups is enabled,
|
||||
// use group provided in 'to'
|
||||
if (enableAutoGroups && isGroup && to) {
|
||||
// Signal sends the internal ID (base64) in group messages
|
||||
// We should NOT add "group." prefix - that's for sending messages, not receiving
|
||||
logger.debug('Message is from existing group with internal ID');
|
||||
|
||||
finalTo = to;
|
||||
} else if (enableAutoGroups && !isGroup && from && to) {
|
||||
try {
|
||||
const config = new Configuration({
|
||||
basePath: process.env.BRIDGE_SIGNAL_URL,
|
||||
});
|
||||
const groupsClient = new GroupsApi(config);
|
||||
|
||||
// Always create a new group for direct messages to the helpdesk
|
||||
// This ensures each conversation gets its own group/ticket
|
||||
logger.info({ from }, 'Creating new group for user');
|
||||
|
||||
// Include timestamp to make each group unique
|
||||
const timestamp = new Date()
|
||||
.toISOString()
|
||||
.replace(/[:.]/g, "-")
|
||||
.substring(0, 19);
|
||||
const groupName = `Support: ${from} (${timestamp})`;
|
||||
|
||||
// Create new group for this conversation
|
||||
const createGroupResponse = await groupsClient.v1GroupsNumberPost({
|
||||
number: row.phoneNumber,
|
||||
data: {
|
||||
name: groupName,
|
||||
members: [from],
|
||||
description: "Private support conversation",
|
||||
},
|
||||
});
|
||||
|
||||
logger.debug({ createGroupResponse }, 'Group creation response from Signal API');
|
||||
|
||||
if (createGroupResponse.id) {
|
||||
// The createGroupResponse.id already contains the full group identifier (group.BASE64)
|
||||
finalTo = createGroupResponse.id;
|
||||
|
||||
// Fetch the group details to get the actual internalId
|
||||
// The base64 part of the ID is NOT the same as the internalId!
|
||||
try {
|
||||
logger.debug('Fetching group details to get internalId');
|
||||
const groups = await groupsClient.v1GroupsNumberGet({
|
||||
number: row.phoneNumber,
|
||||
});
|
||||
|
||||
logger.debug({ groupsSample: groups.slice(0, 3) }, 'Groups for bot');
|
||||
|
||||
const createdGroup = groups.find((g) => g.id === finalTo);
|
||||
if (createdGroup) {
|
||||
logger.debug({ createdGroup }, 'Found created group details');
|
||||
}
|
||||
|
||||
if (createdGroup && createdGroup.internalId) {
|
||||
createdInternalId = createdGroup.internalId;
|
||||
logger.debug({ createdInternalId }, 'Got actual internalId');
|
||||
} else {
|
||||
// Fallback: extract base64 part from ID
|
||||
if (finalTo.startsWith("group.")) {
|
||||
createdInternalId = finalTo.substring(6);
|
||||
}
|
||||
}
|
||||
} catch (fetchError) {
|
||||
logger.debug('Could not fetch group details, using ID base64 part');
|
||||
// Fallback: extract base64 part from ID
|
||||
if (finalTo.startsWith("group.")) {
|
||||
createdInternalId = finalTo.substring(6);
|
||||
}
|
||||
}
|
||||
|
||||
logger.debug({
|
||||
fullGroupId: finalTo,
|
||||
internalId: createdInternalId,
|
||||
}, 'Group created successfully');
|
||||
logger.debug({
|
||||
groupId: finalTo,
|
||||
internalId: createdInternalId,
|
||||
groupName,
|
||||
forPhoneNumber: from,
|
||||
botNumber: row.phoneNumber,
|
||||
response: createGroupResponse,
|
||||
}, 'Created new Signal group');
|
||||
}
|
||||
|
||||
// Now handle notifications and message forwarding for both new and existing groups
|
||||
if (finalTo && finalTo.startsWith("group.")) {
|
||||
// Forward the user's initial message to the group using quote feature
|
||||
try {
|
||||
logger.debug('Forwarding initial message to group using quote feature');
|
||||
|
||||
const attributionMessage = `Message from ${from}:\n"${message}"\n\n---\nSupport team: Your request has been received. An agent will respond shortly.`;
|
||||
|
||||
await worker.addJob("signal/send-signal-message", {
|
||||
token: row.token,
|
||||
to: finalTo,
|
||||
message: attributionMessage,
|
||||
conversationId: null,
|
||||
quoteMessage: message,
|
||||
quoteAuthor: from,
|
||||
quoteTimestamp: Date.parse(sentAt),
|
||||
});
|
||||
|
||||
logger.debug({ finalTo }, 'Successfully forwarded initial message to group');
|
||||
} catch (forwardError) {
|
||||
logger.error({ error: forwardError }, 'Error forwarding message to group');
|
||||
}
|
||||
|
||||
// Send a response to the original DM informing about the group
|
||||
try {
|
||||
logger.debug('Sending group notification to original DM');
|
||||
|
||||
const dmNotification = `Hello! A private support group has been created for your conversation.\n\nGroup name: ${groupName}\n\nPlease look for the new group in your Signal app to continue the conversation. Our support team will respond there shortly.\n\nThank you for contacting support!`;
|
||||
|
||||
await worker.addJob("signal/send-signal-message", {
|
||||
token: row.token,
|
||||
to: from,
|
||||
message: dmNotification,
|
||||
conversationId: null,
|
||||
});
|
||||
|
||||
logger.debug('Successfully sent group notification to user DM');
|
||||
} catch (dmError) {
|
||||
logger.error({ error: dmError }, 'Error sending DM notification');
|
||||
}
|
||||
}
|
||||
} catch (error: any) {
|
||||
// Check if error is because group already exists
|
||||
const errorMessage =
|
||||
error?.response?.data?.error || error?.message || error;
|
||||
const isAlreadyExists =
|
||||
errorMessage?.toString().toLowerCase().includes("already") ||
|
||||
errorMessage?.toString().toLowerCase().includes("exists");
|
||||
|
||||
if (isAlreadyExists) {
|
||||
logger.debug({ from }, 'Group might already exist, continuing with original recipient');
|
||||
} else {
|
||||
logger.error({
|
||||
error: errorMessage,
|
||||
from,
|
||||
to,
|
||||
botNumber: row.phoneNumber,
|
||||
}, 'Error creating Signal group');
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const payload = {
|
||||
to: finalTo,
|
||||
from,
|
||||
message_id: messageId,
|
||||
sent_at: sentAt,
|
||||
message,
|
||||
attachment,
|
||||
filename,
|
||||
mime_type: mimeType,
|
||||
is_group: finalTo.startsWith("group"),
|
||||
};
|
||||
|
||||
await worker.addJob("common/notify-webhooks", { backendId, payload });
|
||||
};
|
||||
|
||||
export default receiveSignalMessageTask;
|
||||
|
|
@ -1,313 +0,0 @@
|
|||
import {
|
||||
db,
|
||||
getWorkerUtils,
|
||||
getMaxAttachmentSize,
|
||||
getMaxTotalAttachmentSize,
|
||||
MAX_ATTACHMENTS,
|
||||
buildSignalGroupName,
|
||||
} from "@link-stack/bridge-common";
|
||||
import { createLogger } from "@link-stack/logger";
|
||||
import * as signalApi from "@link-stack/signal-api";
|
||||
const { Configuration, MessagesApi, GroupsApi } = signalApi;
|
||||
|
||||
const logger = createLogger("bridge-worker-send-signal-message");
|
||||
|
||||
interface SendSignalMessageTaskOptions {
|
||||
token: string;
|
||||
to: string;
|
||||
message: any;
|
||||
conversationId?: string; // Zammad ticket/conversation ID for callback
|
||||
quoteMessage?: string; // Optional: message text to quote
|
||||
quoteAuthor?: string; // Optional: author of quoted message (phone number)
|
||||
quoteTimestamp?: number; // Optional: timestamp of quoted message in milliseconds
|
||||
attachments?: Array<{
|
||||
data: string; // base64
|
||||
filename: string;
|
||||
mime_type: string;
|
||||
}>;
|
||||
}
|
||||
|
||||
const sendSignalMessageTask = async ({
|
||||
token,
|
||||
to,
|
||||
message,
|
||||
conversationId,
|
||||
quoteMessage,
|
||||
quoteAuthor,
|
||||
quoteTimestamp,
|
||||
attachments,
|
||||
}: SendSignalMessageTaskOptions): Promise<void> => {
|
||||
logger.debug(
|
||||
{
|
||||
token,
|
||||
to,
|
||||
conversationId,
|
||||
messageLength: message?.length,
|
||||
},
|
||||
"Processing outgoing message",
|
||||
);
|
||||
const bot = await db
|
||||
.selectFrom("SignalBot")
|
||||
.selectAll()
|
||||
.where("token", "=", token)
|
||||
.executeTakeFirstOrThrow();
|
||||
|
||||
const { phoneNumber: number } = bot;
|
||||
const config = new Configuration({
|
||||
basePath: process.env.BRIDGE_SIGNAL_URL,
|
||||
});
|
||||
const messagesClient = new MessagesApi(config);
|
||||
const groupsClient = new GroupsApi(config);
|
||||
const worker = await getWorkerUtils();
|
||||
|
||||
let finalTo = to;
|
||||
let groupCreated = false;
|
||||
|
||||
try {
|
||||
// Check if 'to' is a group ID (UUID format, group.base64 format, or base64) vs phone number
|
||||
const isUUID = /^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$/i.test(
|
||||
to,
|
||||
);
|
||||
const isGroupPrefix = to.startsWith("group.");
|
||||
const isBase64 = /^[A-Za-z0-9+/]+=*$/.test(to) && to.length > 20; // Base64 internal_id
|
||||
const isGroupId = isUUID || isGroupPrefix || isBase64;
|
||||
const enableAutoGroups = process.env.BRIDGE_SIGNAL_AUTO_GROUPS === "true";
|
||||
|
||||
logger.debug(
|
||||
{
|
||||
to,
|
||||
isGroupId,
|
||||
enableAutoGroups,
|
||||
shouldCreateGroup: enableAutoGroups && !isGroupId && to && conversationId,
|
||||
},
|
||||
"Recipient analysis",
|
||||
);
|
||||
|
||||
// If sending to a phone number and auto-groups is enabled, create a group first
|
||||
if (enableAutoGroups && !isGroupId && to && conversationId) {
|
||||
try {
|
||||
const groupName = buildSignalGroupName(conversationId);
|
||||
const createGroupResponse = await groupsClient.v1GroupsNumberPost({
|
||||
number: bot.phoneNumber,
|
||||
data: {
|
||||
name: groupName,
|
||||
members: [to],
|
||||
description: "Private support conversation",
|
||||
},
|
||||
});
|
||||
|
||||
if (createGroupResponse.id) {
|
||||
// The createGroupResponse.id already contains the full group identifier (group.BASE64)
|
||||
finalTo = createGroupResponse.id;
|
||||
groupCreated = true;
|
||||
|
||||
// Fetch the group details to get the actual internalId
|
||||
let internalId: string | undefined;
|
||||
try {
|
||||
const groups = await groupsClient.v1GroupsNumberGet({
|
||||
number: bot.phoneNumber,
|
||||
});
|
||||
|
||||
const createdGroup = groups.find((g) => g.id === finalTo);
|
||||
if (createdGroup && createdGroup.internalId) {
|
||||
internalId = createdGroup.internalId;
|
||||
logger.debug({ internalId }, "Got actual internalId");
|
||||
} else {
|
||||
// Fallback: extract base64 part from ID
|
||||
if (finalTo.startsWith("group.")) {
|
||||
internalId = finalTo.substring(6);
|
||||
}
|
||||
}
|
||||
} catch (fetchError) {
|
||||
logger.debug("Could not fetch group details, using ID base64 part");
|
||||
// Fallback: extract base64 part from ID
|
||||
if (finalTo.startsWith("group.")) {
|
||||
internalId = finalTo.substring(6);
|
||||
}
|
||||
}
|
||||
logger.debug(
|
||||
{
|
||||
groupId: finalTo,
|
||||
internalId,
|
||||
groupName,
|
||||
conversationId,
|
||||
originalRecipient: to,
|
||||
botNumber: bot.phoneNumber,
|
||||
},
|
||||
"Created new Signal group",
|
||||
);
|
||||
|
||||
// Notify Zammad about the new group ID via webhook
|
||||
// Set group_joined: false initially - will be updated when user accepts invitation
|
||||
await worker.addJob("common/notify-webhooks", {
|
||||
backendId: bot.id,
|
||||
payload: {
|
||||
event: "group_created",
|
||||
conversation_id: conversationId,
|
||||
original_recipient: to,
|
||||
group_id: finalTo,
|
||||
internal_group_id: internalId,
|
||||
group_joined: false,
|
||||
timestamp: new Date().toISOString(),
|
||||
},
|
||||
});
|
||||
}
|
||||
} catch (groupError) {
|
||||
logger.error(
|
||||
{
|
||||
error: groupError instanceof Error ? groupError.message : groupError,
|
||||
to,
|
||||
conversationId,
|
||||
},
|
||||
"Error creating Signal group",
|
||||
);
|
||||
// Continue with original recipient if group creation fails
|
||||
}
|
||||
}
|
||||
|
||||
logger.debug(
|
||||
{
|
||||
fromNumber: number,
|
||||
toRecipient: finalTo,
|
||||
originalTo: to,
|
||||
recipientChanged: to !== finalTo,
|
||||
groupCreated,
|
||||
isGroupRecipient: finalTo.startsWith("group."),
|
||||
},
|
||||
"Sending message via API",
|
||||
);
|
||||
|
||||
// Build the message data with optional quote parameters
|
||||
const messageData: signalApi.ApiSendMessageV2 = {
|
||||
number,
|
||||
recipients: [finalTo],
|
||||
message,
|
||||
};
|
||||
|
||||
logger.debug(
|
||||
{
|
||||
number,
|
||||
recipients: [finalTo],
|
||||
messageLength: message?.length,
|
||||
hasQuoteParams: !!(quoteMessage && quoteAuthor && quoteTimestamp),
|
||||
},
|
||||
"Message data being sent",
|
||||
);
|
||||
|
||||
// Add quote parameters if all are provided
|
||||
if (quoteMessage && quoteAuthor && quoteTimestamp) {
|
||||
messageData.quoteTimestamp = quoteTimestamp;
|
||||
messageData.quoteAuthor = quoteAuthor;
|
||||
messageData.quoteMessage = quoteMessage;
|
||||
|
||||
logger.debug(
|
||||
{
|
||||
quoteAuthor,
|
||||
quoteMessageLength: quoteMessage?.length,
|
||||
quoteTimestamp,
|
||||
},
|
||||
"Including quote in message",
|
||||
);
|
||||
}
|
||||
|
||||
// Add attachments if provided with size validation
|
||||
if (attachments && attachments.length > 0) {
|
||||
const MAX_ATTACHMENT_SIZE = getMaxAttachmentSize();
|
||||
const MAX_TOTAL_SIZE = getMaxTotalAttachmentSize();
|
||||
|
||||
if (attachments.length > MAX_ATTACHMENTS) {
|
||||
throw new Error(
|
||||
`Too many attachments: ${attachments.length} (max ${MAX_ATTACHMENTS})`,
|
||||
);
|
||||
}
|
||||
|
||||
let totalSize = 0;
|
||||
const validatedAttachments = [];
|
||||
|
||||
for (const attachment of attachments) {
|
||||
// Calculate size from base64 string (rough estimate: length * 3/4)
|
||||
const estimatedSize = (attachment.data.length * 3) / 4;
|
||||
|
||||
if (estimatedSize > MAX_ATTACHMENT_SIZE) {
|
||||
logger.warn(
|
||||
{
|
||||
filename: attachment.filename,
|
||||
size: estimatedSize,
|
||||
maxSize: MAX_ATTACHMENT_SIZE,
|
||||
},
|
||||
"Attachment exceeds size limit, skipping",
|
||||
);
|
||||
continue;
|
||||
}
|
||||
|
||||
totalSize += estimatedSize;
|
||||
if (totalSize > MAX_TOTAL_SIZE) {
|
||||
logger.warn(
|
||||
{
|
||||
totalSize,
|
||||
maxTotalSize: MAX_TOTAL_SIZE,
|
||||
},
|
||||
"Total attachment size exceeds limit, skipping remaining",
|
||||
);
|
||||
break;
|
||||
}
|
||||
|
||||
validatedAttachments.push(attachment.data);
|
||||
}
|
||||
|
||||
if (validatedAttachments.length > 0) {
|
||||
messageData.base64Attachments = validatedAttachments;
|
||||
logger.debug(
|
||||
{
|
||||
attachmentCount: validatedAttachments.length,
|
||||
attachmentNames: attachments
|
||||
.slice(0, validatedAttachments.length)
|
||||
.map((att) => att.filename),
|
||||
totalSizeBytes: totalSize,
|
||||
},
|
||||
"Including attachments in message",
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
const response = await messagesClient.v2SendPost({
|
||||
data: messageData,
|
||||
});
|
||||
|
||||
logger.debug(
|
||||
{
|
||||
to: finalTo,
|
||||
groupCreated,
|
||||
response: response?.timestamp || "no timestamp",
|
||||
},
|
||||
"Message sent successfully",
|
||||
);
|
||||
} catch (error: any) {
|
||||
// Try to get the actual error message from the response
|
||||
if (error.response) {
|
||||
try {
|
||||
const errorBody = await error.response.text();
|
||||
logger.error(
|
||||
{
|
||||
status: error.response.status,
|
||||
statusText: error.response.statusText,
|
||||
body: errorBody,
|
||||
sentTo: finalTo,
|
||||
messageDetails: {
|
||||
fromNumber: number,
|
||||
toRecipients: [finalTo],
|
||||
hasQuote: !!quoteMessage,
|
||||
},
|
||||
},
|
||||
"Signal API error",
|
||||
);
|
||||
} catch (e) {
|
||||
logger.error("Could not parse error response");
|
||||
}
|
||||
}
|
||||
logger.error({ error }, "Full error details");
|
||||
throw error;
|
||||
}
|
||||
};
|
||||
|
||||
export default sendSignalMessageTask;
|
||||
|
|
@ -1,11 +0,0 @@
|
|||
// import { db, getWorkerUtils } from "@link-stack/bridge-common";
|
||||
|
||||
interface ReceiveVoiceMessageTaskOptions {
|
||||
message: any;
|
||||
}
|
||||
|
||||
const receiveVoiceMessageTask = async ({
|
||||
message,
|
||||
}: ReceiveVoiceMessageTaskOptions): Promise<void> => {};
|
||||
|
||||
export default receiveVoiceMessageTask;
|
||||
|
|
@ -1,11 +0,0 @@
|
|||
// import { db, getWorkerUtils } from "@link-stack/bridge-common";
|
||||
|
||||
interface SendVoiceMessageTaskOptions {
|
||||
message: any;
|
||||
}
|
||||
|
||||
const sendVoiceMessageTask = async ({
|
||||
message,
|
||||
}: SendVoiceMessageTaskOptions): Promise<void> => {};
|
||||
|
||||
export default sendVoiceMessageTask;
|
||||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue