Alshorty·V3
// Master Reference & Command Centre
All saved ✓
🧭 How to use this document (read this first)
This is your single source of truth for Alshorty V3 in production. Every section has tasks you can tick, steps you can expand, and warnings that protect you from trouble.
1
Navigate using the left sidebar — each section has a specific focus (Security, Legal, AdSense, Roadmap etc.)
2
Click any task item to mark it done (turns green ✓). Click again to unmark.
3
Click "▼ Show Steps" on any task — reveals a detailed step-by-step guide. Every task has one.
4
Press "💾 Save Progress" after completing tasks. Saves to your browser. Won't disappear unless you clear cache.
5
Use "⬇ Backup / Restore" to export your progress as JSON. Paste it into a Google Doc or Notion. Do this weekly.
6
Start with "🔥 Do Today" in the sidebar — your most urgent 10 items right now.
🚀
V3 is live in production! Magic link auth, Google OAuth, URL shortening, admin panel, Razorpay order creation, and link redirects are all working. The foundation is solid — now it's time to polish and grow.
Security Score
A−
Strong — 3 gaps remain
Legal Status
GDPR + DPDP + IT Rules
AdSense Ready
65%
Cookie consent missing
Open Items
Across all sections
Loading...0%
🔴 Critical — Fix Now Blocking
Cookie consent banner before AdSense
CRITICALGDPR + AdSense policy requirement
Sitemap: regenerate to include all 15 blog posts
CRITICALCurrent sitemap only has 9 static pages
ads.txt: replace placeholder with real publisher ID
CRITICALAdSense won't fill ad inventory without it
Confirm all 6 Worker secrets set (not in wrangler.toml)
CRITICALwrangler.toml is in git — never store secrets there
Disabled user accounts must cascade to link deactivation
CRITICALDisabled users can still redirect visitors via old links
🟠 High Priority This Week
Create OG image (1200×630) — currently missing
HIGHSocial shares show blank preview
Add blog cover images to /blog-images/
HIGH15 articles with broken img tags visible to AdSense
Add GSC verification code to index.html
HIGHRequired to submit sitemap and track impressions
Set up uptime monitoring (alshorty.com + api.alshorty.com)
HIGHKnow when site goes down before users report it
Handle payment.refunded webhook — downgrade user
HIGHRefunded users stay on PRO indefinitely
⚠️
Before you do anything else: Run grep FRONTEND_ORIGIN worker/wrangler.toml — it must say https://alshorty.com. This keeps getting accidentally reset to dev3 and breaks magic links and short URL generation.
TODAY'S PRIORITY LIST
1. Build cookie consent banner (GDPR + AdSense blocker)
CRITICALCannot run AdSense without this. 2–3 hours work.
▼ Show Steps
📋 Step-by-step
  1. 1
    Create ui/src/components/shared/CookieConsent.tsx. On first visit, check localStorage.getItem('alshorty_cookie_consent'). If null, show the banner.
  2. 2
    Banner needs two buttons: "Accept all" (sets value to 'accepted') and "Reject non-essential" (sets value to 'rejected'). Store in localStorage.
  3. 3
    In ui/index.html, the AdSense <script> tag is commented out. Conditionally inject it via JS only if consent === 'accepted'. Use document.createElement('script') (same pattern as Razorpay).
  4. 4
    Add <CookieConsent /> to ui/src/App.tsx at the root level — renders on every page before any ad loads.
  5. 5
    Add a "Manage cookie preferences" link to the footer that re-opens the banner (delete the localStorage key, reload).
✅ This alone unblocks AdSense activation. A minimal self-built banner is sufficient — no paid CMP needed at your scale.
2. Regenerate sitemap.xml to include blog posts
CRITICALGoogle can't find your 15 blog articles without this.
▼ Show Steps
📋 Step-by-step
  1. 1
    Open generate-sitemap.mjs in project root. Ensure it reads all slugs from ui/src/lib/blog-data.ts and writes /blog/{slug} entries.
  2. 2
    Run: node generate-sitemap.mjs from project root. This rewrites ui/public/sitemap.xml.
  3. 3
    Verify the output file includes all 15 blog article URLs + tools pages + pricing + about.
  4. 4
    Add sitemap generation to your GitHub Actions deploy-ui.yml — run node generate-sitemap.mjs as a build step so it never goes stale again.
  5. 5
    After deploying, submit the updated sitemap in Google Search Console → Sitemaps.
3. Fix ads.txt — replace placeholder publisher ID
CRITICAL30 second fix. Currently the whole line is commented out with ##.
▼ Show Steps
📋 Step-by-step
  1. 1
    Go to AdSense dashboard → Account → Account information → copy your Publisher ID (format: ca-pub-XXXXXXXXXXXXXXXX).
  2. 2
    Open ui/public/ads.txt. Replace the commented-out line with:
    google.com, ca-pub-YOURPUBID, DIRECT, f08c47fec0942fa0
  3. 3
    Deploy the UI. Then verify at https://alshorty.com/ads.txt — you should see the plain text line.
4. Create OG image and add to ui/public/
HIGHEvery shared link shows a blank preview. Takes 20 mins in Canva.
▼ Show Steps
📋 Step-by-step
  1. 1
    Open Canva.com → Create design → Custom size → 1200×630px.
  2. 2
    Add: your logo, tagline (e.g. "Shorten, track, and manage your links"), brand colours.
  3. 3
    Export as PNG. Save as og-image.png in ui/public/.
  4. 4
    Test at developers.facebook.com/tools/debug/ — paste alshorty.com and verify the preview shows your image.
5. Set up Cloudflare Worker error alerting
HIGH5 minute setup. Know about 5xx spikes before users report them.
▼ Show Steps
📋 Step-by-step
  1. 1
    Cloudflare Dashboard → Workers & Pages → alshorty-v3 → Observability → Notifications.
  2. 2
    Create alert: "Worker Error Rate" → trigger when >1% over 5 minutes → email your address.
  3. 3
    Also create: "Worker CPU Time" alert for >50ms average — early signal of performance regressions.
6. Set up uptime monitoring for both domains
HIGHFree at BetterUptime or UptimeRobot. Takes 5 minutes.
▼ Show Steps
📋 Step-by-step
  1. 1
    Sign up at betteruptime.com (free tier) or uptimerobot.com.
  2. 2
    Add monitor 1: https://alshorty.com/health → expect HTTP 200 → check every 3 minutes.
  3. 3
    Add monitor 2: https://api.alshorty.com/api/status → expect HTTP 200 → check every 3 minutes.
  4. 4
    Set alert to email + SMS/WhatsApp if available. You need to know within 5 minutes of downtime.
7. Add GSC verification code to index.html
HIGH2 minute fix. Required to submit sitemap and see organic traffic.
▼ Show Steps
📋 Step-by-step
  1. 1
    Go to search.google.com/search-console → Add property → URL prefix → https://alshorty.com.
  2. 2
    Choose "HTML tag" verification → copy the content="..." value.
  3. 3
    In ui/index.html, find the commented-out GSC meta tag and uncomment it, replacing YOUR_GSC_CODE_HERE with your real code.
  4. 4
    Deploy UI → click "Verify" in GSC → then submit sitemap at https://alshorty.com/sitemap.xml.
8. Add blog cover images to ui/public/blog-images/
HIGHAll 15 blog articles have broken img tags visible to AdSense reviewers.
▼ Show Steps
📋 Step-by-step
  1. 1
    Create folder: ui/public/blog-images/
  2. 2
    Source royalty-free images from unsplash.com or pexels.com — one per article topic (URLs, QR codes, analytics, affiliate marketing, tech).
  3. 3
    Optimise each to WebP format at 1200×630px. Target under 80KB each. Use squoosh.app.
  4. 4
    Name each image to match the slugs referenced in ui/src/lib/blog-data.ts.
9. Handle payment.refunded webhook in payments.js
HIGHRefunded users stay on PRO indefinitely — revenue leak.
▼ Show Steps
📋 Step-by-step
  1. 1
    In worker/src/routes/payments.js, in the webhook handler switch/if block, add a case for payment.refunded.
  2. 2
    Extract email from event.payload.payment.entity.notes.email.
  3. 3
    Update D1: UPDATE users SET plan='free', sub_expires_at=NULL WHERE email=?
  4. 4
    Log the refund event: INSERT INTO audit_log (email, action, meta, created_at) for traceability.
  5. 5
    Optionally send a downgrade email via Resend so the user knows their plan changed.
10. Compress logo1.png (173KB→<20KB) and favicon.png (108KB→<5KB)
HIGH280KB of uncompressed logo assets loaded on every page. Kills Core Web Vitals.
▼ Show Steps
📋 Step-by-step
  1. 1
    Go to squoosh.app. Upload logo1.png. Select WebP format. Reduce quality to 80. Target under 20KB. Save as logo1.webp.
  2. 2
    Upload favicon.png. Resize to 64×64 or 32×32. Compress as PNG. Target under 5KB. Save back as favicon.png.
  3. 3
    Update any <img src="logo1.png"> references in the UI to logo1.webp. Use grep -r "logo1.png" ui/src/ to find all references.
  4. 4
    Run PageSpeed Insights at pagespeed.web.dev on alshorty.com. Target 85+ on mobile before applying for AdSense.
V3 auth is solid. Magic link tokens are one-time-use SHA-256 hashed with 15-min expiry. Google OAuth validates the aud claim against your Client ID. API keys are SHA-256 hashed, PRO-only, 5-key limit. Two open items below.
COMPLETED ✓
Magic link tokens: one-time-use, SHA-256 hashed, 15-min TTL in KV
DONEToken deleted on first use. KV TTL as backup. No replay possible.
Session cookies: HttpOnly + Secure + SameSite=None + Domain=.alshorty.com
DONESameSite=None required for api.alshorty.com ↔ alshorty.com cross-subdomain auth.
Magic link rate limited: 5 per email per 15 minutes (KV counter, 900s TTL)
DONEPrevents inbox-flooding attacks against Resend quota.
Google OAuth validates aud claim against GOOGLE_CLIENT_ID
DONEPrevents token substitution — tokens from other Google apps rejected.
API keys hashed (SHA-256), sk_ prefix, 5-key limit per user, PRO-only
DONERaw key shown once on creation, never retrievable. Last-used updated async via waitUntil.
Admin secret stored in sessionStorage (clears on tab close, not persistent)
DONEBoth Admin.tsx and api.ts updated. No persistent secret on shared machines.
OPEN ITEMS
Disabled user accounts: existing links still redirect (should return 410)
CRITICALAdmin "disable" blocks new shortens but old links keep redirecting. Phishing links stay live.
▼ Show Steps
📋 Fix: Cascade disable to links
  1. 1
    In routes/redirect.js, after resolving the link, check the owner's account status: const user = await getUserByEmail(link.owner, env);
  2. 2
    If user?.is_disabled, return htmlPage(renderErrorPage('This link is no longer available', code, env), 410)
  3. 3
    Alternative (better): In the admin disable handler in admin-backend.js, when disabling a user, batch-update all their KV links to is_active: false. This avoids a DB lookup on every redirect.
⚠️ Until this is fixed, a banned abuser's phishing links remain 100% functional. This is your highest-priority security gap.
Add CSRF / Origin check on all state-mutating endpoints (POST/PUT/DELETE)
MEDIUMSameSite=None means session cookies send cross-origin. Mutating routes should verify Origin header.
▼ Show Steps
📋 Fix: Origin assertion
  1. 1
    In utils/cors.js, export a helper: export function assertOrigin(request) { const o = request.headers.get('Origin'); return ALLOWED_ORIGINS.includes(o); }
  2. 2
    At the top of deleteLink, editLink, toggleLink, createLink, updateBio handlers: if (!assertOrigin(request)) return forbidden('CSRF_CHECK_FAILED');
💡 This adds ~2 lines per handler. Low effort, meaningful protection against cross-origin form attacks.
Block lists expanded in April 2026: 40+ shortener domains, 25+ BLOCKED_DOMAINS (IP grabbers, adult content, paste sites), 35+ phishing regex patterns. URL input auto-prefixes https:// on paste and blur. PRO-only features enforced server-side.
COMPLETED ✓
Multi-layer validation: scheme → private IP → shortener → TLD → BLOCKED_DOMAINS → keywords → phishing regex
DONEvalidateUrl() + deepUrlScan(). Centralised in constants.js. Return type fixed (was causing "[object Object]" in V2).
editLink and updateBio both validate URLs on every update (was missing in V2)
DONELinks can no longer be edited to phishing URLs post-creation.
Destination URL re-validated at redirect time (defense in depth)
DONEBad URL in KV cannot be served as a redirect. Strips whitespace before validation.
SSRF protection: validateProxyUrl() blocks private IPs/metadata endpoints on health-check and OG-checker
DONEWorker cannot probe internal infrastructure or AWS metadata (169.254.169.254).
URL input: auto-prefixes https:// on paste AND on blur. type="url" enforced.
DONEUsers no longer need to type https://. onBlur only triggers if value contains a dot.
PRO-only features enforced server-side: expiry, max-clicks, password, UTM params, category prefix
DONEReturns UPGRADE_REQUIRED (400) if free user sends these fields. Two-layer enforcement (frontend + backend).
OPEN ITEMS
Integrate Google Safe Browsing API — catches freshly-registered phishing domains
MEDIUMRegex patterns miss new phishing domains. GSB free tier. High legal liability leverage.
▼ Show Steps
📋 Step-by-step
  1. 1
    Get a free Safe Browsing API key from console.cloud.google.com → APIs & Services → Safe Browsing API.
  2. 2
    Add as Worker secret: wrangler secret put GOOGLE_SAFE_BROWSING_KEY
  3. 3
    In createLink(), after deepUrlScan() passes, call: POST https://safebrowsing.googleapis.com/v4/threatMatches:find with the URL.
  4. 4
    If response contains matches, return badRequest('This URL has been flagged as unsafe', 'BLOCKED_URL').
  5. 5
    Wrap in try/catch with fail-open behaviour — if GSB API is down, log the error and allow through. Don't let GSB downtime block your users.
💡 GSB free tier: 10,000 calls/day. At your scale you're safe for months.
UTM parameter length cap (200 chars) and control-char strip
MEDIUMA 10,000-char UTM value bloats KV storage and downstream URLs.
▼ Show Steps
📋 Quick Fix
  1. 1
    In createLink(), add one line before appending UTM params: const trimUtm = v => v ? String(v).slice(0, 200).replace(/[\x00-\x1f]/g, '') : undefined;
  2. 2
    Apply: utm_source: trimUtm(utm_source) etc. for all 5 UTM fields.
CORS restricted to explicit origin allowlist — no wildcards. Origin reflection required for credentials:include. All fixed in April 2026 patch bundle.
COMPLETED ✓
Worker security headers on every response via wrap() — X-Frame-Options, nosniff, Referrer-Policy
DONEConsistent across API, redirect, and error responses. Clickjacking and MIME-sniffing blocked.
Cloudflare Pages _headers file with full CSP (script-src, style-src, connect-src, frame-src)
DONEScoped to Google Fonts, Google Auth, AdSense. Created April 2026 — closes the gap where Pages had no headers.
CORS: explicit origin allowlist, no wildcard, Vary:Origin set, origin reflection for credentials
DONEALLOWED_ORIGINS: alshorty.com, www.alshorty.com, dev3.alshorty.com.
FRONTEND_ORIGIN = https://alshorty.com in wrangler.toml (was dev3.alshorty.com)
DONEAffects magic link email URLs and short_url in API responses. Check before every worker deploy.
_redirects: static files (sitemap.xml, robots.txt, ads.txt) served directly before SPA catch-all
DONEPrevents the /* /index.html 200 catch-all from intercepting static XML/text files.
OPEN ITEMS
Add HSTS header to Worker SECURITY_HEADERS
MEDIUMStrict-Transport-Security not in SECURITY_HEADERS. Also enable in Cloudflare dashboard.
▼ Show Steps
📋 Quick Fix
  1. 1
    In worker/src/index.js, in the SECURITY_HEADERS object, add: 'Strict-Transport-Security': 'max-age=31536000; includeSubDomains'
  2. 2
    In Cloudflare Dashboard → SSL/TLS → Edge Certificates → HSTS → Enable with max-age 1 year, includeSubDomains checked.
💡 HSTS tells browsers to always use HTTPS for your domain — even if the user types http://. Also enables HSTS Preload eligibility for maximum protection.
Remove dev3 Worker routes once prod is confirmed stable for 2+ weeks
LOWdev3.alshorty.com/api* and dev3-api.alshorty.com/* still on the production Worker. Should be a separate Worker.
▼ Show Steps
📋 Clean-up steps
  1. 1
    Wait until prod has been stable for 2+ weeks — you may need dev3 for rollback.
  2. 2
    Cloudflare Dashboard → Workers → alshorty-v3 → Settings → Domains & Routes → delete dev3.alshorty.com/api* and dev3-api.alshorty.com/* routes.
  3. 3
    Create a new separate Worker called alshorty-dev and add the dev3 routes there.
Razorpay checkout is working. Order creation fixed (receipt was 44 chars, limit is 40). SDK now loaded correctly via DOM API in useEffect — not broken JSX script tag. Webhook HMAC verified. Replay protection active.
COMPLETED ✓
Razorpay order creation: receipt ≤40 chars (was 44 — caused all 500 errors), currency always INR
DONEReceipt now uses base-36 hashed email + timestamp. Real Razorpay error message passed through to toast.
Razorpay SDK loaded via document.createElement('script') in useEffect — not JSX script tag
DONEReact silently ignores JSX script tags. "window.Razorpay is not a constructor" error eliminated.
Webhook HMAC-SHA256 verified before processing + replay protection via KV nonce (10-min TTL)
DONEBody read as raw text before JSON parse. Returns 200 on duplicate events so Razorpay doesn't retry.
subscription.cancelled and subscription.halted webhooks handled — downgrade to free immediately
DONEPreviously only payment.captured was handled. Cancelled subs left users on PRO indefinitely.
OPEN ITEMS
Handle payment.refunded webhook — downgrade user plan when refund issued
HIGHRefunded users stay on PRO indefinitely. Revenue leak + unfair to paying users.
▼ Show Steps
📋 Step-by-step
  1. 1
    In worker/src/routes/payments.js, webhook handler: add case for payment.refunded event.
  2. 2
    Extract email from event.payload.payment.entity.notes.email.
  3. 3
    Update D1: UPDATE users SET plan='free', sub_expires_at=NULL WHERE email=?
  4. 4
    Log to audit trail: INSERT INTO audit_log (email, action, meta, created_at).
  5. 5
    Optionally send a downgrade notification email via Resend.
Poll for plan activation after payment (KV eventual consistency delay — up to 60s)
MEDIUMUser may see "Free" plan briefly after upgrading due to KV global replication delay.
▼ Show Steps
📋 Step-by-step
  1. 1
    After Razorpay payment success callback fires, redirect to /billing?upgraded=1 instead of refreshing in-place.
  2. 2
    On the Billing page, detect ?upgraded=1 in URL params. If present, start polling api.auth.session() every 2 seconds for up to 30 seconds.
  3. 3
    While polling, show a spinner: "Activating your Pro plan… this takes a few seconds."
  4. 4
    Once session.plan === 'pro', stop polling and show a success message. Remove ?upgraded=1 from URL.
💡 Most KV updates propagate in under 5 seconds. 30s polling covers even the slowest edge cases.
Set up Razorpay webhook failure monitoring — check delivery history weekly
MEDIUMIf webhook silently fails, users pay but stay on Free plan. Razorpay dashboard shows failed deliveries.
▼ Show Steps
📋 Monitoring setup
  1. 1
    Razorpay Dashboard → Settings → Webhooks → your endpoint → check "Recent Deliveries" weekly.
  2. 2
    Consider logging all received webhook events to D1 with timestamp and event type for a permanent audit trail.
  3. 3
    Add a worker health endpoint: GET /__admin/billing/verify?email=x to manually check a user's plan status vs. what Razorpay says.
🚨
Critical rule: wrangler.toml is committed to git. Any value in [vars] is visible to anyone with repo access. ALL sensitive values must be set via wrangler secret put and referenced as env.SECRET_NAME — never hardcoded in the file.
COMPLETED ✓
Dev routes (/__dev/*) return 404 in production — ENV=production guard in wrangler.toml
DONEDev-only routes never exposed in prod. Clean environment separation.
GitHub Actions CI/CD uses secrets references — no hardcoded values in workflow YAML
DONEAll secrets via ${{ secrets.* }}. VITE_ vars are build-time only — acceptable for public client IDs.
Admin secret in sessionStorage (clears on tab close) — not localStorage (persists forever)
DONEBoth Admin.tsx and api.ts updated in April 2026 patch.
REQUIRED ACTION
Verify all 6 production Worker secrets are set via wrangler secret put
CRITICALRun these commands to verify. If any are missing, the corresponding feature will 500.
▼ Show Steps
📋 Required secrets
  1. 1
    wrangler secret put ADMIN_SECRET — the secret header value for /__admin/* endpoints
  2. 2
    wrangler secret put RESEND_API_KEY — magic link emails will fail without this
  3. 3
    wrangler secret put RAZORPAY_KEY_ID — payment order creation
  4. 4
    wrangler secret put RAZORPAY_KEY_SECRET — payment order creation
  5. 5
    wrangler secret put RAZORPAY_WEBHOOK_SECRET — webhook HMAC verification
  6. 6
    wrangler secret put GOOGLE_CLIENT_ID — Google OAuth (and Microsoft when activated)
💡 To verify what's set: wrangler secret list — shows secret names (not values) currently deployed.
Add .wrangler/ and .dev.vars to .gitignore
HIGH.wrangler/ contains local KV SQLite state. If committed, leaks dev data and bindings config.
▼ Show Steps
📋 Quick Fix
  1. 1
    Add to .gitignore in the repo root: .wrangler/, .dev.vars, *.sqlite, *.sqlite-shm, *.sqlite-wal
  2. 2
    Check if already committed: git ls-files .wrangler/. If output is non-empty, run: git rm -r --cached .wrangler/ && git commit -m "remove wrangler state from tracking"
Update compatibility_date in wrangler.toml from 2024-06-01 to current
LOW12+ months of Workers runtime improvements and security patches missed.
▼ Show Steps
📋 Safe upgrade path
  1. 1
    Update worker/wrangler.toml: compatibility_date = "2025-04-01"
  2. 2
    Test locally first: wrangler dev --remote — some API behaviours changed between dates.
  3. 3
    Deploy to dev3 first. Test full flow (auth, shorten, redirect, payments) before pushing to prod.
Stack: Worker handles api.alshorty.com/* (via CNAME) + specific paths on alshorty.com (via Worker Routes). Pages serves the React SPA for everything else. D1 for relational data (users, analytics, API keys). KV for links, sessions, rate limits, bio slug index.
COMPLETED ✓
All 4 domain prefixes working: /link/ /s/ /go/ /run/ — backward-compatible with legacy bare-path links
DONEOld links (alshorty.com/code) still work via fallback handler. New links use prefixed URLs.
Cloudflare Worker routes configured: /api/*, /link/*, /s/*, /go/*, /run/*, /bio/*, /health, /auth/verify
DONENo /* catch-all (would break SPA). Worker intercepts before Pages for all short-link paths.
Bio slug index: O(1) KV lookup via bio_slug:{slug} — was O(n) full scan
DONEIndex written on create/update. Existing bios backfilled on first visit. No data migration needed.
PRO feature gates: frontend (ProRoute, UI locks) + backend (UPGRADE_REQUIRED 403) two-layer enforcement
DONEBulk, Link-in-Bio, Analytics, Advanced options all gated at both layers. Cannot bypass via API.
PRO users get instant 302 redirect. Free/anon users get 5-second interstitial page.
DONEisFree check uses owner_plan stored on link at creation (fast, no DB lookup at redirect time).
404 for dead short links: styled HTML page instead of bare JSON {"error":"NOT_FOUND"}
DONEBlack-screen "Link not found" JSON display in Chrome fixed. HTML 404 page with "Create your own →" CTA.
Worker fully modularised: routes/, utils/, admin/, templates/, qr/, emails/, config/
DONEindex.js is only a router + security gate. Each module has single responsibility.
OPEN ITEMS
Analytics writes on redirect hot path — move to ctx.waitUntil() background
HIGHD1 INSERT on every redirect adds latency. At high volume risks D1 write rate limits.
▼ Show Steps
📋 Fix: Background writes
  1. 1
    In routes/redirect.js, find the analytics insert call in recordClick().
  2. 2
    Wrap it: ctx.waitUntil(recordClick(request, env, link, code, destination))
  3. 3
    Also wrap the KV click_count increment in waitUntil.
  4. 4
    Return the Response.redirect(destination, 302) BEFORE any DB writes. User never waits for analytics.
✅ This change alone can reduce p95 redirect latency by 30-60ms at scale.
Remove V3Launch.tsx banner component post-stabilisation
LOW25KB component only for launch banner. Remove from App.tsx and Dashboard.tsx once launch period ends.
Add unit test suite for validation, auth, and security helpers (vitest is configured)
MEDIUMPriority: validateUrl() edge cases, validateProxyUrl() private IP ranges, isValidEmail() multi-part TLDs.
▼ Show Steps
📋 Test coverage priorities
  1. 1
    validateUrl(): test private IPs (10.x, 192.168.x), .tk/.cf TLDs, shortener domains, phishing keywords, javascript: scheme, empty string, null, very long strings.
  2. 2
    validateProxyUrl(): test 192.168.x.x, 10.x.x.x, 172.16-31.x.x, 127.0.0.1, ::1, 169.254.169.254, file://, ftp://.
  3. 3
    isValidEmail(): test co.in, com.au, ac.uk, gov.in multi-part TLDs — historically buggy.
  4. 4
    Webhook HMAC: test valid signature, tampered body, wrong secret, replay (duplicate nonce).
COMPLETED ✓
Multi-layer rate limiting: 120/min global IP, 20/min auth endpoints, per-user plan quotas
DONEsecurity.js applies global limits. links.js enforces day/month quotas. Anon: 2 links total per 24h.
Honeypot auto-ban: 3 hits on scanner paths = 7-day IP ban (returns generic 404)
DONEPaths: /wp-admin, /.env, /phpmyadmin, /.git/config. Generic 404 — no feedback to attacker.
IP hashing: IPs never stored raw — salted SHA-256 hash used for all rate limit keys
DONEPrivacy-by-design. Hash is deterministic (rate limiting still works) but not reversible.
Expanded block lists: 40+ shortener domains, 25+ BLOCKED_DOMAINS, 35+ phishing patterns
DONEIP grabbers (grabify.link), paste sites (pastebin.com), adult content, malware, crypto scams.
Bot UA blocking on write endpoints (sqlmap, nikto, dirbuster, masscan) — read endpoints unaffected
DONEBlunt curl/wget block removed (was breaking legitimate API users). Precise scanner-tool blocking retained.
OPEN ITEMS
Disabled user accounts: links still redirect — propagate disable to all user's links
CRITICALSame as Auth section. Phishing links remain live after account is disabled.
▼ Show Steps
📋 Fix approach
  1. 1
    Best approach: in admin-backend.js, when disabling a user, run a KV list scan for all their links (prefix: 'link:', filter by link.owner === email) and set is_active: false on each.
  2. 2
    This is a one-time background operation on the admin action — not on every redirect request. Use ctx.waitUntil() to run it async.
  3. 3
    For immediate effect: add an owner check in redirect.js as a faster intermediate fix (2 KV lookups per redirect — owner lookup + link lookup).
Google Safe Browsing API integration (also in Validation section)
MEDIUMHigh legal liability leverage. Free tier covers your scale for months.
⚠️
Do not apply for AdSense yet. Cookie consent banner is missing (automatic rejection), blog cover images are broken, and sitemap is incomplete. Fix these three first. AdSense reviewers manually check your site.
COMPLETED ✓
Privacy Policy mentions AdSense cookies (__gads, __gpi, IDE) and opt-out links
DONELinks to Google ads settings and aboutads.info opt-out. Required AdSense disclosure present.
All legal pages present and linked in footer: Privacy, Terms, DMCA, Cookie Policy, About, Contact, Sitemap
DONERequired by AdSense. All accessible without login.
15 original blog articles with headings, read time, categories — strong editorial depth
DONEBitly alternatives, URL shortener APIs, QR codes, UTM tracking, affiliate marketing, link-in-bio.
Ad-bearing pages (blog, home, tools, redirect) accessible without login
DONEAdSense bots can verify content. Correct public/authenticated page separation.
AdSlot component renders nothing until VITE_ADSENSE_CLIENT_ID is set — safe to ship
DONENo ad errors in production until you're ready. Clean activation path.
HTTPS enforced on all origins via Cloudflare
DONEAdSense requires HTTPS. All app URLs use https://. Enable "Always Use HTTPS" in Cloudflare if not already.
REQUIRED BEFORE APPLYING
Cookie consent banner (GDPR — AdSense will auto-reject without this)
CRITICAL
Blog cover images created and deployed (15 broken img tags visible to reviewers)
CRITICAL
ads.txt: replace placeholder ca-pub-XXXXXXXXXXXXXXXX with real publisher ID
CRITICAL
Sitemap regenerated to include all 15 blog posts and submitted to GSC
CRITICAL
OG image created and deployed (1200×630px)
HIGH
PageSpeed Insights score: 85+ on mobile before applying
HIGHCompress logo (173KB→<20KB) and favicon (108KB→<5KB) first.
Add AdSense ad slot to redirect interstitial page — primary monetisation window for URL shorteners
HIGHThe 5-second page is where most URL shortener AdSense revenue comes from. No ad slot there yet.
▼ Show Steps
📋 Implementation + AdSense policy notes
⚠️ AdSense Better Ads Standards: "Continue" button must always be visible above fold, never visually adjacent to the ad unit. Max redirect delay: keep at 5 seconds or less.
  1. 1
    In worker/src/templates/redirect-page.js, add an AdSense ad slot div between the destination info and the footer. Include the publisher ID from env.
  2. 2
    Gate it: only render the ad slot when link.owner_plan !== 'pro'. PRO users' links have no ads.
  3. 3
    Keep the "Continue now →" button above the fold. Ensure it is never visually near the ad unit.
  4. 4
    Pre-reserve space for the ad (e.g. min-height: 250px) to prevent layout shift.
Verify alshorty.com domain age ≥ 6 months before applying
HIGHGoogle commonly rejects domains under 6 months. Check WHOIS for registration date.
Set up Google Analytics 4 and link to AdSense after approval
MEDIUMNo GA4 detected. AdSense reviewers look for organic traffic evidence. GA4 ↔ AdSense link enables enhanced reporting.
▼ Show Steps
📋 Setup steps
  1. 1
    Create GA4 property at analytics.google.com for alshorty.com.
  2. 2
    Add gtag.js to index.html — conditionally load with cookie consent (same pattern as AdSense script).
  3. 3
    After AdSense approval, link GA4 to AdSense in both platforms for enhanced reporting.
Activate AdSense: set VITE_ADSENSE_CLIENT_ID in Cloudflare Pages + uncomment script in index.html
MEDIUMDo this AFTER AdSense approval. Do not enable before approval — it can delay the review.
▼ Show Steps
📋 Activation steps
  1. 1
    In Cloudflare Pages Settings → Variables and Secrets: add VITE_ADSENSE_CLIENT_ID = ca-pub-YOURPUBID
  2. 2
    In ui/index.html, uncomment the AdSense <script async src="https://pagead2.googlesyndication.com/..."> tag. Replace placeholder publisher ID.
  3. 3
    Add VITE_ADSENSE_CLIENT_ID to GitHub Actions deploy-ui.yml env vars.
  4. 4
    Deploy → ads render wherever AdSlot components exist. They render nothing if the env var is unset.
COMPLETED ✓
robots.txt: blocks all app/auth paths, allows blog/tools/pricing, references sitemap
DONEDashboard, analytics, billing, auth, API, admin all blocked. Public content open for crawling.
SEO meta tags, canonical URLs, OG tags, Twitter Card, JSON-LD structured data in index.html
DONElib/seo.ts + useSEO() hook per page. Organization, WebSite, SoftwareApplication JSON-LD schemas.
sitemap.xml works at /sitemap.xml — footer link fixed to open as <a> not React Router Link
DONEWas causing 404 when clicked from footer. Now plain anchor element.
OPEN ITEMS
Sitemap missing all 15 blog posts — regenerate with generate-sitemap.mjs
CRITICALCurrent sitemap has 9 static pages only. Google can't discover your blog content.
Add GSC verification code to index.html and submit sitemap
HIGHCurrently commented out. Required to see search impressions and submit sitemap.
Compress logo1.png (173KB) and favicon.png (108KB) — Core Web Vitals impact
HIGH280KB+ of logo assets on every page load. Kills LCP score. Target: logo <20KB WebP, favicon <5KB.
Add sitemap generation as a CI build step so it never goes stale
MEDIUMCurrently regenerated manually. Every new blog post should auto-update the sitemap on deploy.
▼ Show Steps
📋 CI integration
  1. 1
    In .github/workflows/deploy-ui.yml, before the vite build step, add: - run: node generate-sitemap.mjs
  2. 2
    The generated ui/public/sitemap.xml will be included in the Vite build output and deployed automatically.
💡 This means you never have to remember to manually regenerate the sitemap when adding blog posts.
ℹ️
Rule of thumb: Don't start V3.1 features until V3 has been stable in production for 2 full weeks. This gives time to catch and fix real-world bugs without compounding new code on top of them.

🚀 V3.1 — Next 4 Weeks

  • Cookie consent banner (GDPR — blocks AdSense)
  • OG image created and deployed
  • Blog cover images (15 articles)
  • GSC verification + sitemap submitted
  • GA4 set up and linked to AdSense pipeline
  • Image assets compressed (logo + favicon)
  • Razorpay: payment.refunded webhook handler
  • ads.txt publisher ID filled in
  • Disabled user → links 410 cascade fix
  • HSTS header added to Worker responses

🔧 V3.2 — 1–2 Months

  • CSRF Origin check on all mutating endpoints
  • Google Safe Browsing API integration
  • Analytics writes moved to ctx.waitUntil() background
  • Payment plan activation poll (KV consistency UX)
  • Ad slot added to redirect interstitial page
  • UTM parameter length cap and control-char strip
  • Unit test suite (validateUrl, webhook HMAC, etc.)
  • Admin link pagination (cursor-based at scale)
  • Razorpay webhook failure monitoring/alerting

✨ V4 Features — 2–4 Months

  • Custom domain support for PRO users (links.yoursite.com)
  • Team/workspace accounts (multi-user collaboration)
  • Webhook delivery for click events (Zapier/n8n)
  • Link rotation (round-robin across multiple destinations)
  • Retargeting pixel management UI (FB, Google, TikTok)
  • Folder/tag organisation for link management
  • Scheduled link activation/deactivation
  • API v2 with rate-limited public sandbox
  • QR code bulk generation

🏛️ Infrastructure — 3–6 Months

  • Cloudflare Durable Objects for atomic click counters
  • D1 read replicas once Cloudflare GA
  • Admin link audit log (who changed what and when)
  • Automated billing emails (invoice on charge)
  • E2E tests (Playwright): shorten→redirect, checkout flow
  • OpenAPI spec auto-generated from Worker route handlers
  • Multi-region KV pre-warming for viral link bursts
  • Automated abuse report ingestion (GSB reporting)
SET UP NOW
Cloudflare Worker error alerting: error rate >1% over 5 minutes → email alert
HIGHFree in Cloudflare dashboard. 5 minutes to set up.
▼ Show Steps
📋 Setup
  1. 1
    Cloudflare Dashboard → Workers & Pages → alshorty-v3 → Observability → Notifications.
  2. 2
    Create alert: "Worker Error Rate" → trigger when >1% over 5 minutes → your email.
  3. 3
    Create second alert: "Worker CPU Time" → trigger when >50ms average → performance regression signal.
Uptime monitoring for alshorty.com and api.alshorty.com — alert within 3 minutes of downtime
HIGHFree at BetterUptime or UptimeRobot. Takes 5 minutes.
▼ Show Steps
📋 Setup
  1. 1
    Sign up at betteruptime.com (free) or uptimerobot.com (free).
  2. 2
    Add: https://alshorty.com/health → expect 200 → check every 3 minutes.
  3. 3
    Add: https://api.alshorty.com/api/status → expect 200 → check every 3 minutes.
  4. 4
    Set alerts to email + SMS if available.
Enable Cloudflare D1 query analytics — identify slow queries before they cause timeouts
MEDIUMCloudflare Dashboard → D1 → SHORTY_DB → Queries. Look for queries >50ms.
▼ Show Steps
📋 What to check
  1. 1
    In D1 query analytics, filter for queries taking >50ms. These are your bottlenecks.
  2. 2
    Most common issue: unindexed clicks table. Verify: CREATE INDEX IF NOT EXISTS idx_clicks_link_code ON clicks(link_code, created_at)
  3. 3
    Also check users table has index on email — it's queried on every authenticated request.
Weekly check: Razorpay webhook delivery history for failed events
MEDIUMIf webhook fails silently, users pay but stay on Free plan. Calendar reminder: every Monday.
Create separate alshorty-dev Worker for dev3 routes (separate from production Worker)
LOWdev3 routes currently on the prod Worker. Should be isolated to avoid production/dev confusion.
⚠️
This checklist resets. Unlike other panels, these items should be re-checked before every deploy. They are not "done forever" — they are "done for this deployment."
WORKER DEPLOY CHECKS
Verify FRONTEND_ORIGIN = https://alshorty.com in wrangler.toml (not dev3)
CRITICALThis keeps getting accidentally reset. Run: grep FRONTEND_ORIGIN worker/wrangler.toml
wrangler deploy --dry-run passes with no errors
HIGHcd worker && wrangler deploy --dry-run 2>&1 | grep -i "error\|warn"
No secrets accidentally in wrangler.toml [vars] section
HIGHgrep -E "KEY|SECRET|TOKEN|PASSWORD" worker/wrangler.toml — output should be empty or only comments
UI DEPLOY CHECKS
TypeScript build passes: tsc --noEmit && vite build with no errors
HIGHcd ui && tsc --noEmit && vite build 2>&1 | tail -20
sitemap.xml regenerated if any blog posts were added or removed
MEDIUMnode generate-sitemap.mjs — verify output includes all blog URLs
POST-DEPLOY SMOKE TESTS
Smoke test: shorten a URL → click it → verify 5-second redirect page appears
CRITICALOpen the short link in incognito. Countdown must appear and redirect to correct destination.
Check browser console: only expected noise (__admin/config 401 on Pricing page is normal)
HIGHOpen DevTools on: home page, dashboard, pricing. Any other 4xx/5xx on API calls = real issue.
Verify magic link email has correct domain (alshorty.com, not dev3)
HIGHRequest a magic link → check the email → link must say alshorty.com in the URL.
alshorty.com/sitemap.xml returns valid XML (not SPA 404)
MEDIUMcurl -I https://alshorty.com/sitemap.xml → expect Content-Type: text/xml and HTTP/2 200
alshorty.com/ads.txt is accessible and contains valid content (not ## commented placeholder)
MEDIUMcurl https://alshorty.com/ads.txt → should show google.com, ca-pub-YOURPUBID, DIRECT...
🔑 Credentials & Keys (never store real values here — store hints only)
📌 Decisions & Context
🐛 Bug Log & Known Issues
Current stack handles ~10 million redirects/month comfortably before any optimisation is needed. The biggest immediate risk is D1 writes on the redirect hot path — fix that first.
📊 Current Limits (Cloudflare Free/Paid)
ResourceFreePaid ($5/mo)
Worker requests/day100,00010 million
Worker CPU time/request10ms50ms
KV reads/day100,000Unlimited
KV writes/day1,000Unlimited
D1 reads/day5 million25 billion
D1 writes/day100,00050 million

⚠️ Each redirect = 1 Worker req + 1 KV read + 1 D1 write (analytics). At 100k redirects/day you'll hit the free D1 write limit. Move analytics to waitUntil() first.

🚦 When to Scale
SignalAction
Worker CPU >30ms avgProfile + optimise routes
D1 writes >80k/dayUpgrade to $5 Workers plan
KV reads >80k/day (free)Upgrade or batch reads
D1 queries >50msAdd indexes (see below)
>50k active links in KVConsider prefix-sharding
Click counters race conditionDurable Objects (V4)
IMMEDIATE PERFORMANCE WINS
Move analytics D1 writes to ctx.waitUntil() — removes ~30ms from every redirect
HIGHEvery redirect currently waits for D1 INSERT before returning the 302. User shouldn't wait for analytics.
▼ Show Steps
📋 Fix
  1. 1
    In routes/redirect.js, find where recordClick() is called.
  2. 2
    Change from: await recordClick(...)
    To: ctx.waitUntil(recordClick(...))
  3. 3
    Return the redirect response immediately above this line — user gets the 302 before any DB work.
  4. 4
    Same for KV click_count increment — wrap in waitUntil.
✅ This single change reduces p95 redirect latency by 30-60ms and protects against D1 rate limits.
Verify D1 indexes exist on clicks(link_code, created_at) and users(email)
HIGHWithout these, every analytics query does a full table scan. Kills performance at 100k+ clicks.
▼ Show Steps
📋 Check and add indexes
  1. 1
    Check existing indexes: wrangler d1 execute alshorty-db-prod --remote --command="SELECT name,tbl_name FROM sqlite_master WHERE type='index'"
  2. 2
    Add if missing: wrangler d1 execute alshorty-db-prod --remote --command="CREATE INDEX IF NOT EXISTS idx_clicks_link ON clicks(link_code, created_at DESC)"
  3. 3
    wrangler d1 execute alshorty-db-prod --remote --command="CREATE INDEX IF NOT EXISTS idx_clicks_time ON clicks(created_at DESC)"
  4. 4
    wrangler d1 execute alshorty-db-prod --remote --command="CREATE INDEX IF NOT EXISTS idx_users_email ON users(email)"
Upgrade to Cloudflare Workers Paid plan ($5/month) when approaching 80k requests/day
MEDIUMFree plan: 100k req/day + 10ms CPU. Paid: 10M req/day + 50ms CPU + no KV write limits.
▼ Show Steps
📋 How to upgrade
  1. 1
    Monitor daily requests: Cloudflare Dashboard → Workers → alshorty-v3 → Metrics → Requests.
  2. 2
    When approaching 80k/day, go to: Cloudflare → Workers → Plans → Upgrade to Workers Paid ($5/month).
  3. 3
    No code changes needed — the same Worker runs on the paid plan automatically.
💡 $5/month gives you 10M requests/day. At average 30-second sessions, that's ~3,000 concurrent users.
Implement click counter with Cloudflare Durable Objects for atomic increments (V4)
LOWCurrent KV read-modify-write has race condition. Two simultaneous clicks both read count=5 and both write 6 instead of 7.
▼ Show Steps
📋 When to do this
  1. 1
    This only matters for links getting 100+ simultaneous clicks (viral links). At your current scale, the race condition causes maybe 0.1% under-count. Acceptable.
  2. 2
    When you need exact counts: create a Durable Object LinkCounter with an atomic increment() method. Each link gets its own DO instance keyed by link code.
  3. 3
    Durable Objects are available on Workers Paid plan. No extra cost at your scale.
Bio slug lookup: O(1) KV index (bio_slug:{slug}) — was O(n) full scan
DONEIndex written on create/update. Old bios backfilled on first visit. Bio pages now scale correctly.
Multi-layer rate limiting: 120/min global IP, 20/min auth, per-user plan quotas
DONEsecurity.js applies global limits. links.js enforces day/month quotas. Anon: 2 links total per 24h.
V3 is live. The 7-day tasks fix immediate revenue and legal blockers. 30 days gets AdSense approved. 3 months builds organic traffic. 6 months to first meaningful MRR milestone. 12 months positions you for a real product launch.
🔥 Days 1–7 Do This Week
7 tasks · Critical blockers · ~12 hours total
Fix disabled user → links still redirect (security gap)
CRITICAL~2 hours. Banned abusers keep live phishing links. See Auth section for implementation.
Build and deploy cookie consent banner
CRITICAL~3 hours. Blocks AdSense, GDPR liability until fixed. See Do Today for steps.
Regenerate sitemap.xml with all 15 blog posts + submit to GSC
CRITICAL~30 minutes. Google can't find your content without this.
Set up uptime monitoring (BetterUptime/UptimeRobot)
HIGH~15 minutes. Free. Know when site goes down before users report it.
Set up Cloudflare Worker error alerting (error rate >1% → email)
HIGH~10 minutes. Free. Know about 5xx spikes immediately.
Handle payment.refunded webhook — downgrade user on refund
HIGH~2 hours. Refunded users stay on PRO indefinitely. Revenue leak.
Move D1 analytics writes to ctx.waitUntil() background
HIGH~1 hour. Removes 30ms latency from every redirect. Protects D1 rate limits.
🟠 Days 8–30 AdSense Ready
6 tasks · Get AdSense approved · ~20 hours total
Create and deploy OG image (1200×630px) + all 15 blog cover images
HIGH~4 hours. Required for AdSense review. Social previews broken without OG image.
Compress logo1.png (173KB→<20KB WebP) and favicon.png (108KB→<5KB)
HIGH~1 hour. Squoosh.app. Required for 85+ PageSpeed score before AdSense application.
Fix ads.txt with real publisher ID + apply for Google AdSense
HIGHAfter cookie banner + images are done. Apply at adsense.google.com. Review takes 1–4 weeks.
Add Google Analytics 4 (conditionally loaded with cookie consent)
MEDIUM~2 hours. AdSense reviewers look for organic traffic evidence. Link GA4 to AdSense after approval.
Add AdSense ad slot to free-user redirect interstitial page
HIGH~2 hours. The 5-second redirect page is your primary monetisation window. No ad slot there yet.
Write 5 new blog posts targeting high-value keywords (URL shortener comparison, etc.)
MEDIUM~10 hours. Each blog post is a landing page for organic traffic. Compound returns over time.
🟢 Months 1–3 Traffic & Revenue
5 tasks · First paying users · SEO foundation
Add CSRF Origin checks on all mutating endpoints (createLink, deleteLink, updateBio)
MEDIUMSecurity hardening. ~3 hours. See Auth section for implementation.
Integrate Google Safe Browsing API (free tier, 10k lookups/day)
MEDIUM~4 hours. Catches freshly-registered phishing domains that keyword lists miss. Legal liability reduction.
Implement payment activation polling (KV eventual consistency UX)
MEDIUM~3 hours. Users briefly see "Free" plan after upgrading due to KV propagation delay.
Publish 10 more blog posts + start internal linking strategy
MEDIUM25 total articles is the minimum for AdSense to see "content depth". Interlink related posts.
Set up weekly automated backup (cron + backup.sh)
MEDIUM~1 hour. See Config section. Schedule: crontab every Monday 9am.
🔵 Months 3–6 Feature Expansion
4 tasks · PRO feature growth · Team features begin
Custom domain support for PRO users (links.yoursite.com)
MEDIUMHighest-value PRO upsell. Requires Cloudflare custom domain API + Worker route per user. Complex but high leverage.
Webhook delivery for click events (Zapier/Make/n8n integration)
MEDIUMPRO feature. Send click data to user's CRM/analytics. High B2B value.
Team/workspace accounts (multi-user collaboration)
MEDIUMEnables B2B pricing tier. Organisations pay 3-5x individual users.
Folder/tag organisation + link rotation + scheduled activation
LOWPower-user features that increase PRO stickiness and reduce churn.
Months 6–12 Scale & Monetise
4 tasks · Infrastructure hardening · Growth flywheel
Durable Objects for atomic click counters (eliminate race condition at scale)
LOWOnly needed when links regularly see 100+ simultaneous clicks. Cloudflare Workers Paid plan required.
API v2 with public sandbox, rate-limited free tier, OpenAPI spec
MEDIUMDeveloper-led growth. Public API attracts integrators, drives word-of-mouth, enables partner ecosystem.
E2E test suite (Playwright): shorten → redirect, checkout, auth flows
MEDIUMPrevents regressions as codebase grows. Critical paths must never break silently.
Multi-region KV pre-warming for viral link bursts
LOWWhen a link goes viral, KV needs to be warm at all edges. Pre-warm popular links every 5 minutes.
🏗️ How Everything Connects
USER REQUEST FLOW
Browser → alshorty.com (Cloudflare Pages SPA)
Browser → alshorty.com/link/*Worker → KV lookup → redirect page HTML
Browser → alshorty.com/api/*Worker → D1/KV → JSON response
Browser → api.alshorty.com/*Worker (via CNAME) → D1/KV → JSON
WORKER INTERNAL ROUTING (index.js)
Request → security checks → rate limit → route to handler:
  /api/auth/*     → routes/auth.js
  /api/shorten   → routes/links.js
  /api/links     → routes/links.js
  /api/bio       → routes/links.js
  /api/analytics → routes/analytics.js
  /api/payments/* → routes/payments.js
  /api/webhooks/* → routes/payments.js
  /__admin/*     → admin/admin-backend.js
  /link/* /s/* /go/* /run/* /bio/* → routes/redirect.js
SECURITY LAYER (applied to every request)
  security.js   → bot UA check, global rate limit (120/min), IP ban check, honeypot
  constants.js → SOURCE OF TRUTH for all limits, block lists, pricing
  validation.js→ URL validation, alias validation, domain type checks
  cors.js      → CORS headers, origin allowlist
📁 Worker File Structure
worker/
  wrangler.toml            ← config, KV/D1 bindings, routes
  package.json
  src/
    index.js               ← ROUTER + kill switch
    config/
      constants.js          ← ALL limits & block lists ← EDIT HERE
    routes/
      auth.js               ← magic link, Google, Microsoft
      links.js              ← create/edit/delete links, bio
      redirect.js           ← handles /link/, /s/, /go/, /run/
      analytics.js          ← click stats, export
      payments.js           ← Razorpay order + webhook
    utils/
      security.js           ← rate limiting, bot detection
      validation.js         ← URL + alias validation
      auth.js               ← session cookies, token utils
      cors.js               ← CORS headers, origin check
      response.js           ← ok(), badRequest(), notFound()
    admin/
      admin-backend.js      ← all /__admin/* endpoints
    templates/
      redirect-page.js      ← 5-second interstitial HTML
      bio-page.js           ← /bio/{slug} HTML
    db/
      schema.sql            ← D1 table definitions
📁 UI File Structure
ui/
  index.html               ← meta tags, GSC, AdSense script
  vite.config.ts
  public/
    _headers               ← Cloudflare Pages security headers
    _redirects             ← SPA catch-all (/* /index.html)
    sitemap.xml            ← auto-generated, do not edit manually
    robots.txt
    ads.txt                ← AdSense publisher ID goes here
  src/
    App.tsx                ← all routes defined here
    lib/
      api.ts                ← ALL API calls from frontend
      utils.ts              ← formatNumber, formatDate, etc.
    pages/
      Dashboard.tsx, Analytics.tsx
      Pricing.tsx, Billing.tsx
      Admin.tsx              ← admin panel
      LinkInBio.tsx, Bulk.tsx
      Privacy.tsx, Terms.tsx, DMCA.tsx
    components/
      shared/ShortenForm.tsx  ← main URL shortening form
      shared/ProRoute.tsx    ← PRO plan route guard
      layout/Header.tsx
      layout/Footer.tsx
    contexts/
      AuthContext.tsx        ← session state, user object
🔑 Key Variables — Where to Change What
What you want to changeFileVariable/SectionNotes
Pricing (₹ amounts)constants.jsPRICING.MONTHLY.INOr Admin → Config panel (no redeploy)
Free plan limitsconstants.jsLIMITS.FREElinks_per_day, links_per_month
PRO plan limitsconstants.jsLIMITS.PROlinks_per_day, links_per_month
Anon user link limitconstants.jsANON_MAX_URLSDefault: 2 links per 24h per IP
Blocked shortener domainsconstants.jsBLOCKED_SHORTENER_DOMAINSAdd domains to block re-shortening
Blocked domains (adult/malware)constants.jsBLOCKED_DOMAINSIP grabbers, paste sites, adult content
Blocked TLDsconstants.jsBLOCKED_TLDS.tk .ml .cf .ga .gq etc.
Phishing keywordsconstants.jsSUSPICIOUS_KEYWORDSCrypto scams, credential phishing
Phishing regex patternssecurity.jsPHISHING_PATTERNSComplex pattern matching
Global rate limitsecurity.js_checkGlobalRate()120 req/min general, 20 req/min auth
Allowed CORS originsconstants.jsALLOWED_ORIGINSAdd new domains here if adding subdomains
Production domainwrangler.tomlFRONTEND_ORIGINMust be https://alshorty.com — check before every deploy
Free link expiry timeconstants.jsFREE_LINK_EXPIRY_MSHow long anon links survive
Bot UA block listsecurity.jsBOT_UA_FRAGMENTSScanner/exploit tool user agents
API base URL (frontend)CF Pages env varsVITE_API_URLhttps://api.alshorty.com
AdSense publisher IDCF Pages env varsVITE_ADSENSE_CLIENT_IDSet after AdSense approval
🚀 Deployment Commands — Every Scenario
WORKER DEPLOY
# Standard deploy (most common — do this after any worker code change)
cd worker
wrangler deploy
# Dry run first (always safe to do before a real deploy)
wrangler deploy --dry-run
# Watch live logs after deploying (tail the production worker)
wrangler tail
# Filter logs to just errors
wrangler tail --format pretty --status error
# Local dev against REAL production data (careful — uses prod KV/D1)
wrangler dev --remote
# Check what's deployed right now
wrangler deployments list
# Rollback to previous version
wrangler rollback
FRONTEND DEPLOY
# Normal deploy — just push to main and GitHub Actions deploys automatically
cd ui
git add -A
git commit -m "your message"
git push origin main
# GitHub Actions runs deploy-ui.yml → Cloudflare Pages builds + deploys
# Build locally to test before pushing
npm run build
npx serve dist  # preview at localhost:3000
# TypeScript check (catches errors before push)
npx tsc --noEmit
# Force redeploy without code changes (via Cloudflare dashboard)
# → Pages → alshorty-v3-frontend → Deployments → Retry deployment
SITEMAP REGENERATE
# Run from project root (not from ui/ or worker/)
node generate-sitemap.mjs
# This writes to ui/public/sitemap.xml — then commit and push
git add ui/public/sitemap.xml
git commit -m "regenerate sitemap"
git push
🗄️ Database Operations (D1)
# ── READ DATA ────────────────────────────────────────────────
# Count total users
wrangler d1 execute alshorty-db-prod --remote --command="SELECT COUNT(*) as total FROM users"
# List latest 10 users with plan
wrangler d1 execute alshorty-db-prod --remote --command="SELECT email, plan, created_at FROM users ORDER BY created_at DESC LIMIT 10"
# Count clicks today
wrangler d1 execute alshorty-db-prod --remote --command="SELECT COUNT(*) as clicks_today FROM clicks WHERE date(created_at/1000,'unixepoch')=date('now')"
# List all PRO users
wrangler d1 execute alshorty-db-prod --remote --command="SELECT email, created_at FROM users WHERE plan='pro' ORDER BY created_at DESC"
# Check specific user
wrangler d1 execute alshorty-db-prod --remote --command="SELECT * FROM users WHERE email='user@example.com'"
# ── UPDATE DATA ──────────────────────────────────────────────
# Manually upgrade user to PRO
wrangler d1 execute alshorty-db-prod --remote --command="UPDATE users SET plan='pro' WHERE email='user@example.com'"
# Downgrade user to free
wrangler d1 execute alshorty-db-prod --remote --command="UPDATE users SET plan='free' WHERE email='user@example.com'"
# Make user an admin
wrangler d1 execute alshorty-db-prod --remote --command="UPDATE users SET is_admin=1 WHERE email='your@email.com'"
# Disable user account
wrangler d1 execute alshorty-db-prod --remote --command="UPDATE users SET is_disabled=1 WHERE email='bad@actor.com'"
# ── SCHEMA ───────────────────────────────────────────────────
# List all tables
wrangler d1 execute alshorty-db-prod --remote --command="SELECT name FROM sqlite_master WHERE type='table'"
# Apply schema (fresh install only)
wrangler d1 execute alshorty-db-prod --remote --file=worker/src/db/schema.sql
# Export full DB backup
wrangler d1 export alshorty-db-prod --remote --output=backup-$(date +%Y%m%d).sql
🗃️ KV Operations
# List KV namespaces
wrangler kv namespace list
# Count all link keys in SHORTY_LINKS namespace
wrangler kv key list --namespace-id <SHORTY_LINKS_ID> | python3 -c "import json,sys; print(len(json.load(sys.stdin)), 'keys')"
# Read a specific link
wrangler kv key get "link:abc123" --namespace-id <SHORTY_LINKS_ID>
# Delete a specific link (by code)
wrangler kv key delete "link:abc123" --namespace-id <SHORTY_LINKS_ID>
# List all session keys (rate limit keys)
wrangler kv key list --namespace-id <SHORTY_LIMITS_ID> --prefix "ratelimit:"
# Clear an IP ban manually
wrangler kv key delete "ban:<IP_HASH>" --namespace-id <SHORTY_LIMITS_ID>
# KV Namespace IDs (from wrangler.toml):
# SHORTY_LINKS ← links, bio pages, bio_slug index
# SHORTY_SESSIONS ← active session tokens
# SHORTY_LIMITS ← rate limits, IP bans, anon counters
# SHORTY_CACHE ← temporary cache
# SHORTY_QUEUE ← background job queue
🔐 Secrets Management
# List all secret names (never shows values)
wrangler secret list
# Add or update a secret
wrangler secret put RAZORPAY_KEY_ID
# (you'll be prompted to type the value — it's hidden)
# Delete a secret
wrangler secret delete SECRET_NAME
# Required secrets — all 8 must be set:
ADMIN_SECRET          # /__admin/* access (generate with: openssl rand -hex 32)
RESEND_API_KEY        # magic link emails (from resend.com)
RAZORPAY_KEY_ID       # rzp_live_xxx (public key)
RAZORPAY_KEY_SECRET   # rzp secret (NEVER expose in frontend)
RAZORPAY_WEBHOOK_SECRET # webhook HMAC verification
GOOGLE_CLIENT_ID      # xxx.apps.googleusercontent.com
MICROSOFT_CLIENT_ID   # Azure app client ID (optional)
MICROSOFT_CLIENT_SECRET # Azure app secret (optional)
🌐 Cloudflare Pages Environment Variables
ℹ️
Set these at: Cloudflare Dashboard → Workers & Pages → alshorty-v3-frontend → Settings → Variables and Secrets → Production. After adding/changing, trigger a new deployment.
VariableValueNotes
VITE_API_URLhttps://api.alshorty.comWorker API endpoint
VITE_GOOGLE_CLIENT_IDSecret — your Google OAuth client IDRequired for Google sign-in button
VITE_RAZORPAY_KEY_IDrzp_live_xxxPublic key — safe in frontend
VITE_MICROSOFT_CLIENT_IDAzure app client ID (optional)Leave empty if not using Microsoft auth
VITE_ADSENSE_CLIENT_IDca-pub-YOURPUBIDSet AFTER AdSense approval. Leave blank until then.
PUBLIC_API_BASE_URLhttps://api.alshorty.comUsed by some components directly
🧹 Routine Housekeeping
WEEKLY (every Monday ~15 min)
Check Razorpay webhook delivery history for failed events
HIGHRazorpay Dashboard → Settings → Webhooks → your endpoint → Recent Deliveries. Any failures = users paid but didn't get PRO.
Review Cloudflare Worker error rate + CPU time in dashboard
MEDIUMWorkers → alshorty-v3 → Metrics. Error rate should be <0.1%. CPU time should be <20ms avg.
Run database backup: wrangler d1 export alshorty-db-prod --remote
HIGHStore the .sql file in a safe location. This is your only recovery option if data is lost.
▼ Show Steps
📋 Full backup procedure
  1. 1
    cd worker && wrangler d1 export alshorty-db-prod --remote --output=backups/db-$(date +%Y%m%d).sql
  2. 2
    Also export KV keys: wrangler kv key list --namespace-id <SHORTY_LINKS_ID> > backups/kv-keys-$(date +%Y%m%d).json
  3. 3
    Upload both files to Google Drive or Dropbox for offsite storage.
  4. 4
    Keep last 4 weeks of backups, then delete older ones.
⚠️ Cloudflare does NOT back up your D1 data automatically. If you delete the DB or corrupt it, only your manual backup can save you.
MONTHLY (first of each month ~30 min)
Review admin panel: users, links, payments — spot any anomalies
MEDIUMGo to alshorty.com/admin → Overview. Check: new user signups, PRO conversions, link creation volume, any suspicious spike in activity.
Check D1 query analytics for slow queries (>50ms)
MEDIUMCloudflare Dashboard → D1 → SHORTY_DB → Queries. Slow queries need indexes added.
Review and update block lists in constants.js if new abuse patterns discovered
LOWAdd newly-discovered phishing domains to BLOCKED_DOMAINS. Add new scam keywords to SUSPICIOUS_KEYWORDS.
Check Resend dashboard for email delivery failures
MEDIUMresend.com → Logs. Any failed magic link deliveries = users who couldn't sign in.
📋 Common Operational Tasks — Step by Step
How to manually grant PRO to a user
ADMIN TASKVia admin panel or direct D1 command.
▼ Show Steps
📋 Two ways to grant PRO
  1. 1
    Via Admin Panel (easiest): Go to alshorty.com/admin → Users → search for user → click "Grant PRO" button.
  2. 2
    Via CLI: wrangler d1 execute alshorty-db-prod --remote --command="UPDATE users SET plan='pro' WHERE email='user@email.com'"
  3. 3
    Ask user to log out and log back in to refresh their session with the new plan.
How to delete a malicious/spam link
ADMIN TASK
▼ Show Steps
📋 Delete a link
  1. 1
    Via Admin Panel: alshorty.com/admin → Links → find the link code → click Delete.
  2. 2
    Via CLI: wrangler kv key delete "link:CODE" --namespace-id <SHORTY_LINKS_ID>
  3. 3
    Also disable the user if it's part of a larger abuse pattern: Admin → Users → Disable account.
How to enable maintenance mode (kill switch)
ADMIN TASKDisables all public-facing features while keeping admin access.
▼ Show Steps
📋 Kill switch
  1. 1
    Via Admin Panel: alshorty.com/admin → System → "Enable Maintenance Mode" toggle.
  2. 2
    Via CLI: wrangler kv key put "system:kill_switch" "true" --namespace-id <SHORTY_LIMITS_ID>
  3. 3
    All public requests return 503. Admin endpoints (/__admin/*) still work.
  4. 4
    To disable: delete the key: wrangler kv key delete "system:kill_switch" --namespace-id <SHORTY_LIMITS_ID>
How to temporarily reduce pricing for testing
CONFIG TASKUse Admin panel — no code change or redeploy needed.
▼ Show Steps
📋 Change pricing
  1. 1
    Via Admin Panel: alshorty.com/admin → Config → Pricing section → change values → Save. Takes effect immediately, no redeploy.
  2. 2
    For Razorpay test mode: Change RAZORPAY_KEY_ID secret to a rzp_test_xxx key and redeploy worker. Test cards work without charging real money.
  3. 3
    Minimum Razorpay order amount is ₹1 (100 paise). Set pricing.monthly.inr to 1 in Admin Config for testing.
⚠️ Remember to restore real pricing and switch back to live Razorpay keys before going live!
How to reset a user's password / send new magic link manually
SUPPORT TASK
▼ Show Steps
📋 Help a user log in
  1. 1
    Ask the user to visit alshorty.com/auth and request a new magic link. Resend sends it automatically.
  2. 2
    If email not arriving, check Resend dashboard → Logs → filter by their email address.
  3. 3
    If their account is disabled, re-enable via Admin Panel → Users → find user → Enable.
How to add a new blog post to the site
CONTENT TASK
▼ Show Steps
📋 Publishing a blog post
  1. 1
    Open ui/src/lib/blog-data.ts. Add a new object to the array with: slug, title, excerpt, content, date, readTime, category, coverImage.
  2. 2
    Add a cover image to ui/public/blog-images/your-slug.webp (1200×630px, <80KB).
  3. 3
    Regenerate sitemap: node generate-sitemap.mjs from project root.
  4. 4
    Commit and push: git add -A && git commit -m "add blog: your title" && git push. Pages auto-deploys.
How to restore from a database backup
DISASTER RECOVERYOnly needed if database is corrupted or accidentally deleted.
▼ Show Steps
📋 Restore procedure
🚨 This overwrites the current database. Only do this in an emergency. Double-check the backup file is the right one.
  1. 1
    Get your latest backup .sql file from wherever you store backups.
  2. 2
    Enable maintenance mode first (kill switch) so no new writes happen during restore.
  3. 3
    wrangler d1 execute alshorty-db-prod --remote --file=your-backup.sql
  4. 4
    Verify row counts: wrangler d1 execute alshorty-db-prod --remote --command="SELECT COUNT(*) FROM users; SELECT COUNT(*) FROM clicks"
  5. 5
    Disable maintenance mode. Test login, link creation, redirects.
How to roll back the Worker to previous version
EMERGENCYIf a bad deploy breaks something, rollback takes 30 seconds.
▼ Show Steps
📋 Rollback procedure
  1. 1
    Fastest method: cd worker && wrangler rollback — rolls back to the previous deployment instantly.
  2. 2
    Specific version: wrangler deployments list → find the version ID → wrangler rollback <VERSION_ID>
  3. 3
    Via Dashboard: Cloudflare → Workers → alshorty-v3 → Deployments → find a good deployment → "Rollback to this version".
  4. 4
    Test immediately after rollback: curl https://api.alshorty.com/health
🩺 Health Check Commands — Run These After Every Deploy
# 1. Worker alive?
curl https://alshorty.com/health
# Expected: {"status":"ok","version":"3.0","ts":1234567890}
# 2. All KV namespaces + D1 green?
curl https://api.alshorty.com/api/status
# Expected: all bindings showing "ok"
# 3. Shortening works?
curl -X POST https://api.alshorty.com/api/shorten \
  -H "Content-Type: application/json" \
  -d '{"url":"https://google.com"}'
# Expected: {"ok":true,"code":"abc123","short_url":"https://alshorty.com/link/abc123"}
# 4. Redirect page showing?
curl -I https://alshorty.com/link/abc123
# Expected: HTTP/2 200 (the redirect page HTML, not a 302)
# 5. sitemap.xml accessible?
curl -I https://alshorty.com/sitemap.xml
# Expected: HTTP/2 200 with Content-Type: text/xml
# 6. Admin stats (replace YOUR_ADMIN_SECRET)
curl https://alshorty.com/__admin/stats \
  -H "Authorization: Bearer YOUR_ADMIN_SECRET"
# Expected: JSON with user counts, link counts, etc.
# 7. Watch live logs
wrangler tail --format pretty