mirror of
https://github.com/codeflash-ai/codeflash-internal.git
synced 2026-05-04 18:25:18 +00:00
Reverts the following commits from main: -d7a8b8f2perf: fix CI build + lazy-load heavy libs + parallelize DB queries (#2601) -48b5e2b4fix: make tree-sitter WASM build failure non-fatal when cache exists (#2602) -c372b6bcMerge pull request #2603 from codeflash-ai/fix/deploy-build-common -b656bb1dfix: cf-api deploy broken by pnpm workspace migration -c1b0076cfix: align TypeScript versions to deduplicate @prisma/client in pnpm -09ed4d4bfix: use redirect instead of throw for auth failures during prerender -71127055fix: redirect remaining auth throws that crash prerendering PR #2601 introduced 18 bugs including 5 authorization bypass vulnerabilities: - Cross-org data access via forged currentOrganizationId cookie - Cross-repo/cross-org member role escalation and deletion (unscoped lookups) - Missing replayTests/concolicTests in approval flow - repository_id filter silently broken for personal accounts - Tests mocking wrong Prisma method ($queryRawUnsafe vs $queryRaw) The subsequent PRs (#2602, #2603, and follow-up commits) were dependent fixes for issues caused by #2601 and are reverted together. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
parent
71127055f3
commit
a805f4cfbf
159 changed files with 41546 additions and 24616 deletions
|
|
@ -4,13 +4,13 @@ paths:
|
|||
---
|
||||
# JS/TS Packages
|
||||
|
||||
NEVER start, restart, or manage dev servers (pnpm dev, node, nohup, background processes). The developer will run services manually.
|
||||
NEVER start, restart, or manage dev servers (npm run dev, node, nohup, background processes). The developer will run services manually.
|
||||
|
||||
pnpm workspace at `js/`. Install from workspace root: `cd js && pnpm install`. All use ESLint + Prettier.
|
||||
All use ESLint + Prettier. Run commands from each package directory.
|
||||
|
||||
## Prisma
|
||||
|
||||
Schema lives in `common/prisma/schema.prisma`, shared by cf-api and cf-webapp. pnpm's isolated node_modules means each package gets its own `@prisma/client` — no symlinks needed. `common` is CommonJS — use `require`-style imports when working with it directly. Published as `@codeflash-ai/common` to GitHub Packages; workspace packages reference it as `"workspace:*"`.
|
||||
Schema lives in `common/prisma/schema.prisma`, shared by cf-api and cf-webapp. `common` is CommonJS — use `require`-style imports when working with it directly. Published as `@codeflash-ai/common` to GitHub Packages.
|
||||
|
||||
## Package Gotchas
|
||||
|
||||
|
|
|
|||
|
|
@ -1,116 +0,0 @@
|
|||
# Handoff - Prisma Optimization Session (continued)
|
||||
|
||||
## Environment
|
||||
- Node.js 25.8.1, npm 11.11.0
|
||||
- Next.js 16.2.3, Prisma 7.7.0, PostgreSQL
|
||||
- Branch: perf/absolute-performance
|
||||
- Tests: 39 pass (0 failures -- fixed 3 pre-existing failures in this session)
|
||||
- Types: clean (0 errors -- fixed 5 pre-existing TS2339 errors in this session)
|
||||
|
||||
## Focus
|
||||
Prisma query optimization in cf-webapp. Targeting: overfetching, missing select,
|
||||
redundant queries, permission-check full-table loads, and missing indexed lookups.
|
||||
|
||||
## Session Tag
|
||||
prisma-2026-04-11
|
||||
|
||||
## Previous session commits (13b302a8 through 2444d1b4)
|
||||
See full git log for details. Major optimizations:
|
||||
- findFirst->findUnique on composite indexes
|
||||
- Loading ALL members replaced with parallel indexed lookups
|
||||
- Set/Map-based lookups replacing Array.some/Array.find
|
||||
- Sequential Promise.all batches merged
|
||||
- DB indexes added for observability queries
|
||||
- "use cache" migration for observability pages
|
||||
- Layout query consolidation
|
||||
- Consolidated count queries, select narrowing, parallelized login callback
|
||||
- Dashboard CTE rewrite: UNION for personal accounts instead of 3-way OR
|
||||
- PR data query UNION CTE for personal accounts
|
||||
|
||||
## This session commits
|
||||
|
||||
### Commit: 6f9e81a6
|
||||
perf: add select narrowing to organization queries and error fetches
|
||||
- cached-dashboard-data.ts: organizations select only id, name (skips
|
||||
description, website, github_org_id, auto_add_github_members, etc.)
|
||||
- dashboard/action.ts getUserOrganizations: same select narrowing
|
||||
- members/action.ts getOrganizationMembers: select only id + nested members
|
||||
- members/data.ts getMembersPageInitData: same select narrowing
|
||||
- llm-call/[id]/page.tsx: select 6 rendered fields from optimization_errors
|
||||
(skips stack_trace Text column)
|
||||
|
||||
### Commit: 7221d448
|
||||
perf: narrow optimization_features select in getTraceData, fix pre-existing type errors
|
||||
- optimization_features.findFirst: select only 12 consumed fields instead of
|
||||
all 30+ columns (skips optimizations_raw, speedup_ratio, experiment_metadata,
|
||||
original_runtime, approval_*, slack_message_ts, etc.)
|
||||
- optimization_errors.findMany: added id/created_at back to select (fixed 5
|
||||
pre-existing TS2339 errors from previous session's aggressive narrowing)
|
||||
|
||||
### Commit: 1ef61d1e
|
||||
perf: add select narrowing to llm_calls.findUnique on detail page
|
||||
- Excludes 8 unused columns including large JSON blobs: messages, parsed_response, context,
|
||||
plus max_tokens, retry_count, user_id, python_version, is_async
|
||||
|
||||
### Commit: bcaf08b5
|
||||
perf: avoid intermediate Date objects in trace aggregation loop
|
||||
- Store first_seen/last_seen as numeric timestamps during aggregation
|
||||
- Convert to Date once per trace at the end
|
||||
- Sort on numeric timestamps instead of calling .getTime() in comparator
|
||||
- Use for-of loop instead of .forEach
|
||||
|
||||
### Commit: f96fba76
|
||||
perf: cache split("/")[0] result instead of calling twice
|
||||
- In getRepositoryById and getOptimizationRepositories
|
||||
|
||||
### Commit: d6cab273
|
||||
perf: add loading.tsx skeletons for observability detail pages
|
||||
- llm-calls/loading.tsx and llm-call/[id]/loading.tsx
|
||||
- These pages lack internal Suspense and make DB queries at server component level
|
||||
|
||||
### Commit: ee535ae9
|
||||
perf: restructure getOptimizationPRs to limit before joining
|
||||
- Both org and personal paths now use two-phase CTE:
|
||||
phase 1: identify page of event IDs using EXISTS (no full JOIN)
|
||||
phase 2: JOIN only ~10 result IDs with optimization_features and repositories
|
||||
- Removed unused dataWhereClause variable
|
||||
|
||||
### Commit: 26307af8
|
||||
fix: add missing _count to getRepositoryById test mock
|
||||
- Fixed all 3 pre-existing test failures (39/39 now pass)
|
||||
|
||||
### Commit: 817e5884
|
||||
fix: add defense-in-depth SQL interpolation guards to dashboard queries
|
||||
- sqlUuid(), sqlUserId(), sqlUsername(), sqlEventType() validation functions
|
||||
- Math.trunc() for numeric values
|
||||
|
||||
## Not addressed (assessed and skipped)
|
||||
- get-trace-data.ts findFirst with startsWith -- cannot use findUnique (not unique key)
|
||||
- review-optimizations/[traceId]/action.ts:166 findFirst with complex OR -- correct as-is
|
||||
- repository-utils.ts sequential memoryCache operations -- in-memory, likely synchronous
|
||||
- getUserOrganizations vs getCachedDashboardData -- different caching layers for different purposes
|
||||
- update operations returning full rows (privacy-actions, member role, save-modified-code) --
|
||||
write operations, infrequent, marginal savings from select narrowing
|
||||
- comments.findMany with include author -- already has select narrowing on author relation
|
||||
- getRepositoriesForAccountCached -- function from @codeflash-ai/common, cannot narrow from webapp side
|
||||
- 97 "use client" components -- all need interactivity; converting would be architectural change
|
||||
- Radix UI packages in optimizePackageImports -- already direct imports, not barrel exports
|
||||
- .map().filter(Boolean) chains -- all on small arrays, intermediate arrays negligible
|
||||
|
||||
## Coverage summary
|
||||
All Prisma queries in cf-webapp/src have been audited. Remaining queries are either:
|
||||
1. Already using select narrowing (traces page, llm-calls page, repository members)
|
||||
2. Cached with "use cache" (organizations list, trace data, call types, models)
|
||||
3. Using efficient patterns (findUnique on composite keys, groupBy, raw SQL with UNION)
|
||||
4. Detail pages that legitimately need full rows (llm-call detail page)
|
||||
5. Write operations (create, update, delete) where return data is discarded
|
||||
|
||||
## Pre-submit review
|
||||
- Types: clean (tsc --noEmit passes with 0 errors)
|
||||
- Tests: 39 pass, 0 failures (fixed 3 pre-existing failures)
|
||||
- No behavior changes -- all permission checks preserve identical logic
|
||||
- No resource ownership issues
|
||||
- No concurrency concerns -- all queries are per-request, no shared mutable state
|
||||
- SQL interpolation defense-in-depth guards added for all raw SQL queries
|
||||
- getOptimizationPRs query restructured to LIMIT before JOINing large tables
|
||||
- Breadth scan completed across all 246 TypeScript files in cf-webapp/src
|
||||
|
|
@ -1,119 +0,0 @@
|
|||
## Summary
|
||||
|
||||
Comprehensive Prisma query optimization across cf-webapp, targeting overfetching, missing select narrowing, redundant queries, permission-check full-table loads, and missing indexed lookups. Completed breadth scan of all 246 TypeScript files in cf-webapp/src.
|
||||
|
||||
## Optimizations
|
||||
|
||||
### Query Optimization (`perf/absolute-performance`)
|
||||
|
||||
| # | Target | Pattern | Impact | Domain |
|
||||
|---|--------|---------|--------|--------|
|
||||
| 1 | members/action.ts | findFirst→findUnique on composite index, parallel permission checks | Index-seek replaces table-scan | query, structure |
|
||||
| 2 | repositories/action.ts | findFirst→findUnique, parallel permission checks, select narrowing | Index-seek replaces table-scan | query, structure |
|
||||
| 3 | members/data.ts | findFirst→findUnique for org lookup | Single-row PK seek | query |
|
||||
| 4 | privacy-actions.ts | findFirst→findUnique with composite key | Index-seek replaces scan | query |
|
||||
| 5 | review-optimizations/action.ts | Set-based lookup replacing Array.some | O(1) vs O(n) per item | cpu |
|
||||
| 6 | get-recent-traces.ts | Map-based lookup replacing Array.find in loop | O(1) vs O(n) per item | cpu |
|
||||
| 7 | llm-calls/page.tsx | Combined 2 sequential Promise.all into 1 parallel batch | Reduced sequential waterfall | async |
|
||||
| 8 | traces/page.tsx | Parallelized 2 independent sequential queries | Reduced sequential waterfall | async |
|
||||
| 9 | data.ts + repo-detail-client.tsx | Consolidated 2 separate count queries into single query | 2 roundtrips → 1 | query |
|
||||
| 10 | review-optimizations/action.ts | Narrowed repository include from all columns to 3 fields | Reduced data transfer | query |
|
||||
| 11 | [traceId]/action.ts | Narrowed repository include to id, full_name, name, installation_id | Reduced data transfer | query |
|
||||
| 12 | llm-calls/page.tsx | Hoisted cached filter queries into main Promise.all | Eliminated waterfall stage | async |
|
||||
| 13 | members/data.ts | Eliminated redundant findUnique for current user role | 1 roundtrip eliminated | query |
|
||||
| 14 | [traceId]/action.ts | Added select:{metadata:true} to saveOptimizationChanges | Reduced data transfer | query |
|
||||
| 15 | auth0.ts | Parallelized trackUserLogin and hasCompletedOnboarding | Reduced login latency | async |
|
||||
| 16 | dashboard/action.ts | Statistics CTE rewrite: UNION instead of 3-way OR | 3 index-backed scans replace bitmap OR merge | query |
|
||||
| 17 | dashboard/action.ts | PR data query: UNION CTE for personal accounts | 3 index-backed scans replace bitmap OR merge | query |
|
||||
| 18 | cached-dashboard-data.ts | Select only id, name from organizations | Reduced data transfer | query |
|
||||
| 19 | dashboard/action.ts | Select only id, name from organizations in getUserOrganizations | Reduced data transfer | query |
|
||||
| 20 | members/action.ts | Select only id+members from organizations | Reduced data transfer | query |
|
||||
| 21 | members/data.ts | Select only id+members from organizations in getMembersPageInitData | Reduced data transfer | query |
|
||||
| 22 | llm-call/[id]/page.tsx | Select 6 fields from optimization_errors (skips stack_trace Text) | Reduced data transfer | query |
|
||||
| 23 | get-trace-data.ts | Select only 6 consumed fields from optimization_errors | Reduced data transfer | query |
|
||||
| 24 | get-trace-data.ts | Select 12 fields from optimization_features (skips 30+ columns) | Reduced data transfer - large JSON/Text excluded | query |
|
||||
| 25 | llm-call/[id]/page.tsx | Select 22 fields from llm_calls (skips messages, parsed_response, context) | Reduced data transfer - large JSON excluded | query |
|
||||
| 26 | traces/page.tsx | Store timestamps as numbers during aggregation | Avoids 2 Date objects per call per trace | cpu, memory |
|
||||
| 27 | action.ts (dashboard+repo) | Cache full_name.split("/")[0] into local variable | Avoids duplicate string split | cpu |
|
||||
| 28 | llm-calls/loading.tsx + llm-call/[id]/loading.tsx | Add streaming loading skeletons | Instant shell streaming while data fetches resolve | async |
|
||||
| 29 | dashboard/action.ts | Restructure getOptimizationPRs: LIMIT before JOIN | JOINs only ~10 rows instead of all candidates | query |
|
||||
| 30 | traces/page.tsx | Rewrite getDistinctTraces as raw SQL CTE using composite index | Leverages [trace_id, created_at DESC] for MAX aggregation | query |
|
||||
| 31 | traces/page.tsx | Rewrite getUniqueOrganizations as raw SQL with partial index | Partial index scan replaces full table scan | query |
|
||||
| 32 | common/prisma/migrations | Add partial index on optimization_features.organization WHERE NOT NULL | Smaller, faster index for DISTINCT organization queries | query |
|
||||
| 33 | review-optimizations/action.ts | Fix groupBy type annotation | Resolve TS2345 type error in org account path | structure |
|
||||
| 34 | dashboard/action.ts | Replace EXISTS with LEFT JOIN in getOptimizationPRs count queries | Avoids row-by-row subquery evaluation for both org + personal paths | query |
|
||||
| 35 | dashboard/action.ts | Replace EXISTS with LEFT JOIN in getOptimizationPRs data queries | Avoids row-by-row subquery evaluation for both org + personal paths | query |
|
||||
|
||||
**Commits (current session - 2026-04-11):**
|
||||
- `4f047220` — perf: optimize /observability/traces queries with raw SQL and partial index
|
||||
- `26910a49` — perf: replace EXISTS subqueries with LEFT JOIN in dashboard PR queries
|
||||
|
||||
**Commits (prior sessions):**
|
||||
- `1bbabd99` — chore: update optimization tracking for breadth scan results
|
||||
- `ee535ae9` — perf: restructure getOptimizationPRs to limit before joining
|
||||
- `d6cab273` — perf: add loading.tsx skeletons for observability detail pages
|
||||
- `f96fba76` — perf: cache split("/")[0] result instead of calling twice
|
||||
- `bcaf08b5` — perf: avoid intermediate Date objects in trace aggregation loop
|
||||
- `1ef61d1e` — perf: add select narrowing to llm_calls.findUnique on detail page
|
||||
- `817e5884` — fix: add defense-in-depth SQL interpolation guards to dashboard queries
|
||||
- `26307af8` — fix: add missing _count to getRepositoryById test mock
|
||||
- `7221d448` — perf: narrow optimization_features select in getTraceData, fix pre-existing type errors
|
||||
- `6f9e81a6` — perf: add select narrowing to organization queries and error fetches
|
||||
|
||||
**All commits (46 total):**
|
||||
See `git log main..perf/absolute-performance` for complete history.
|
||||
|
||||
## Key Discoveries
|
||||
|
||||
1. **Personal account queries use bitmap OR merge** — Dashboard statistics and PR data queries for personal accounts (no organization) used a 3-way OR condition that PostgreSQL optimized with bitmap OR merge. Rewriting as UNION queries allowed each branch to use its own index-backed scan, improving query efficiency.
|
||||
|
||||
2. **findFirst with composite index lookup** — Many queries used `findFirst` with a composite unique key (e.g., `{organizationId, userId}`) that could be replaced with `findUnique` for guaranteed single-row index seek instead of table scan.
|
||||
|
||||
3. **Permission checks load all members** — Several functions loaded all organization members into arrays, then used `Array.some()` or `Array.find()` in permission checks. Replaced with parallel indexed Prisma queries that exit early after first match.
|
||||
|
||||
4. **Select narrowing skips large columns** — Many queries fetched all columns when only a few were consumed. Added explicit `select` clauses to skip unused fields, especially large JSON and Text columns like `messages`, `parsed_response`, `context`, `stack_trace`.
|
||||
|
||||
5. **CTE query plan improvements** — Restructured `getOptimizationPRs` to `LIMIT` candidate event IDs in phase 1 (using EXISTS, no full JOIN), then JOIN only the ~10 result IDs with `optimization_features` and `repositories` in phase 2. Avoids large intermediate JOIN sets.
|
||||
|
||||
6. **Pre-existing failures masked by test runner** — Found 3 test failures that were pre-existing (missing `_count` field in mock) and 5 type errors (missing fields in select clause) that were not caught during previous sessions.
|
||||
|
||||
## Test Plan
|
||||
|
||||
- [x] All existing tests pass (39/39, fixed 3 pre-existing failures)
|
||||
- [x] Types clean (0 errors, fixed 5 pre-existing TS2339 errors)
|
||||
- [x] No performance regressions in non-targeted benchmarks
|
||||
- [x] Pre-submit review completed — all queries audited for select narrowing, indexed lookups, and parallel execution opportunities
|
||||
|
||||
## Session Summary (2026-04-11)
|
||||
|
||||
Targeted the 3 remaining performance priorities from profiling data:
|
||||
1. **/observability/traces** (3.3s) — optimized GROUP BY and DISTINCT organization queries
|
||||
2. **/dashboard PR queries** (921ms + 1435ms) — eliminated row-by-row EXISTS subquery evaluation
|
||||
3. **Duplicate per-page queries** — verified already addressed by prior "use cache" work
|
||||
|
||||
**Net impact:** ~5 seconds of query time eliminated across hot paths
|
||||
|
||||
## Skipped (assessed, not applicable)
|
||||
|
||||
- `get-trace-data.ts findFirst with startsWith` — cannot use findUnique (not a unique key)
|
||||
- `review-optimizations/[traceId]/action.ts:166 findFirst with complex OR` — correct as-is
|
||||
- `repository-utils.ts sequential memoryCache operations` — in-memory, likely synchronous
|
||||
- Write operations returning full rows (privacy-actions, member role, save-modified-code) — infrequent, marginal savings
|
||||
- Comments.findMany with include author — already has select narrowing on relation
|
||||
- `getRepositoriesForAccountCached` — function from @codeflash-ai/common, cannot narrow from webapp side
|
||||
- 97 "use client" components — all need interactivity, conversion would be architectural change
|
||||
- Radix UI packages in optimizePackageImports — already direct imports, not barrel exports
|
||||
- `.map().filter(Boolean)` chains — all on small arrays, intermediate arrays negligible
|
||||
|
||||
## Session Stats
|
||||
|
||||
- **Experiments**: 29 optimizations kept (0 discarded)
|
||||
- **Session duration**: Multiple sessions across ~2 weeks (42 commits total)
|
||||
- **Domains**: query (primary), cpu, memory, async, structure
|
||||
- **Files audited**: 246 TypeScript files in cf-webapp/src
|
||||
- **Branch**: perf/absolute-performance (42 commits ahead of main)
|
||||
- **Session tag**: prisma-2026-04-11
|
||||
|
||||
| 36 | apikeys/page.tsx | Rewrite getCachedApiKeys as UNION query | 2 index-backed scans replace bitmap OR with nested EXISTS | query |
|
||||
| 37 | common/user-functions.ts | Add getUserDashboardData consolidating 4 queries | Single fetch for onboarding, privacy, isPaid, subscription | query |
|
||||
| 38 | cached-dashboard-data.ts | Use getUserDashboardData for cold-load optimization | Reduces dashboard layout query count from 5 → 2 | query |
|
||||
|
|
@ -1,162 +0,0 @@
|
|||
# Cross-Session Learnings
|
||||
|
||||
## Personal Account Queries Use Bitmap OR Merge
|
||||
|
||||
Dashboard statistics and PR data queries for personal accounts (users without an organization) originally used a 3-way OR condition: `WHERE userId = $1 OR orgMember.userId = $1 OR orgAdmin.userId = $1`. PostgreSQL optimized this with a bitmap OR merge scan across multiple indexes, which is less efficient than individual index-backed scans.
|
||||
|
||||
**Solution:** Rewrite as UNION queries where each branch uses its own index-backed scan:
|
||||
```sql
|
||||
WITH filtered AS (
|
||||
-- Branch 1: personal repos
|
||||
SELECT id FROM repositories WHERE userId = $1
|
||||
UNION
|
||||
-- Branch 2: org member repos
|
||||
SELECT r.id FROM repositories r JOIN org_members om ON ... WHERE om.userId = $1
|
||||
UNION
|
||||
-- Branch 3: org admin repos
|
||||
SELECT r.id FROM repositories r JOIN org_admins oa ON ... WHERE oa.userId = $1
|
||||
)
|
||||
SELECT * FROM repositories WHERE id IN (SELECT id FROM filtered)
|
||||
```
|
||||
|
||||
Each UNION branch hits a specific index cleanly instead of merging bitmaps.
|
||||
|
||||
## findFirst with Composite Index Lookup
|
||||
|
||||
Many Prisma queries used `findFirst` with a composite unique key (e.g., `{organizationId, userId}`) that could be replaced with `findUnique` for guaranteed single-row index seek.
|
||||
|
||||
**Evidence:** `members/action.ts`, `repositories/action.ts`, `members/data.ts`, `privacy-actions.ts` all had patterns like:
|
||||
```ts
|
||||
const member = await prisma.organization_members.findFirst({
|
||||
where: { organizationId, userId }
|
||||
})
|
||||
```
|
||||
|
||||
When the schema has a unique constraint `@@unique([organizationId, userId])`, use:
|
||||
```ts
|
||||
const member = await prisma.organization_members.findUnique({
|
||||
where: { organizationId_userId: { organizationId, userId } }
|
||||
})
|
||||
```
|
||||
|
||||
This guarantees Prisma uses the unique index for a single-row seek instead of a table scan with LIMIT 1.
|
||||
|
||||
## Permission Checks Load All Members
|
||||
|
||||
Several functions loaded all organization members into arrays, then used `Array.some()` or `Array.find()` for permission checks:
|
||||
```ts
|
||||
const members = await prisma.organizations.findFirst({...}).members
|
||||
return members.some(m => m.userId === userId)
|
||||
```
|
||||
|
||||
This fetches all N members (O(N) DB transfer), then scans the array (O(N) CPU).
|
||||
|
||||
**Solution:** Use indexed Prisma query that exits early:
|
||||
```ts
|
||||
const member = await prisma.organization_members.findUnique({
|
||||
where: { organizationId_userId: { organizationId, userId } }
|
||||
})
|
||||
return member !== null
|
||||
```
|
||||
|
||||
This is O(1) DB query with early exit. For multiple permission checks in parallel, use `Promise.all` with individual indexed queries instead of loading all members once.
|
||||
|
||||
## Select Narrowing Skips Large Columns
|
||||
|
||||
Many Prisma queries fetched all columns when only a few were consumed in the UI or API response. This is especially wasteful for:
|
||||
- Large JSON columns: `messages`, `parsed_response`, `context`, `experiment_metadata`, `optimizations_raw`
|
||||
- Text columns: `stack_trace`
|
||||
- Unused metadata: `github_org_id`, `auto_add_github_members`, `retry_count`, `python_version`, `is_async`, etc.
|
||||
|
||||
**Solution:** Add explicit `select` clause listing only consumed fields:
|
||||
```ts
|
||||
const call = await prisma.llm_calls.findUnique({
|
||||
where: { id },
|
||||
select: {
|
||||
id: true, model: true, status: true, // ... only fields used in page
|
||||
// Omit: messages, parsed_response, context (large JSON)
|
||||
}
|
||||
})
|
||||
```
|
||||
|
||||
**Evidence:** `llm-call/[id]/page.tsx` reduced from fetching all 30 llm_calls columns to 22 (skipped 3 large JSON blobs + metadata). `get-trace-data.ts` reduced optimization_features from 30+ columns to 12 consumed fields.
|
||||
|
||||
## CTE Phase 1: LIMIT Before JOIN
|
||||
|
||||
When paginating a query that joins large tables, restructure the CTE to identify the page of IDs first (with LIMIT), then JOIN only those IDs in phase 2.
|
||||
|
||||
**Before (inefficient):**
|
||||
```sql
|
||||
WITH data AS (
|
||||
SELECT e.id, e.created_at, f.*, r.*
|
||||
FROM optimization_events e
|
||||
LEFT JOIN optimization_features f ON ...
|
||||
LEFT JOIN repositories r ON ...
|
||||
WHERE <filters>
|
||||
ORDER BY e.created_at DESC
|
||||
LIMIT 10
|
||||
)
|
||||
SELECT * FROM data
|
||||
```
|
||||
|
||||
This creates a large intermediate JOIN set before applying LIMIT.
|
||||
|
||||
**After (efficient):**
|
||||
```sql
|
||||
WITH page_ids AS (
|
||||
SELECT e.id
|
||||
FROM optimization_events e
|
||||
WHERE EXISTS (SELECT 1 FROM optimization_features f WHERE f.optimization_event_id = e.id)
|
||||
AND <filters>
|
||||
ORDER BY e.created_at DESC
|
||||
LIMIT 10
|
||||
),
|
||||
data AS (
|
||||
SELECT e.id, e.created_at, f.*, r.*
|
||||
FROM optimization_events e
|
||||
JOIN page_ids p ON e.id = p.id
|
||||
LEFT JOIN optimization_features f ON ...
|
||||
LEFT JOIN repositories r ON ...
|
||||
)
|
||||
SELECT * FROM data
|
||||
```
|
||||
|
||||
Phase 1 uses EXISTS (index-only check, no full JOIN) to identify ~10 event IDs. Phase 2 joins only those 10 IDs with the large tables.
|
||||
|
||||
**Evidence:** `getOptimizationPRs` in `dashboard/action.ts` — both org and personal account paths now use this two-phase CTE structure.
|
||||
|
||||
## EXISTS Subqueries vs LEFT JOIN for Filtering
|
||||
|
||||
When filtering rows based on the existence of related data, using `LEFT JOIN` with a boolean check is often faster than `EXISTS` subqueries, especially when the subquery would be evaluated row-by-row for many candidate rows.
|
||||
|
||||
**Before (slow):**
|
||||
```sql
|
||||
SELECT id FROM candidates c
|
||||
WHERE c.field IS NOT NULL
|
||||
OR EXISTS (
|
||||
SELECT 1 FROM related_table r
|
||||
WHERE r.key = c.key AND r.field IS NOT NULL
|
||||
)
|
||||
```
|
||||
|
||||
This evaluates the EXISTS subquery once per row in candidates. If there are 10,000 candidates, that's 10,000 subquery executions.
|
||||
|
||||
**After (fast):**
|
||||
```sql
|
||||
SELECT c.id, r.field IS NOT NULL AS has_related_field
|
||||
FROM candidates c
|
||||
LEFT JOIN related_table r ON c.key = r.key
|
||||
WHERE c.field IS NOT NULL OR r.field IS NOT NULL
|
||||
```
|
||||
|
||||
The LEFT JOIN is evaluated once with a hash join or index seek, then the filter is applied. Much more efficient for large candidate sets.
|
||||
|
||||
**Evidence:** `getOptimizationPRs` in `dashboard/action.ts` — replaced EXISTS checks for `optimization_features.pull_request` with LEFT JOIN in both count and data queries, for both org and personal account paths. Expected 921ms + 1435ms → <800ms combined.
|
||||
|
||||
## Pre-existing Failures Masked by Test Runner
|
||||
|
||||
Found 3 test failures and 5 type errors that were pre-existing but not caught in previous sessions:
|
||||
- Missing `_count` field in `getRepositoryById` test mock (test runner didn't fail until accessed)
|
||||
- Missing `id` and `created_at` in optimization_errors select clause (TypeScript TS2339 errors when accessed in UI)
|
||||
|
||||
**Lesson:** Always run full test suite AND type check (`tsc --noEmit`) after each optimization session, even if individual experiments passed their guard checks.
|
||||
|
|
@ -1,41 +0,0 @@
|
|||
commit target description status domains interaction
|
||||
13b302a8 members/action.ts findFirst->findUnique on composite index, parallel permission checks instead of loading all members (5 functions) keep query,structure index-seek replaces table-scan
|
||||
13b302a8 repositories/action.ts findFirst->findUnique, parallel permission checks, select narrowing (5 functions) keep query,structure index-seek replaces table-scan
|
||||
13b302a8 members/data.ts findFirst->findUnique for org lookup keep query single-row PK seek
|
||||
13b302a8 privacy-actions.ts findFirst->findUnique with composite key + select keep query index-seek replaces scan
|
||||
13b302a8 review-optimizations/action.ts Set-based lookup replacing Array.some keep cpu O(1) vs O(n) per item
|
||||
13b302a8 get-recent-traces.ts Map-based lookup replacing Array.find in loop keep cpu O(1) vs O(n) per item
|
||||
13b302a8 llm-calls/page.tsx Combined 2 sequential Promise.all into 1 parallel batch keep async reduced sequential waterfall
|
||||
13b302a8 traces/page.tsx Parallelized 2 independent sequential queries keep async reduced sequential waterfall
|
||||
a14cd8e7 data.ts+repo-detail-client.tsx Consolidated 2 separate count queries into single combined query keep query 2 roundtrips to 1
|
||||
16fc8856 review-optimizations/action.ts Narrowed repository include from all columns to 3 needed fields keep query reduced data transfer
|
||||
22ef695c [traceId]/action.ts Narrowed repository include to id,full_name,name,installation_id keep query reduced data transfer
|
||||
7fcbd321 llm-calls/page.tsx Hoisted cached filter queries into main Promise.all keep async eliminated waterfall stage
|
||||
972846ab members/data.ts Eliminated redundant findUnique for current user role keep query 1 roundtrip eliminated
|
||||
f8686933 [traceId]/action.ts Added select:{metadata:true} to saveOptimizationChanges findUnique keep query reduced data transfer
|
||||
cb384315 auth0.ts Parallelized trackUserLogin and hasCompletedOnboarding in login callback keep async reduced login latency
|
||||
bc715120 dashboard/action.ts Rewrite statistics CTE to use UNION instead of 3-way OR for personal accounts keep query 3 index-backed scans replace bitmap OR merge
|
||||
2444d1b4 dashboard/action.ts Rewrite PR data query to use UNION CTE for personal accounts keep query 3 index-backed scans replace bitmap OR merge
|
||||
6f9e81a6 cached-dashboard-data.ts Select only id,name from organizations (skips description, website, github_org_id, etc.) keep query reduced data transfer
|
||||
6f9e81a6 dashboard/action.ts Select only id,name from organizations in getUserOrganizations keep query reduced data transfer
|
||||
6f9e81a6 members/action.ts Select only id+members from organizations in getOrganizationMembers keep query reduced data transfer
|
||||
6f9e81a6 members/data.ts Select only id+members from organizations in getMembersPageInitData keep query reduced data transfer
|
||||
6f9e81a6 llm-call/[id]/page.tsx Select 6 fields from optimization_errors (skips stack_trace Text column) keep query reduced data transfer
|
||||
6f9e81a6 get-trace-data.ts Select only 6 consumed fields from optimization_errors (was 4, fixed to 6) keep query reduced data transfer
|
||||
7221d448 get-trace-data.ts Select 12 fields from optimization_features instead of all 30+ columns keep query reduced data transfer - skips large JSON/Text columns
|
||||
1ef61d1e llm-call/[id]/page.tsx Select 22 fields from llm_calls instead of all 30 (skips messages, parsed_response, context JSON blobs) keep query reduced data transfer - large JSON excluded
|
||||
bcaf08b5 traces/page.tsx Store timestamps as numbers during aggregation, convert to Date once per trace at end keep cpu,memory avoids 2 Date objects per call per existing trace
|
||||
f96fba76 action.ts (dashboard+repo) Cache full_name.split("/")[0] into local variable instead of calling twice keep cpu avoids duplicate string split
|
||||
d6cab273 llm-calls/loading.tsx + llm-call/[id]/loading.tsx Add streaming loading skeletons for observability pages without internal Suspense keep async instant shell streaming while server component data fetches resolve
|
||||
ee535ae9 dashboard/action.ts Restructure getOptimizationPRs: LIMIT before JOIN to optimization_features/repositories keep query JOINs only for ~10 result rows instead of all candidates
|
||||
ab15d0b5 review-optimizations/action.ts Wrap getRepositoriesWithStagingEvents + getAllOptimizationEvents with React cache() for request-level deduplication keep async,query eliminates 7-8x duplicate calls per request (9.1s + 15.4s → 3.5s expected)
|
||||
1a57228c review-optimizations/action.ts Rewrite getRepositoriesWithStagingEvents and getAllOptimizationEvents to use UNION queries for personal accounts keep query 3 index-backed scans replace bitmap OR merge (1633ms+1939ms → expected <1200ms total)
|
||||
PENDING traces/page.tsx Rewrite getDistinctTraces as raw SQL CTE to use [trace_id, created_at DESC] index for GROUP BY keep query leverages composite index for MAX aggregation (expected 616ms → <200ms)
|
||||
PENDING traces/page.tsx Rewrite getUniqueOrganizations as raw SQL to use partial index on (organization WHERE NOT NULL) keep query partial index scan replaces full table scan (expected 727-980ms → <100ms)
|
||||
PENDING common/prisma/migrations Add partial index on optimization_features(organization) WHERE organization IS NOT NULL keep query covers DISTINCT organization query with smaller index
|
||||
PENDING review-optimizations/action.ts Fix groupBy type annotation for organization account path keep structure resolve TS2345 type error
|
||||
PENDING dashboard/action.ts Replace EXISTS subqueries with LEFT JOIN in getOptimizationPRs count query (org + personal) keep query avoids row-by-row EXISTS evaluation (expected 921ms → <300ms)
|
||||
PENDING dashboard/action.ts Replace EXISTS subqueries with LEFT JOIN in getOptimizationPRs data query (org + personal) keep query avoids row-by-row EXISTS evaluation (expected 1435ms → <500ms)
|
||||
PENDING apikeys/page.tsx Rewrite getCachedApiKeys as UNION query to avoid OR with nested EXISTS keep query 2 index-backed scans replace bitmap OR merge (expected 787ms → <250ms)
|
||||
PENDING common/user-functions.ts Add getUserDashboardData to consolidate 4 separate user queries keep query 4 roundtrips → 2 (onboarding, privacy, isPaid, subscription)
|
||||
PENDING cached-dashboard-data.ts Use getUserDashboardData to eliminate separate user/subscription queries keep query reduces cold-load query count from 5 → 2
|
||||
|
Can't render this file because it contains an unexpected character in line 28 and column 59.
|
26
.github/workflows/cf-api-tests.yaml
vendored
26
.github/workflows/cf-api-tests.yaml
vendored
|
|
@ -60,27 +60,23 @@ jobs:
|
|||
node-version: '20'
|
||||
registry-url: https://npm.pkg.github.com
|
||||
scope: '@codeflash-ai'
|
||||
|
||||
- name: Install pnpm
|
||||
uses: pnpm/action-setup@v4
|
||||
with:
|
||||
version: 10
|
||||
cache: 'npm'
|
||||
cache-dependency-path: 'js/cf-api/package-lock.json'
|
||||
|
||||
- name: Install dependencies
|
||||
working-directory: js
|
||||
run: pnpm install --frozen-lockfile
|
||||
|
||||
- name: Build common package
|
||||
working-directory: js
|
||||
run: pnpm --filter @codeflash-ai/common build
|
||||
run: |
|
||||
cd js/cf-api
|
||||
npm ci
|
||||
|
||||
- name: Run tests
|
||||
working-directory: js/cf-api
|
||||
run: NODE_OPTIONS=--experimental-vm-modules pnpm jest --ci --config jest.config.cjs
|
||||
run: |
|
||||
cd js/cf-api
|
||||
NODE_OPTIONS=--experimental-vm-modules npx jest --ci --config jest.config.cjs
|
||||
|
||||
- name: Build
|
||||
working-directory: js/cf-api
|
||||
run: pnpm build
|
||||
run: |
|
||||
cd js/cf-api
|
||||
npm run build
|
||||
|
||||
# - name: Type check
|
||||
# run: |
|
||||
|
|
|
|||
40
.github/workflows/cf-webapp-quality-gates.yml
vendored
40
.github/workflows/cf-webapp-quality-gates.yml
vendored
|
|
@ -49,60 +49,44 @@ jobs:
|
|||
- uses: actions/setup-node@v6
|
||||
with:
|
||||
node-version: "20"
|
||||
cache: npm
|
||||
cache-dependency-path: js/cf-webapp/package-lock.json
|
||||
registry-url: https://npm.pkg.github.com
|
||||
scope: "@codeflash-ai"
|
||||
|
||||
- name: Install pnpm
|
||||
uses: pnpm/action-setup@v4
|
||||
with:
|
||||
version: 10
|
||||
|
||||
- name: Restore WASM artifacts cache
|
||||
uses: actions/cache@v5
|
||||
with:
|
||||
path: |
|
||||
js/cf-webapp/public/web-tree-sitter.wasm
|
||||
js/cf-webapp/public/tree-sitter-python.wasm
|
||||
js/cf-webapp/public/.tree-sitter-python-version
|
||||
key: wasm-${{ runner.os }}-${{ hashFiles('js/pnpm-lock.yaml') }}
|
||||
|
||||
- name: Install dependencies
|
||||
working-directory: js
|
||||
run: pnpm install --frozen-lockfile
|
||||
|
||||
- name: Build common package
|
||||
working-directory: js
|
||||
run: pnpm --filter @codeflash-ai/common build
|
||||
|
||||
- name: Generate Prisma client for cf-webapp
|
||||
working-directory: js/cf-webapp
|
||||
run: pnpm prisma generate
|
||||
run: npm ci --ignore-scripts
|
||||
|
||||
- name: Generate Prisma client
|
||||
working-directory: js/cf-webapp
|
||||
run: npx prisma generate
|
||||
|
||||
- name: Restore Next.js build cache
|
||||
uses: actions/cache@v5
|
||||
with:
|
||||
path: js/cf-webapp/.next/cache
|
||||
key: nextjs-${{ runner.os }}-${{ hashFiles('js/pnpm-lock.yaml') }}-${{ hashFiles('js/cf-webapp/src/**') }}
|
||||
key: nextjs-${{ runner.os }}-${{ hashFiles('js/cf-webapp/package-lock.json') }}-${{ hashFiles('js/cf-webapp/src/**') }}
|
||||
restore-keys: |
|
||||
nextjs-${{ runner.os }}-${{ hashFiles('js/pnpm-lock.yaml') }}-
|
||||
nextjs-${{ runner.os }}-${{ hashFiles('js/cf-webapp/package-lock.json') }}-
|
||||
nextjs-${{ runner.os }}-
|
||||
|
||||
- name: Type-check
|
||||
id: typecheck
|
||||
working-directory: js/cf-webapp
|
||||
run: pnpm tsc --noEmit
|
||||
run: npx tsc --noEmit
|
||||
continue-on-error: true
|
||||
|
||||
- name: Tests
|
||||
id: tests
|
||||
working-directory: js/cf-webapp
|
||||
run: pnpm vitest run --reporter=verbose 2>&1 | tee test-output.txt
|
||||
run: npx vitest run --reporter=verbose 2>&1 | tee test-output.txt
|
||||
continue-on-error: true
|
||||
|
||||
- name: Build
|
||||
id: build
|
||||
working-directory: js/cf-webapp
|
||||
run: pnpm next build 2>&1 | tee build-output.txt
|
||||
run: npx next build 2>&1 | tee build-output.txt
|
||||
continue-on-error: true
|
||||
|
||||
- name: Extract results
|
||||
|
|
|
|||
22
.github/workflows/codeflash-js.yaml
vendored
22
.github/workflows/codeflash-js.yaml
vendored
|
|
@ -87,17 +87,14 @@ jobs:
|
|||
uses: actions/setup-node@v6
|
||||
with:
|
||||
node-version: "20"
|
||||
cache: "npm"
|
||||
cache-dependency-path: js/cf-api/package-lock.json
|
||||
registry-url: https://npm.pkg.github.com
|
||||
scope: "@codeflash-ai"
|
||||
|
||||
- name: Install pnpm
|
||||
uses: pnpm/action-setup@v4
|
||||
with:
|
||||
version: 10
|
||||
|
||||
- name: Install cf-api dependencies
|
||||
working-directory: js
|
||||
run: pnpm install --frozen-lockfile
|
||||
working-directory: js/cf-api
|
||||
run: npm ci
|
||||
|
||||
- name: Set up Python and install Codeflash
|
||||
uses: astral-sh/setup-uv@v7
|
||||
|
|
@ -141,17 +138,14 @@ jobs:
|
|||
uses: actions/setup-node@v6
|
||||
with:
|
||||
node-version: "20"
|
||||
cache: "npm"
|
||||
cache-dependency-path: js/cf-webapp/package-lock.json
|
||||
registry-url: https://npm.pkg.github.com
|
||||
scope: "@codeflash-ai"
|
||||
|
||||
- name: Install pnpm
|
||||
uses: pnpm/action-setup@v4
|
||||
with:
|
||||
version: 10
|
||||
|
||||
- name: Install cf-webapp dependencies
|
||||
working-directory: js
|
||||
run: pnpm install --frozen-lockfile
|
||||
working-directory: js/cf-webapp
|
||||
run: npm ci
|
||||
|
||||
- name: Set up Python and install Codeflash
|
||||
uses: astral-sh/setup-uv@v7
|
||||
|
|
|
|||
19
.github/workflows/deploy_cfapi_to_azure.yml
vendored
19
.github/workflows/deploy_cfapi_to_azure.yml
vendored
|
|
@ -29,27 +29,20 @@ jobs:
|
|||
registry-url: https://npm.pkg.github.com
|
||||
scope: "@codeflash-ai"
|
||||
|
||||
- name: Install pnpm
|
||||
uses: pnpm/action-setup@v4
|
||||
with:
|
||||
version: 10
|
||||
|
||||
- name: Install dependencies
|
||||
working-directory: js
|
||||
run: pnpm install --frozen-lockfile
|
||||
|
||||
- name: Build common package
|
||||
working-directory: js
|
||||
run: pnpm --filter @codeflash-ai/common build
|
||||
run: |
|
||||
cd js/cf-api
|
||||
npm install
|
||||
|
||||
- name: Build and package app
|
||||
run: |
|
||||
cd js/cf-api
|
||||
pnpm build
|
||||
npm run build
|
||||
# Create deployment package with correct structure
|
||||
mkdir -p deployment
|
||||
cp -r dist deployment/
|
||||
cp -rL node_modules deployment/
|
||||
cp -r node_modules deployment/
|
||||
cp -r node_modules deployment/dist/
|
||||
cp package.json deployment/
|
||||
cp -r resend deployment/
|
||||
# Ensure markdown files are included
|
||||
|
|
|
|||
35
.github/workflows/deploy_cfwebapp_to_azure.yml
vendored
35
.github/workflows/deploy_cfwebapp_to_azure.yml
vendored
|
|
@ -29,42 +29,29 @@ jobs:
|
|||
registry-url: https://npm.pkg.github.com
|
||||
scope: "@codeflash-ai"
|
||||
|
||||
- name: Install pnpm
|
||||
uses: pnpm/action-setup@v4
|
||||
with:
|
||||
version: 10
|
||||
|
||||
- name: Restore WASM artifacts cache
|
||||
uses: actions/cache@v5
|
||||
with:
|
||||
path: |
|
||||
js/cf-webapp/public/web-tree-sitter.wasm
|
||||
js/cf-webapp/public/tree-sitter-python.wasm
|
||||
js/cf-webapp/public/.tree-sitter-python-version
|
||||
key: wasm-${{ runner.os }}-${{ hashFiles('js/pnpm-lock.yaml') }}
|
||||
- name: Configure .npmrc for GitHub Packages
|
||||
run: |
|
||||
echo "//npm.pkg.github.com/:_authToken=${NODE_AUTH_TOKEN}" > ~/.npmrc
|
||||
|
||||
- name: Install dependencies
|
||||
working-directory: js
|
||||
run: pnpm install --frozen-lockfile
|
||||
run: |
|
||||
cd js/cf-webapp
|
||||
npm install
|
||||
|
||||
- name: Restore Next.js build cache
|
||||
uses: actions/cache@v5
|
||||
with:
|
||||
path: js/cf-webapp/.next/cache
|
||||
key: nextjs-${{ runner.os }}-${{ hashFiles('js/pnpm-lock.yaml') }}-${{ hashFiles('js/cf-webapp/src/**') }}
|
||||
key: nextjs-${{ runner.os }}-${{ hashFiles('js/cf-webapp/package-lock.json') }}-${{ hashFiles('js/cf-webapp/src/**') }}
|
||||
restore-keys: |
|
||||
nextjs-${{ runner.os }}-${{ hashFiles('js/pnpm-lock.yaml') }}-
|
||||
nextjs-${{ runner.os }}-${{ hashFiles('js/cf-webapp/package-lock.json') }}-
|
||||
nextjs-${{ runner.os }}-
|
||||
|
||||
- name: Build common package
|
||||
working-directory: js
|
||||
run: pnpm --filter @codeflash-ai/common build
|
||||
|
||||
- name: Build and package app
|
||||
working-directory: js
|
||||
run: |
|
||||
pnpm --filter codeflash-webapp build
|
||||
cd cf-webapp && zip -qr cfwebapp.zip . .next node_modules package.json public
|
||||
cd js/cf-webapp
|
||||
npm run build
|
||||
zip -qr cfwebapp.zip . .next node_modules package.json public
|
||||
|
||||
- name: Upload artifact for deployment jobs
|
||||
uses: actions/upload-artifact@v7
|
||||
|
|
|
|||
34
.github/workflows/nextjs-build.yaml
vendored
34
.github/workflows/nextjs-build.yaml
vendored
|
|
@ -56,33 +56,27 @@ jobs:
|
|||
registry-url: https://npm.pkg.github.com
|
||||
scope: '@codeflash-ai'
|
||||
|
||||
- name: Install pnpm
|
||||
uses: pnpm/action-setup@v4
|
||||
with:
|
||||
version: 10
|
||||
|
||||
- name: Restore WASM artifacts cache
|
||||
uses: actions/cache@v5
|
||||
with:
|
||||
path: |
|
||||
js/cf-webapp/public/web-tree-sitter.wasm
|
||||
js/cf-webapp/public/tree-sitter-python.wasm
|
||||
js/cf-webapp/public/.tree-sitter-python-version
|
||||
key: wasm-${{ runner.os }}-${{ hashFiles('js/pnpm-lock.yaml') }}
|
||||
|
||||
- name: Install dependencies
|
||||
working-directory: js
|
||||
run: pnpm install --frozen-lockfile
|
||||
run: |
|
||||
cd js/cf-webapp
|
||||
# Install dependencies but skip prepare scripts
|
||||
npm ci --ignore-scripts
|
||||
|
||||
- name: Generate Prisma client
|
||||
run: |
|
||||
cd js/cf-webapp
|
||||
npx prisma generate
|
||||
|
||||
- name: Restore Next.js build cache
|
||||
uses: actions/cache@v5
|
||||
with:
|
||||
path: js/cf-webapp/.next/cache
|
||||
key: nextjs-${{ runner.os }}-${{ hashFiles('js/pnpm-lock.yaml') }}-${{ hashFiles('js/cf-webapp/src/**') }}
|
||||
key: nextjs-${{ runner.os }}-${{ hashFiles('js/cf-webapp/package-lock.json') }}-${{ hashFiles('js/cf-webapp/src/**') }}
|
||||
restore-keys: |
|
||||
nextjs-${{ runner.os }}-${{ hashFiles('js/pnpm-lock.yaml') }}-
|
||||
nextjs-${{ runner.os }}-${{ hashFiles('js/cf-webapp/package-lock.json') }}-
|
||||
nextjs-${{ runner.os }}-
|
||||
|
||||
- name: Build Next.js app
|
||||
working-directory: js
|
||||
run: pnpm --filter cf-webapp build
|
||||
run: |
|
||||
cd js/cf-webapp
|
||||
npx next build
|
||||
|
|
|
|||
3
.gitignore
vendored
3
.gitignore
vendored
|
|
@ -1,9 +1,6 @@
|
|||
# Tessl managed tiles (reinstalled via `tessl install`)
|
||||
.tessl/tiles/
|
||||
|
||||
# Playwright MCP snapshots
|
||||
.playwright-mcp/
|
||||
|
||||
# Byte-compiled / optimized / DLL files
|
||||
__pycache__/
|
||||
**/__pycache__/
|
||||
|
|
|
|||
|
|
@ -1 +0,0 @@
|
|||
@codeflash-ai:registry=https://npm.pkg.github.com
|
||||
22
js/CLAUDE.md
22
js/CLAUDE.md
|
|
@ -1,18 +1,12 @@
|
|||
# JS Packages
|
||||
|
||||
pnpm workspace (`js/pnpm-workspace.yaml`) with four TypeScript packages: cf-api, cf-webapp, common, VSC-Extension. See `.claude/rules/js-packages.md` for patterns and gotchas.
|
||||
Four TypeScript packages: cf-api, cf-webapp, common, VSC-Extension. See `.claude/rules/js-packages.md` for patterns and gotchas.
|
||||
|
||||
## Setup
|
||||
## Commands (run from each package directory)
|
||||
|
||||
```bash
|
||||
cd js && pnpm install
|
||||
```
|
||||
|
||||
## Commands (from `js/` workspace root)
|
||||
|
||||
| Package | Dev | Build | Test | Lint |
|
||||
|---------|-----|-------|------|------|
|
||||
| cf-api | `pnpm --filter cf-api dev` | `pnpm --filter cf-api build` | `pnpm --filter cf-api test` | `pnpm --filter cf-api lint` |
|
||||
| cf-webapp | `pnpm --filter cf-webapp dev` | `pnpm --filter cf-webapp build` | `pnpm --filter cf-webapp test` | `pnpm --filter cf-webapp lint` |
|
||||
| common | — | `pnpm --filter @codeflash-ai/common build` | — | — |
|
||||
| VSC-Extension | `npm run dev` | `npm run build` | `npm test` | `npm run lint` |
|
||||
| Package | Dev | Build | Test | Lint | Format |
|
||||
|---------|-----|-------|------|------|--------|
|
||||
| cf-api | `npm run dev` | `npm run build` | `npm test` | `npm run lint` | `npm run format` |
|
||||
| cf-webapp | `npm run dev` | `npm run build` | `npm test` | `npm run lint` | `npm run format` |
|
||||
| common | — | `npm run build` | — | — | `npm run format` |
|
||||
| VSC-Extension | `npm run dev` | `npm run build` | `npm test` | `npm run lint` | `npm run format` |
|
||||
|
|
|
|||
54
js/README.md
54
js/README.md
|
|
@ -10,16 +10,14 @@ CodeFlash AI is a JavaScript/TypeScript monorepo that provides a scalable and mo
|
|||
js/
|
||||
├── common/ # Shared code and database schema
|
||||
├── cf-api/ # Backend API service
|
||||
├── cf-webapp/ # Next.js web application
|
||||
├── VSC-Extension/ # VS Code extension
|
||||
└── pnpm-workspace.yaml
|
||||
└── cf-webapp/ # Next.js web application
|
||||
```
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- Node.js (v20+)
|
||||
- pnpm (v10+): `npm install -g pnpm`
|
||||
- Prisma CLI (installed as devDependency)
|
||||
- Node.js (v18+ recommended)
|
||||
- npm (v9+)
|
||||
- Prisma CLI
|
||||
|
||||
## Setup
|
||||
|
||||
|
|
@ -33,8 +31,11 @@ cd codeflash-ai/js
|
|||
### 2. Install Dependencies
|
||||
|
||||
```bash
|
||||
# Install all workspace dependencies from js/
|
||||
pnpm install
|
||||
# Install root and project dependencies
|
||||
npm install
|
||||
cd common && npm install
|
||||
cd ../cf-api && npm install
|
||||
cd ../cf-webapp && npm install
|
||||
```
|
||||
|
||||
### 3. Database Configuration
|
||||
|
|
@ -42,8 +43,8 @@ pnpm install
|
|||
```bash
|
||||
# Generate Prisma client and run migrations
|
||||
cd common
|
||||
pnpm prisma generate
|
||||
pnpm prisma migrate dev
|
||||
npx prisma generate
|
||||
npx prisma migrate dev
|
||||
```
|
||||
|
||||
## Development Workflow
|
||||
|
|
@ -51,18 +52,21 @@ pnpm prisma migrate dev
|
|||
### Start Development Servers
|
||||
|
||||
```bash
|
||||
# From js/ workspace root:
|
||||
pnpm --filter cf-api dev
|
||||
pnpm --filter cf-webapp dev
|
||||
# Start API server
|
||||
cd cf-api
|
||||
For local development, developers would use `npm run dev`
|
||||
For production (Azure), the system would use `npm run start`
|
||||
|
||||
# Start web application
|
||||
cd cf-webapp
|
||||
npm run dev
|
||||
```
|
||||
|
||||
### Build
|
||||
### Build Common Package
|
||||
|
||||
```bash
|
||||
# Build individual packages
|
||||
pnpm --filter cf-webapp build
|
||||
pnpm --filter cf-api build
|
||||
pnpm --filter @codeflash-ai/common build
|
||||
cd common
|
||||
npm run build
|
||||
```
|
||||
|
||||
## Key Components
|
||||
|
|
@ -72,7 +76,12 @@ pnpm --filter @codeflash-ai/common build
|
|||
- Shared TypeScript utilities
|
||||
- Prisma database schema
|
||||
- Reusable functions across projects
|
||||
- Referenced as `"workspace:*"` by cf-api and cf-webapp
|
||||
|
||||
#### Installation in Other Projects
|
||||
|
||||
```bash
|
||||
npm install @codeflash-ai/common
|
||||
```
|
||||
|
||||
#### Usage Example
|
||||
|
||||
|
|
@ -82,7 +91,7 @@ import { createOrUpdateUser } from "@codeflash-ai/common"
|
|||
|
||||
## Best Practices
|
||||
|
||||
1. Always install from the workspace root (`js/`)
|
||||
1. Always build the common package after making changes
|
||||
2. Keep shared logic in the `common` package
|
||||
3. Use TypeScript for type safety
|
||||
4. Follow existing code structure
|
||||
|
|
@ -91,7 +100,8 @@ import { createOrUpdateUser } from "@codeflash-ai/common"
|
|||
## Publishing common Package
|
||||
|
||||
```bash
|
||||
# Publish common package to npm
|
||||
cd common
|
||||
pnpm build
|
||||
pnpm publish
|
||||
npm run build
|
||||
npm publish
|
||||
```
|
||||
|
|
|
|||
9
js/cf-api/.eslintignore
Normal file
9
js/cf-api/.eslintignore
Normal file
|
|
@ -0,0 +1,9 @@
|
|||
node_modules/
|
||||
dist/
|
||||
build/
|
||||
coverage/
|
||||
*.config.js
|
||||
.eslintrc.mjs
|
||||
// Comment out the ESLint line temporarily to allow for the build to pass
|
||||
**/*.ts
|
||||
**/*.js
|
||||
22
js/cf-api/.eslintrc.mjs
Normal file
22
js/cf-api/.eslintrc.mjs
Normal file
|
|
@ -0,0 +1,22 @@
|
|||
export default {
|
||||
root: true,
|
||||
env: {
|
||||
node: true,
|
||||
es2021: true,
|
||||
es6: true,
|
||||
},
|
||||
extends: ["eslint:recommended", "plugin:@typescript-eslint/recommended", "prettier"],
|
||||
parser: "@typescript-eslint/parser",
|
||||
parserOptions: {
|
||||
ecmaVersion: 2022,
|
||||
sourceType: "module",
|
||||
project: "./tsconfig.json",
|
||||
tsconfigRootDir: import.meta.dirname,
|
||||
extraFileExtensions: [".mjs"],
|
||||
},
|
||||
plugins: ["@typescript-eslint"],
|
||||
ignorePatterns: ["dist/**", "node_modules/**", "*.config.js", ".eslintrc.mjs", "jest.config.cjs"],
|
||||
rules: {
|
||||
"@typescript-eslint/no-var-requires": "off",
|
||||
},
|
||||
}
|
||||
|
|
@ -4,11 +4,13 @@ import { ManagementClient } from "auth0"
|
|||
let managementClient: ManagementClient | null = null
|
||||
|
||||
export function getManagementClient(): ManagementClient {
|
||||
managementClient ||= new ManagementClient({
|
||||
domain: process.env.AUTH0_ISSUER_BASE_URL ?? "",
|
||||
clientId: process.env.AUTH0_MANAGEMENT_CLIENT_ID ?? "",
|
||||
clientSecret: process.env.AUTH0_MANAGEMENT_CLIENT_SECRET ?? "",
|
||||
})
|
||||
if (!managementClient) {
|
||||
managementClient = new ManagementClient({
|
||||
domain: process.env.AUTH0_ISSUER_BASE_URL ?? "",
|
||||
clientId: process.env.AUTH0_MANAGEMENT_CLIENT_ID ?? "",
|
||||
clientSecret: process.env.AUTH0_MANAGEMENT_CLIENT_SECRET ?? "",
|
||||
})
|
||||
}
|
||||
return managementClient
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -1,11 +1,11 @@
|
|||
import type { FileDiffContent, Hunk } from "@codeflash-ai/code-suggester/build/src/types.js"
|
||||
import { type FileDiffContent, type Hunk } from "@codeflash-ai/code-suggester/build/src/types.js"
|
||||
|
||||
import {
|
||||
getRawSuggestionHunks,
|
||||
partitionSuggestedHunksByScope,
|
||||
} from "@codeflash-ai/code-suggester/build/src/utils/hunk-utils.js"
|
||||
import { getPullRequestHunks } from "@codeflash-ai/code-suggester/build/src/github/review-pull-request.js"
|
||||
import type { Octokit } from "@octokit/rest"
|
||||
import { type Octokit } from "@octokit/rest"
|
||||
|
||||
export function fileDiffsToMap(obj: Record<string, FileDiffContent>): Map<string, FileDiffContent> {
|
||||
const map = new Map()
|
||||
|
|
|
|||
|
|
@ -1,8 +1,9 @@
|
|||
// Handler for the /cfapi/cli-get-user endpoint
|
||||
|
||||
import fs from "node:fs"
|
||||
import path, { dirname } from "node:path"
|
||||
import { fileURLToPath } from "node:url"
|
||||
import fs from "fs"
|
||||
import path from "path"
|
||||
import { fileURLToPath } from "url"
|
||||
import { dirname } from "path"
|
||||
|
||||
const __filename = fileURLToPath(import.meta.url)
|
||||
const __dirname = dirname(__filename)
|
||||
|
|
@ -12,12 +13,12 @@ const min_version = fs
|
|||
.trim()
|
||||
|
||||
export function getUser(req, res) {
|
||||
const cli_version = req.headers.cli_version || "unknown"
|
||||
const cli_version = req.headers["cli_version"] || "unknown"
|
||||
|
||||
if (cli_version !== "unknown") {
|
||||
res.status(200).send({
|
||||
userId: req.userId,
|
||||
min_version,
|
||||
min_version: min_version,
|
||||
})
|
||||
} else {
|
||||
res.status(200).send(req.userId)
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
import { Request, Response } from "express"
|
||||
import { AuthorizedUserReq } from "../types.js"
|
||||
import { AuthorizedUserReq } from "types.js"
|
||||
import { userNickname } from "../auth0-mgmt.js"
|
||||
import { getInstallationOctokitByOwner, isUserCollaborator } from "../github/github-utils.js"
|
||||
import { githubApp } from "../github/github-app.js"
|
||||
|
|
|
|||
|
|
@ -122,7 +122,7 @@ export async function is_code_being_optimized_again(req: Request, res: Response)
|
|||
properties: {
|
||||
repo_owner: owner,
|
||||
repo_name: repo,
|
||||
pr_number,
|
||||
pr_number: pr_number,
|
||||
},
|
||||
})
|
||||
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
import type { Response } from "express"
|
||||
import { prisma } from "@codeflash-ai/common"
|
||||
import { AuthorizedUserReq } from "../types.js"
|
||||
import { AuthorizedUserReq } from "types.js"
|
||||
import { githubApp } from "../github/github-app.js"
|
||||
import { isUserCollaborator } from "../github/github-utils.js"
|
||||
import { userNickname } from "../auth0-mgmt.js"
|
||||
|
|
@ -29,8 +29,8 @@ let dependencies: CommitStagingCodeDependencies = {
|
|||
findFirst: prisma.optimization_events.findFirst,
|
||||
},
|
||||
},
|
||||
getInstallationOctokit: async (installationId: number) =>
|
||||
await (githubApp.getInstallationOctokit(installationId) as Promise<Octokit>),
|
||||
getInstallationOctokit: (installationId: number) =>
|
||||
githubApp.getInstallationOctokit(installationId) as Promise<Octokit>,
|
||||
userNickname,
|
||||
isUserCollaborator,
|
||||
}
|
||||
|
|
@ -46,8 +46,8 @@ export function resetCommitStagingCodeDependencies() {
|
|||
findFirst: prisma.optimization_events.findFirst,
|
||||
},
|
||||
},
|
||||
getInstallationOctokit: async (installationId: number) =>
|
||||
await (githubApp.getInstallationOctokit(installationId) as Promise<Octokit>),
|
||||
getInstallationOctokit: (installationId: number) =>
|
||||
githubApp.getInstallationOctokit(installationId) as Promise<Octokit>,
|
||||
userNickname,
|
||||
isUserCollaborator,
|
||||
}
|
||||
|
|
@ -132,16 +132,16 @@ export async function executeCommitStagingCode(
|
|||
|
||||
// Get repository info
|
||||
const repository = stagingEvent.repository
|
||||
if (!repository?.installation_id) {
|
||||
if (!repository || !repository.installation_id) {
|
||||
return {
|
||||
status: 404,
|
||||
data: { error: "No repository or installation found for this staging event" },
|
||||
}
|
||||
}
|
||||
|
||||
const [owner, repo] = String(repository.full_name).split("/")
|
||||
const [owner, repo] = repository.full_name.split("/")
|
||||
const installationOctokit = await dependencies.getInstallationOctokit(
|
||||
Number(repository.installation_id),
|
||||
repository.installation_id,
|
||||
)
|
||||
|
||||
// Check if user is a collaborator before proceeding
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
import { fileDiffsToMap, isDiffContentsWellFormed } from "../diff_utils.js"
|
||||
import type { FileDiffContent } from "@codeflash-ai/code-suggester/build/src/types.js"
|
||||
import { type FileDiffContent } from "@codeflash-ai/code-suggester/build/src/types.js"
|
||||
import { userNickname } from "../auth0-mgmt.js"
|
||||
import {
|
||||
addLabelToPullRequest,
|
||||
|
|
@ -34,7 +34,7 @@ import {
|
|||
prisma,
|
||||
upsertRepository,
|
||||
} from "@codeflash-ai/common"
|
||||
import { AuthorizedUserReq } from "../types.js"
|
||||
import { AuthorizedUserReq } from "types.js"
|
||||
import {
|
||||
requestApproval,
|
||||
requiresApprovalForRepo,
|
||||
|
|
@ -124,7 +124,9 @@ export function createStandalonePRTitleAndBody(
|
|||
|
||||
const metadata = buildOptimizationMetadata(prCommentFields, trace_id)
|
||||
let optReviewBadge = generateOptimizationReviewTemplate(optimizationReview)
|
||||
optReviewBadge &&= ` ${optReviewBadge}\n`
|
||||
if (optReviewBadge) {
|
||||
optReviewBadge = ` ${optReviewBadge}\n`
|
||||
}
|
||||
|
||||
// Add line profiler link if profiler data exists
|
||||
let lineProfilerSection = ""
|
||||
|
|
@ -200,7 +202,7 @@ const defaultPrContentBuilder: PrContentBuilder = {
|
|||
}
|
||||
|
||||
let dependencies: CreatePrDependencies = {
|
||||
prisma,
|
||||
prisma: new PrismaClient(),
|
||||
userNickname,
|
||||
getInstallationOctokitByOwner,
|
||||
isUserCollaborator,
|
||||
|
|
@ -214,7 +216,7 @@ let dependencies: CreatePrDependencies = {
|
|||
}
|
||||
|
||||
let triggerCreatePrDeps: TriggerCreatePrDependencies = {
|
||||
prisma,
|
||||
prisma: new PrismaClient(),
|
||||
fileDiffsToMap,
|
||||
buildPrTitle,
|
||||
createNewBranchFromDiffContents,
|
||||
|
|
@ -233,7 +235,7 @@ export function setCreatePrDependencies(deps: Partial<CreatePrDependencies>) {
|
|||
|
||||
export function resetCreatePrDependencies() {
|
||||
dependencies = {
|
||||
prisma,
|
||||
prisma: new PrismaClient(),
|
||||
userNickname,
|
||||
getInstallationOctokitByOwner,
|
||||
isUserCollaborator,
|
||||
|
|
@ -253,7 +255,7 @@ export function setTriggerCreatePrDependencies(deps: Partial<TriggerCreatePrDepe
|
|||
|
||||
export function resetTriggerCreatePrDependencies() {
|
||||
triggerCreatePrDeps = {
|
||||
prisma,
|
||||
prisma: new PrismaClient(),
|
||||
fileDiffsToMap,
|
||||
buildPrTitle,
|
||||
createNewBranchFromDiffContents,
|
||||
|
|
@ -349,9 +351,9 @@ export async function createPr(req: Request, res: Response) {
|
|||
// TODO: Remove this background upsert logic after ensuring all old repositories have been saved.
|
||||
dependencies
|
||||
.registerRepositoryAndMember(owner, repo, nickname, userId, installationOctokit)
|
||||
.then(() => {
|
||||
logger.info(`Background repo and member upsert completed for ${owner}/${repo}`, req)
|
||||
})
|
||||
.then(() =>
|
||||
logger.info(`Background repo and member upsert completed for ${owner}/${repo}`, req),
|
||||
)
|
||||
.catch(err => {
|
||||
logger.errorWithSentry(
|
||||
`Error in background upsertRepoAndCreateMember:`,
|
||||
|
|
@ -804,7 +806,7 @@ export async function triggerCreatePr(
|
|||
})()
|
||||
|
||||
// Run reviewer assignment and label additions in parallel
|
||||
const githubPostPrTasks: Array<Promise<void>> = [
|
||||
const githubPostPrTasks: Promise<void>[] = [
|
||||
triggerCreatePrDeps.assignReviewer(
|
||||
installationOctokit,
|
||||
owner,
|
||||
|
|
@ -835,7 +837,7 @@ export async function triggerCreatePr(
|
|||
|
||||
const updateOptimizationFeaturesTask = (async () => {
|
||||
if (traceId !== "") {
|
||||
const pull_request_db = await triggerCreatePrDeps.prisma.optimization_features.findUnique({
|
||||
let pull_request_db = await triggerCreatePrDeps.prisma.optimization_features.findUnique({
|
||||
where: {
|
||||
trace_id: traceId,
|
||||
},
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
import type { Response } from "express"
|
||||
import type { FileDiffContent } from "@codeflash-ai/code-suggester/build/src/types.js"
|
||||
import { type FileDiffContent } from "@codeflash-ai/code-suggester/build/src/types.js"
|
||||
import { getEffectivePrivacyMode, prisma } from "@codeflash-ai/common"
|
||||
import { AuthorizedUserReq, SubscriptionInfo } from "../types.js"
|
||||
import { AuthorizedUserReq, SubscriptionInfo } from "types.js"
|
||||
import {
|
||||
StagingStorageStrategyFactory,
|
||||
StagingStorageContext,
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
import type { Response } from "express"
|
||||
import { AuthorizedUserReq } from "../types.js"
|
||||
import { AuthorizedUserReq } from "types.js"
|
||||
import { githubApp } from "../github/github-app.js"
|
||||
import { isUserCollaborator } from "../github/github-utils.js"
|
||||
import { userNickname } from "../auth0-mgmt.js"
|
||||
|
|
@ -21,8 +21,8 @@ export interface GetStagingCodeDependencies {
|
|||
}
|
||||
|
||||
let dependencies: GetStagingCodeDependencies = {
|
||||
getInstallationOctokit: async (installationId: number) =>
|
||||
await (githubApp.getInstallationOctokit(installationId) as Promise<Octokit>),
|
||||
getInstallationOctokit: (installationId: number) =>
|
||||
githubApp.getInstallationOctokit(installationId) as Promise<Octokit>,
|
||||
userNickname,
|
||||
isUserCollaborator,
|
||||
}
|
||||
|
|
@ -33,8 +33,8 @@ export function setGetStagingCodeDependencies(newDependencies: GetStagingCodeDep
|
|||
|
||||
export function resetGetStagingCodeDependencies() {
|
||||
dependencies = {
|
||||
getInstallationOctokit: async (installationId: number) =>
|
||||
await (githubApp.getInstallationOctokit(installationId) as Promise<Octokit>),
|
||||
getInstallationOctokit: (installationId: number) =>
|
||||
githubApp.getInstallationOctokit(installationId) as Promise<Octokit>,
|
||||
userNickname,
|
||||
isUserCollaborator,
|
||||
}
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@ import { userNickname } from "../auth0-mgmt.js"
|
|||
import { getInstallationOctokitByOwner, isUserCollaborator } from "../github/github-utils.js"
|
||||
import { githubApp } from "../github/github-app.js"
|
||||
import { Request, Response } from "express"
|
||||
import { AuthorizedUserReq } from "../types.js"
|
||||
import { AuthorizedUserReq } from "types.js"
|
||||
import { logger } from "../utils/logger.js"
|
||||
import {
|
||||
missingRequiredFields,
|
||||
|
|
|
|||
|
|
@ -42,8 +42,8 @@ export async function optimizationSuccess(req: Request, res: Response): Promise<
|
|||
|
||||
try {
|
||||
const result = await dependencies.prisma.optimization_events.updateMany({
|
||||
where: { trace_id },
|
||||
data: { is_optimization_found },
|
||||
where: { trace_id: trace_id },
|
||||
data: { is_optimization_found: is_optimization_found },
|
||||
})
|
||||
|
||||
if (result.count === 0) {
|
||||
|
|
@ -51,6 +51,7 @@ export async function optimizationSuccess(req: Request, res: Response): Promise<
|
|||
}
|
||||
|
||||
res.status(200).json({ message: "Optimization status updated." })
|
||||
return
|
||||
} catch (error) {
|
||||
if (error && typeof error === "object" && "getHttpStatus" in error) {
|
||||
throw error
|
||||
|
|
|
|||
|
|
@ -45,7 +45,7 @@ export async function sendOptimizationCompletedEmail(req: Request, res: Response
|
|||
},
|
||||
})
|
||||
await sendEmail({
|
||||
to: String(user.email),
|
||||
to: user.email,
|
||||
subject: `Codeflash: Optimization Completed${showRepo ? ` For ${owner}/${repo}` : ""}`,
|
||||
html,
|
||||
})
|
||||
|
|
@ -56,6 +56,7 @@ export async function sendOptimizationCompletedEmail(req: Request, res: Response
|
|||
})
|
||||
|
||||
res.status(200).json({ status: "success", message: "Email has been successfully sent." })
|
||||
return
|
||||
} catch (error) {
|
||||
logger.errorWithSentry(
|
||||
"Failed to send optimization completed email",
|
||||
|
|
|
|||
|
|
@ -1,11 +1,11 @@
|
|||
import * as Sentry from "@sentry/node"
|
||||
import { Request, Response } from "express"
|
||||
import type { Octokit } from "octokit"
|
||||
import { type Octokit } from "octokit"
|
||||
import { userNickname } from "../auth0-mgmt.js"
|
||||
import { getInstallationOctokitByOwner, isUserCollaborator } from "../github/github-utils.js"
|
||||
import { githubApp } from "../github/github-app.js"
|
||||
import { posthog } from "../analytics.js"
|
||||
import { AuthorizedUserReq } from "../types.js"
|
||||
import { AuthorizedUserReq } from "types.js"
|
||||
import { registerRepositoryAndMember } from "./utils/github-repo-setup.js"
|
||||
import { createNewPullRequest } from "../github/create-pr-from-diffcontents.js"
|
||||
import {
|
||||
|
|
@ -365,11 +365,11 @@ export async function setupGithubActions(req: Request, res: Response): Promise<v
|
|||
// Register repository and member in background
|
||||
dependencies
|
||||
.registerRepositoryAndMember(owner, repo, nickname, userId, installationOctokit)
|
||||
.then(() => {
|
||||
.then(() =>
|
||||
console.log(
|
||||
`[setup-github-actions.ts:setupGithubActions] Background repo and member upsert completed for ${owner}/${repo}`,
|
||||
)
|
||||
})
|
||||
),
|
||||
)
|
||||
.catch(err => {
|
||||
console.error(
|
||||
`[setup-github-actions.ts:setupGithubActions] Error in background upsert for ${owner}/${repo}:`,
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
import { Request, Response } from "express"
|
||||
import * as crypto from "node:crypto"
|
||||
import * as crypto from "crypto"
|
||||
import { posthog } from "../analytics.js"
|
||||
import { processReaction } from "../github/optimization_approval.js"
|
||||
import * as Sentry from "@sentry/node"
|
||||
|
|
@ -74,7 +74,7 @@ export function verifySlackRequest(req: Request): boolean {
|
|||
const baseString = `v0:${slackTimestamp}:${requestBody}`
|
||||
|
||||
const hmac = dependencies.crypto.createHmac("sha256", SLACK_SIGNING_SECRET)
|
||||
const signature = `v0=${hmac.update(baseString).digest("hex")}`
|
||||
const signature = "v0=" + hmac.update(baseString).digest("hex")
|
||||
|
||||
return dependencies.crypto.timingSafeEqual(Buffer.from(signature), Buffer.from(slackSignature))
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,5 +1,6 @@
|
|||
import { Request, Response } from "express"
|
||||
import { addMonthsSafe, stripe, SUBSCRIPTION_PLANS, prisma } from "@codeflash-ai/common"
|
||||
import { addMonthsSafe, stripe, SUBSCRIPTION_PLANS } from "@codeflash-ai/common"
|
||||
import { prisma } from "@codeflash-ai/common"
|
||||
import * as Sentry from "@sentry/node"
|
||||
import { logger } from "../utils/logger.js"
|
||||
import { badRequest } from "../exceptions/index.js"
|
||||
|
|
@ -58,7 +59,7 @@ export async function stripeWebhookHandler(req: Request, res: Response) {
|
|||
throw new Error("STRIPE_WEBHOOK_SECRET is not configured")
|
||||
}
|
||||
|
||||
const event = dependencies.stripe.webhooks.constructEvent(req.body, sig, webhookSecret)
|
||||
const event = dependencies.stripe.webhooks.constructEvent(req.body, sig!, webhookSecret)
|
||||
|
||||
logger.info("Processing Stripe webhook", req, {
|
||||
eventType: event.type,
|
||||
|
|
|
|||
|
|
@ -7,7 +7,10 @@ import {
|
|||
} from "@codeflash-ai/common"
|
||||
import * as Sentry from "@sentry/node"
|
||||
import { logger } from "../utils/logger.js"
|
||||
import { missingRequiredFields, subscriptionNotFound } from "../exceptions/index.js"
|
||||
import {
|
||||
missingRequiredFields,
|
||||
subscriptionNotFound,
|
||||
} from "../exceptions/index.js"
|
||||
|
||||
// Dependencies interface for easier testing
|
||||
export interface SubscriptionDependencies {
|
||||
|
|
@ -53,8 +56,7 @@ export async function getSubscription(req: Request, res: Response, next: NextFun
|
|||
const userId = req.query.userId as string
|
||||
|
||||
if (!userId) {
|
||||
next(missingRequiredFields("userId"))
|
||||
return
|
||||
return next(missingRequiredFields("userId"))
|
||||
}
|
||||
|
||||
try {
|
||||
|
|
@ -62,8 +64,7 @@ export async function getSubscription(req: Request, res: Response, next: NextFun
|
|||
const subscription = await dependencies.fetchSubscription(userId)
|
||||
|
||||
if (!subscription) {
|
||||
next(subscriptionNotFound(userId))
|
||||
return
|
||||
return next(subscriptionNotFound(userId))
|
||||
}
|
||||
|
||||
return res.json({
|
||||
|
|
@ -86,8 +87,7 @@ export async function createCheckout(req: Request, res: Response, next: NextFunc
|
|||
const { userId, priceId, successUrl, cancelUrl, period } = req.body
|
||||
|
||||
if (!userId || !priceId) {
|
||||
next(missingRequiredFields("userId, priceId"))
|
||||
return
|
||||
return next(missingRequiredFields("userId, priceId"))
|
||||
}
|
||||
|
||||
try {
|
||||
|
|
@ -116,8 +116,7 @@ export async function cancelSubscription(req: Request, res: Response, next: Next
|
|||
const { userId } = req.body
|
||||
|
||||
if (!userId) {
|
||||
next(missingRequiredFields("userId"))
|
||||
return
|
||||
return next(missingRequiredFields("userId"))
|
||||
}
|
||||
|
||||
try {
|
||||
|
|
|
|||
|
|
@ -14,9 +14,8 @@ import {
|
|||
createNewBranchFromDiffContents,
|
||||
} from "../github/create-pr-from-diffcontents.js"
|
||||
import { posthog } from "../analytics.js"
|
||||
import type { FileDiffContent } from "@codeflash-ai/code-suggester/build/src/types.js"
|
||||
import { type FileDiffContent } from "@codeflash-ai/code-suggester/build/src/types.js"
|
||||
import { PrismaClient } from "@prisma/client"
|
||||
import { prisma } from "@codeflash-ai/common"
|
||||
import { sendSlackMessage } from "../github/slack_util.js"
|
||||
import { Response } from "express"
|
||||
import {
|
||||
|
|
@ -64,7 +63,7 @@ export interface SuggestPrChangesDependencies {
|
|||
|
||||
// Default dependencies
|
||||
let dependencies: SuggestPrChangesDependencies = {
|
||||
prisma,
|
||||
prisma: new PrismaClient(),
|
||||
userNickname,
|
||||
getInstallationOctokitByOwner,
|
||||
isUserCollaborator,
|
||||
|
|
@ -91,7 +90,7 @@ export function setSuggestPrChangesDependencies(deps: Partial<SuggestPrChangesDe
|
|||
|
||||
export function resetSuggestPrChangesDependencies() {
|
||||
dependencies = {
|
||||
prisma,
|
||||
prisma: new PrismaClient(),
|
||||
userNickname,
|
||||
getInstallationOctokitByOwner,
|
||||
isUserCollaborator,
|
||||
|
|
@ -267,9 +266,9 @@ export async function suggestPrChanges(
|
|||
logger.info(`${nickname} is a collaborator on ${owner}/${repo}`, req)
|
||||
// TODO: Remove this background upsert logic after ensuring all old repositories have been saved.
|
||||
registerRepositoryAndMember(owner, repo, nickname, userId, installationOctokit)
|
||||
.then(() => {
|
||||
logger.info(`Background repo and member upsert completed for ${owner}/${repo}`, req)
|
||||
})
|
||||
.then(() =>
|
||||
logger.info(`Background repo and member upsert completed for ${owner}/${repo}`, req),
|
||||
)
|
||||
.catch(err => {
|
||||
logger.errorWithSentry(
|
||||
`Error in background upsertRepoAndCreateMember`,
|
||||
|
|
@ -319,7 +318,7 @@ export async function suggestPrChanges(
|
|||
)
|
||||
|
||||
if (result && typeof result === "object" && "status" in result) {
|
||||
return result
|
||||
return result as Response
|
||||
}
|
||||
return res.json(result)
|
||||
} else {
|
||||
|
|
@ -418,7 +417,7 @@ export async function suggestPrChanges(
|
|||
|
||||
// Don't call res.json(result) if result is already a Response object
|
||||
if (result && typeof result === "object" && "status" in result) {
|
||||
return result
|
||||
return result as Response
|
||||
}
|
||||
return res.json(result)
|
||||
} catch (error: any) {
|
||||
|
|
@ -427,12 +426,7 @@ export async function suggestPrChanges(
|
|||
if (traceId) {
|
||||
logger.info(`PR suggestion failed, falling back to staging for traceId: ${traceId}`, req)
|
||||
try {
|
||||
const stagingResult = await dependencies.saveStagingReview(
|
||||
req.body,
|
||||
req.userId,
|
||||
req.organizationId,
|
||||
req.subscriptionInfo,
|
||||
)
|
||||
const stagingResult = await dependencies.saveStagingReview(req.body, req.userId, req.organizationId, req.subscriptionInfo)
|
||||
if (stagingResult.status === 200) {
|
||||
return res.status(200).json({
|
||||
message: "PR suggestion failed, staging created as fallback",
|
||||
|
|
@ -444,7 +438,7 @@ export async function suggestPrChanges(
|
|||
`Staging fallback returned status ${stagingResult.status}`,
|
||||
req,
|
||||
{ reqBody: req.body, userId: req.userId, traceId, stagingResult },
|
||||
new Error(`Staging fallback returned status ${stagingResult.status}`),
|
||||
new Error(`Staging fallback returned status ${stagingResult.status}`)
|
||||
)
|
||||
return res.status(stagingResult.status).json({
|
||||
message: "PR suggestion failed and staging fallback also failed",
|
||||
|
|
@ -455,7 +449,7 @@ export async function suggestPrChanges(
|
|||
`Staging fallback threw an exception`,
|
||||
req,
|
||||
{ reqBody: req.body, userId: req.userId, traceId },
|
||||
stagingError as Error,
|
||||
stagingError as Error
|
||||
)
|
||||
return res.status(500).json({
|
||||
message: "PR suggestion failed and staging fallback threw an error",
|
||||
|
|
@ -464,12 +458,7 @@ export async function suggestPrChanges(
|
|||
}
|
||||
}
|
||||
|
||||
logger.errorWithSentry(
|
||||
`Error in /cfapi/suggest-pr-changes: ${error}`,
|
||||
req,
|
||||
{ errorMessage: error.message },
|
||||
error as Error,
|
||||
)
|
||||
logger.errorWithSentry(`Error in /cfapi/suggest-pr-changes: ${error}`, req, { errorMessage: error.message }, error as Error)
|
||||
dependencies.posthog.capture({
|
||||
distinctId: req.userId,
|
||||
event: `cfapi-suggest-pr-changes-failed-error`,
|
||||
|
|
@ -503,7 +492,7 @@ export async function triggerSuggestPrChanges(
|
|||
const diffContentsMap: Map<string, FileDiffContent> = dependencies.fileDiffsToMap(diffContents)
|
||||
|
||||
const { validHunks, invalidHunks } = await dependencies.determineValidHunks(
|
||||
installationOctokit.rest,
|
||||
installationOctokit.rest as AnyOctokit,
|
||||
{ owner, repo },
|
||||
pullNumber,
|
||||
100,
|
||||
|
|
@ -525,26 +514,32 @@ export async function triggerSuggestPrChanges(
|
|||
|
||||
// Check if the PR is merged or closed - we can't suggest changes on merged/closed PRs
|
||||
if (originalPrData.data.merged) {
|
||||
logger.info(`PR #${pullNumber} is already merged, cannot suggest changes`, {
|
||||
endpoint: "/cfapi/suggest-pr-changes",
|
||||
operation: "pr_merged_check",
|
||||
owner,
|
||||
repo,
|
||||
userId,
|
||||
})
|
||||
logger.info(
|
||||
`PR #${pullNumber} is already merged, cannot suggest changes`,
|
||||
{
|
||||
endpoint: "/cfapi/suggest-pr-changes",
|
||||
operation: "pr_merged_check",
|
||||
owner,
|
||||
repo,
|
||||
userId,
|
||||
},
|
||||
)
|
||||
throw unprocessableEntity(
|
||||
`Cannot suggest changes on merged PR #${pullNumber}. The PR was already merged.`,
|
||||
)
|
||||
}
|
||||
|
||||
if (originalPrData.data.state === "closed") {
|
||||
logger.info(`PR #${pullNumber} is closed, cannot suggest changes`, {
|
||||
endpoint: "/cfapi/suggest-pr-changes",
|
||||
operation: "pr_closed_check",
|
||||
owner,
|
||||
repo,
|
||||
userId,
|
||||
})
|
||||
logger.info(
|
||||
`PR #${pullNumber} is closed, cannot suggest changes`,
|
||||
{
|
||||
endpoint: "/cfapi/suggest-pr-changes",
|
||||
operation: "pr_closed_check",
|
||||
owner,
|
||||
repo,
|
||||
userId,
|
||||
},
|
||||
)
|
||||
throw unprocessableEntity(
|
||||
`Cannot suggest changes on closed PR #${pullNumber}. The PR is no longer open.`,
|
||||
)
|
||||
|
|
@ -562,7 +557,7 @@ export async function triggerSuggestPrChanges(
|
|||
const commitMessage = `Optimize ${prCommentFields.function_name} \n\n${prCommentFields.optimization_explanation}`
|
||||
|
||||
let hasMultipleHunksInSameFile = false
|
||||
const hasMultipleFiles = validHunks.size > 1
|
||||
let hasMultipleFiles = validHunks.size > 1
|
||||
|
||||
for (const [filePath, hunks] of validHunks.entries()) {
|
||||
if (hunks.length > 1) {
|
||||
|
|
@ -667,7 +662,7 @@ export async function triggerSuggestPrChanges(
|
|||
throw new Error(`Failed to create branch ${newBranchName}`)
|
||||
}
|
||||
const newPrData = await dependencies.createDependentPullRequest(
|
||||
installationOctokit,
|
||||
installationOctokit as AnyOctokit,
|
||||
owner,
|
||||
repo,
|
||||
pullNumber,
|
||||
|
|
@ -712,7 +707,7 @@ export async function triggerSuggestPrChanges(
|
|||
})
|
||||
|
||||
if (traceId !== "") {
|
||||
const pull_request_db = await dependencies.prisma.optimization_features.findUnique({
|
||||
let pull_request_db = await dependencies.prisma.optimization_features.findUnique({
|
||||
where: {
|
||||
trace_id: traceId,
|
||||
},
|
||||
|
|
@ -774,8 +769,10 @@ export async function triggerSuggestPrChanges(
|
|||
{ isUnifiedReview: true, includeHeader: false, isCollapsed: true },
|
||||
)
|
||||
let optReviewBadge = generateOptimizationReviewTemplate(optimizationReview)
|
||||
optReviewBadge &&= `\n\n${optReviewBadge}\n`
|
||||
const reviewComments = []
|
||||
if (optReviewBadge) {
|
||||
optReviewBadge = `\n\n${optReviewBadge}\n`
|
||||
}
|
||||
let reviewComments = []
|
||||
let foundInvalidHunk = false
|
||||
|
||||
for (const [filePath, hunks] of validHunks.entries()) {
|
||||
|
|
@ -787,17 +784,25 @@ export async function triggerSuggestPrChanges(
|
|||
|
||||
if (isLongDiff) {
|
||||
commentBody =
|
||||
`${prCommentBody}\n\n` +
|
||||
`<details>\n` +
|
||||
`<summary>Click to see suggested changes</summary>\n\n` +
|
||||
`\`\`\`suggestion\n${newContent}\n\`\`\`\n` +
|
||||
`</details>` +
|
||||
`\n${optReviewBadge}`
|
||||
prCommentBody +
|
||||
"\n\n" +
|
||||
"<details>\n" +
|
||||
"<summary>Click to see suggested changes</summary>\n\n" +
|
||||
"```suggestion\n" +
|
||||
newContent +
|
||||
"\n```\n" +
|
||||
"</details>" +
|
||||
"\n" +
|
||||
optReviewBadge
|
||||
} else {
|
||||
commentBody =
|
||||
`${prCommentBody}\n\n` +
|
||||
`\`\`\`suggestion\n${newContent}\n\`\`\`` +
|
||||
`\n${optReviewBadge}`
|
||||
prCommentBody +
|
||||
"\n\n" +
|
||||
"```suggestion\n" +
|
||||
newContent +
|
||||
"\n```" +
|
||||
"\n" +
|
||||
optReviewBadge
|
||||
}
|
||||
|
||||
reviewComments.push({
|
||||
|
|
@ -866,7 +871,7 @@ export async function triggerSuggestPrChanges(
|
|||
})
|
||||
|
||||
if (traceId !== "") {
|
||||
const pull_request_db = await dependencies.prisma.optimization_features.findUnique({
|
||||
let pull_request_db = await dependencies.prisma.optimization_features.findUnique({
|
||||
where: {
|
||||
trace_id: traceId,
|
||||
},
|
||||
|
|
|
|||
|
|
@ -241,7 +241,7 @@ export async function verifyExistingOptimizations(req: Request, res: Response) {
|
|||
throw internalServerError(`Failed to retrieve PR reviews for ${repo_owner}/${repo_name}`)
|
||||
}
|
||||
|
||||
const reviewBodies: Array<{ body: string }> = []
|
||||
const reviewBodies: { body: string }[] = []
|
||||
for (const review of pr_reviews.data) {
|
||||
// Add the main review body if it exists
|
||||
if (review.body) {
|
||||
|
|
@ -317,7 +317,7 @@ export async function verifyExistingOptimizations(req: Request, res: Response) {
|
|||
const prBody = pr.data.body || ""
|
||||
const validComments = pr_messages.data.filter(
|
||||
(comment: { body?: string }) => comment.body !== undefined,
|
||||
) as Array<{ body: string }>
|
||||
) as { body: string }[]
|
||||
const allComments = [...validComments, ...reviewBodies]
|
||||
const optimizations_dict = dependencies.parseAndCreateOptimizationsDict(prBody, allComments)
|
||||
|
||||
|
|
@ -325,7 +325,7 @@ export async function verifyExistingOptimizations(req: Request, res: Response) {
|
|||
return res.status(200).json({ error: "No optimizations found for this PR" })
|
||||
}
|
||||
|
||||
const response_dict: Record<string, string[]> = {}
|
||||
const response_dict: { [key: string]: string[] } = {}
|
||||
for (const key in optimizations_dict) {
|
||||
response_dict[key] = Array.from(optimizations_dict[key])
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,113 +0,0 @@
|
|||
import love from "eslint-config-love"
|
||||
import eslintConfigPrettier from "eslint-config-prettier/flat"
|
||||
|
||||
export default [
|
||||
// Global ignores (must be a standalone object with only `ignores`)
|
||||
{
|
||||
ignores: [
|
||||
"dist/**",
|
||||
"node_modules/**",
|
||||
"coverage/**",
|
||||
"build/**",
|
||||
"*.config.js",
|
||||
"*.config.cjs",
|
||||
"jest.config.cjs",
|
||||
"**/*.test.ts",
|
||||
"**/*.spec.ts",
|
||||
],
|
||||
},
|
||||
|
||||
// eslint-config-love base (TypeScript files only)
|
||||
{
|
||||
...love,
|
||||
files: ["**/*.ts"],
|
||||
},
|
||||
|
||||
// Prettier must come after all other configs
|
||||
eslintConfigPrettier,
|
||||
|
||||
// Relax rules that are new in eslint-config-love but were not in the
|
||||
// previous config. Tighten incrementally — remove lines as code is fixed.
|
||||
{
|
||||
files: ["**/*.ts"],
|
||||
rules: {
|
||||
// --- type-safety (big refactor needed) ---
|
||||
"@typescript-eslint/no-unsafe-assignment": "off",
|
||||
"@typescript-eslint/no-unsafe-member-access": "off",
|
||||
"@typescript-eslint/no-unsafe-argument": "off",
|
||||
"@typescript-eslint/no-unsafe-call": "off",
|
||||
"@typescript-eslint/no-unsafe-return": "off",
|
||||
"@typescript-eslint/no-unsafe-type-assertion": "off",
|
||||
"@typescript-eslint/no-unsafe-enum-comparison": "off",
|
||||
"@typescript-eslint/no-explicit-any": "off",
|
||||
"@typescript-eslint/no-base-to-string": "off",
|
||||
"@typescript-eslint/restrict-template-expressions": "off",
|
||||
"@typescript-eslint/no-non-null-assertion": "off",
|
||||
"@typescript-eslint/no-redundant-type-constituents": "off",
|
||||
"@typescript-eslint/consistent-type-assertions": "off",
|
||||
"@typescript-eslint/use-unknown-in-catch-callback-variable": "off",
|
||||
|
||||
// --- promise handling ---
|
||||
"@typescript-eslint/no-floating-promises": "off",
|
||||
"@typescript-eslint/no-misused-promises": "off",
|
||||
"@typescript-eslint/require-await": "off",
|
||||
"@typescript-eslint/strict-void-return": "off",
|
||||
"@typescript-eslint/no-confusing-void-expression": "off",
|
||||
"promise/avoid-new": "off",
|
||||
"no-async-promise-executor": "off",
|
||||
"no-promise-executor-return": "off",
|
||||
|
||||
// --- style / convention ---
|
||||
"@typescript-eslint/strict-boolean-expressions": "off",
|
||||
"@typescript-eslint/no-unnecessary-condition": "off",
|
||||
"@typescript-eslint/no-magic-numbers": "off",
|
||||
"@typescript-eslint/prefer-nullish-coalescing": "off",
|
||||
"@typescript-eslint/prefer-destructuring": "off",
|
||||
"@typescript-eslint/explicit-function-return-type": "off",
|
||||
"@typescript-eslint/no-unnecessary-boolean-literal-compare": "off",
|
||||
"@typescript-eslint/no-useless-default-assignment": "off",
|
||||
"@typescript-eslint/naming-convention": "off",
|
||||
"@typescript-eslint/consistent-type-imports": "off",
|
||||
"@typescript-eslint/no-inferrable-types": "off",
|
||||
"@typescript-eslint/max-params": "off",
|
||||
"@typescript-eslint/init-declarations": "off",
|
||||
"@typescript-eslint/no-var-requires": "off",
|
||||
"@typescript-eslint/unbound-method": "off",
|
||||
"@typescript-eslint/no-empty-function": "off",
|
||||
"@typescript-eslint/no-useless-constructor": "off",
|
||||
"@typescript-eslint/method-signature-style": "off",
|
||||
"@typescript-eslint/unified-signatures": "off",
|
||||
"@typescript-eslint/ban-ts-comment": "off",
|
||||
"@typescript-eslint/no-dynamic-delete": "off",
|
||||
"@typescript-eslint/no-extraneous-class": "off",
|
||||
"@typescript-eslint/no-namespace": "off",
|
||||
"@typescript-eslint/promise-function-async": "off",
|
||||
"@typescript-eslint/no-unnecessary-type-conversion": "off",
|
||||
"@typescript-eslint/no-unused-vars": [
|
||||
"warn",
|
||||
{ argsIgnorePattern: "^_", varsIgnorePattern: "^_" },
|
||||
],
|
||||
"@typescript-eslint/prefer-optional-chain": "off",
|
||||
|
||||
// --- eslint core ---
|
||||
"no-console": "off",
|
||||
"no-await-in-loop": "off",
|
||||
"no-param-reassign": "off",
|
||||
"no-plusplus": "off",
|
||||
"no-negated-condition": "off",
|
||||
"no-useless-assignment": "off",
|
||||
"no-useless-concat": "off",
|
||||
"prefer-named-capture-group": "off",
|
||||
"prefer-regex-literals": "off",
|
||||
"require-unicode-regexp": "off",
|
||||
"require-atomic-updates": "off",
|
||||
"logical-assignment-operators": "off",
|
||||
"guard-for-in": "off",
|
||||
"max-depth": "off",
|
||||
"max-lines": "off",
|
||||
complexity: "off",
|
||||
eqeqeq: "off",
|
||||
radix: "off",
|
||||
},
|
||||
},
|
||||
]
|
||||
|
|
@ -1,5 +1,5 @@
|
|||
import type { FileDiffContent } from "@codeflash-ai/code-suggester/build/src/types.js"
|
||||
import type { Octokit } from "octokit"
|
||||
import { type FileDiffContent } from "@codeflash-ai/code-suggester/build/src/types.js"
|
||||
import { type Octokit } from "octokit"
|
||||
import { addLabelToPullRequest } from "./github-utils.js"
|
||||
import {
|
||||
buildBenchmarkInfo,
|
||||
|
|
@ -8,11 +8,9 @@ import {
|
|||
buildResultFooter,
|
||||
generateOptimizationReviewTemplate,
|
||||
originalPRComment,
|
||||
buildResultHeader,
|
||||
buildResultDetails,
|
||||
buildResultTestReport,
|
||||
} from "./pr-changes-utils.js"
|
||||
import type { RestEndpointMethodTypes } from "@octokit/rest"
|
||||
import { buildResultHeader, buildResultDetails, buildResultTestReport } from "./pr-changes-utils.js"
|
||||
import { AnyOctokit, PullRequestCreationResponse } from "../types.js"
|
||||
import * as Sentry from "@sentry/node"
|
||||
|
||||
|
|
@ -193,7 +191,7 @@ export async function createNewBranchFromDiffContents(
|
|||
return result.status === 200
|
||||
} catch (error) {
|
||||
console.error("Error creating branch from diff contents:", error)
|
||||
Sentry.captureException(`Failed to create branch: ${error.message}`, {
|
||||
Sentry.captureException("Failed to create branch: " + error.message, {
|
||||
extra: { owner, repo, newBranchName, baseBranch, commitMessage, diffContentsMap },
|
||||
})
|
||||
return false
|
||||
|
|
@ -488,7 +486,9 @@ function createDependentPRTitleAndBody(
|
|||
If you approve this dependent PR, these changes will be merged into the original PR branch \`${baseBranch}\`.
|
||||
>This PR will be automatically closed if the original PR is merged.\n` + `----\n`
|
||||
let optReviewBadge = generateOptimizationReviewTemplate(optimizationReview)
|
||||
optReviewBadge &&= ` ${optReviewBadge}\n`
|
||||
if (optReviewBadge) {
|
||||
optReviewBadge = ` ${optReviewBadge}\n`
|
||||
}
|
||||
// Conditionally build the body based on whether benchmark info exists
|
||||
const body = benchmarkInfo
|
||||
? `${introSection}${prCommentHeader}\n${benchmarkInfo}\n${prCommentBody}\n${prCommentTestReport}\n${prCommentFooter}${optReviewBadge}`
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
import { App } from "octokit"
|
||||
import { createNodeMiddleware } from "@octokit/webhooks"
|
||||
import fs from "node:fs"
|
||||
import fs from "fs"
|
||||
import {
|
||||
getGithubAppPrivateKey,
|
||||
getGithubAppWebhookSecret,
|
||||
|
|
@ -78,12 +78,8 @@ export const githubApp = await (async () => {
|
|||
|
||||
if (!GH_APP_ID || GH_APP_ID === "" || process.env.NODE_ENV === "test") {
|
||||
logger.warn("GitHub App not configured (GH_APP_ID missing)", { operation: "server_startup" })
|
||||
logger.warn("PR creation and GitHub webhook features are disabled", {
|
||||
operation: "server_startup",
|
||||
})
|
||||
logger.info("CLI and optimization features will continue to work", {
|
||||
operation: "server_startup",
|
||||
})
|
||||
logger.warn("PR creation and GitHub webhook features are disabled", { operation: "server_startup" })
|
||||
logger.info("CLI and optimization features will continue to work", { operation: "server_startup" })
|
||||
|
||||
// Return a minimal mock that won't fail
|
||||
return {
|
||||
|
|
@ -105,9 +101,7 @@ export const githubApp = await (async () => {
|
|||
}
|
||||
|
||||
// In other environments, initialize normally
|
||||
logger.info(`GitHub App ID ${GH_APP_ID} detected, initializing...`, {
|
||||
operation: "server_startup",
|
||||
})
|
||||
logger.info(`GitHub App ID ${GH_APP_ID} detected, initializing...`, { operation: "server_startup" })
|
||||
const app = await initializeApp()
|
||||
|
||||
logger.info("GitHub App initialized", { operation: "server_startup" })
|
||||
|
|
@ -118,15 +112,11 @@ export const githubApp = await (async () => {
|
|||
|
||||
app.webhooks.onAny(async ({ id, name, payload }) => {
|
||||
// Only log event type and ID, not full payload (too verbose)
|
||||
logger.info(
|
||||
"GitHub App: Received webhook event",
|
||||
{
|
||||
operation: "webhook_received",
|
||||
repoOwner: (payload as any)?.repository?.owner?.login,
|
||||
repoName: (payload as any)?.repository?.name,
|
||||
},
|
||||
{ eventType: name, eventId: id },
|
||||
)
|
||||
logger.info("GitHub App: Received webhook event", {
|
||||
operation: "webhook_received",
|
||||
repoOwner: (payload as any)?.repository?.owner?.login,
|
||||
repoName: (payload as any)?.repository?.name,
|
||||
}, { eventType: name, eventId: id })
|
||||
posthog?.capture({
|
||||
distinctId: `github|${payload.sender?.id}`,
|
||||
event: `cfapi-github-webhook-received`,
|
||||
|
|
@ -147,10 +137,7 @@ export const githubApp = await (async () => {
|
|||
: account && "slug" in account
|
||||
? account.slug
|
||||
: "unknown"
|
||||
logger.info(
|
||||
`Received installation event: installation_id=${payload.installation.id}, account=${accountName}`,
|
||||
webhookContext(payload, "installation"),
|
||||
)
|
||||
logger.info(`Received installation event: installation_id=${payload.installation.id}, account=${accountName}`, webhookContext(payload, "installation"))
|
||||
// Create an installation access token
|
||||
const installationAccessToken = await octokit.rest.apps.createInstallationAccessToken({
|
||||
installation_id: payload.installation.id,
|
||||
|
|
@ -159,17 +146,11 @@ export const githubApp = await (async () => {
|
|||
})
|
||||
|
||||
app.webhooks.on("pull_request.opened", async ({ octokit, payload }) => {
|
||||
logger.info(
|
||||
`Received pull_request.opened event: PR #${payload.pull_request?.number} in ${payload.repository?.full_name}`,
|
||||
webhookContext(payload, "pull_request_opened"),
|
||||
)
|
||||
logger.info(`Received pull_request.opened event: PR #${payload.pull_request?.number} in ${payload.repository?.full_name}`, webhookContext(payload, "pull_request_opened"))
|
||||
})
|
||||
|
||||
app.webhooks.on("pull_request.edited", async ({ octokit, payload }) => {
|
||||
logger.info(
|
||||
`Received pull_request.edited event: PR #${payload.pull_request?.number} in ${payload.repository?.full_name}`,
|
||||
webhookContext(payload, "pull_request_edited"),
|
||||
)
|
||||
logger.info(`Received pull_request.edited event: PR #${payload.pull_request?.number} in ${payload.repository?.full_name}`, webhookContext(payload, "pull_request_edited"))
|
||||
})
|
||||
|
||||
app.webhooks.on("pull_request.closed", async ({ octokit, payload }) => {
|
||||
|
|
@ -196,22 +177,11 @@ export const githubApp = await (async () => {
|
|||
})
|
||||
}
|
||||
|
||||
logger.info(
|
||||
`Updated optimization_event for PR ID ${prId} to ${payload.pull_request.merged ? "pr_merged" : "pr_closed"} and removed line profiler data`,
|
||||
webhookContext(payload, "pull_request_closed"),
|
||||
)
|
||||
logger.info(`Updated optimization_event for PR ID ${prId} to ${payload.pull_request.merged ? "pr_merged" : "pr_closed"} and removed line profiler data`, webhookContext(payload, "pull_request_closed"))
|
||||
} catch (err) {
|
||||
logger.error(
|
||||
`Failed to update optimization_event for PR ID ${prId}:`,
|
||||
webhookContext(payload, "pull_request_closed"),
|
||||
{},
|
||||
err as Error,
|
||||
)
|
||||
logger.error(`Failed to update optimization_event for PR ID ${prId}:`, webhookContext(payload, "pull_request_closed"), {}, err as Error)
|
||||
}
|
||||
logger.info(
|
||||
`Received pull_request.closed event: PR #${payload.pull_request.number} by ${payload.pull_request.user.login} was closed`,
|
||||
webhookContext(payload, "pull_request_closed"),
|
||||
)
|
||||
logger.info(`Received pull_request.closed event: PR #${payload.pull_request.number} by ${payload.pull_request.user.login} was closed`, webhookContext(payload, "pull_request_closed"))
|
||||
|
||||
// Check if the PR was merged and is a PR created by Codeflash
|
||||
const is_user_code_flash = payload.pull_request.user.id === APP_USER_ID
|
||||
|
|
@ -249,10 +219,7 @@ export const githubApp = await (async () => {
|
|||
mergedBy: payload.pull_request.merged_by?.login,
|
||||
},
|
||||
})
|
||||
logger.info(
|
||||
`Commented on original PR #${originalPrNumber} and logged the event to PostHog`,
|
||||
webhookContext(payload, "dependent_pr_merged"),
|
||||
)
|
||||
logger.info(`Commented on original PR #${originalPrNumber} and logged the event to PostHog`, webhookContext(payload, "dependent_pr_merged"))
|
||||
} else if (standalonePrMatch != null) {
|
||||
posthog?.capture({
|
||||
distinctId: `github|${payload.sender.id}`,
|
||||
|
|
@ -265,19 +232,11 @@ export const githubApp = await (async () => {
|
|||
mergedBy: payload.pull_request.merged_by?.login,
|
||||
},
|
||||
})
|
||||
logger.info(
|
||||
`Logged standalone PR #${payload.pull_request.number} merge event to PostHog`,
|
||||
webhookContext(payload, "standalone_pr_merged"),
|
||||
)
|
||||
logger.info(`Logged standalone PR #${payload.pull_request.number} merge event to PostHog`, webhookContext(payload, "standalone_pr_merged"))
|
||||
}
|
||||
}
|
||||
} catch (mergedPrError) {
|
||||
logger.errorWithSentry(
|
||||
"Failed to process merged PR comment/analytics",
|
||||
webhookContext(payload, "pull_request_closed"),
|
||||
{},
|
||||
mergedPrError as Error,
|
||||
)
|
||||
logger.errorWithSentry("Failed to process merged PR comment/analytics", webhookContext(payload, "pull_request_closed"), {}, mergedPrError as Error)
|
||||
}
|
||||
|
||||
// Close any open optimization PRs targeting the branch of the closed PR
|
||||
|
|
@ -290,10 +249,7 @@ export const githubApp = await (async () => {
|
|||
APP_USER_ID,
|
||||
})
|
||||
if (payload.installation === undefined) {
|
||||
logger.error(
|
||||
"Installation ID is missing from payload. Cannot close PRs for this installation!",
|
||||
closeCtx,
|
||||
)
|
||||
logger.error("Installation ID is missing from payload. Cannot close PRs for this installation!", closeCtx)
|
||||
return
|
||||
}
|
||||
try {
|
||||
|
|
@ -305,17 +261,11 @@ export const githubApp = await (async () => {
|
|||
base: closedPrBranch,
|
||||
})
|
||||
|
||||
logger.info(
|
||||
`Found ${openPrs.data.length} open PRs targeting branch ${closedPrBranch}`,
|
||||
closeCtx,
|
||||
{
|
||||
openPrCount: openPrs.data.length,
|
||||
openPrNumbers: openPrs.data.map(pr => pr.number).join(","),
|
||||
openPrUsers: openPrs.data
|
||||
.map(pr => `#${pr.number}:${pr.user?.login}(id=${pr.user?.id},type=${pr.user?.type})`)
|
||||
.join(","),
|
||||
},
|
||||
)
|
||||
logger.info(`Found ${openPrs.data.length} open PRs targeting branch ${closedPrBranch}`, closeCtx, {
|
||||
openPrCount: openPrs.data.length,
|
||||
openPrNumbers: openPrs.data.map(pr => pr.number).join(","),
|
||||
openPrUsers: openPrs.data.map(pr => `#${pr.number}:${pr.user?.login}(id=${pr.user?.id},type=${pr.user?.type})`).join(","),
|
||||
})
|
||||
|
||||
for (const pr of openPrs.data) {
|
||||
// Check if the PR is opened by the Codeflash GitHub App and targets the same base branch as the closed PR
|
||||
|
|
@ -330,14 +280,8 @@ export const githubApp = await (async () => {
|
|||
pull_number: pr.number,
|
||||
state: "closed",
|
||||
})
|
||||
logger.info(
|
||||
`Closed optimization PR #${pr.number} targeting branch '${closedPrBranch}' because original PR #${payload.pull_request.number} by ${payload.pull_request.user.login} was closed`,
|
||||
webhookContext(payload, "close_dependent_prs"),
|
||||
)
|
||||
logger.info(
|
||||
"Posting pull request comment...",
|
||||
webhookContext(payload, "close_dependent_prs"),
|
||||
)
|
||||
logger.info(`Closed optimization PR #${pr.number} targeting branch '${closedPrBranch}' because original PR #${payload.pull_request.number} by ${payload.pull_request.user.login} was closed`, webhookContext(payload, "close_dependent_prs"))
|
||||
logger.info("Posting pull request comment...", webhookContext(payload, "close_dependent_prs"))
|
||||
await octokit.rest.issues.createComment({
|
||||
owner: payload.repository.owner.login,
|
||||
repo: payload.repository.name,
|
||||
|
|
@ -358,12 +302,7 @@ export const githubApp = await (async () => {
|
|||
await deleteBranchIfExists(installationOctokit, payload, closedPrBranch)
|
||||
}
|
||||
} catch (error) {
|
||||
logger.errorWithSentry(
|
||||
`Failed to close optimization PRs targeting branch ${closedPrBranch}`,
|
||||
webhookContext(payload, "close_dependent_prs"),
|
||||
{},
|
||||
error as Error,
|
||||
)
|
||||
logger.errorWithSentry(`Failed to close optimization PRs targeting branch ${closedPrBranch}`, webhookContext(payload, "close_dependent_prs"), {}, error as Error)
|
||||
}
|
||||
}
|
||||
})
|
||||
|
|
@ -377,25 +316,16 @@ export const githubApp = await (async () => {
|
|||
: account && "slug" in account
|
||||
? account.slug
|
||||
: "unknown"
|
||||
logger.info(
|
||||
`Received installation.created event: installation_id=${payload.installation.id}, account=${accountName}`,
|
||||
webhookContext(payload, "installation_created"),
|
||||
)
|
||||
logger.info(`Received installation.created event: installation_id=${payload.installation.id}, account=${accountName}`, webhookContext(payload, "installation_created"))
|
||||
})
|
||||
|
||||
app.webhooks.on("installation_repositories.added", async ({ octokit, payload }) => {
|
||||
const repoCount = payload.repositories_added?.length || 0
|
||||
logger.info(
|
||||
`Received installation_repositories.added event: installation_id=${payload.installation.id}, repositories_added=${repoCount}`,
|
||||
webhookContext(payload, "installation_repositories_added"),
|
||||
)
|
||||
logger.info(`Received installation_repositories.added event: installation_id=${payload.installation.id}, repositories_added=${repoCount}`, webhookContext(payload, "installation_repositories_added"))
|
||||
})
|
||||
|
||||
app.webhooks.on("marketplace_purchase", async ({ id, name, payload }) => {
|
||||
logger.info(
|
||||
`Received marketplace purchase event: ${name} (${id})`,
|
||||
webhookContext(payload, "marketplace_purchase"),
|
||||
)
|
||||
logger.info(`Received marketplace purchase event: ${name} (${id})`, webhookContext(payload, "marketplace_purchase"))
|
||||
posthog?.capture({
|
||||
distinctId: `github|${payload.sender.id}`,
|
||||
event: `cfapi-github-marketplace-purchase`,
|
||||
|
|
@ -408,10 +338,7 @@ export const githubApp = await (async () => {
|
|||
|
||||
app.webhooks.on("pull_request.synchronize", async ({ octokit, payload }) => {
|
||||
if (payload.pull_request) {
|
||||
logger.info(
|
||||
`Received pull_request.synchronize event: PR #${payload.pull_request.number} by ${payload.pull_request?.user?.login} was updated with new commits`,
|
||||
webhookContext(payload, "pull_request_synchronize"),
|
||||
)
|
||||
logger.info(`Received pull_request.synchronize event: PR #${payload.pull_request.number} by ${payload.pull_request?.user?.login} was updated with new commits`, webhookContext(payload, "pull_request_synchronize"))
|
||||
// Retrieve the list of commits for the pull request
|
||||
const commits = await octokit.rest.pulls.listCommits({
|
||||
owner: payload.repository.owner.login,
|
||||
|
|
@ -437,10 +364,7 @@ export const githubApp = await (async () => {
|
|||
author: latestCommit.commit.author?.name,
|
||||
},
|
||||
})
|
||||
logger.info(
|
||||
`Logged co-authored commit to PostHog: ${latestCommit.sha}`,
|
||||
webhookContext(payload, "pull_request_synchronize"),
|
||||
)
|
||||
logger.info(`Logged co-authored commit to PostHog: ${latestCommit.sha}`, webhookContext(payload, "pull_request_synchronize"))
|
||||
|
||||
// should not be null, but check anyway
|
||||
const authorname = latestCommit.commit.author?.name ?? "You"
|
||||
|
|
@ -451,10 +375,7 @@ export const githubApp = await (async () => {
|
|||
issue_number: payload.pull_request.number,
|
||||
body: `This PR is now faster! 🚀 ${authorname} accepted my code suggestion above.`,
|
||||
})
|
||||
logger.info(
|
||||
`Commented on PR #${payload.pull_request.number} about the accepted review comment`,
|
||||
webhookContext(payload, "pull_request_synchronize"),
|
||||
)
|
||||
logger.info(`Commented on PR #${payload.pull_request.number} about the accepted review comment`, webhookContext(payload, "pull_request_synchronize"))
|
||||
}
|
||||
}
|
||||
})
|
||||
|
|
@ -489,24 +410,11 @@ export const githubApp = await (async () => {
|
|||
|
||||
const feedbackContent = mentionMatch[1].trim()
|
||||
if (!feedbackContent) {
|
||||
logger.info(`Empty feedback received from ${commentAuthor.login}, ignoring`, {
|
||||
operation: "process_feedback",
|
||||
repoOwner: repository.owner.login,
|
||||
repoName: repository.name,
|
||||
prNumber,
|
||||
})
|
||||
logger.info(`Empty feedback received from ${commentAuthor.login}, ignoring`, { operation: "process_feedback", repoOwner: repository.owner.login, repoName: repository.name, prNumber })
|
||||
return
|
||||
}
|
||||
|
||||
logger.info(
|
||||
`Received feedback (${commentType}) from ${commentAuthor.login} on PR #${prNumber}: "${feedbackContent.substring(0, 100)}..."`,
|
||||
{
|
||||
operation: "process_feedback",
|
||||
repoOwner: repository.owner.login,
|
||||
repoName: repository.name,
|
||||
prNumber,
|
||||
},
|
||||
)
|
||||
logger.info(`Received feedback (${commentType}) from ${commentAuthor.login} on PR #${prNumber}: "${feedbackContent.substring(0, 100)}..."`, { operation: "process_feedback", repoOwner: repository.owner.login, repoName: repository.name, prNumber })
|
||||
|
||||
// Helper to add reaction based on comment type
|
||||
const addReaction = async (content: "+1") => {
|
||||
|
|
@ -537,12 +445,7 @@ export const githubApp = await (async () => {
|
|||
|
||||
const prId = String(pr.data.id)
|
||||
const prUrl = pr.data.html_url
|
||||
logger.info(`Looking for optimization event with pr_id=${prId} or pr_url=${prUrl}`, {
|
||||
operation: "process_feedback",
|
||||
repoOwner: repository.owner.login,
|
||||
repoName: repository.name,
|
||||
prNumber,
|
||||
})
|
||||
logger.info(`Looking for optimization event with pr_id=${prId} or pr_url=${prUrl}`, { operation: "process_feedback", repoOwner: repository.owner.login, repoName: repository.name, prNumber })
|
||||
|
||||
// Find optimization events by PR ID or by PR URL
|
||||
const optimizationEvent = await prisma.optimization_events.findFirst({
|
||||
|
|
@ -563,28 +466,12 @@ export const githubApp = await (async () => {
|
|||
})
|
||||
|
||||
if (!optimizationEvent) {
|
||||
logger.info(
|
||||
`No optimization event found for PR #${prNumber} in ${repository.full_name} (pr_id=${prId})`,
|
||||
{
|
||||
operation: "process_feedback",
|
||||
repoOwner: repository.owner.login,
|
||||
repoName: repository.name,
|
||||
prNumber,
|
||||
},
|
||||
)
|
||||
logger.info(`No optimization event found for PR #${prNumber} in ${repository.full_name} (pr_id=${prId})`, { operation: "process_feedback", repoOwner: repository.owner.login, repoName: repository.name, prNumber })
|
||||
await addReaction("+1")
|
||||
return
|
||||
}
|
||||
|
||||
logger.info(
|
||||
`Found optimization event: id=${optimizationEvent.id}, trace_id=${optimizationEvent.trace_id}`,
|
||||
{
|
||||
operation: "process_feedback",
|
||||
repoOwner: repository.owner.login,
|
||||
repoName: repository.name,
|
||||
prNumber,
|
||||
},
|
||||
)
|
||||
logger.info(`Found optimization event: id=${optimizationEvent.id}, trace_id=${optimizationEvent.trace_id}`, { operation: "process_feedback", repoOwner: repository.owner.login, repoName: repository.name, prNumber })
|
||||
|
||||
// Create or get the user
|
||||
const user = await createOrUpdateUser(
|
||||
|
|
@ -606,28 +493,20 @@ export const githubApp = await (async () => {
|
|||
|
||||
await prisma.$transaction(async tx => {
|
||||
// Lock the row with FOR UPDATE to prevent concurrent modifications
|
||||
const [lockedEvent] = await tx.$queryRaw<Array<{ feedback: unknown[] }>>`
|
||||
const [lockedEvent] = await tx.$queryRaw<{ feedback: unknown[] }[]>`
|
||||
SELECT feedback FROM optimization_events WHERE id = ${optimizationEvent.id} FOR UPDATE
|
||||
`
|
||||
const existingFeedback = (lockedEvent.feedback as any[]) || []
|
||||
const existingFeedback = (lockedEvent.feedback as Array<any>) || []
|
||||
|
||||
await tx.optimization_events.update({
|
||||
where: { id: String(optimizationEvent.id) },
|
||||
where: { id: optimizationEvent.id },
|
||||
data: {
|
||||
feedback: [...existingFeedback, newFeedbackEntry],
|
||||
},
|
||||
})
|
||||
})
|
||||
|
||||
logger.info(
|
||||
`Saved feedback from ${commentAuthor.login} for optimization event ${optimizationEvent.id} (trace_id: ${optimizationEvent.trace_id})`,
|
||||
{
|
||||
operation: "process_feedback",
|
||||
repoOwner: repository.owner.login,
|
||||
repoName: repository.name,
|
||||
prNumber,
|
||||
},
|
||||
)
|
||||
logger.info(`Saved feedback from ${commentAuthor.login} for optimization event ${optimizationEvent.id} (trace_id: ${optimizationEvent.trace_id})`, { operation: "process_feedback", repoOwner: repository.owner.login, repoName: repository.name, prNumber })
|
||||
|
||||
// Log to PostHog
|
||||
posthog?.capture({
|
||||
|
|
@ -652,32 +531,12 @@ export const githubApp = await (async () => {
|
|||
// React with a thumbs up to acknowledge the feedback
|
||||
await addReaction("+1")
|
||||
} catch (error) {
|
||||
logger.errorWithSentry(
|
||||
`Failed to process feedback from ${commentAuthor.login}`,
|
||||
{
|
||||
operation: "process_feedback",
|
||||
repoOwner: repository.owner.login,
|
||||
repoName: repository.name,
|
||||
prNumber,
|
||||
},
|
||||
{},
|
||||
error as Error,
|
||||
)
|
||||
logger.errorWithSentry(`Failed to process feedback from ${commentAuthor.login}`, { operation: "process_feedback", repoOwner: repository.owner.login, repoName: repository.name, prNumber }, {}, error as Error)
|
||||
|
||||
try {
|
||||
await addReaction("+1")
|
||||
} catch (reactionError) {
|
||||
logger.error(
|
||||
"Failed to add reaction:",
|
||||
{
|
||||
operation: "process_feedback",
|
||||
repoOwner: repository.owner.login,
|
||||
repoName: repository.name,
|
||||
prNumber,
|
||||
},
|
||||
{},
|
||||
reactionError as Error,
|
||||
)
|
||||
logger.error("Failed to add reaction:", { operation: "process_feedback", repoOwner: repository.owner.login, repoName: repository.name, prNumber }, {}, reactionError as Error)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -720,50 +579,32 @@ export const githubApp = await (async () => {
|
|||
if (error instanceof Error) {
|
||||
// Check if it's an AggregateError, common for signature issues
|
||||
if (error.name === "AggregateError" && Array.isArray((error as any).errors)) {
|
||||
logger.error(
|
||||
"AggregateError details (possible secret mismatch or multiple issues):",
|
||||
errorContext,
|
||||
)
|
||||
logger.error("AggregateError details (possible secret mismatch or multiple issues):", errorContext)
|
||||
;(error as any).errors.forEach((subError: Error, i: number) => {
|
||||
logger.error(` Sub-error ${i + 1}: ${subError.message}`, errorContext)
|
||||
})
|
||||
} else if (error.message.includes("content length")) {
|
||||
logger.error(
|
||||
"Content length mismatch detected by Octokit. Payload may be truncated or header incorrect.",
|
||||
errorContext,
|
||||
)
|
||||
logger.error("Content length mismatch detected by Octokit. Payload may be truncated or header incorrect.", errorContext)
|
||||
const eventRequest = (error as any).event?.request
|
||||
if (eventRequest?.headers) {
|
||||
logger.error("Request headers from error.event:", errorContext, {
|
||||
headers: JSON.stringify(eventRequest.headers, null, 2),
|
||||
})
|
||||
if (eventRequest && eventRequest.headers) {
|
||||
logger.error("Request headers from error.event:", errorContext, { headers: JSON.stringify(eventRequest.headers, null, 2) })
|
||||
}
|
||||
}
|
||||
// Log the full error structure for better debugging
|
||||
logger.error("Full error object (onError):", errorContext, {
|
||||
errorDetails: JSON.stringify(error, Object.getOwnPropertyNames(error), 2),
|
||||
})
|
||||
logger.error("Full error object (onError):", errorContext, { errorDetails: JSON.stringify(error, Object.getOwnPropertyNames(error), 2) })
|
||||
} else {
|
||||
logger.error("Full error (onError, non-Error instance):", errorContext, {
|
||||
errorDetails: String(error),
|
||||
})
|
||||
logger.error("Full error (onError, non-Error instance):", errorContext, { errorDetails: String(error) })
|
||||
}
|
||||
Sentry.captureException(error)
|
||||
})
|
||||
|
||||
app.webhooks.on("installation_repositories", async ({ payload }) => {
|
||||
const repoCount = payload.repositories_added?.length || 0
|
||||
logger.info(
|
||||
`Received installation_repositories event: installation_id=${payload.installation?.id}, repositories_added=${repoCount}`,
|
||||
webhookContext(payload, "installation_repositories"),
|
||||
)
|
||||
logger.info(`Received installation_repositories event: installation_id=${payload.installation?.id}, repositories_added=${repoCount}`, webhookContext(payload, "installation_repositories"))
|
||||
const { repositories_added, installation, sender } = payload
|
||||
// Check if required fields are missing
|
||||
if (!repositories_added || !installation?.id) {
|
||||
logger.warn(
|
||||
"Missing repositories_added or installation.id",
|
||||
webhookContext(payload, "installation_repositories"),
|
||||
)
|
||||
logger.warn("Missing repositories_added or installation.id", webhookContext(payload, "installation_repositories"))
|
||||
return
|
||||
}
|
||||
const account = installation.account
|
||||
|
|
@ -786,10 +627,7 @@ export const githubApp = await (async () => {
|
|||
}
|
||||
|
||||
if (!accountLogin) {
|
||||
logger.error(
|
||||
"Account login or slug not found",
|
||||
webhookContext(payload, "installation_repositories"),
|
||||
)
|
||||
logger.error("Account login or slug not found", webhookContext(payload, "installation_repositories"))
|
||||
return
|
||||
}
|
||||
|
||||
|
|
@ -804,10 +642,7 @@ export const githubApp = await (async () => {
|
|||
account_login: accountLogin,
|
||||
account_type: accountType,
|
||||
})
|
||||
logger.info(
|
||||
`Installation created for ID: ${installation.id}`,
|
||||
webhookContext(payload, "installation_repositories"),
|
||||
)
|
||||
logger.info(`Installation created for ID: ${installation.id}`, webhookContext(payload, "installation_repositories"))
|
||||
}
|
||||
|
||||
// Process each repository in the list of added repositories
|
||||
|
|
@ -817,10 +652,7 @@ export const githubApp = await (async () => {
|
|||
const githubUserId = sender?.id
|
||||
|
||||
if (githubUserId) {
|
||||
logger.info(
|
||||
`GitHub User ID: ${githubUserId} triggered the event`,
|
||||
webhookContext(payload, "installation_repositories"),
|
||||
)
|
||||
logger.info(`GitHub User ID: ${githubUserId} triggered the event`, webhookContext(payload, "installation_repositories"))
|
||||
// Fetch the user's role using the helper
|
||||
// Use octokit from getInstallationOctokit for this installation
|
||||
const installationOctokit = await app.getInstallationOctokit(installation.id)
|
||||
|
|
@ -831,10 +663,7 @@ export const githubApp = await (async () => {
|
|||
username: sender.login,
|
||||
isOrg: accountType === "Organization",
|
||||
})
|
||||
logger.info(
|
||||
`Fetched user role: ${userRole}`,
|
||||
webhookContext(payload, "installation_repositories"),
|
||||
)
|
||||
logger.info(`Fetched user role: ${userRole}`, webhookContext(payload, "installation_repositories"))
|
||||
const user = await createOrUpdateUser(
|
||||
`github|${githubUserId}`,
|
||||
sender.login,
|
||||
|
|
@ -848,7 +677,7 @@ export const githubApp = await (async () => {
|
|||
const existingOrg = await prisma.organizations.findUnique({
|
||||
where: { github_org_id: ghOrgId },
|
||||
})
|
||||
orgId = existingOrg ? String(existingOrg.id) : undefined
|
||||
orgId = existingOrg?.id
|
||||
if (!existingOrg) {
|
||||
const organization = await organizationRepository.upsertOrganization({
|
||||
github_org_id: ghOrgId,
|
||||
|
|
@ -863,10 +692,7 @@ export const githubApp = await (async () => {
|
|||
addedBy: user.user_id, // Indicates that this user was the first to be added . If user_id equals addedBy, it means this user installed GitHub App for this repository.
|
||||
})
|
||||
|
||||
logger.info(
|
||||
`Organization upserted: ${accountLogin}`,
|
||||
webhookContext(payload, "installation_repositories"),
|
||||
)
|
||||
logger.info(`Organization upserted: ${accountLogin}`, webhookContext(payload, "installation_repositories"))
|
||||
orgId = organization.id
|
||||
}
|
||||
}
|
||||
|
|
@ -880,28 +706,17 @@ export const githubApp = await (async () => {
|
|||
organization_id: orgId,
|
||||
})
|
||||
|
||||
logger.info(
|
||||
`Repository upserted: ${savedRepo.full_name}`,
|
||||
webhookContext(payload, "installation_repositories"),
|
||||
)
|
||||
logger.info(`Repository upserted: ${savedRepo.full_name}`, webhookContext(payload, "installation_repositories"))
|
||||
await upsertRepositoryMember({
|
||||
repository_id: savedRepo.id,
|
||||
user_id: user.user_id,
|
||||
role: userRole,
|
||||
})
|
||||
} else {
|
||||
logger.error(
|
||||
"GitHub User ID not found in sender",
|
||||
webhookContext(payload, "installation_repositories"),
|
||||
)
|
||||
logger.error("GitHub User ID not found in sender", webhookContext(payload, "installation_repositories"))
|
||||
}
|
||||
} catch (error) {
|
||||
logger.errorWithSentry(
|
||||
`Failed to add/reactivate repository ${repo.full_name}`,
|
||||
webhookContext(payload, "installation_repositories"),
|
||||
{},
|
||||
error as Error,
|
||||
)
|
||||
logger.errorWithSentry(`Failed to add/reactivate repository ${repo.full_name}`, webhookContext(payload, "installation_repositories"), {}, error as Error)
|
||||
}
|
||||
}
|
||||
})
|
||||
|
|
@ -945,12 +760,7 @@ const deleteBranchIfExists = async (installationOctokit: any, payload: any, bran
|
|||
if (error.status === 404) {
|
||||
logger.info(`Branch '${branchName}' does not exist`, ctx)
|
||||
} else {
|
||||
logger.error(
|
||||
`Error checking branch existence or deleting '${branchName}':`,
|
||||
ctx,
|
||||
{},
|
||||
error as Error,
|
||||
)
|
||||
logger.error(`Error checking branch existence or deleting '${branchName}':`, ctx, {}, error as Error)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -290,7 +290,7 @@ export async function getUserRole({
|
|||
}
|
||||
|
||||
async function getInstallations(app: App) {
|
||||
const installations: any[] = []
|
||||
let installations: any[] = []
|
||||
let page = 1
|
||||
|
||||
console.log("fetching installations...")
|
||||
|
|
@ -408,9 +408,9 @@ async function getReposForInstallation(installationOctokit: Octokit): Promise<an
|
|||
async function getMembersWithRolesForOrg(
|
||||
installationOctokit: Octokit,
|
||||
orgLogin: string,
|
||||
): Promise<Array<{ id: number; username: string; role: string }>> {
|
||||
const members: Array<{ id: number; username: string; role: string }> = []
|
||||
const memberData: Array<{ id: number; login: string }> = []
|
||||
): Promise<{ id: number; username: string; role: string }[]> {
|
||||
const members: { id: number; username: string; role: string }[] = []
|
||||
const memberData: { id: number; login: string }[] = []
|
||||
let page = 1
|
||||
|
||||
// ---- Fetch members (paginated) ----
|
||||
|
|
@ -462,8 +462,8 @@ export async function syncOrgsWithMembers(app: App, orgNames?: string[]) {
|
|||
try {
|
||||
const login = installation.account!.login
|
||||
let repos: any[] = []
|
||||
let members: Array<{ id: number; username: string; role: string }> = []
|
||||
console.log(`fetch repos for ${login}`)
|
||||
let members: { id: number; username: string; role: string }[] = []
|
||||
console.log("fetch repos for " + login)
|
||||
|
||||
const installationOctokit = await app.getInstallationOctokit(installation.id)
|
||||
|
||||
|
|
@ -487,7 +487,7 @@ export async function syncOrgsWithMembers(app: App, orgNames?: string[]) {
|
|||
repos = await getReposForInstallation(installationOctokit)
|
||||
|
||||
console.log("Done... ")
|
||||
console.log(`fetch members for ${login}`)
|
||||
console.log("fetch members for " + login)
|
||||
|
||||
// --- Fetch all members with roles ---
|
||||
members = await getMembersWithRolesForOrg(installationOctokit, login)
|
||||
|
|
@ -498,11 +498,13 @@ export async function syncOrgsWithMembers(app: App, orgNames?: string[]) {
|
|||
let organization = await organizationRepository.findByGithubOrgId(
|
||||
String(installation.account!.id),
|
||||
)
|
||||
organization ||= await organizationRepository.create({
|
||||
github_org_id: String(installation.account!.id),
|
||||
name: login,
|
||||
added_by: "Codeflash",
|
||||
})
|
||||
if (!organization) {
|
||||
organization = await organizationRepository.create({
|
||||
github_org_id: String(installation.account!.id),
|
||||
name: login,
|
||||
added_by: "Codeflash",
|
||||
})
|
||||
}
|
||||
|
||||
// Fetch existing members in organization from DB
|
||||
const existingMembersInDb = await prisma.organization_members.findMany({
|
||||
|
|
@ -514,15 +516,9 @@ export async function syncOrgsWithMembers(app: App, orgNames?: string[]) {
|
|||
|
||||
// Remove members who no longer exist in the org
|
||||
for (const existingMember of existingMembersInDb) {
|
||||
if (!currentMemberIds.includes(String(existingMember.user_id))) {
|
||||
await organizationMemberRepository.removeMember(
|
||||
String(organization.id),
|
||||
String(existingMember.user_id),
|
||||
)
|
||||
await deleteOrganizationMemberApiKeys(
|
||||
String(existingMember.user_id),
|
||||
String(organization.id),
|
||||
)
|
||||
if (!currentMemberIds.includes(existingMember.user_id)) {
|
||||
await organizationMemberRepository.removeMember(organization.id, existingMember.user_id)
|
||||
await deleteOrganizationMemberApiKeys(existingMember.user_id, organization.id)
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -1,4 +1,4 @@
|
|||
import { prisma } from "@codeflash-ai/common"
|
||||
import { PrismaClient } from "@prisma/client"
|
||||
import { sendSlackMessage } from "./slack_util.js"
|
||||
import {
|
||||
requiresApproval,
|
||||
|
|
@ -11,6 +11,8 @@ import {
|
|||
optimizationNotFound,
|
||||
internalServerError,
|
||||
} from "../exceptions/index.js"
|
||||
|
||||
const prisma = new PrismaClient()
|
||||
const SLACK_CHANNEL = process.env.SLACK_APPROVAL_CHANNEL_ID || process.env.SLACK_CHANNEL_ID
|
||||
const APPROVAL_EMOJI = getApprovalEmoji()
|
||||
const REJECTION_EMOJI = getRejectionEmoji()
|
||||
|
|
@ -188,7 +190,7 @@ export async function sendQualityMonitoringNotification(
|
|||
})
|
||||
|
||||
const message = {
|
||||
blocks,
|
||||
blocks: blocks,
|
||||
text: `Quality Monitoring: ${prType} Applied for ${functionName} in ${owner}/${repo} (${traceId}). Speedup: ${prCommentFields.speedup_pct || "N/A"}. View details: ${traceViewUrl}${prUrl ? ` | PR: ${prUrl}` : ""}`,
|
||||
}
|
||||
|
||||
|
|
@ -335,7 +337,7 @@ export async function requestApproval(
|
|||
})
|
||||
|
||||
const message = {
|
||||
blocks,
|
||||
blocks: blocks,
|
||||
text: `${prType} Optimization Approval Request for ${functionName} in ${owner}/${repo} (${traceId}). Speedup: ${prCommentFields.speedup_pct || "N/A"}. View details: ${traceViewUrl}`,
|
||||
}
|
||||
|
||||
|
|
@ -457,7 +459,7 @@ export async function processReaction(event: any): Promise<boolean> {
|
|||
// Process approval
|
||||
if (reaction === APPROVAL_EMOJI) {
|
||||
await prisma.optimization_features.update({
|
||||
where: { trace_id: String(optimization.trace_id) },
|
||||
where: { trace_id: optimization.trace_id },
|
||||
data: {
|
||||
approval_status: "approved",
|
||||
approval_user: user,
|
||||
|
|
@ -525,7 +527,7 @@ export async function processReaction(event: any): Promise<boolean> {
|
|||
installationOctokit,
|
||||
requestData.replayTests,
|
||||
requestData.concolicTests,
|
||||
String(optimization.trace_id),
|
||||
optimization.trace_id,
|
||||
requestData.optimizationReview,
|
||||
)
|
||||
} else if (requestData.type === "suggest-pr-changes") {
|
||||
|
|
@ -566,7 +568,7 @@ export async function processReaction(event: any): Promise<boolean> {
|
|||
installationOctokit,
|
||||
requestData.replayTests,
|
||||
requestData.concolicTests,
|
||||
String(optimization.trace_id),
|
||||
optimization.trace_id,
|
||||
requestData.optimizationReview,
|
||||
)
|
||||
}
|
||||
|
|
@ -574,13 +576,12 @@ export async function processReaction(event: any): Promise<boolean> {
|
|||
console.error(
|
||||
`Error processing approved request for trace ${optimization.trace_id}: ${err}`,
|
||||
)
|
||||
|
||||
|
||||
// Extract helpful error details for Slack notification
|
||||
const errorMessage = err.message || String(err)
|
||||
const errorType = err.constructor?.name || "Error"
|
||||
const isPrMergedOrClosed =
|
||||
errorMessage.includes("merged") || errorMessage.includes("closed")
|
||||
|
||||
const isPrMergedOrClosed = errorMessage.includes("merged") || errorMessage.includes("closed")
|
||||
|
||||
const errorBlocks: any[] = [
|
||||
{
|
||||
type: "section",
|
||||
|
|
@ -597,7 +598,7 @@ export async function processReaction(event: any): Promise<boolean> {
|
|||
},
|
||||
},
|
||||
]
|
||||
|
||||
|
||||
// Add helpful context if PR is merged/closed
|
||||
if (isPrMergedOrClosed) {
|
||||
errorBlocks.push({
|
||||
|
|
@ -610,7 +611,7 @@ export async function processReaction(event: any): Promise<boolean> {
|
|||
],
|
||||
})
|
||||
}
|
||||
|
||||
|
||||
await sendSlackMessage(
|
||||
{
|
||||
blocks: errorBlocks,
|
||||
|
|
@ -618,7 +619,7 @@ export async function processReaction(event: any): Promise<boolean> {
|
|||
},
|
||||
channel,
|
||||
)
|
||||
|
||||
|
||||
// Return false to indicate the reaction processing failed
|
||||
return false
|
||||
}
|
||||
|
|
@ -630,7 +631,7 @@ export async function processReaction(event: any): Promise<boolean> {
|
|||
// Process rejection
|
||||
if (reaction === REJECTION_EMOJI) {
|
||||
await prisma.optimization_features.update({
|
||||
where: { trace_id: String(optimization.trace_id) },
|
||||
where: { trace_id: optimization.trace_id },
|
||||
data: {
|
||||
approval_status: "rejected",
|
||||
approval_user: user,
|
||||
|
|
@ -670,11 +671,7 @@ async function getUserNickname(userId: string): Promise<string | null> {
|
|||
return await userNickname(userId)
|
||||
}
|
||||
|
||||
async function getInstallationOctokit(
|
||||
owner: string,
|
||||
repo: string,
|
||||
userId?: string,
|
||||
): Promise<any | Error> {
|
||||
async function getInstallationOctokit(owner: string, repo: string, userId?: string): Promise<any | Error> {
|
||||
const { getInstallationOctokitByOwner } = await import("../github/github-utils.js")
|
||||
const { githubApp } = await import("../github/github-app.js")
|
||||
return await getInstallationOctokitByOwner(githubApp, owner, repo, userId)
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
import fs from "node:fs"
|
||||
import path, { dirname } from "node:path"
|
||||
import { fileURLToPath } from "node:url"
|
||||
import fs from "fs"
|
||||
import path, { dirname } from "path"
|
||||
import { fileURLToPath } from "url"
|
||||
import { PrCommentFields } from "./create-pr-from-diffcontents.js"
|
||||
import { OptimizationReview } from "../OptimizationReview.js"
|
||||
|
||||
|
|
@ -189,11 +189,11 @@ export function buildPrCommentBody(
|
|||
? buildBenchmarkInfo(prCommentFields)
|
||||
: ""
|
||||
return (
|
||||
`${buildOptimizationMetadata(prCommentFields, trace_id)}\n${
|
||||
includeHeader ? `#### ⚡️ Codeflash found optimizations for this PR\n` : ""
|
||||
}${buildResultHeader(prCommentFields, isUnifiedReview)}\n${
|
||||
benchmarkInfo ? `${benchmarkInfo}\n` : ""
|
||||
}${buildResultDetails(prCommentFields, isCollapsed)}\n` +
|
||||
`${buildOptimizationMetadata(prCommentFields, trace_id)}\n` +
|
||||
(includeHeader ? `#### ⚡️ Codeflash found optimizations for this PR\n` : "") +
|
||||
`${buildResultHeader(prCommentFields, isUnifiedReview)}\n` +
|
||||
(benchmarkInfo ? `${benchmarkInfo}\n` : "") +
|
||||
`${buildResultDetails(prCommentFields, isCollapsed)}\n` +
|
||||
`${buildResultTestReport(
|
||||
prCommentFields,
|
||||
existingTests,
|
||||
|
|
@ -208,7 +208,7 @@ export function buildPrCommentBody(
|
|||
|
||||
export function buildMergeBranchMsg(newBranchName: string): string {
|
||||
if (newBranchName?.length > 0) {
|
||||
return `To test or edit this optimization locally ` + `\`git merge ${newBranchName}\`\n\n`
|
||||
return "To test or edit this optimization locally " + "`git merge " + newBranchName + "`\n\n"
|
||||
}
|
||||
return ""
|
||||
}
|
||||
|
|
@ -294,19 +294,21 @@ export function buildResultHeader(fields: PrCommentFields, isUnifiedReview?: boo
|
|||
|
||||
export function buildResultDetails(fields: PrCommentFields, isCollapsed: boolean = false): string {
|
||||
return isCollapsed
|
||||
? `${getPrDetailsTemplateCollapsed().replace(
|
||||
? getPrDetailsTemplateCollapsed().replace(
|
||||
/\{optimization_explanation}/g,
|
||||
fields.optimization_explanation,
|
||||
)}\n`
|
||||
: `${getPrDetailsTemplate().replace(
|
||||
) + "\n"
|
||||
: getPrDetailsTemplate().replace(
|
||||
/\{optimization_explanation}/g,
|
||||
fields.optimization_explanation,
|
||||
)}\n`
|
||||
) + "\n"
|
||||
}
|
||||
export function buildResultFooter(newBranchName: string): string {
|
||||
return (
|
||||
`To edit these changes ` +
|
||||
`\`git checkout ${newBranchName}\` and push.\n\n` +
|
||||
"To edit these changes " +
|
||||
"`git checkout " +
|
||||
newBranchName +
|
||||
"` and push.\n\n" +
|
||||
`[](https://codeflash.ai)`
|
||||
)
|
||||
}
|
||||
|
|
@ -367,7 +369,7 @@ export function buildResultTestReport(
|
|||
reportTableMd += `<details>\n`
|
||||
|
||||
// Extract emoji if present at the start, then format as "[emoji] Click to see [name]"
|
||||
const emojiMatch = /^(\p{Emoji_Presentation}|\p{Emoji}\uFE0F?)/u.exec(testType)
|
||||
const emojiMatch = testType.match(/^(\p{Emoji_Presentation}|\p{Emoji}\uFE0F?)/u)
|
||||
if (emojiMatch) {
|
||||
const emoji = emojiMatch[0]
|
||||
const testName = testType.slice(emoji.length).trim()
|
||||
|
|
@ -391,7 +393,7 @@ export function buildResultTestReport(
|
|||
// Check if generatedTests already contains backticks
|
||||
if (!trimmedGeneratedTests.includes("`")) {
|
||||
// Wrap in Python markdown block
|
||||
reportTableMd += `\`\`\`python\n${trimmedGeneratedTests}\n\`\`\``
|
||||
reportTableMd += "```python\n" + trimmedGeneratedTests + "\n```"
|
||||
} else {
|
||||
reportTableMd += trimmedGeneratedTests
|
||||
}
|
||||
|
|
@ -405,7 +407,7 @@ export function buildResultTestReport(
|
|||
}
|
||||
|
||||
// Add the final markdown content (e.g., the feedback section)
|
||||
const finalMarkdown = reportTableMd
|
||||
const finalMarkdown = `${reportTableMd}`
|
||||
|
||||
return getPrTestReportTemplate().replace(/\{report_table}/g, finalMarkdown)
|
||||
}
|
||||
|
|
@ -413,7 +415,7 @@ export function buildResultTestReport(
|
|||
// Enhanced parser that supports both metadata and legacy regex parsing
|
||||
export function parseAndCreateOptimizationsDict(
|
||||
prBody: string,
|
||||
prComments: Array<{ body: string }>,
|
||||
prComments: { body: string }[],
|
||||
): Record<string, Set<string>> {
|
||||
const optimizations: Record<string, Set<string>> = {}
|
||||
const textsToParse = [prBody, ...prComments.map(comment => comment.body)]
|
||||
|
|
@ -431,7 +433,9 @@ export function parseAndCreateOptimizationsDict(
|
|||
const filePath = metadata.file
|
||||
|
||||
if (functionName && filePath) {
|
||||
optimizations[filePath] ||= new Set()
|
||||
if (!optimizations[filePath]) {
|
||||
optimizations[filePath] = new Set()
|
||||
}
|
||||
optimizations[filePath].add(functionName)
|
||||
}
|
||||
} catch (e) {
|
||||
|
|
@ -446,7 +450,9 @@ export function parseAndCreateOptimizationsDict(
|
|||
const filePath = legacyMatch[4]
|
||||
|
||||
if (functionName && filePath) {
|
||||
optimizations[filePath] ||= new Set()
|
||||
if (!optimizations[filePath]) {
|
||||
optimizations[filePath] = new Set()
|
||||
}
|
||||
optimizations[filePath].add(functionName)
|
||||
}
|
||||
}
|
||||
|
|
@ -458,7 +464,7 @@ export function parseAndCreateOptimizationsDict(
|
|||
// Helper function to extract rich metadata from comments (future use)
|
||||
export function parseOptimizationMetadata(
|
||||
prBody: string,
|
||||
prComments: Array<{ body: string }>,
|
||||
prComments: { body: string }[],
|
||||
): Array<{
|
||||
function: string
|
||||
file: string
|
||||
|
|
@ -496,7 +502,9 @@ export function buildDependentPrTitle(
|
|||
pullNumber: number,
|
||||
baseBranch: string,
|
||||
): string {
|
||||
return `${buildPrTitle(functionName, speedupPct, speedupX)} in PR #${pullNumber} (\`${baseBranch}\`)`
|
||||
return (
|
||||
buildPrTitle(functionName, speedupPct, speedupX) + ` in PR #${pullNumber} (\`${baseBranch}\`)`
|
||||
)
|
||||
}
|
||||
|
||||
export function buildPrTitle(functionName: string, speedupPct: string, speedupX: string): string {
|
||||
|
|
@ -518,19 +526,23 @@ export function originalPRComment(
|
|||
): string {
|
||||
const prCommentHeader = buildResultHeader(prCommentFields)
|
||||
let optReviewBadge = generateOptimizationReviewTemplate(optimizationReview)
|
||||
optReviewBadge &&= `\n\n${optReviewBadge}\n`
|
||||
if (optReviewBadge) {
|
||||
optReviewBadge = `\n\n${optReviewBadge}\n`
|
||||
}
|
||||
const isMediumReview = optimizationReview === OptimizationReview.MEDIUM
|
||||
const reviewSection = isMediumReview
|
||||
? `#### A new Optimization Review has been created.\n\n🔗 [Review here](https://app.codeflash.ai/review-optimizations/${newPrNumber})`
|
||||
: `#### A dependent PR with the suggested changes has been created. Please review:\n\n- ### #${newPrNumber}`
|
||||
return `
|
||||
return (
|
||||
`
|
||||
#### ⚡️ Codeflash found optimizations for this PR
|
||||
${prCommentHeader}
|
||||
${reviewSection}
|
||||
${
|
||||
!isMediumReview
|
||||
? `If you approve, it will be merged into this PR (branch \`${baseBranch}\`).
|
||||
` +
|
||||
(!isMediumReview
|
||||
? `If you approve, it will be merged into this PR (branch \`${baseBranch}\`).
|
||||
`
|
||||
: ""
|
||||
}${optReviewBadge}`
|
||||
: "") +
|
||||
optReviewBadge
|
||||
)
|
||||
}
|
||||
|
|
|
|||
|
|
@ -45,7 +45,9 @@ export function initializeWebClient() {
|
|||
throw new Error("Missing SLACK_CHANNEL_ID")
|
||||
}
|
||||
|
||||
web ||= new dependencies.WebClient(SLACK_TOKEN, {})
|
||||
if (!web) {
|
||||
web = new dependencies.WebClient(SLACK_TOKEN, {})
|
||||
}
|
||||
|
||||
return web
|
||||
}
|
||||
|
|
@ -67,8 +69,8 @@ export const sendSlackMessage = async (
|
|||
message: any,
|
||||
channel: string | null = null,
|
||||
returnData: boolean = false,
|
||||
): Promise<boolean | object> =>
|
||||
await new Promise(async (resolve, reject) => {
|
||||
): Promise<boolean | object> => {
|
||||
return new Promise(async (resolve, reject) => {
|
||||
try {
|
||||
const webClient = initializeWebClient()
|
||||
const SLACK_CHANNEL_ID = dependencies.getSlackChannelId()
|
||||
|
|
@ -107,9 +109,10 @@ export const sendSlackMessage = async (
|
|||
// console.log("Sending payload to Slack:", JSON.stringify(payload, null, 2));
|
||||
|
||||
const resp = await webClient.chat.postMessage(payload)
|
||||
resolve(returnData ? resp : true)
|
||||
return resolve(returnData ? resp : true)
|
||||
} catch (error) {
|
||||
dependencies.console.error("Error sending Slack message:", error)
|
||||
resolve(returnData ? { error } : true)
|
||||
return resolve(returnData ? { error } : true)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
|
|
|||
|
|
@ -41,8 +41,8 @@ Sentry.init({
|
|||
beforeSend(event, hint) {
|
||||
// Remove sensitive headers
|
||||
if (event.request?.headers) {
|
||||
delete event.request.headers.authorization
|
||||
delete event.request.headers.cookie
|
||||
delete event.request.headers["authorization"]
|
||||
delete event.request.headers["cookie"]
|
||||
delete event.request.headers["x-api-key"]
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -11,18 +11,14 @@ process.env.NODE_ENV = "test"
|
|||
// Note: Jest moduleNameMapper strips .js extensions, so this should match the import
|
||||
// @ts-ignore
|
||||
jest.mock("./endpoints/utils/github-repo-setup", () => ({
|
||||
registerRepositoryAndMember: jest
|
||||
.fn()
|
||||
.mockImplementation(async () => await Promise.resolve(12345)),
|
||||
getInstallationId: jest.fn().mockImplementation(async () => await Promise.resolve(12345)),
|
||||
registerRepositoryAndMember: jest.fn().mockImplementation(() => Promise.resolve(12345)),
|
||||
getInstallationId: jest.fn().mockImplementation(() => Promise.resolve(12345)),
|
||||
}))
|
||||
|
||||
// Also mock the direct import paths that might be used
|
||||
jest.mock("./endpoints/utils/github-repo-setup.js", () => ({
|
||||
registerRepositoryAndMember: jest
|
||||
.fn()
|
||||
.mockImplementation(async () => await Promise.resolve(12345)),
|
||||
getInstallationId: jest.fn().mockImplementation(async () => await Promise.resolve(12345)),
|
||||
registerRepositoryAndMember: jest.fn().mockImplementation(() => Promise.resolve(12345)),
|
||||
getInstallationId: jest.fn().mockImplementation(() => Promise.resolve(12345)),
|
||||
}))
|
||||
|
||||
// Set environment variable to disable Prisma in tests
|
||||
|
|
|
|||
|
|
@ -1,6 +1,7 @@
|
|||
import { posthog } from "../analytics.js"
|
||||
import { AuthorizedUserReq } from "../types.js"
|
||||
import { NextFunction, Response } from "express"
|
||||
import { NextFunction } from "express"
|
||||
import { Response } from "express"
|
||||
import { AuthStrategyFactory } from "./Auth/auth-strategy-factory.js"
|
||||
import { logger } from "../utils/logger.js"
|
||||
import {
|
||||
|
|
@ -38,16 +39,17 @@ export async function checkForValidAPIKey(
|
|||
},
|
||||
disableGeoip: false,
|
||||
})
|
||||
next(missingAuthorizationHeader({ requestId: req.requestId, endpoint: req.path }))
|
||||
return
|
||||
return next(missingAuthorizationHeader({ requestId: req.requestId, endpoint: req.path }))
|
||||
}
|
||||
|
||||
// Optimized Bearer token extraction - avoid regex overhead
|
||||
const apiKey = authHeader.startsWith("Bearer ") ? authHeader.substring(7) : authHeader
|
||||
const apiKey = authHeader.startsWith("Bearer ")
|
||||
? authHeader.substring(7)
|
||||
: authHeader
|
||||
|
||||
try {
|
||||
const authResult = await AuthStrategyFactory.getStrategy(apiKey).authenticate()
|
||||
if (authResult?.userId == null) {
|
||||
if (authResult == null || authResult.userId == null) {
|
||||
console.log(`User Id null for API key ${apiKey}. Returning 403`)
|
||||
posthog?.capture({
|
||||
distinctId: "null-user-with-invalid-api-key",
|
||||
|
|
@ -92,11 +94,6 @@ export async function checkForValidAPIKey(
|
|||
error as Error,
|
||||
)
|
||||
|
||||
next(
|
||||
internalServerError("Authentication service error", {
|
||||
requestId: req.requestId,
|
||||
endpoint: req.path,
|
||||
}),
|
||||
)
|
||||
return next(internalServerError("Authentication service error", { requestId: req.requestId, endpoint: req.path }))
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -228,16 +228,18 @@ export function addUserContext(req: Request, res: Response, next: NextFunction):
|
|||
|
||||
if (userId || username || userEmail) {
|
||||
// Enhance request logger with user context
|
||||
req.requestLogger &&= req.requestLogger.child({
|
||||
userId,
|
||||
username,
|
||||
userEmail,
|
||||
})
|
||||
if (req.requestLogger) {
|
||||
req.requestLogger = req.requestLogger.child({
|
||||
userId,
|
||||
username,
|
||||
userEmail,
|
||||
})
|
||||
}
|
||||
|
||||
// Add to Sentry
|
||||
Sentry.setUser({
|
||||
id: userId,
|
||||
username,
|
||||
username: username,
|
||||
email: userEmail,
|
||||
})
|
||||
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
import rateLimit from "express-rate-limit"
|
||||
import * as Sentry from "@sentry/node"
|
||||
import { AuthorizedUserReq } from "../types.js"
|
||||
import { AuthorizedUserReq } from "types.js"
|
||||
import { isCodeflashEmployee } from "../utils/employee-utils.js"
|
||||
|
||||
// Load values from environment or use defaults
|
||||
|
|
@ -35,11 +35,13 @@ export const idLimiter = rateLimit({
|
|||
...baseRateLimitConfig,
|
||||
skip: (req: AuthorizedUserReq) => {
|
||||
// Skip if no userId is set — typically means checkForValidAPIKey hasn't run yet
|
||||
if (!req.userId) return true
|
||||
|
||||
if (isCodeflashEmployee(req.userId)) return true
|
||||
|
||||
return false
|
||||
if (!req.userId) return true;
|
||||
|
||||
if (isCodeflashEmployee(req.userId)) return true;
|
||||
|
||||
return false;
|
||||
},
|
||||
keyGenerator: (req: AuthorizedUserReq) => {
|
||||
return `ratelimit:user:${req.userId}:${req.path}`
|
||||
},
|
||||
keyGenerator: (req: AuthorizedUserReq) => `ratelimit:user:${req.userId}:${req.path}`,
|
||||
})
|
||||
|
|
|
|||
|
|
@ -2,7 +2,11 @@ import { Response, NextFunction } from "express"
|
|||
import { prisma, checkAndResetSubscriptionPeriod, SUBSCRIPTION_PLANS } from "@codeflash-ai/common"
|
||||
import { AuthorizedUserReq } from "../types.js"
|
||||
import { logger } from "../utils/logger.js"
|
||||
import { missingUserId, subscriptionInactive, internalServerError } from "../exceptions/index.js"
|
||||
import {
|
||||
missingUserId,
|
||||
subscriptionInactive,
|
||||
internalServerError,
|
||||
} from "../exceptions/index.js"
|
||||
|
||||
export async function trackUsage(req: AuthorizedUserReq, res: Response, next: NextFunction) {
|
||||
const userId = req.userId
|
||||
|
|
@ -17,8 +21,7 @@ export async function trackUsage(req: AuthorizedUserReq, res: Response, next: Ne
|
|||
operation: "usage_tracking",
|
||||
})
|
||||
|
||||
next(missingUserId({ requestId: req.requestId, endpoint: req.path }))
|
||||
return
|
||||
return next(missingUserId({ requestId: req.requestId, endpoint: req.path }))
|
||||
}
|
||||
|
||||
try {
|
||||
|
|
@ -60,11 +63,11 @@ export async function trackUsage(req: AuthorizedUserReq, res: Response, next: Ne
|
|||
})
|
||||
|
||||
// Add subscription info to request for later use
|
||||
req.subscriptionInfo = {
|
||||
userId,
|
||||
tier: String(newSubscription.plan_type),
|
||||
used: Number(newSubscription.optimizations_used),
|
||||
limit: Number(newSubscription.optimizations_limit),
|
||||
req["subscriptionInfo"] = {
|
||||
userId: userId,
|
||||
tier: newSubscription.plan_type,
|
||||
used: newSubscription.optimizations_used,
|
||||
limit: newSubscription.optimizations_limit,
|
||||
}
|
||||
|
||||
// Log subscription creation success - logger handles environment filtering automatically
|
||||
|
|
@ -79,8 +82,7 @@ export async function trackUsage(req: AuthorizedUserReq, res: Response, next: Ne
|
|||
limit: newSubscription.optimizations_limit,
|
||||
})
|
||||
|
||||
next()
|
||||
return
|
||||
return next()
|
||||
}
|
||||
|
||||
// Check subscription status and limits
|
||||
|
|
@ -96,8 +98,7 @@ export async function trackUsage(req: AuthorizedUserReq, res: Response, next: Ne
|
|||
status: subscription.subscription_status,
|
||||
})
|
||||
|
||||
next(subscriptionInactive({ requestId: req.requestId, userId, endpoint: req.path }))
|
||||
return
|
||||
return next(subscriptionInactive({ requestId: req.requestId, userId, endpoint: req.path }))
|
||||
}
|
||||
|
||||
// Check if we need to reset monthly usage (lazy reset)
|
||||
|
|
@ -105,11 +106,11 @@ export async function trackUsage(req: AuthorizedUserReq, res: Response, next: Ne
|
|||
const currentOptimizationsUsed = currentSubscription?.optimizations_used || 0
|
||||
|
||||
// Add subscription info to request for later use
|
||||
req.subscriptionInfo = {
|
||||
userId,
|
||||
tier: String(subscription.plan_type),
|
||||
req["subscriptionInfo"] = {
|
||||
userId: userId,
|
||||
tier: subscription.plan_type,
|
||||
used: currentOptimizationsUsed,
|
||||
limit: Number(subscription.optimizations_limit),
|
||||
limit: subscription.optimizations_limit,
|
||||
}
|
||||
|
||||
// Log usage tracking completion - logger handles environment filtering automatically
|
||||
|
|
@ -142,12 +143,6 @@ export async function trackUsage(req: AuthorizedUserReq, res: Response, next: Ne
|
|||
error as Error,
|
||||
)
|
||||
|
||||
next(
|
||||
internalServerError("Error tracking usage", {
|
||||
requestId: req.requestId,
|
||||
userId,
|
||||
endpoint: req.path,
|
||||
}),
|
||||
)
|
||||
return next(internalServerError("Error tracking usage", { requestId: req.requestId, userId, endpoint: req.path }))
|
||||
}
|
||||
}
|
||||
|
|
|
|||
13166
js/cf-api/package-lock.json
generated
Normal file
13166
js/cf-api/package-lock.json
generated
Normal file
File diff suppressed because it is too large
Load diff
|
|
@ -6,17 +6,17 @@
|
|||
"scripts": {
|
||||
"npx": "npx",
|
||||
"copy-md": "copyfiles -u 0 \"github/*.md\" dist",
|
||||
"copy-configs": "copyfiles -e \"node_modules/**\" -u 0 \"**/*.json\" \"**/*.pem\" \"**/*.txt\" dist",
|
||||
"copy-assets": "pnpm run copy-md && pnpm run copy-configs",
|
||||
"build": "pnpm install && prisma generate && tsc && pnpm run copy-assets",
|
||||
"copy-configs": "copyfiles -u 0 \"**/*.json\" \"**/*.pem\" \"**/*.txt\" dist",
|
||||
"copy-assets": "npm run copy-md && npm run copy-configs",
|
||||
"build": "npm install --loglevel verbose && npx prisma generate && tsc && npm run copy-assets",
|
||||
"deploy": "az webapp up -n codeflash-api --sku P1V2 --runtime NODE:20-lts --verbose",
|
||||
"dev": "prisma generate && tsx index.ts",
|
||||
"dev": "npx prisma generate && npx tsx index.ts",
|
||||
"start": "node dist/index.js",
|
||||
"prisma:generate": "cd ../common && prisma generate",
|
||||
"prisma:migrate": "cd ../common && prisma migrate dev",
|
||||
"prisma:generate": "cd ../common && npx prisma generate",
|
||||
"prisma:migrate": "cd ../common && npx prisma migrate dev",
|
||||
"test": "NODE_OPTIONS=--experimental-vm-modules jest",
|
||||
"test:watch": "NODE_OPTIONS=--experimental-vm-modules jest --watch",
|
||||
"lint": "eslint './*.ts' './endpoints/**/*.ts' './config/**/*.ts' './github/**/*.ts' './middlewares/**/*.ts' './scripts/**/*.ts'",
|
||||
"lint": "eslint './*.ts' './endpoints/**/*.ts' './config/**/*.ts' './github/**/*.ts' './middlewares/**/*.ts' './scripts/**/*.ts' --ext .ts",
|
||||
"type-check": "tsc --noEmit",
|
||||
"prepare": "simple-git-hooks",
|
||||
"format": "prettier --write \"**/*.{js,ts,tsx,json,md}\"",
|
||||
|
|
@ -28,7 +28,7 @@
|
|||
"@azure/keyvault-keys": "^4.10.0",
|
||||
"@azure/keyvault-secrets": "^4.11.1",
|
||||
"@codeflash-ai/code-suggester": "^5.0.4",
|
||||
"@codeflash-ai/common": "workspace:*",
|
||||
"@codeflash-ai/common": "^1.0.31",
|
||||
"@octokit/app": "^16.1.2",
|
||||
"@octokit/auth-app": "^8.2.0",
|
||||
"@octokit/core": "^7.0.6",
|
||||
|
|
@ -37,7 +37,7 @@
|
|||
"@octokit/webhooks": "^14.2.0",
|
||||
"@opentelemetry/api": "^1.9.1",
|
||||
"@opentelemetry/context-async-hooks": "^2.6.1",
|
||||
"@prisma/client": "^7.7.0",
|
||||
"@prisma/client": "^6.19.3",
|
||||
"@sentry/node": "^10.48.0",
|
||||
"@sentry/opentelemetry": "^10.48.0",
|
||||
"@sentry/profiling-node": "^10.48.0",
|
||||
|
|
@ -66,15 +66,15 @@
|
|||
"@types/jest": "^29.5.14",
|
||||
"@types/supertest": "^7.2.0",
|
||||
"copyfiles": "^2.4.1",
|
||||
"eslint": "^9.39.4",
|
||||
"eslint-config-love": "^152.0.0",
|
||||
"eslint": "^8.57.1",
|
||||
"eslint-config-prettier": "^10.1.8",
|
||||
"eslint-config-standard-with-typescript": "^43.0.1",
|
||||
"eslint-plugin-import": "^2.29.0",
|
||||
"eslint-plugin-promise": "^6.1.1",
|
||||
"jest": "^29.7.0",
|
||||
"lint-staged": "^16.4.0",
|
||||
"prettier": "^3.8.2",
|
||||
"prisma": "^7.7.0",
|
||||
"prisma": "^6.19.3",
|
||||
"supertest": "^7.2.2",
|
||||
"ts-jest": "^29.4.9",
|
||||
"ts-node": "^10.9.2"
|
||||
|
|
|
|||
|
|
@ -1,6 +0,0 @@
|
|||
import path from "node:path"
|
||||
import { defineConfig } from "prisma/config"
|
||||
|
||||
export default defineConfig({
|
||||
schema: path.join(__dirname, "../common/prisma/schema.prisma"),
|
||||
})
|
||||
|
|
@ -1,6 +1,6 @@
|
|||
import dotenv from "dotenv"
|
||||
import console from "node:console"
|
||||
import fs from "node:fs"
|
||||
import console from "console"
|
||||
import fs from "fs"
|
||||
|
||||
if (fs.existsSync(".env.local")) {
|
||||
console.log("Using .env.local file to supply config environment variables")
|
||||
|
|
|
|||
|
|
@ -1,4 +1,4 @@
|
|||
import fs from "node:fs"
|
||||
import fs from "fs"
|
||||
import { AnyOctokit } from "./types.js"
|
||||
|
||||
const APP_ID: string = process.env.APP_ID || "" // Replace with your GitHub App ID
|
||||
|
|
@ -38,7 +38,7 @@ jobs:
|
|||
repo: repoName,
|
||||
path: ".github/workflows/optimize.yml",
|
||||
message: "Setup Code Optimization action",
|
||||
content,
|
||||
content: content,
|
||||
})
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -151,7 +151,7 @@ export class GitBranchStagingStrategy extends StagingStorageStrategy {
|
|||
}
|
||||
|
||||
const installationOctokit = await dependencies.getInstallationOctokit(
|
||||
Number(repository.installation_id),
|
||||
repository.installation_id,
|
||||
)
|
||||
|
||||
const nickname = await dependencies.userNickname(userId)
|
||||
|
|
|
|||
|
|
@ -10,21 +10,24 @@
|
|||
"strictNullChecks": false,
|
||||
"sourceMap": true,
|
||||
"target": "es2022",
|
||||
"types": ["node", "express"],
|
||||
"types": ["node", "express", "jest", "@types/jest"],
|
||||
"outDir": "dist",
|
||||
"rootDir": ".",
|
||||
"baseUrl": ".",
|
||||
"skipLibCheck": true,
|
||||
"paths": {},
|
||||
"resolveJsonModule": true,
|
||||
"allowJs": true
|
||||
},
|
||||
"include": ["src/**/*", "**/*.ts", "*.ts", "**/*.md", "**/*.json", "**/*.pem", "**/*.txt"],
|
||||
"exclude": [
|
||||
"node_modules",
|
||||
"dist",
|
||||
"**/*.test.ts",
|
||||
"*.test.ts",
|
||||
"**/*.spec.ts",
|
||||
"**/__tests__/*",
|
||||
"jest.setup.ts"
|
||||
]
|
||||
"include": [
|
||||
"src/**/*",
|
||||
"**/*.ts",
|
||||
"*.ts",
|
||||
"types.d.ts",
|
||||
"**/*.md",
|
||||
"**/*.json",
|
||||
"**/*.pem",
|
||||
"**/*.txt"
|
||||
],
|
||||
"exclude": ["node_modules", "dist", "**/*.test.ts", "*.test.ts", "**/*.spec.ts", "**/__tests__/*"]
|
||||
}
|
||||
|
|
|
|||
7
js/cf-api/types.ts → js/cf-api/types.d.ts
vendored
7
js/cf-api/types.ts → js/cf-api/types.d.ts
vendored
|
|
@ -36,9 +36,9 @@ export interface PullRequestDB {
|
|||
|
||||
// Complete AsyncExpressApp interface
|
||||
export interface AsyncExpressApp {
|
||||
post: ((path: string, handler: any) => AsyncExpressApp) &
|
||||
((path: string, middleware: any, handler: any) => AsyncExpressApp) &
|
||||
((path: string, ...handlers: any[]) => AsyncExpressApp)
|
||||
post(path: string, handler: any): AsyncExpressApp
|
||||
post(path: string, middleware: any, handler: any): AsyncExpressApp
|
||||
post(path: string, ...handlers: any[]): AsyncExpressApp
|
||||
|
||||
// Async methods
|
||||
postAsync: (path: string, handler: (req: any, res: any, next?: any) => Promise<any>) => void
|
||||
|
|
@ -47,6 +47,7 @@ export interface AsyncExpressApp {
|
|||
// Standard Express methods
|
||||
use: (pathOrMiddleware: any, middleware?: any) => AsyncExpressApp
|
||||
get: (path: string, handler: (req: any, res: any, next?: any) => any) => AsyncExpressApp
|
||||
post: (path: string, handler: (req: any, res: any, next?: any) => any) => AsyncExpressApp
|
||||
put: (path: string, handler: (req: any, res: any, next?: any) => any) => AsyncExpressApp
|
||||
delete: (path: string, handler: (req: any, res: any, next?: any) => any) => AsyncExpressApp
|
||||
patch: (path: string, handler: (req: any, res: any, next?: any) => any) => AsyncExpressApp
|
||||
|
|
@ -1,33 +1,12 @@
|
|||
# App
|
||||
NEXT_PUBLIC_APP_URL=http://localhost:3000/
|
||||
WEBAPP_URL=http://localhost:3000/
|
||||
CODEFLASH_CFAPI_URL=http://localhost:3001
|
||||
|
||||
# Auth0
|
||||
AUTH0_ISSUER_BASE_URL=https://codeflash-ai.us.auth0.com
|
||||
AUTH0_CLIENT_ID=
|
||||
AUTH0_CLIENT_SECRET=
|
||||
AUTH0_SECRET=
|
||||
AUTH0_BASE_URL=http://localhost:3000/
|
||||
|
||||
# Database (use sslmode=verify-full for Azure PostgreSQL)
|
||||
DATABASE_URL="postgresql://user:password@host:5432/postgres?sslmode=verify-full"
|
||||
|
||||
# Stripe
|
||||
STRIPE_SECRET_KEY=
|
||||
STRIPE_PRO_PRODUCT_ID=
|
||||
STRIPE_PRO_PRICE_MONTHLY_ID=
|
||||
STRIPE_PRO_PRICE_YEARLY_ID=
|
||||
STRIPE_WEBHOOK_SECRET=
|
||||
|
||||
# Codeflash
|
||||
NEXT_PUBLIC_CF_API_KEY=
|
||||
API_TOKEN_LIMIT=4000
|
||||
|
||||
# Sentry (omit NEXT_PUBLIC_SENTRY_DISABLED to enable)
|
||||
NEXT_PUBLIC_SENTRY_DISABLED=true
|
||||
# SENTRY_AUTH_TOKEN= # set in CI for source map uploads
|
||||
|
||||
# Optional: local paths for aiservice integration
|
||||
# AISERVICE_DIR=/path/to/codeflash-internal
|
||||
# CODEFLASH_DIR=/path/to/codeflash
|
||||
AUTH0_BASE_URL
|
||||
AUTH0_CLIENT_ID
|
||||
AUTH0_CLIENT_SECRET
|
||||
AUTH0_ISSUER_BASE_URL
|
||||
AUTH0_SECRET
|
||||
AUTH0_SESSION_ROLLING=false
|
||||
NPM_TOKEN
|
||||
SCM_DO_BUILD_DURING_DEPLOYMENT
|
||||
WEBSITE_HEALTHCHECK_MAXPINGFAILURES
|
||||
WEBSITE_HTTPLOGGING_RETENTION_DAYS
|
||||
AISERVICE_DIR=
|
||||
CODEFLASH_DIR=
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
import bundleAnalyzer from "@next/bundle-analyzer"
|
||||
import { dirname, resolve } from "path"
|
||||
import { dirname } from "path"
|
||||
import { fileURLToPath } from "url"
|
||||
|
||||
const withBundleAnalyzer = bundleAnalyzer({
|
||||
|
|
@ -10,19 +10,6 @@ const __dirname = dirname(fileURLToPath(import.meta.url))
|
|||
|
||||
/** @type {import("next").NextConfig} */
|
||||
const nextConfig = {
|
||||
cacheComponents: true,
|
||||
cacheLife: {
|
||||
dashboard: {
|
||||
stale: 60, // 1 minute — serve stale while revalidating
|
||||
revalidate: 300, // 5 minutes — background revalidation interval
|
||||
expire: 3600, // 1 hour — hard expiry
|
||||
},
|
||||
frequent: {
|
||||
stale: 30, // 30 seconds
|
||||
revalidate: 60, // 1 minute
|
||||
expire: 600, // 10 minutes
|
||||
},
|
||||
},
|
||||
transpilePackages: ["@codeflash-ai/common"],
|
||||
webpack: (config, { isServer }) => {
|
||||
config.watchOptions = {
|
||||
|
|
@ -30,16 +17,6 @@ const nextConfig = {
|
|||
aggregateTimeout: 300,
|
||||
}
|
||||
|
||||
// Suppress known-harmless "Critical dependency" warnings from OpenTelemetry
|
||||
// and require-in-the-middle. These packages use dynamic require() for runtime
|
||||
// monkey-patching — webpack can't statically analyze them but they work fine.
|
||||
// Root cause: @sentry/nextjs → @sentry/node → @opentelemetry/instrumentation.
|
||||
config.ignoreWarnings = [
|
||||
...(config.ignoreWarnings || []),
|
||||
{ module: /@opentelemetry\/instrumentation/ },
|
||||
{ module: /require-in-the-middle/ },
|
||||
]
|
||||
|
||||
// Handle web-tree-sitter's Node.js module imports in browser.
|
||||
// fallback handles static require(); alias handles dynamic import()
|
||||
if (!isServer) {
|
||||
|
|
@ -69,20 +46,7 @@ const nextConfig = {
|
|||
'module': { browser: './src/lib/empty-shim.js' },
|
||||
},
|
||||
},
|
||||
serverExternalPackages: [
|
||||
"@anthropic-ai/sdk",
|
||||
"sharp",
|
||||
"posthog-node",
|
||||
"@opentelemetry/api",
|
||||
"@opentelemetry/sdk-node",
|
||||
"@opentelemetry/auto-instrumentations-node",
|
||||
"@opentelemetry/instrumentation",
|
||||
"@prisma/instrumentation",
|
||||
"@sentry/opentelemetry",
|
||||
"@sentry/node",
|
||||
"require-in-the-middle",
|
||||
"@fastify/otel",
|
||||
],
|
||||
serverExternalPackages: ["@anthropic-ai/sdk", "sharp"],
|
||||
experimental: {
|
||||
// Tree-shake barrel exports for these heavy packages. Without this,
|
||||
// importing a single icon from lucide-react or a single component from
|
||||
|
|
@ -94,49 +58,20 @@ const nextConfig = {
|
|||
"chart.js",
|
||||
"react-chartjs-2",
|
||||
"motion",
|
||||
"zod",
|
||||
"react-hook-form",
|
||||
"@hookform/resolvers",
|
||||
"react-markdown",
|
||||
"remark-gfm",
|
||||
"sonner",
|
||||
"react-resizable-panels",
|
||||
"@radix-ui/react-dialog",
|
||||
"@radix-ui/react-dropdown-menu",
|
||||
"@radix-ui/react-select",
|
||||
"@radix-ui/react-tabs",
|
||||
"@radix-ui/react-tooltip",
|
||||
"@radix-ui/react-toast",
|
||||
"chartjs-plugin-datalabels",
|
||||
"marked",
|
||||
"prism-react-renderer",
|
||||
],
|
||||
serverActions: {
|
||||
allowedOrigins: ["app.codeflash.ai", "localhost:3000"],
|
||||
bodySizeLimit: '5mb', // Increased from default 1mb to handle large PR creation payloads
|
||||
},
|
||||
// NOTE: turbopackRemoveUnused{Imports,Exports} are NOT enabled — they
|
||||
// break @opentelemetry/api barrel re-exports and Next.js internal ESM
|
||||
// modules (same class of bug as turbopackTreeShaking + @sentry/core below).
|
||||
// turbopackRemoveUnusedImports requires turbopackRemoveUnusedExports.
|
||||
turbopackInferModuleSideEffects: true,
|
||||
// Scope hoisting: collapses module wrappers for smaller output
|
||||
turbopackScopeHoisting: true,
|
||||
// NOTE: turbopackTreeShaking is NOT enabled — it fragments modules into
|
||||
// "internal parts" which breaks @sentry/core's ESM cross-references
|
||||
// (withScope, withErrorInstrumentation exports disappear). Re-test when
|
||||
// Turbopack or Sentry fixes the incompatibility.
|
||||
// Persist compiled artifacts between CI builds
|
||||
turbopackFileSystemCacheForBuild: true,
|
||||
// Client-side router cache: avoid refetching on back-navigation
|
||||
staleTimes: {
|
||||
dynamic: 30,
|
||||
static: 180,
|
||||
},
|
||||
},
|
||||
typescript: {
|
||||
// Type-checking is split into a separate `npm run type-check` step.
|
||||
// This cuts ~16s off `next build` (was 60% of build time).
|
||||
ignoreBuildErrors: true,
|
||||
ignoreBuildErrors: false,
|
||||
},
|
||||
// Optimize for production stability
|
||||
poweredByHeader: false,
|
||||
|
|
@ -152,30 +87,41 @@ const nextConfig = {
|
|||
hostname: "github.com",
|
||||
},
|
||||
],
|
||||
formats: ['image/avif', 'image/webp'],
|
||||
},
|
||||
}
|
||||
|
||||
// module.exports = nextConfig
|
||||
|
||||
import { withSentryConfig } from "@sentry/nextjs"
|
||||
|
||||
// Only upload source maps when SENTRY_AUTH_TOKEN is set (CI/deploy).
|
||||
// Skipping this shaves significant time off local builds.
|
||||
const withSentry = process.env.SENTRY_AUTH_TOKEN
|
||||
? (config) => withSentryConfig(
|
||||
config,
|
||||
{
|
||||
silent: true,
|
||||
org: "codeflash-ai",
|
||||
project: "webapp",
|
||||
},
|
||||
{
|
||||
widenClientFileUpload: true,
|
||||
tunnelRoute: "/monitoring",
|
||||
hideSourceMaps: true,
|
||||
disableLogger: true,
|
||||
automaticVercelMonitors: false,
|
||||
},
|
||||
)
|
||||
: (config) => config
|
||||
export default withBundleAnalyzer(withSentryConfig(
|
||||
nextConfig,
|
||||
{
|
||||
// For all available options, see:
|
||||
// https://github.com/getsentry/sentry-webpack-plugin#options
|
||||
|
||||
export default withBundleAnalyzer(withSentry(nextConfig))
|
||||
// Suppresses source map uploading logs during build
|
||||
silent: true,
|
||||
org: "codeflash-ai",
|
||||
project: "webapp",
|
||||
},
|
||||
{
|
||||
// For all available options, see:
|
||||
// https://docs.sentry.io/platforms/javascript/guides/nextjs/manual-setup/
|
||||
|
||||
// Upload a larger set of source maps for prettier stack traces (increases build time)
|
||||
widenClientFileUpload: true,
|
||||
|
||||
// Routes browser requests to Sentry through a Next.js rewrite to circumvent ad-blockers (increases server load)
|
||||
tunnelRoute: "/monitoring",
|
||||
|
||||
// Hides source maps from generated client bundles
|
||||
hideSourceMaps: true,
|
||||
|
||||
// Automatically tree-shake Sentry logger statements to reduce bundle size
|
||||
disableLogger: true,
|
||||
|
||||
// Disable automatic instrumentation that might cause issues
|
||||
automaticVercelMonitors: false,
|
||||
},
|
||||
))
|
||||
|
|
|
|||
17602
js/cf-webapp/package-lock.json
generated
Normal file
17602
js/cf-webapp/package-lock.json
generated
Normal file
File diff suppressed because it is too large
Load diff
|
|
@ -4,32 +4,33 @@
|
|||
"private": true,
|
||||
"scripts": {
|
||||
"dev": "next dev",
|
||||
"build": "prisma generate && next build --webpack",
|
||||
"build": " npm install --loglevel verbose && npx prisma generate && npx next build",
|
||||
"deploy": "az webapp up -n codeflash-webapp-2 --sku P1V2 --runtime NODE:20-lts",
|
||||
"start": "next start",
|
||||
"start": "node_modules/next/dist/bin/next start",
|
||||
"lint": "eslint --fix .",
|
||||
"lint:check": "eslint .",
|
||||
"test": "vitest",
|
||||
"type-check": "tsc --noEmit",
|
||||
"analyze": "ANALYZE=true next build",
|
||||
"prisma:generate": "prisma generate",
|
||||
"prisma:migrate": "prisma migrate dev",
|
||||
"prisma:generate": "npx prisma generate",
|
||||
"prisma:migrate": "npx prisma migrate dev",
|
||||
"prepare": "simple-git-hooks",
|
||||
"postinstall": "node scripts/postinstall-wasm.mjs",
|
||||
"postinstall": "cp node_modules/web-tree-sitter/web-tree-sitter.wasm public/ && npx tree-sitter build --wasm node_modules/tree-sitter-python -o public/tree-sitter-python.wasm",
|
||||
"format": "prettier --write \"**/*.{js,ts,tsx,json,md}\"",
|
||||
"format:check": "prettier --check \"**/*.{js,ts,tsx,json,md}\""
|
||||
},
|
||||
"dependencies": {
|
||||
"@anthropic-ai/sdk": "^0.87.0",
|
||||
"@auth0/nextjs-auth0": "^4",
|
||||
"@codeflash-ai/common": "workspace:*",
|
||||
"@codeflash-ai/common": "^1.0.31",
|
||||
"@hookform/resolvers": "^5.2.2",
|
||||
"@monaco-editor/react": "^4.7.0",
|
||||
"@opentelemetry/auto-instrumentations-node": "^0.72.0",
|
||||
"@opentelemetry/sdk-node": "^0.214.0",
|
||||
"@prisma/client": "^7.7.0",
|
||||
"@prisma/client": "^6.19.3",
|
||||
"@prisma/instrumentation": "^7.6.0",
|
||||
"@radix-ui/react-dialog": "^1.0.5",
|
||||
"@radix-ui/react-dropdown-menu": "^2.0.6",
|
||||
"@radix-ui/react-label": "^2.0.2",
|
||||
"@radix-ui/react-navigation-menu": "^1.1.4",
|
||||
"@radix-ui/react-progress": "^1.1.2",
|
||||
|
|
@ -42,6 +43,7 @@
|
|||
"@sentry/nextjs": "^10.38.0",
|
||||
"@sentry/opentelemetry": "^10.47.0",
|
||||
"@types/node": "^25.6.0",
|
||||
"@types/pg": "^8.10.9",
|
||||
"@types/react": "^19.2.14",
|
||||
"@types/react-dom": "^19.2.3",
|
||||
"@types/react-syntax-highlighter": "^15.5.13",
|
||||
|
|
@ -60,6 +62,7 @@
|
|||
"node-ts-cache": "^4.4.0",
|
||||
"node-ts-cache-storage-memory": "^4.4.0",
|
||||
"papaparse": "^5.5.3",
|
||||
"pg": "^8.11.3",
|
||||
"postcss": "^8",
|
||||
"posthog-js": "^1.367.0",
|
||||
"posthog-node": "^5.29.2",
|
||||
|
|
@ -87,21 +90,19 @@
|
|||
"@types/papaparse": "^5.5.2",
|
||||
"@vitejs/plugin-react": "^4.3.1",
|
||||
"autoprefixer": "^10.0.1",
|
||||
"eslint": "^9.39.4",
|
||||
"baseline-browser-mapping": "^2.9.11",
|
||||
"eslint": "^10.2.0",
|
||||
"eslint-config-next": "^16.2.3",
|
||||
"eslint-config-prettier": "^10.1.8",
|
||||
"jsdom": "^29.0.2",
|
||||
"lint-staged": "^16.4.0",
|
||||
"monaco-editor": "^0.55.1",
|
||||
"prettier": "^3.8.2",
|
||||
"prisma": "^7.7.0",
|
||||
"prisma": "^6.19.3",
|
||||
"simple-git-hooks": "^2.9.0",
|
||||
"typescript": "^5.9.3",
|
||||
"vitest": "^4.1.4"
|
||||
},
|
||||
"optionalDependencies": {
|
||||
"tree-sitter-cli": "^0.26.3",
|
||||
"tree-sitter-python": "^0.25.0"
|
||||
"tree-sitter-python": "^0.25.0",
|
||||
"typescript": "~5.4.5",
|
||||
"vitest": "^4.1.4"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=20.0.0"
|
||||
|
|
@ -116,5 +117,8 @@
|
|||
"**/*.{json,md}": [
|
||||
"prettier --write"
|
||||
]
|
||||
},
|
||||
"overrides": {
|
||||
"dompurify": "3.3.3"
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -2,5 +2,5 @@ import path from "node:path"
|
|||
import { defineConfig } from "prisma/config"
|
||||
|
||||
export default defineConfig({
|
||||
schema: path.join(__dirname, "../common/prisma/schema.prisma"),
|
||||
schema: path.join(__dirname, "node_modules/@codeflash-ai/common/prisma/schema.prisma"),
|
||||
})
|
||||
|
|
|
|||
|
|
@ -1 +0,0 @@
|
|||
0.25.0
|
||||
|
|
@ -1,80 +0,0 @@
|
|||
#!/usr/bin/env node
|
||||
/**
|
||||
* Postinstall script that caches tree-sitter WASM artifacts in public/.
|
||||
* Prisma client generation is handled by pnpm workspaces — no symlinks needed.
|
||||
*
|
||||
* Uses Node module resolution to find packages regardless of where pnpm
|
||||
* stores them (isolated node_modules with symlinks to the store).
|
||||
*/
|
||||
import { existsSync, readFileSync, writeFileSync, copyFileSync } from "fs"
|
||||
import { createRequire } from "module"
|
||||
import { execSync } from "child_process"
|
||||
import { dirname, resolve } from "path"
|
||||
|
||||
const require = createRequire(import.meta.url)
|
||||
|
||||
// Resolve package directory. Some packages (e.g. web-tree-sitter) don't
|
||||
// export ./package.json, so fall back to resolving the main entry.
|
||||
function pkgDir(name) {
|
||||
try {
|
||||
return dirname(require.resolve(`${name}/package.json`))
|
||||
} catch {
|
||||
return dirname(require.resolve(name))
|
||||
}
|
||||
}
|
||||
|
||||
// --- Tree-sitter WASM ---
|
||||
const PUBLIC = resolve("public")
|
||||
const WASM_FILE = resolve(PUBLIC, "tree-sitter-python.wasm")
|
||||
const WEB_WASM = resolve(PUBLIC, "web-tree-sitter.wasm")
|
||||
const VERSION_STAMP = resolve(PUBLIC, ".tree-sitter-python-version")
|
||||
|
||||
// Always copy web-tree-sitter.wasm (fast — just a file copy)
|
||||
try {
|
||||
const webTreeSitterSrc = resolve(pkgDir("web-tree-sitter"), "web-tree-sitter.wasm")
|
||||
copyFileSync(webTreeSitterSrc, WEB_WASM)
|
||||
console.log("[postinstall] Copied web-tree-sitter.wasm")
|
||||
} catch {
|
||||
console.warn("[postinstall] web-tree-sitter.wasm not found — skipping copy")
|
||||
}
|
||||
|
||||
// Read the installed tree-sitter-python version
|
||||
let installedVersion = "unknown"
|
||||
let treeSitterPythonDir
|
||||
try {
|
||||
treeSitterPythonDir = pkgDir("tree-sitter-python")
|
||||
const pkg = JSON.parse(readFileSync(resolve(treeSitterPythonDir, "package.json"), "utf8"))
|
||||
installedVersion = pkg.version
|
||||
} catch {
|
||||
// Package not installed — will force build
|
||||
}
|
||||
|
||||
// Check if we can skip the build
|
||||
let cachedVersion = ""
|
||||
try {
|
||||
cachedVersion = readFileSync(VERSION_STAMP, "utf8").trim()
|
||||
} catch {
|
||||
// No stamp — first install
|
||||
}
|
||||
|
||||
if (existsSync(WASM_FILE) && cachedVersion === installedVersion) {
|
||||
console.log(`[postinstall] tree-sitter-python.wasm is up-to-date (v${installedVersion}) — skipping build`)
|
||||
process.exit(0)
|
||||
}
|
||||
|
||||
// Build tree-sitter-python WASM
|
||||
console.log(`[postinstall] Building tree-sitter-python.wasm (v${installedVersion})...`)
|
||||
try {
|
||||
execSync(`npx tree-sitter build --wasm ${treeSitterPythonDir} -o ${WASM_FILE}`, {
|
||||
stdio: "inherit",
|
||||
})
|
||||
writeFileSync(VERSION_STAMP, installedVersion)
|
||||
console.log(`[postinstall] Built and cached tree-sitter-python.wasm (v${installedVersion})`)
|
||||
} catch (err) {
|
||||
if (existsSync(WASM_FILE)) {
|
||||
console.warn("[postinstall] Failed to rebuild tree-sitter-python.wasm, using stale cached version:", err.message)
|
||||
} else {
|
||||
console.error("[postinstall] Failed to build tree-sitter-python.wasm and no cached version exists:", err.message)
|
||||
process.exit(1)
|
||||
}
|
||||
}
|
||||
|
|
@ -1,167 +0,0 @@
|
|||
"use client"
|
||||
|
||||
import LogoBox from "@/components/dashboard/logo-box"
|
||||
import { useState, useEffect } from "react"
|
||||
import { useSearchParams } from "next/navigation"
|
||||
import { Loading } from "@/components/ui/loading"
|
||||
|
||||
export default function OAuthCallbackContent() {
|
||||
const [copied, setCopied] = useState(false)
|
||||
const [isLoading, setIsLoading] = useState(true)
|
||||
const [error, setError] = useState<string | null>(null)
|
||||
const searchParams = useSearchParams()
|
||||
|
||||
const code = searchParams.get("code")
|
||||
const state = searchParams.get("state")
|
||||
|
||||
useEffect(() => {
|
||||
// Validate the OAuth callback
|
||||
if (!code || !state) {
|
||||
setError("Invalid authentication response. Missing required parameters.")
|
||||
}
|
||||
setIsLoading(false)
|
||||
}, [code, state])
|
||||
|
||||
const handleCopyCode = async () => {
|
||||
if (!code) return
|
||||
|
||||
try {
|
||||
await navigator.clipboard.writeText(code)
|
||||
setCopied(true)
|
||||
setTimeout(() => setCopied(false), 2000)
|
||||
} catch (err) {
|
||||
console.error("Failed to copy:", err)
|
||||
}
|
||||
}
|
||||
|
||||
if (isLoading) {
|
||||
return <Loading />
|
||||
}
|
||||
|
||||
if (error || !code) {
|
||||
return (
|
||||
<div className="min-h-screen bg-gradient-to-b from-primary/10 via-primary/5 to-background relative">
|
||||
<div className="absolute inset-0 bg-[linear-gradient(to_right,#80808008_1px,transparent_1px),linear-gradient(to_bottom,#80808008_1px,transparent_1px)] bg-[size:24px_24px]" />
|
||||
<div className="min-h-screen flex flex-col items-center justify-center px-6 py-12 relative z-10">
|
||||
<div className="mb-16">
|
||||
<LogoBox />
|
||||
</div>
|
||||
<div className="max-w-md w-full">
|
||||
<div className="bg-card border border-border rounded-2xl shadow-xl overflow-hidden p-8">
|
||||
<div className="w-20 h-20 bg-amber-500/10 rounded-2xl flex items-center justify-center mx-auto relative">
|
||||
<svg
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
width="48"
|
||||
height="48"
|
||||
viewBox="0 0 24 24"
|
||||
fill="none"
|
||||
stroke="currentColor"
|
||||
strokeWidth="2"
|
||||
strokeLinecap="round"
|
||||
strokeLinejoin="round"
|
||||
className="text-amber-600 dark:text-amber-500"
|
||||
>
|
||||
<circle cx="12" cy="12" r="10" />
|
||||
<line x1="12" y1="8" x2="12" y2="12" />
|
||||
<line x1="12" y1="16" x2="12.01" y2="16" />
|
||||
</svg>
|
||||
</div>
|
||||
<div className="space-y-3 text-center mt-6">
|
||||
<h2 className="text-2xl font-bold text-foreground">Authentication Error</h2>
|
||||
<p className="text-sm text-muted-foreground leading-relaxed">
|
||||
{error || "Invalid authentication response"}
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="min-h-screen bg-gradient-to-b from-primary/10 via-primary/5 to-background relative">
|
||||
<div className="absolute inset-0 bg-[linear-gradient(to_right,#80808008_1px,transparent_1px),linear-gradient(to_bottom,#80808008_1px,transparent_1px)] bg-[size:24px_24px]" />
|
||||
<div className="min-h-screen flex flex-col items-center justify-center px-6 py-12 relative z-10">
|
||||
<div className="mb-16">
|
||||
<LogoBox />
|
||||
</div>
|
||||
|
||||
<div className="max-w-2xl w-full space-y-8">
|
||||
{/* Header */}
|
||||
<div className="text-center space-y-4">
|
||||
<h1 className="text-4xl md:text-5xl font-bold text-foreground tracking-tight">
|
||||
Authentication Code
|
||||
</h1>
|
||||
<p className="text-lg text-muted-foreground">Paste this into Codeflash CLI</p>
|
||||
</div>
|
||||
|
||||
{/* Code Display Box */}
|
||||
<div className="bg-card border border-border rounded-2xl shadow-xl overflow-hidden">
|
||||
<div className="p-8 space-y-6">
|
||||
{/* Code Container */}
|
||||
<div className="bg-muted/50 border border-border rounded-xl p-6 font-mono text-sm break-all">
|
||||
<code className="text-foreground/90 select-all">{code}</code>
|
||||
</div>
|
||||
|
||||
{/* Copy Button */}
|
||||
<button
|
||||
onClick={handleCopyCode}
|
||||
className="w-full px-6 py-3.5 bg-primary hover:bg-primary/90 active:scale-[0.99] text-primary-foreground font-semibold rounded-xl transition-all shadow-sm flex items-center justify-center gap-2 group"
|
||||
>
|
||||
{copied ? (
|
||||
<>
|
||||
<svg
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
width="20"
|
||||
height="20"
|
||||
viewBox="0 0 24 24"
|
||||
fill="none"
|
||||
stroke="currentColor"
|
||||
strokeWidth="2"
|
||||
strokeLinecap="round"
|
||||
strokeLinejoin="round"
|
||||
className="transition-transform group-hover:scale-110"
|
||||
>
|
||||
<polyline points="20 6 9 17 4 12" />
|
||||
</svg>
|
||||
Copied!
|
||||
</>
|
||||
) : (
|
||||
<>
|
||||
<svg
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
width="20"
|
||||
height="20"
|
||||
viewBox="0 0 24 24"
|
||||
fill="none"
|
||||
stroke="currentColor"
|
||||
strokeWidth="2"
|
||||
strokeLinecap="round"
|
||||
strokeLinejoin="round"
|
||||
className="transition-transform group-hover:scale-110"
|
||||
>
|
||||
<rect x="9" y="9" width="13" height="13" rx="2" ry="2" />
|
||||
<path d="M5 15H4a2 2 0 0 1-2-2V4a2 2 0 0 1 2-2h9a2 2 0 0 1 2 2v1" />
|
||||
</svg>
|
||||
Copy Code
|
||||
</>
|
||||
)}
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Additional Info */}
|
||||
<div className="text-center space-y-2">
|
||||
<p className="text-sm text-muted-foreground">
|
||||
This code will authenticate your CodeFlash CLI.
|
||||
</p>
|
||||
<p className="text-xs text-muted-foreground/70">
|
||||
Keep this code secure and do not share it with anyone.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
|
@ -1,11 +1,167 @@
|
|||
import { Suspense } from "react"
|
||||
"use client"
|
||||
|
||||
import LogoBox from "@/components/dashboard/logo-box"
|
||||
import { useState, useEffect } from "react"
|
||||
import { useSearchParams } from "next/navigation"
|
||||
import { Loading } from "@/components/ui/loading"
|
||||
import OAuthCallbackContent from "./content"
|
||||
|
||||
export default function OAuthCallbackPage() {
|
||||
const [copied, setCopied] = useState(false)
|
||||
const [isLoading, setIsLoading] = useState(true)
|
||||
const [error, setError] = useState<string | null>(null)
|
||||
const searchParams = useSearchParams()
|
||||
|
||||
const code = searchParams.get("code")
|
||||
const state = searchParams.get("state")
|
||||
|
||||
useEffect(() => {
|
||||
// Validate the OAuth callback
|
||||
if (!code || !state) {
|
||||
setError("Invalid authentication response. Missing required parameters.")
|
||||
}
|
||||
setIsLoading(false)
|
||||
}, [code, state])
|
||||
|
||||
const handleCopyCode = async () => {
|
||||
if (!code) return
|
||||
|
||||
try {
|
||||
await navigator.clipboard.writeText(code)
|
||||
setCopied(true)
|
||||
setTimeout(() => setCopied(false), 2000)
|
||||
} catch (err) {
|
||||
console.error("Failed to copy:", err)
|
||||
}
|
||||
}
|
||||
|
||||
if (isLoading) {
|
||||
return <Loading />
|
||||
}
|
||||
|
||||
if (error || !code) {
|
||||
return (
|
||||
<div className="min-h-screen bg-gradient-to-b from-primary/10 via-primary/5 to-background relative">
|
||||
<div className="absolute inset-0 bg-[linear-gradient(to_right,#80808008_1px,transparent_1px),linear-gradient(to_bottom,#80808008_1px,transparent_1px)] bg-[size:24px_24px]" />
|
||||
<div className="min-h-screen flex flex-col items-center justify-center px-6 py-12 relative z-10">
|
||||
<div className="mb-16">
|
||||
<LogoBox />
|
||||
</div>
|
||||
<div className="max-w-md w-full">
|
||||
<div className="bg-card border border-border rounded-2xl shadow-xl overflow-hidden p-8">
|
||||
<div className="w-20 h-20 bg-amber-500/10 rounded-2xl flex items-center justify-center mx-auto relative">
|
||||
<svg
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
width="48"
|
||||
height="48"
|
||||
viewBox="0 0 24 24"
|
||||
fill="none"
|
||||
stroke="currentColor"
|
||||
strokeWidth="2"
|
||||
strokeLinecap="round"
|
||||
strokeLinejoin="round"
|
||||
className="text-amber-600 dark:text-amber-500"
|
||||
>
|
||||
<circle cx="12" cy="12" r="10" />
|
||||
<line x1="12" y1="8" x2="12" y2="12" />
|
||||
<line x1="12" y1="16" x2="12.01" y2="16" />
|
||||
</svg>
|
||||
</div>
|
||||
<div className="space-y-3 text-center mt-6">
|
||||
<h2 className="text-2xl font-bold text-foreground">Authentication Error</h2>
|
||||
<p className="text-sm text-muted-foreground leading-relaxed">
|
||||
{error || "Invalid authentication response"}
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
return (
|
||||
<Suspense fallback={<Loading />}>
|
||||
<OAuthCallbackContent />
|
||||
</Suspense>
|
||||
<div className="min-h-screen bg-gradient-to-b from-primary/10 via-primary/5 to-background relative">
|
||||
<div className="absolute inset-0 bg-[linear-gradient(to_right,#80808008_1px,transparent_1px),linear-gradient(to_bottom,#80808008_1px,transparent_1px)] bg-[size:24px_24px]" />
|
||||
<div className="min-h-screen flex flex-col items-center justify-center px-6 py-12 relative z-10">
|
||||
<div className="mb-16">
|
||||
<LogoBox />
|
||||
</div>
|
||||
|
||||
<div className="max-w-2xl w-full space-y-8">
|
||||
{/* Header */}
|
||||
<div className="text-center space-y-4">
|
||||
<h1 className="text-4xl md:text-5xl font-bold text-foreground tracking-tight">
|
||||
Authentication Code
|
||||
</h1>
|
||||
<p className="text-lg text-muted-foreground">Paste this into Codeflash CLI</p>
|
||||
</div>
|
||||
|
||||
{/* Code Display Box */}
|
||||
<div className="bg-card border border-border rounded-2xl shadow-xl overflow-hidden">
|
||||
<div className="p-8 space-y-6">
|
||||
{/* Code Container */}
|
||||
<div className="bg-muted/50 border border-border rounded-xl p-6 font-mono text-sm break-all">
|
||||
<code className="text-foreground/90 select-all">{code}</code>
|
||||
</div>
|
||||
|
||||
{/* Copy Button */}
|
||||
<button
|
||||
onClick={handleCopyCode}
|
||||
className="w-full px-6 py-3.5 bg-primary hover:bg-primary/90 active:scale-[0.99] text-primary-foreground font-semibold rounded-xl transition-all shadow-sm flex items-center justify-center gap-2 group"
|
||||
>
|
||||
{copied ? (
|
||||
<>
|
||||
<svg
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
width="20"
|
||||
height="20"
|
||||
viewBox="0 0 24 24"
|
||||
fill="none"
|
||||
stroke="currentColor"
|
||||
strokeWidth="2"
|
||||
strokeLinecap="round"
|
||||
strokeLinejoin="round"
|
||||
className="transition-transform group-hover:scale-110"
|
||||
>
|
||||
<polyline points="20 6 9 17 4 12" />
|
||||
</svg>
|
||||
Copied!
|
||||
</>
|
||||
) : (
|
||||
<>
|
||||
<svg
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
width="20"
|
||||
height="20"
|
||||
viewBox="0 0 24 24"
|
||||
fill="none"
|
||||
stroke="currentColor"
|
||||
strokeWidth="2"
|
||||
strokeLinecap="round"
|
||||
strokeLinejoin="round"
|
||||
className="transition-transform group-hover:scale-110"
|
||||
>
|
||||
<rect x="9" y="9" width="13" height="13" rx="2" ry="2" />
|
||||
<path d="M5 15H4a2 2 0 0 1-2-2V4a2 2 0 0 1 2-2h9a2 2 0 0 1 2 2v1" />
|
||||
</svg>
|
||||
Copy Code
|
||||
</>
|
||||
)}
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Additional Info */}
|
||||
<div className="text-center space-y-2">
|
||||
<p className="text-sm text-muted-foreground">
|
||||
This code will authenticate your CodeFlash CLI.
|
||||
</p>
|
||||
<p className="text-xs text-muted-foreground/70">
|
||||
Keep this code secure and do not share it with anyone.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
import { redirect } from "next/navigation"
|
||||
import { auth0 } from "@/lib/auth0"
|
||||
import Link from "next/link"
|
||||
import { Suspense, type JSX } from "react"
|
||||
import { type JSX } from "react"
|
||||
import { APP_ROUTES } from "@/lib/types"
|
||||
|
||||
// Security function to validate returnTo URLs
|
||||
|
|
@ -12,10 +12,12 @@ function isValidReturnUrl(url: string): boolean {
|
|||
return false
|
||||
}
|
||||
|
||||
async function LoginContent(props: {
|
||||
searchParams: Promise<{ returnTo?: string; error?: string }>
|
||||
}): Promise<JSX.Element> {
|
||||
const searchParams = await props.searchParams
|
||||
export default async function AuthenticationPage(
|
||||
props: {
|
||||
searchParams: Promise<{ returnTo?: string; error?: string }>
|
||||
}
|
||||
): Promise<JSX.Element> {
|
||||
const searchParams = await props.searchParams;
|
||||
const session = await auth0.getSession()
|
||||
|
||||
if (session) {
|
||||
|
|
@ -54,19 +56,3 @@ async function LoginContent(props: {
|
|||
const loginUrl = `/auth/login?returnTo=${encodeURIComponent(returnTo)}`
|
||||
redirect(loginUrl)
|
||||
}
|
||||
|
||||
export default function AuthenticationPage(props: {
|
||||
searchParams: Promise<{ returnTo?: string; error?: string }>
|
||||
}): JSX.Element {
|
||||
return (
|
||||
<Suspense
|
||||
fallback={
|
||||
<div className="flex min-h-screen items-center justify-center">
|
||||
<div className="h-8 w-8 animate-spin rounded-full border-4 border-gray-300 border-t-blue-500" />
|
||||
</div>
|
||||
}
|
||||
>
|
||||
<LoginContent searchParams={props.searchParams} />
|
||||
</Suspense>
|
||||
)
|
||||
}
|
||||
|
|
|
|||
|
|
@ -39,7 +39,7 @@ export async function SubmitFirstOnboardingPage(
|
|||
custom_pain_point: customOptionInput,
|
||||
},
|
||||
})
|
||||
// PostHog batches automatically — no flush needed
|
||||
await posthog?.flush()
|
||||
|
||||
await submitOnboardingQuestions(user_id, email)
|
||||
// Check for saved redirect URL after onboarding completion
|
||||
|
|
@ -81,7 +81,7 @@ export async function SubmitSkipOnboardingPage(): Promise<void> {
|
|||
username: nickname,
|
||||
},
|
||||
})
|
||||
// PostHog batches automatically — no flush needed
|
||||
await posthog?.flush()
|
||||
|
||||
await markUserCompletedOnboarding(user_id)
|
||||
// Checking for saved redirect URL after onboarding completion
|
||||
|
|
|
|||
|
|
@ -31,5 +31,5 @@ export async function SubmitSecondOnboardingPage(
|
|||
...(colleagueInviteEmail && { colleague_invite_email: colleagueInviteEmail }),
|
||||
},
|
||||
})
|
||||
// PostHog batches automatically — no flush needed
|
||||
await posthog?.flush()
|
||||
}
|
||||
|
|
|
|||
|
|
@ -21,22 +21,14 @@ import {
|
|||
} from "@/components/ui/dialog"
|
||||
import { Label } from "@/components/ui/label"
|
||||
import { Button } from "@/components/ui/button"
|
||||
import { type cf_api_keys } from "@prisma/client"
|
||||
import { deleteAPIKey } from "./tokenfuncs"
|
||||
import { Badge } from "@/components/ui/badge"
|
||||
import { useToast } from "@/components/ui/use-toast"
|
||||
import { Building2, User } from "lucide-react"
|
||||
import Image from "next/image"
|
||||
|
||||
export interface ApiKeyWithOrg {
|
||||
id: number
|
||||
key: string
|
||||
suffix: string
|
||||
name: string
|
||||
created_at: Date
|
||||
last_used: Date | null
|
||||
user_id: string | null
|
||||
organization_id: string | null
|
||||
tier: string | null
|
||||
interface ApiKeyWithOrg extends cf_api_keys {
|
||||
organization?: {
|
||||
id: string
|
||||
name: string
|
||||
|
|
@ -138,7 +130,7 @@ export function ApiKeyTable({
|
|||
<span className="text-sm">
|
||||
{key.user.user_id === currentUserId
|
||||
? "Me"
|
||||
: key.user.name || key.user.email || key.user.github_username}
|
||||
: (key.user.name || key.user.email || key.user.github_username)}
|
||||
</span>
|
||||
</div>
|
||||
) : (
|
||||
|
|
|
|||
|
|
@ -2,7 +2,14 @@
|
|||
import { type JSX } from "react"
|
||||
import { Button } from "@/components/ui/button"
|
||||
import { Trash2 } from "lucide-react"
|
||||
import { type ApiKeyWithOrg } from "./api-key-table"
|
||||
import { type cf_api_keys } from "@prisma/client"
|
||||
|
||||
interface ApiKeyWithOrg extends cf_api_keys {
|
||||
organization?: {
|
||||
id: string
|
||||
name: string
|
||||
} | null
|
||||
}
|
||||
|
||||
export function DeleteApiKeyButton({
|
||||
deleteDialog,
|
||||
|
|
|
|||
|
|
@ -1,122 +1,62 @@
|
|||
import { Suspense } from "react"
|
||||
"use server"
|
||||
import { type JSX } from "react"
|
||||
import { redirect } from "next/navigation"
|
||||
import { auth0 } from "@/lib/auth0"
|
||||
import { CreateApiKeyDialog } from "./dialog-create-api-key"
|
||||
import { Separator } from "@/components/ui/separator"
|
||||
import { ApiKeyTable, type ApiKeyWithOrg } from "./api-key-table"
|
||||
import { ApiKeyTable } from "./api-key-table"
|
||||
import { type cf_api_keys } from "@prisma/client"
|
||||
import PostHogClient from "@/lib/posthog"
|
||||
import { cacheLife, cacheTag } from "next/cache"
|
||||
import { VS_CODE_KEY_NAME } from "@codeflash-ai/common"
|
||||
import { Prisma } from "@prisma/client"
|
||||
import { prisma } from "@/lib/prisma"
|
||||
|
||||
async function getCachedApiKeys(userId: string): Promise<ApiKeyWithOrg[]> {
|
||||
"use cache"
|
||||
cacheLife("frequent")
|
||||
cacheTag(`apikeys:${userId}`)
|
||||
|
||||
// Rewrite as raw SQL with UNION to avoid bitmap OR merge and nested EXISTS subquery.
|
||||
// Branch 1: personal API keys (user_id match, no org)
|
||||
// Branch 2: org API keys (user is member of the org)
|
||||
|
||||
const result = await prisma.$queryRaw<
|
||||
Array<{
|
||||
id: number
|
||||
key: string
|
||||
suffix: string
|
||||
name: string
|
||||
created_at: Date
|
||||
last_used: Date | null
|
||||
user_id: string | null
|
||||
organization_id: string | null
|
||||
tier: string | null
|
||||
org_id: string | null
|
||||
org_name: string | null
|
||||
owner_user_id: string | null
|
||||
owner_github_username: string | null
|
||||
owner_name: string | null
|
||||
owner_email: string | null
|
||||
}>
|
||||
>(Prisma.sql`
|
||||
SELECT
|
||||
ak.id, ak.key, ak.suffix, ak.name, ak.created_at, ak.last_used,
|
||||
ak.user_id, ak.organization_id, ak.tier,
|
||||
o.id as org_id, o.name as org_name,
|
||||
u.user_id as owner_user_id, u.github_username as owner_github_username,
|
||||
u.name as owner_name, u.email as owner_email
|
||||
FROM (
|
||||
-- Personal API keys
|
||||
SELECT id FROM cf_api_keys
|
||||
WHERE user_id = ${userId} AND organization_id IS NULL
|
||||
UNION
|
||||
-- Organization API keys (user is member)
|
||||
SELECT ak.id
|
||||
FROM cf_api_keys ak
|
||||
INNER JOIN organization_members om ON ak.organization_id = om.organization_id
|
||||
WHERE om.user_id = ${userId} AND ak.organization_id IS NOT NULL
|
||||
) AS filtered_ids
|
||||
INNER JOIN cf_api_keys ak ON ak.id = filtered_ids.id
|
||||
LEFT JOIN organizations o ON ak.organization_id = o.id
|
||||
LEFT JOIN users u ON ak.user_id = u.user_id
|
||||
ORDER BY ak.created_at DESC
|
||||
`)
|
||||
|
||||
// Map raw result to ApiKeyWithOrg format
|
||||
return (
|
||||
result as Array<{
|
||||
id: number
|
||||
key: string
|
||||
suffix: string
|
||||
name: string
|
||||
created_at: Date
|
||||
last_used: Date | null
|
||||
user_id: string | null
|
||||
organization_id: string | null
|
||||
tier: string | null
|
||||
org_id: string | null
|
||||
org_name: string | null
|
||||
owner_user_id: string | null
|
||||
owner_github_username: string | null
|
||||
owner_name: string | null
|
||||
owner_email: string | null
|
||||
}>
|
||||
).map(row => ({
|
||||
id: row.id,
|
||||
key: row.key,
|
||||
suffix: row.suffix,
|
||||
name: row.name,
|
||||
created_at: row.created_at,
|
||||
last_used: row.last_used,
|
||||
user_id: row.user_id,
|
||||
organization_id: row.organization_id,
|
||||
tier: row.tier,
|
||||
organization: row.org_id
|
||||
? {
|
||||
id: row.org_id,
|
||||
name: row.org_name!,
|
||||
}
|
||||
: null,
|
||||
user: row.owner_user_id
|
||||
? {
|
||||
user_id: row.owner_user_id,
|
||||
github_username: row.owner_github_username!,
|
||||
name: row.owner_name,
|
||||
email: row.owner_email,
|
||||
}
|
||||
: null,
|
||||
}))
|
||||
interface ApiKeyWithOrg extends cf_api_keys {
|
||||
organization?: {
|
||||
id: string
|
||||
name: string
|
||||
} | null
|
||||
user?: {
|
||||
user_id: string
|
||||
github_username: string
|
||||
name: string | null
|
||||
email: string | null
|
||||
} | null
|
||||
}
|
||||
|
||||
async function APIKeyContent(): Promise<JSX.Element> {
|
||||
export default async function APIKeyGenerator(): Promise<JSX.Element> {
|
||||
const session = await auth0.getSession()
|
||||
// Auth handled by middleware + layout; redirect bails out of prerendering
|
||||
// Auth handled by middleware + layout
|
||||
if (!session?.user) {
|
||||
redirect("/login")
|
||||
throw new Error("Authentication required")
|
||||
}
|
||||
const userId = session.user.sub
|
||||
|
||||
const apiKeys = await getCachedApiKeys(userId)
|
||||
// Get user's organization memberships
|
||||
const userOrgMemberships = await prisma.organization_members.findMany({
|
||||
where: { user_id: userId },
|
||||
select: { organization_id: true },
|
||||
})
|
||||
const userOrgIds = userOrgMemberships.map(m => m.organization_id)
|
||||
|
||||
// Fetch personal keys (no organization) and keys from user's organizations
|
||||
const apiKeys: ApiKeyWithOrg[] = await prisma.cf_api_keys.findMany({
|
||||
where: {
|
||||
OR: [{ user_id: userId, organization_id: null }, { organization_id: { in: userOrgIds } }],
|
||||
},
|
||||
include: {
|
||||
organization: {
|
||||
select: { id: true, name: true },
|
||||
},
|
||||
user: {
|
||||
select: {
|
||||
user_id: true,
|
||||
github_username: true,
|
||||
name: true,
|
||||
email: true,
|
||||
},
|
||||
},
|
||||
},
|
||||
orderBy: { created_at: "desc" },
|
||||
})
|
||||
|
||||
const posthog = PostHogClient()
|
||||
posthog?.capture({
|
||||
|
|
@ -125,7 +65,7 @@ async function APIKeyContent(): Promise<JSX.Element> {
|
|||
event: "webapp-loaded-api-keys",
|
||||
})
|
||||
|
||||
posthog?.flush()
|
||||
await posthog?.flush()
|
||||
|
||||
return (
|
||||
<div>
|
||||
|
|
@ -164,11 +104,7 @@ async function APIKeyContent(): Promise<JSX.Element> {
|
|||
<p className="leading-7 mt-6">
|
||||
These API keys are used to authenticate your requests to Codeflash's AI services.
|
||||
</p>
|
||||
<ApiKeyTable
|
||||
apiKeys={apiKeys}
|
||||
vscodeKeyName={VS_CODE_KEY_NAME}
|
||||
currentUserId={userId}
|
||||
/>{" "}
|
||||
<ApiKeyTable apiKeys={apiKeys} vscodeKeyName={VS_CODE_KEY_NAME} currentUserId={userId} />{" "}
|
||||
</>
|
||||
)}
|
||||
|
||||
|
|
@ -176,11 +112,3 @@ async function APIKeyContent(): Promise<JSX.Element> {
|
|||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
export default async function APIKeyGenerator(): Promise<JSX.Element> {
|
||||
return (
|
||||
<Suspense fallback={<div>Loading...</div>}>
|
||||
<APIKeyContent />
|
||||
</Suspense>
|
||||
)
|
||||
}
|
||||
|
|
|
|||
|
|
@ -10,7 +10,6 @@ import {
|
|||
import { TokenLimitExceededError } from "./token-error"
|
||||
import { prisma } from "@/lib/prisma"
|
||||
import { trackApiKeyCreated } from "@/lib/analytics/tracking"
|
||||
import { updateTag } from "next/cache"
|
||||
|
||||
export async function generateToken(
|
||||
keyName: string,
|
||||
|
|
@ -25,7 +24,6 @@ export async function generateToken(
|
|||
try {
|
||||
const token: string = await safeGenAndStoreAPITokenHash(keyName, userId, organizationId)
|
||||
await trackApiKeyCreated(userId, { keyName, organizationId })
|
||||
updateTag(`apikeys:${userId}`)
|
||||
return { success: true, token, err: undefined }
|
||||
} catch (error) {
|
||||
if (error instanceof Error && error.message === "Token limit exceeded") {
|
||||
|
|
@ -55,7 +53,6 @@ export async function generateTokenForVsCode(
|
|||
try {
|
||||
console.log("[Token] Generating VSCode API key for user:", userId, "orgId:", orgId)
|
||||
const token: string = await genAndStoreAPITokenHashForVSC(userId, orgId)
|
||||
updateTag(`apikeys:${userId}`)
|
||||
return { success: true, token, err: undefined }
|
||||
} catch (error) {
|
||||
if (error instanceof Error && error.message === "Token limit exceeded") {
|
||||
|
|
@ -95,5 +92,4 @@ export async function deleteAPIKey(id: number): Promise<void> {
|
|||
}
|
||||
|
||||
await deleteAPIKeyById(id, userId)
|
||||
updateTag(`apikeys:${userId}`)
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,40 +1,19 @@
|
|||
import { Suspense } from "react"
|
||||
import { cacheLife, cacheTag } from "next/cache"
|
||||
"use server"
|
||||
import { auth0 } from "@/lib/auth0"
|
||||
import { BillingView } from "./billing-view"
|
||||
import { trackBillingPageViewed } from "@/lib/analytics/tracking"
|
||||
import { SUBSCRIPTION_PLANS, checkAndResetSubscriptionPeriod } from "@codeflash-ai/common"
|
||||
|
||||
async function getCachedSubscription(userId: string) {
|
||||
"use cache"
|
||||
cacheLife("frequent")
|
||||
cacheTag(`billing:${userId}`)
|
||||
|
||||
const sub = await checkAndResetSubscriptionPeriod(userId)
|
||||
if (!sub) return null
|
||||
// Return plain object (no Prisma model) for cache serialization
|
||||
return {
|
||||
plan_type: sub.plan_type,
|
||||
optimizations_used: sub.optimizations_used,
|
||||
optimizations_limit: sub.optimizations_limit,
|
||||
subscription_status: sub.subscription_status,
|
||||
stripe_customer_id: sub.stripe_customer_id,
|
||||
stripe_subscription_id: sub.stripe_subscription_id,
|
||||
current_period_start: sub.current_period_start,
|
||||
current_period_end: sub.current_period_end,
|
||||
}
|
||||
}
|
||||
|
||||
async function BillingContent() {
|
||||
export default async function BillingPage() {
|
||||
const session = await auth0.getSession()
|
||||
if (!session?.user) return null
|
||||
const userId = session.user.sub
|
||||
try {
|
||||
// Track page view (fire-and-forget — batched by PostHog client)
|
||||
trackBillingPageViewed(userId, { username: session.user.nickname })
|
||||
// Track page view
|
||||
await trackBillingPageViewed(userId, { username: session.user.nickname })
|
||||
|
||||
// Get subscription info (cached 30s stale / 60s revalidate)
|
||||
const subscription = (await getCachedSubscription(userId)) || {
|
||||
// Get subscription info from database with lazy reset
|
||||
const subscription = (await checkAndResetSubscriptionPeriod(userId)) || {
|
||||
plan_type: "free",
|
||||
optimizations_used: 0,
|
||||
optimizations_limit: SUBSCRIPTION_PLANS.FREE.optimizations,
|
||||
|
|
@ -58,11 +37,3 @@ async function BillingContent() {
|
|||
)
|
||||
}
|
||||
}
|
||||
|
||||
export default async function BillingPage() {
|
||||
return (
|
||||
<Suspense fallback={<div>Loading...</div>}>
|
||||
<BillingContent />
|
||||
</Suspense>
|
||||
)
|
||||
}
|
||||
|
|
|
|||
|
|
@ -7,19 +7,8 @@ import {
|
|||
AccountPayload,
|
||||
checkAndResetSubscriptionPeriod,
|
||||
} from "@codeflash-ai/common"
|
||||
import { Prisma } from "@prisma/client"
|
||||
import { dedup } from "@/lib/request-dedup"
|
||||
|
||||
const VALID_EVENT_TYPES = new Set(["pr_created", "pr_merged", "pr_closed", "no-pr", "all"])
|
||||
|
||||
/** Validate an event_type filter value against the allowlist. */
|
||||
function validateEventType(value: string): string {
|
||||
if (!VALID_EVENT_TYPES.has(value)) {
|
||||
throw new Error(`Invalid event type: ${value}`)
|
||||
}
|
||||
return value
|
||||
}
|
||||
|
||||
export interface RepositoryWithUsage {
|
||||
id: string
|
||||
github_repo_id: string
|
||||
|
|
@ -59,28 +48,23 @@ export async function getAllRepositories(
|
|||
},
|
||||
})
|
||||
|
||||
const activeRepoSet = new Set(
|
||||
activeRepoIds.map((r: { repository_id: string | null }) => r.repository_id),
|
||||
)
|
||||
const activeRepoSet = new Set(activeRepoIds.map(r => r.repository_id))
|
||||
|
||||
const result = repos.map(repo => {
|
||||
const organization = repo.full_name.split("/")[0]
|
||||
return {
|
||||
id: repo.id,
|
||||
github_repo_id: repo.github_repo_id,
|
||||
name: repo.name,
|
||||
full_name: repo.full_name,
|
||||
is_private: repo.is_private,
|
||||
is_active: activeRepoSet.has(repo.id),
|
||||
has_github_action: repo.has_github_action,
|
||||
created_at: repo.created_at,
|
||||
last_optimized: repo.last_optimized,
|
||||
optimizations_limit: repo.optimizations_limit,
|
||||
optimizations_used: repo.optimizations_used,
|
||||
organization,
|
||||
avatarUrl: `https://github.com/${organization}.png`,
|
||||
}
|
||||
})
|
||||
const result = repos.map(repo => ({
|
||||
id: repo.id,
|
||||
github_repo_id: repo.github_repo_id,
|
||||
name: repo.name,
|
||||
full_name: repo.full_name,
|
||||
is_private: repo.is_private,
|
||||
is_active: activeRepoSet.has(repo.id),
|
||||
has_github_action: repo.has_github_action,
|
||||
created_at: repo.created_at,
|
||||
last_optimized: repo.last_optimized,
|
||||
optimizations_limit: repo.optimizations_limit,
|
||||
optimizations_used: repo.optimizations_used,
|
||||
organization: repo.full_name.split("/")[0],
|
||||
avatarUrl: `https://github.com/${repo.full_name.split("/")[0]}.png`,
|
||||
}))
|
||||
|
||||
return result
|
||||
} catch (error) {
|
||||
|
|
@ -90,75 +74,30 @@ export async function getAllRepositories(
|
|||
})
|
||||
}
|
||||
|
||||
/**
|
||||
* Build the base_events CTE for the statistics query.
|
||||
*
|
||||
* For org accounts this is a simple `WHERE repository_id IN (...)`.
|
||||
* For personal accounts the OR across repository_id / user_id /
|
||||
* current_username forces PostgreSQL into a slow bitmap OR merge when
|
||||
* there are 100+ repo UUIDs. Instead we emit three UNION branches so
|
||||
* each hits its own composite index independently.
|
||||
*
|
||||
* Returns a `Prisma.Sql` fragment for safe composition via tagged templates.
|
||||
*/
|
||||
function buildBaseEventsCte(payload: AccountPayload, repoIds: string[], year?: number): Prisma.Sql {
|
||||
const safeYear = year != null ? Math.trunc(year) : undefined
|
||||
const yearCondition = safeYear
|
||||
? Prisma.sql`AND EXTRACT(YEAR FROM created_at) = ${safeYear}`
|
||||
: Prisma.empty
|
||||
|
||||
const repoInClause = Prisma.sql`repository_id IN (${Prisma.join(repoIds)})`
|
||||
function buildOptimizationWhereClause(
|
||||
payload: AccountPayload,
|
||||
repoIds: string[],
|
||||
year?: number,
|
||||
): string {
|
||||
const repoIdsString = repoIds.map(id => `'${id}'`).join(",")
|
||||
const yearCondition = year ? `AND EXTRACT(YEAR FROM created_at) = ${year}` : ""
|
||||
|
||||
if ("orgId" in payload) {
|
||||
return Prisma.sql`base_events AS (
|
||||
SELECT
|
||||
created_at,
|
||||
is_optimization_found,
|
||||
current_username,
|
||||
repository_id,
|
||||
event_type
|
||||
FROM optimization_events
|
||||
WHERE ${repoInClause} ${yearCondition}
|
||||
)`
|
||||
return `repository_id IN (${repoIdsString}) ${yearCondition}`
|
||||
} else {
|
||||
const userId = payload.userId.replace(/'/g, "''")
|
||||
const username = payload.username.replace(/'/g, "''")
|
||||
|
||||
return `(
|
||||
repository_id IN (${repoIdsString})
|
||||
OR user_id = '${userId}'
|
||||
OR current_username = '${username}'
|
||||
) ${yearCondition}`
|
||||
}
|
||||
|
||||
// Personal account: UNION three index-backed scans, then deduplicate.
|
||||
// Each branch can seek on its leading index column.
|
||||
const { userId, username } = payload
|
||||
|
||||
return Prisma.sql`base_events AS (
|
||||
SELECT
|
||||
created_at,
|
||||
is_optimization_found,
|
||||
current_username,
|
||||
repository_id,
|
||||
event_type
|
||||
FROM optimization_events
|
||||
WHERE ${repoInClause} ${yearCondition}
|
||||
UNION
|
||||
SELECT
|
||||
created_at,
|
||||
is_optimization_found,
|
||||
current_username,
|
||||
repository_id,
|
||||
event_type
|
||||
FROM optimization_events
|
||||
WHERE user_id = ${userId} ${yearCondition}
|
||||
UNION
|
||||
SELECT
|
||||
created_at,
|
||||
is_optimization_found,
|
||||
current_username,
|
||||
repository_id,
|
||||
event_type
|
||||
FROM optimization_events
|
||||
WHERE current_username = ${username} ${yearCondition}
|
||||
)`
|
||||
}
|
||||
|
||||
export async function statistics(payload: AccountPayload, year: number) {
|
||||
try {
|
||||
const safeYear = Math.trunc(year) // ensure integer for SQL interpolation
|
||||
const { repoIds } = await getRepositoriesForAccountCached(payload)
|
||||
|
||||
if (repoIds.length === 0) {
|
||||
|
|
@ -171,9 +110,10 @@ export async function statistics(payload: AccountPayload, year: number) {
|
|||
}
|
||||
|
||||
const since = new Date(Date.now() - 30 * 24 * 60 * 60 * 1000)
|
||||
const baseEventsCte = buildBaseEventsCte(payload, repoIds, safeYear)
|
||||
const whereClause = buildOptimizationWhereClause(payload, repoIds, year)
|
||||
|
||||
const result = await prisma.$queryRaw<
|
||||
const sinceFormatted = since.toISOString()
|
||||
const result = await prisma.$queryRawUnsafe<
|
||||
Array<{
|
||||
total_attempts: bigint
|
||||
successful_attempts: bigint
|
||||
|
|
@ -182,90 +122,89 @@ export async function statistics(payload: AccountPayload, year: number) {
|
|||
active_repos: string
|
||||
pr_stats: string
|
||||
}>
|
||||
>`
|
||||
WITH
|
||||
-- Step 1: Collect matching rows (UNION for personal accounts)
|
||||
${baseEventsCte},
|
||||
|
||||
-- Step 1b: Derive date-based columns from the base rows
|
||||
prepared_events AS (
|
||||
SELECT
|
||||
>(
|
||||
`
|
||||
WITH
|
||||
-- Step 1: Filter and prepare base data with dynamic WHERE
|
||||
base_events AS (
|
||||
SELECT
|
||||
created_at,
|
||||
is_optimization_found,
|
||||
current_username,
|
||||
repository_id,
|
||||
event_type,
|
||||
DATE(created_at) as event_date,
|
||||
created_at >= ${since}::timestamp as is_recent,
|
||||
EXTRACT(YEAR FROM created_at)::int = ${safeYear} as is_target_year,
|
||||
created_at >= '${sinceFormatted}'::timestamp as is_recent,
|
||||
EXTRACT(YEAR FROM created_at)::int = ${year} as is_target_year,
|
||||
EXTRACT(MONTH FROM created_at)::int as event_month
|
||||
FROM base_events
|
||||
FROM optimization_events
|
||||
WHERE ${whereClause}
|
||||
),
|
||||
|
||||
|
||||
-- Step 2: Calculate total aggregates
|
||||
totals AS (
|
||||
SELECT
|
||||
SELECT
|
||||
COUNT(*)::bigint as total_attempts,
|
||||
SUM(CASE WHEN is_optimization_found THEN 1 ELSE 0 END)::bigint as successful_attempts
|
||||
FROM prepared_events
|
||||
FROM base_events
|
||||
),
|
||||
|
||||
|
||||
-- Step 3: Daily time series with cumulative counts (WINDOW FUNCTIONS!)
|
||||
daily_series AS (
|
||||
SELECT
|
||||
SELECT
|
||||
event_date,
|
||||
COUNT(*) as daily_all,
|
||||
SUM(CASE WHEN is_optimization_found THEN 1 ELSE 0 END) as daily_success,
|
||||
SUM(COUNT(*)) OVER (
|
||||
ORDER BY event_date
|
||||
ORDER BY event_date
|
||||
ROWS BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW
|
||||
) as cumulative_all,
|
||||
SUM(SUM(CASE WHEN is_optimization_found THEN 1 ELSE 0 END)) OVER (
|
||||
ORDER BY event_date
|
||||
ORDER BY event_date
|
||||
ROWS BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW
|
||||
) as cumulative_success
|
||||
FROM prepared_events
|
||||
FROM base_events
|
||||
GROUP BY event_date
|
||||
ORDER BY event_date
|
||||
),
|
||||
|
||||
|
||||
-- Step 4: Active users (last 30 days)
|
||||
active_users_agg AS (
|
||||
SELECT
|
||||
SELECT
|
||||
current_username,
|
||||
COUNT(*)::bigint as event_count
|
||||
FROM prepared_events
|
||||
WHERE is_recent = true
|
||||
FROM base_events
|
||||
WHERE is_recent = true
|
||||
AND current_username IS NOT NULL
|
||||
GROUP BY current_username
|
||||
ORDER BY event_count DESC
|
||||
LIMIT 100
|
||||
),
|
||||
|
||||
|
||||
-- Step 5: Active repos (last 30 days)
|
||||
active_repos_agg AS (
|
||||
SELECT DISTINCT repository_id
|
||||
FROM prepared_events
|
||||
WHERE is_recent = true
|
||||
FROM base_events
|
||||
WHERE is_recent = true
|
||||
AND repository_id IS NOT NULL
|
||||
),
|
||||
|
||||
|
||||
-- Step 6: PR stats by month
|
||||
pr_stats_agg AS (
|
||||
SELECT
|
||||
SELECT
|
||||
event_month as month,
|
||||
SUM(CASE WHEN event_type = 'pr_created' THEN 1 ELSE 0 END)::int as pr_created,
|
||||
SUM(CASE WHEN event_type = 'pr_merged' THEN 1 ELSE 0 END)::int as pr_merged,
|
||||
SUM(CASE WHEN event_type = 'pr_closed' THEN 1 ELSE 0 END)::int as pr_closed
|
||||
FROM prepared_events
|
||||
FROM base_events
|
||||
WHERE is_target_year = true
|
||||
AND event_type IN ('pr_created', 'pr_merged', 'pr_closed')
|
||||
GROUP BY event_month
|
||||
),
|
||||
|
||||
|
||||
-- Step 7: Aggregate time series into JSON
|
||||
time_series_json AS (
|
||||
SELECT
|
||||
SELECT
|
||||
COALESCE(
|
||||
json_agg(
|
||||
json_build_object(
|
||||
|
|
@ -278,10 +217,10 @@ export async function statistics(payload: AccountPayload, year: number) {
|
|||
) as daily_time_series
|
||||
FROM daily_series
|
||||
),
|
||||
|
||||
|
||||
-- Step 8: Aggregate active users into JSON
|
||||
users_json AS (
|
||||
SELECT
|
||||
SELECT
|
||||
COALESCE(
|
||||
json_agg(
|
||||
json_build_object(
|
||||
|
|
@ -293,20 +232,20 @@ export async function statistics(payload: AccountPayload, year: number) {
|
|||
) as active_users
|
||||
FROM active_users_agg
|
||||
),
|
||||
|
||||
|
||||
-- Step 9: Aggregate active repos into JSON
|
||||
repos_json AS (
|
||||
SELECT
|
||||
SELECT
|
||||
COALESCE(
|
||||
json_agg(repository_id::text),
|
||||
'[]'::json
|
||||
) as active_repos
|
||||
FROM active_repos_agg
|
||||
),
|
||||
|
||||
|
||||
-- Step 10: Aggregate PR stats into JSON
|
||||
pr_json AS (
|
||||
SELECT
|
||||
SELECT
|
||||
COALESCE(
|
||||
json_agg(
|
||||
json_build_object(
|
||||
|
|
@ -320,9 +259,9 @@ export async function statistics(payload: AccountPayload, year: number) {
|
|||
) as pr_stats
|
||||
FROM pr_stats_agg
|
||||
)
|
||||
|
||||
|
||||
-- Final: Combine everything into single row
|
||||
SELECT
|
||||
SELECT
|
||||
COALESCE(t.total_attempts, 0) as total_attempts,
|
||||
COALESCE(t.successful_attempts, 0) as successful_attempts,
|
||||
ts.daily_time_series::text as daily_time_series,
|
||||
|
|
@ -334,7 +273,8 @@ export async function statistics(payload: AccountPayload, year: number) {
|
|||
CROSS JOIN users_json u
|
||||
CROSS JOIN repos_json r
|
||||
CROSS JOIN pr_json p
|
||||
`
|
||||
`,
|
||||
)
|
||||
|
||||
const data = result[0]
|
||||
|
||||
|
|
@ -460,23 +400,88 @@ export async function getOptimizationPRs(
|
|||
}
|
||||
}
|
||||
|
||||
// Build parameterized SQL fragments
|
||||
const repoInClause = Prisma.sql`oe.repository_id IN (${Prisma.join(repoIds)})`
|
||||
// Build WHERE conditions with parameterized queries
|
||||
const repoIdsString = repoIds.map(id => `'${id.replace(/'/g, "''")}'`).join(",")
|
||||
|
||||
const repositoryCondition = repositoryId
|
||||
? Prisma.sql`AND oe.repository_id = ${repositoryId}`
|
||||
: Prisma.empty
|
||||
let accountCondition: string
|
||||
if ("orgId" in payload) {
|
||||
accountCondition = `oe.repository_id IN (${repoIdsString})`
|
||||
} else {
|
||||
const userId = payload.userId.replace(/'/g, "''")
|
||||
const username = payload.username.replace(/'/g, "''")
|
||||
accountCondition = `(
|
||||
oe.repository_id IN (${repoIdsString})
|
||||
OR oe.user_id = '${userId}'
|
||||
OR oe.current_username = '${username}'
|
||||
)`
|
||||
}
|
||||
|
||||
const eventTypeCondition =
|
||||
eventTypeFilter && eventTypeFilter !== "all"
|
||||
? Prisma.sql`AND oe.event_type = ${validateEventType(eventTypeFilter)}`
|
||||
: Prisma.sql`AND oe.event_type IN ('pr_created','pr_merged','pr_closed')`
|
||||
? `AND oe.event_type = '${String(eventTypeFilter).replace(/'/g, "''")}'`
|
||||
: `AND oe.event_type IN ('pr_created','pr_merged','pr_closed')`
|
||||
|
||||
const safePageSize = Math.trunc(pageSize)
|
||||
const offset = Math.trunc((page - 1) * safePageSize)
|
||||
const repositoryCondition = repositoryId
|
||||
? `AND oe.repository_id = '${String(repositoryId).replace(/'/g, "''")}'`
|
||||
: ""
|
||||
|
||||
// Shared select fields fragment for the data query
|
||||
const selectFields = Prisma.sql`
|
||||
// Separate WHERE clauses: the count query uses EXISTS to avoid joining the
|
||||
// large optimization_features table when oe.pr_url already satisfies the
|
||||
// "has a PR" condition. The data query still LEFT JOINs to pull fallback
|
||||
// fields but only for the small LIMIT'd result set.
|
||||
const prCondition = `
|
||||
AND oe.is_optimization_found = true
|
||||
AND (
|
||||
oe.pr_url IS NOT NULL
|
||||
OR EXISTS (
|
||||
SELECT 1 FROM optimization_features of2
|
||||
WHERE of2.trace_id = oe.trace_id
|
||||
AND of2.pull_request IS NOT NULL
|
||||
)
|
||||
)
|
||||
`
|
||||
|
||||
const countWhereClause = `
|
||||
${accountCondition}
|
||||
${eventTypeCondition}
|
||||
${repositoryCondition}
|
||||
${prCondition}
|
||||
`
|
||||
|
||||
const dataWhereClause = `
|
||||
${accountCondition}
|
||||
${eventTypeCondition}
|
||||
${repositoryCondition}
|
||||
AND oe.is_optimization_found = true
|
||||
AND (
|
||||
oe.pr_url IS NOT NULL
|
||||
OR of.pull_request IS NOT NULL
|
||||
)
|
||||
`
|
||||
|
||||
const offset = (page - 1) * pageSize
|
||||
|
||||
// Run data + count queries in parallel.
|
||||
// Count uses EXISTS (no JOIN on optimization_features).
|
||||
// Data query JOINs optimization_features only for the LIMIT'd rows.
|
||||
const [events, countRows] = await Promise.all([
|
||||
prisma.$queryRawUnsafe<
|
||||
Array<{
|
||||
id: string
|
||||
event_type: string
|
||||
pr_url: string | null
|
||||
function_name: string | null
|
||||
file_path: string | null
|
||||
speedup_x: number | null
|
||||
speedup_pct: number | null
|
||||
created_at: Date
|
||||
repository_id: string | null
|
||||
repo_name: string | null
|
||||
repo_full_name: string | null
|
||||
}>
|
||||
>(
|
||||
`
|
||||
SELECT
|
||||
oe.id,
|
||||
oe.event_type,
|
||||
COALESCE(
|
||||
|
|
@ -530,171 +535,29 @@ export async function getOptimizationPRs(
|
|||
oe.created_at,
|
||||
oe.repository_id,
|
||||
r.name AS repo_name,
|
||||
r.full_name AS repo_full_name`
|
||||
|
||||
// Build count and data queries — for personal accounts, rewrite 3-way OR
|
||||
// as UNION so each branch uses its optimal composite index independently
|
||||
// instead of a slow bitmap OR merge across 100+ repo UUIDs.
|
||||
let countQuery: Prisma.Sql
|
||||
let dataQuery: Prisma.Sql
|
||||
|
||||
if ("orgId" in payload) {
|
||||
// Org: LEFT JOIN with optimization_features to avoid EXISTS subquery evaluation per row
|
||||
countQuery = Prisma.sql`
|
||||
SELECT COUNT(*)::bigint AS count
|
||||
r.full_name AS repo_full_name
|
||||
FROM optimization_events oe
|
||||
LEFT JOIN optimization_features of ON oe.trace_id = of.trace_id
|
||||
WHERE ${repoInClause}
|
||||
${eventTypeCondition}
|
||||
${repositoryCondition}
|
||||
AND oe.is_optimization_found = true
|
||||
AND (oe.pr_url IS NOT NULL OR of.pull_request IS NOT NULL)
|
||||
`
|
||||
|
||||
// Two-phase: first identify the page of event IDs using LEFT JOIN instead
|
||||
// of EXISTS to avoid row-by-row subquery evaluation, then JOIN for display data.
|
||||
dataQuery = Prisma.sql`
|
||||
WITH page_ids AS (
|
||||
SELECT oe.id
|
||||
FROM optimization_events oe
|
||||
LEFT JOIN optimization_features of ON oe.trace_id = of.trace_id
|
||||
WHERE ${repoInClause}
|
||||
${eventTypeCondition}
|
||||
${repositoryCondition}
|
||||
AND oe.is_optimization_found = true
|
||||
AND (oe.pr_url IS NOT NULL OR of.pull_request IS NOT NULL)
|
||||
ORDER BY oe.created_at DESC
|
||||
LIMIT ${safePageSize} OFFSET ${offset}
|
||||
)
|
||||
SELECT ${selectFields}
|
||||
FROM optimization_events oe
|
||||
INNER JOIN page_ids pi ON pi.id = oe.id
|
||||
LEFT JOIN optimization_features of ON oe.trace_id = of.trace_id
|
||||
LEFT JOIN repositories r ON oe.repository_id = r.id
|
||||
WHERE ${dataWhereClause}
|
||||
ORDER BY oe.created_at DESC
|
||||
`
|
||||
} else {
|
||||
// Personal: UNION for index-backed scans
|
||||
const { userId, username } = payload
|
||||
const repoInClauseNoAlias = Prisma.sql`oe.repository_id IN (${Prisma.join(repoIds)})`
|
||||
const repoFilterNoAlias = repositoryId
|
||||
? Prisma.sql`AND repository_id = ${repositoryId}`
|
||||
: Prisma.empty
|
||||
const eventFilterNoAlias =
|
||||
eventTypeFilter && eventTypeFilter !== "all"
|
||||
? Prisma.sql`event_type = ${validateEventType(eventTypeFilter)}`
|
||||
: Prisma.sql`event_type IN ('pr_created','pr_merged','pr_closed')`
|
||||
const branchFilters = Prisma.sql`AND ${eventFilterNoAlias} AND is_optimization_found = true ${repoFilterNoAlias}`
|
||||
|
||||
countQuery = Prisma.sql`
|
||||
WITH candidate_events AS (
|
||||
SELECT oe.id, oe.trace_id, oe.pr_url,
|
||||
of.pull_request IS NOT NULL AS has_pr_in_features
|
||||
FROM optimization_events oe
|
||||
LEFT JOIN optimization_features of ON oe.trace_id = of.trace_id
|
||||
WHERE ${repoInClauseNoAlias} ${branchFilters}
|
||||
UNION
|
||||
SELECT oe.id, oe.trace_id, oe.pr_url,
|
||||
of.pull_request IS NOT NULL AS has_pr_in_features
|
||||
FROM optimization_events oe
|
||||
LEFT JOIN optimization_features of ON oe.trace_id = of.trace_id
|
||||
WHERE oe.user_id = ${userId} ${branchFilters}
|
||||
UNION
|
||||
SELECT oe.id, oe.trace_id, oe.pr_url,
|
||||
of.pull_request IS NOT NULL AS has_pr_in_features
|
||||
FROM optimization_events oe
|
||||
LEFT JOIN optimization_features of ON oe.trace_id = of.trace_id
|
||||
WHERE oe.current_username = ${username} ${branchFilters}
|
||||
)
|
||||
LIMIT ${pageSize} OFFSET ${offset}
|
||||
`,
|
||||
),
|
||||
prisma.$queryRawUnsafe<Array<{ count: bigint }>>(
|
||||
`
|
||||
SELECT COUNT(*)::bigint AS count
|
||||
FROM candidate_events ce
|
||||
WHERE ce.pr_url IS NOT NULL OR ce.has_pr_in_features
|
||||
`
|
||||
|
||||
// Personal: two-phase CTE approach to avoid joining large tables
|
||||
// before sorting and limiting.
|
||||
//
|
||||
// Phase 1 (candidates): UNION for index-backed scans, carrying
|
||||
// id + created_at + pr_url + trace_id for filtering and sorting.
|
||||
// Phase 2 (page_ids): Filter for PR presence (pr_url OR optimization_features),
|
||||
// sort by created_at DESC, and LIMIT — so the expensive JOINs only
|
||||
// happen for the final page of results.
|
||||
dataQuery = Prisma.sql`
|
||||
WITH candidates AS (
|
||||
SELECT oe.id, oe.created_at, oe.pr_url, oe.trace_id,
|
||||
of.pull_request IS NOT NULL AS has_pr_in_features
|
||||
FROM optimization_events oe
|
||||
LEFT JOIN optimization_features of ON oe.trace_id = of.trace_id
|
||||
WHERE ${repoInClauseNoAlias} ${branchFilters}
|
||||
UNION
|
||||
SELECT oe.id, oe.created_at, oe.pr_url, oe.trace_id,
|
||||
of.pull_request IS NOT NULL AS has_pr_in_features
|
||||
FROM optimization_events oe
|
||||
LEFT JOIN optimization_features of ON oe.trace_id = of.trace_id
|
||||
WHERE oe.user_id = ${userId} ${branchFilters}
|
||||
UNION
|
||||
SELECT oe.id, oe.created_at, oe.pr_url, oe.trace_id,
|
||||
of.pull_request IS NOT NULL AS has_pr_in_features
|
||||
FROM optimization_events oe
|
||||
LEFT JOIN optimization_features of ON oe.trace_id = of.trace_id
|
||||
WHERE oe.current_username = ${username} ${branchFilters}
|
||||
),
|
||||
page_ids AS (
|
||||
SELECT id
|
||||
FROM candidates c
|
||||
WHERE c.pr_url IS NOT NULL OR c.has_pr_in_features
|
||||
ORDER BY c.created_at DESC
|
||||
LIMIT ${safePageSize} OFFSET ${offset}
|
||||
)
|
||||
SELECT ${selectFields}
|
||||
FROM optimization_events oe
|
||||
INNER JOIN page_ids pi ON pi.id = oe.id
|
||||
LEFT JOIN optimization_features of ON oe.trace_id = of.trace_id
|
||||
LEFT JOIN repositories r ON oe.repository_id = r.id
|
||||
ORDER BY oe.created_at DESC
|
||||
`
|
||||
}
|
||||
|
||||
// Run data + count queries in parallel.
|
||||
// Both use UNION (personal) or flat WHERE (org) to avoid bitmap OR.
|
||||
const [events, countRows] = await Promise.all([
|
||||
prisma.$queryRaw<
|
||||
Array<{
|
||||
id: string
|
||||
event_type: string
|
||||
pr_url: string | null
|
||||
function_name: string | null
|
||||
file_path: string | null
|
||||
speedup_x: number | null
|
||||
speedup_pct: number | null
|
||||
created_at: Date
|
||||
repository_id: string | null
|
||||
repo_name: string | null
|
||||
repo_full_name: string | null
|
||||
}>
|
||||
>(dataQuery),
|
||||
prisma.$queryRaw<Array<{ count: bigint }>>(countQuery),
|
||||
WHERE ${countWhereClause}
|
||||
`,
|
||||
),
|
||||
])
|
||||
|
||||
const totalCount = Number(countRows?.[0]?.count ?? 0)
|
||||
const totalPages = Math.ceil(totalCount / safePageSize)
|
||||
const totalPages = Math.ceil(totalCount / pageSize)
|
||||
|
||||
return {
|
||||
events: (
|
||||
events as Array<{
|
||||
id: string
|
||||
event_type: string
|
||||
pr_url: string | null
|
||||
function_name: string | null
|
||||
file_path: string | null
|
||||
speedup_x: number | null
|
||||
speedup_pct: number | null
|
||||
created_at: Date
|
||||
repository_id: string | null
|
||||
repo_name: string | null
|
||||
repo_full_name: string | null
|
||||
}>
|
||||
).map(e => ({
|
||||
events: events.map(e => ({
|
||||
id: e.id,
|
||||
event_type: e.event_type,
|
||||
pr_url: e.pr_url,
|
||||
|
|
|
|||
|
|
@ -1,21 +1,13 @@
|
|||
import { Suspense } from "react"
|
||||
import { Lock, Globe, Zap, Gauge, FolderGit2, BookOpen } from "lucide-react"
|
||||
import { getDashboardData, getOptimizationPRs } from "./action"
|
||||
import { getDashboardData } from "./action"
|
||||
import { getAccountContext } from "@/lib/server/get-account-context"
|
||||
import { ActiveUsersLeaderboard } from "@/components/dashboard/ActiveUsersLeaderboard"
|
||||
import { CompactPullRequestActivityCard } from "@/components/dashboard/CompactPullRequestActivityCard"
|
||||
import { MetricCard } from "@/components/dashboard/MetricCard"
|
||||
import { OptimizationPRsTable } from "@/components/dashboard/OptimizationPRsTable"
|
||||
import {
|
||||
OptimizationPRsTableSkeleton,
|
||||
MetricCardSkeleton,
|
||||
PullRequestActivityCardSkeleton,
|
||||
ActiveUsersLeaderboardSkeleton,
|
||||
} from "@/components/dashboard/DashboardSkeleton"
|
||||
import { YearSelector } from "./_components/YearSelector"
|
||||
import { cacheLife, cacheTag } from "next/cache"
|
||||
import { format, subDays } from "date-fns"
|
||||
import type { AccountPayload } from "@codeflash-ai/common"
|
||||
|
||||
function getDateRangeDisplay(): string {
|
||||
const now = new Date()
|
||||
|
|
@ -34,29 +26,18 @@ function getDateRangeDisplay(): string {
|
|||
return `${format(last30DaysStart, "MMMM d, yyyy")} - ${format(now, "MMMM d, yyyy")}`
|
||||
}
|
||||
|
||||
// Async server component: streams PR table data independently
|
||||
async function OptimizationPRsSection({ payload }: { payload: AccountPayload }) {
|
||||
"use cache"
|
||||
cacheLife("frequent")
|
||||
cacheTag("optimization-prs")
|
||||
|
||||
const data = await getOptimizationPRs(payload)
|
||||
return <OptimizationPRsTable initialData={data} />
|
||||
}
|
||||
|
||||
// Async server component: streams stats + charts independently
|
||||
async function DashboardStatsSection({
|
||||
payload,
|
||||
selectedYear,
|
||||
export default async function DashboardPage({
|
||||
searchParams,
|
||||
}: {
|
||||
payload: AccountPayload
|
||||
selectedYear: number
|
||||
searchParams: Promise<{ year?: string }>
|
||||
}) {
|
||||
"use cache"
|
||||
cacheLife("dashboard")
|
||||
cacheTag("dashboard-stats")
|
||||
const params = await searchParams
|
||||
const currentYear = new Date().getFullYear()
|
||||
const parsedYear = params.year ? parseInt(params.year, 10) : currentYear
|
||||
const selectedYear = Number.isNaN(parsedYear) ? currentYear : parsedYear
|
||||
|
||||
const { stats, repos } = await getDashboardData(payload, selectedYear)
|
||||
const accountPayload = await getAccountContext()
|
||||
const { stats, repos } = await getDashboardData(accountPayload, selectedYear)
|
||||
|
||||
const repositories = Array.isArray(repos) ? repos : []
|
||||
const privateRepos = repositories.filter(repo => repo?.is_private).length
|
||||
|
|
@ -75,7 +56,16 @@ async function DashboardStatsSection({
|
|||
)
|
||||
|
||||
return (
|
||||
<>
|
||||
<div className="min-h-screen pb-8 py-6 sm:py-8 px-4 sm:px-6 max-w-[1400px] mx-auto">
|
||||
<div className="mb-6 sm:mb-8">
|
||||
<div className="flex items-center justify-between mb-2">
|
||||
<h1 className="text-xl sm:text-2xl font-bold">Dashboard</h1>
|
||||
<Suspense>
|
||||
<YearSelector selectedYear={selectedYear} />
|
||||
</Suspense>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{totalRepos === 0 && (
|
||||
<div className="mb-6 sm:mb-8">
|
||||
<div className="rounded-xl border border-dashed border-border bg-muted/10 px-5 py-4 sm:px-6 sm:py-5 flex flex-col sm:flex-row sm:items-center sm:justify-between gap-3">
|
||||
|
|
@ -110,6 +100,11 @@ async function DashboardStatsSection({
|
|||
</div>
|
||||
)}
|
||||
|
||||
{/* Optimization PRs Table */}
|
||||
<div className="mb-6 sm:mb-8">
|
||||
<OptimizationPRsTable />
|
||||
</div>
|
||||
|
||||
<div className="grid grid-cols-1 gap-3 sm:gap-5 mb-6 sm:mb-8">
|
||||
<div className="grid grid-cols-1 sm:grid-cols-2 gap-3 sm:gap-5">
|
||||
<MetricCard
|
||||
|
|
@ -205,68 +200,6 @@ async function DashboardStatsSection({
|
|||
<ActiveUsersLeaderboard leaderboardData={stats.activeUsersLast30Days} />
|
||||
</div>
|
||||
</div>
|
||||
</>
|
||||
)
|
||||
}
|
||||
|
||||
// Skeleton for the stats section matching the exact layout structure
|
||||
function StatsSkeletonFallback() {
|
||||
return (
|
||||
<>
|
||||
<div className="grid grid-cols-1 gap-3 sm:gap-5 mb-6 sm:mb-8">
|
||||
<div className="grid grid-cols-1 sm:grid-cols-2 gap-3 sm:gap-5">
|
||||
<MetricCardSkeleton showChart={true} />
|
||||
<MetricCardSkeleton showChart={true} />
|
||||
</div>
|
||||
<div className="grid grid-cols-1 sm:grid-cols-2 md:grid-cols-4 gap-3 sm:gap-5">
|
||||
<MetricCardSkeleton showChart={false} />
|
||||
<MetricCardSkeleton showChart={false} />
|
||||
<MetricCardSkeleton showChart={false} />
|
||||
<MetricCardSkeleton showChart={false} />
|
||||
</div>
|
||||
</div>
|
||||
<div className="grid grid-cols-1 md:grid-cols-2 gap-3 sm:gap-5 mb-6 sm:mb-8 h-96 md:h-[500px]">
|
||||
<PullRequestActivityCardSkeleton />
|
||||
<ActiveUsersLeaderboardSkeleton />
|
||||
</div>
|
||||
</>
|
||||
)
|
||||
}
|
||||
|
||||
export default async function DashboardPage({
|
||||
searchParams,
|
||||
}: {
|
||||
searchParams: Promise<{ year?: string }>
|
||||
}) {
|
||||
const params = await searchParams
|
||||
const currentYear = new Date().getFullYear()
|
||||
const parsedYear = params.year ? parseInt(params.year, 10) : currentYear
|
||||
const selectedYear = Number.isNaN(parsedYear) ? currentYear : parsedYear
|
||||
|
||||
const accountPayload = await getAccountContext()
|
||||
|
||||
return (
|
||||
<div className="min-h-screen pb-8 py-6 sm:py-8 px-4 sm:px-6 max-w-[1400px] mx-auto">
|
||||
<div className="mb-6 sm:mb-8">
|
||||
<div className="flex items-center justify-between mb-2">
|
||||
<h1 className="text-xl sm:text-2xl font-bold">Dashboard</h1>
|
||||
<Suspense>
|
||||
<YearSelector selectedYear={selectedYear} />
|
||||
</Suspense>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* PR table streams independently — first data section to appear */}
|
||||
<div className="mb-6 sm:mb-8">
|
||||
<Suspense fallback={<OptimizationPRsTableSkeleton />}>
|
||||
<OptimizationPRsSection payload={accountPayload} />
|
||||
</Suspense>
|
||||
</div>
|
||||
|
||||
{/* Stats, metrics, and charts stream as a group */}
|
||||
<Suspense fallback={<StatsSkeletonFallback />}>
|
||||
<DashboardStatsSection payload={accountPayload} selectedYear={selectedYear} />
|
||||
</Suspense>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,9 +1,8 @@
|
|||
import { Suspense } from "react"
|
||||
import { auth0 } from "@/lib/auth0"
|
||||
import PostHogClient from "@/lib/posthog"
|
||||
import GettingStartedClient from "./getting-started-client"
|
||||
|
||||
async function GettingStartedContent() {
|
||||
export default async function GettingStarted() {
|
||||
const session = await auth0.getSession()
|
||||
if (!session) return null
|
||||
|
||||
|
|
@ -15,13 +14,7 @@ async function GettingStartedContent() {
|
|||
event: "webapp-loaded-getting-started",
|
||||
})
|
||||
|
||||
await posthog?.flush()
|
||||
|
||||
return <GettingStartedClient />
|
||||
}
|
||||
|
||||
export default function GettingStarted() {
|
||||
return (
|
||||
<Suspense fallback={<div>Loading...</div>}>
|
||||
<GettingStartedContent />
|
||||
</Suspense>
|
||||
)
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,38 +1,40 @@
|
|||
import { auth0 } from "@/lib/auth0"
|
||||
import { cookies } from "next/headers"
|
||||
import { redirect } from "next/navigation"
|
||||
import { ReactNode } from "react"
|
||||
import { hasCompletedOnboarding } from "@codeflash-ai/common"
|
||||
import Script from "next/script"
|
||||
import { ViewModeProvider } from "../app/ViewModeContext"
|
||||
import { PrivacyModeProvider } from "../app/PrivacyModeContext"
|
||||
import { DashboardShell } from "@/components/dashboard-shell"
|
||||
import { getCachedDashboardData } from "@/lib/cached-dashboard-data"
|
||||
import { getDashboardInitData } from "../app/init-data-action"
|
||||
|
||||
export default async function DashboardLayout({ children }: { children: ReactNode }) {
|
||||
const session = await auth0.getSession()
|
||||
if (!session) return null
|
||||
|
||||
const [data, cookieStore] = await Promise.all([
|
||||
getCachedDashboardData(session.user.sub),
|
||||
cookies(),
|
||||
const [completedOnboarding, initData] = await Promise.all([
|
||||
hasCompletedOnboarding(session.user.sub),
|
||||
getDashboardInitData(session.user.sub),
|
||||
])
|
||||
if (!data.onboardingCompleted) {
|
||||
if (!completedOnboarding) {
|
||||
redirect("/onboarding")
|
||||
}
|
||||
|
||||
const serverOrgId = cookieStore.get("currentOrganizationId")?.value ?? null
|
||||
|
||||
return (
|
||||
<ViewModeProvider
|
||||
user={session.user}
|
||||
initialOrganizations={data.organizations}
|
||||
serverOrgId={serverOrgId}
|
||||
>
|
||||
<ViewModeProvider user={session.user} initialOrganizations={initData.organizations}>
|
||||
<PrivacyModeProvider
|
||||
userId={session.user.sub}
|
||||
initialPrivacyMode={data.privacyMode}
|
||||
initialCanUsePrivacyMode={data.canUsePrivacyMode}
|
||||
initialPrivacyMode={initData.privacyMode}
|
||||
initialCanUsePrivacyMode={initData.canUsePrivacyMode}
|
||||
>
|
||||
<DashboardShell user={session.user} initialSubscription={data.subscription}>
|
||||
<DashboardShell user={session.user} initialSubscription={initData.subscription}>
|
||||
<Script
|
||||
id="crisp-chat-script"
|
||||
strategy="afterInteractive"
|
||||
dangerouslySetInnerHTML={{
|
||||
__html: `window.$crisp=[];window.CRISP_WEBSITE_ID="3e855999-42a1-4543-accf-afc369edfca0";(function(){d=document;s=d.createElement("script");s.src="https://client.crisp.chat/l.js";s.async=1;d.getElementsByTagName("head")[0].appendChild(s);})();`,
|
||||
}}
|
||||
/>
|
||||
{children}
|
||||
</DashboardShell>
|
||||
</PrivacyModeProvider>
|
||||
|
|
|
|||
|
|
@ -47,8 +47,7 @@ describe("getOrganizationMembers", () => {
|
|||
|
||||
describe("successful retrieval", () => {
|
||||
it("returns members when user has access", async () => {
|
||||
vi.mocked(prisma.organizations.findUnique).mockResolvedValue(mockOrg as any)
|
||||
vi.mocked(prisma.organization_members.findUnique).mockResolvedValue({ id: "member-1" } as any)
|
||||
vi.mocked(prisma.organizations.findFirst).mockResolvedValue(mockOrg as any)
|
||||
|
||||
const result = await getOrganizationMembers("user-1", "org-1")
|
||||
|
||||
|
|
@ -57,8 +56,7 @@ describe("getOrganizationMembers", () => {
|
|||
})
|
||||
|
||||
it("maps nested organization_members to flat Member structure", async () => {
|
||||
vi.mocked(prisma.organizations.findUnique).mockResolvedValue(mockOrg as any)
|
||||
vi.mocked(prisma.organization_members.findUnique).mockResolvedValue({ id: "member-1" } as any)
|
||||
vi.mocked(prisma.organizations.findFirst).mockResolvedValue(mockOrg as any)
|
||||
|
||||
const result = await getOrganizationMembers("user-1", "org-1")
|
||||
const member = result.data![0]
|
||||
|
|
@ -78,8 +76,7 @@ describe("getOrganizationMembers", () => {
|
|||
|
||||
describe("access control", () => {
|
||||
it("returns error when organization not found", async () => {
|
||||
vi.mocked(prisma.organizations.findUnique).mockResolvedValue(null)
|
||||
vi.mocked(prisma.organization_members.findUnique).mockResolvedValue(null)
|
||||
vi.mocked(prisma.organizations.findFirst).mockResolvedValue(null)
|
||||
|
||||
const result = await getOrganizationMembers("user-1", "org-1")
|
||||
|
||||
|
|
@ -88,8 +85,7 @@ describe("getOrganizationMembers", () => {
|
|||
})
|
||||
|
||||
it("returns error when user is not in organization members", async () => {
|
||||
vi.mocked(prisma.organizations.findUnique).mockResolvedValue(mockOrg as any)
|
||||
vi.mocked(prisma.organization_members.findUnique).mockResolvedValue(null)
|
||||
vi.mocked(prisma.organizations.findFirst).mockResolvedValue(mockOrg as any)
|
||||
|
||||
const result = await getOrganizationMembers("unknown-user", "org-1")
|
||||
|
||||
|
|
@ -100,7 +96,9 @@ describe("getOrganizationMembers", () => {
|
|||
|
||||
describe("error handling", () => {
|
||||
it("returns error response when Prisma throws", async () => {
|
||||
vi.mocked(prisma.organizations.findUnique).mockRejectedValue(new Error("Connection failed"))
|
||||
vi.mocked(prisma.organizations.findFirst).mockRejectedValue(
|
||||
new Error("Connection failed"),
|
||||
)
|
||||
|
||||
const result = await getOrganizationMembers("user-1", "org-1")
|
||||
|
||||
|
|
@ -109,7 +107,7 @@ describe("getOrganizationMembers", () => {
|
|||
})
|
||||
|
||||
it("uses fallback message for non-Error exceptions", async () => {
|
||||
vi.mocked(prisma.organizations.findUnique).mockRejectedValue("string error")
|
||||
vi.mocked(prisma.organizations.findFirst).mockRejectedValue("string error")
|
||||
|
||||
const result = await getOrganizationMembers("user-1", "org-1")
|
||||
|
||||
|
|
|
|||
|
|
@ -16,69 +16,52 @@ import { trackMemberInvited } from "@/lib/analytics/tracking"
|
|||
*/
|
||||
export const getOrganizationMembers = withTiming(
|
||||
"getOrganizationMembers",
|
||||
async (currentUserId: string, organizationId: string): Promise<ActionResponse<Member[]>> => {
|
||||
try {
|
||||
// Check access via indexed composite key in parallel with member fetch
|
||||
const [org, accessCheck] = await Promise.all([
|
||||
prisma.organizations.findUnique({
|
||||
where: { id: organizationId },
|
||||
select: {
|
||||
id: true,
|
||||
organization_members: {
|
||||
include: {
|
||||
user: { select: { user_id: true, github_username: true, name: true, email: true } },
|
||||
},
|
||||
orderBy: {
|
||||
added_at: "asc",
|
||||
},
|
||||
},
|
||||
async (
|
||||
currentUserId: string,
|
||||
organizationId: string,
|
||||
): Promise<ActionResponse<Member[]>> => {
|
||||
try {
|
||||
const org = await prisma.organizations.findFirst({
|
||||
where: { id: organizationId },
|
||||
include: {
|
||||
organization_members: {
|
||||
include: {
|
||||
user: true,
|
||||
},
|
||||
}),
|
||||
prisma.organization_members.findUnique({
|
||||
where: {
|
||||
organization_id_user_id: { organization_id: organizationId, user_id: currentUserId },
|
||||
orderBy: {
|
||||
added_at: "asc",
|
||||
},
|
||||
select: { id: true },
|
||||
}),
|
||||
])
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
if (!org) {
|
||||
return createErrorResponse("Organization not found")
|
||||
}
|
||||
|
||||
if (!accessCheck) {
|
||||
return createErrorResponse("You don't have access to this organization")
|
||||
}
|
||||
|
||||
const members: Member[] = org.organization_members.map(
|
||||
(member: {
|
||||
id: string
|
||||
user_id: string
|
||||
role: string
|
||||
added_at: Date
|
||||
user: {
|
||||
user_id: string
|
||||
github_username: string
|
||||
name: string | null
|
||||
email: string | null
|
||||
}
|
||||
}) => ({
|
||||
id: member.id,
|
||||
user_id: member.user_id,
|
||||
username: member.user.github_username,
|
||||
name: member.user.name,
|
||||
email: member.user.email,
|
||||
role: member.role,
|
||||
added_at: member.added_at,
|
||||
avatarUrl: `https://github.com/${member.user.github_username}.png`,
|
||||
}),
|
||||
)
|
||||
|
||||
return createSuccessResponse(members)
|
||||
} catch (error) {
|
||||
console.error("Failed to get organization members:", error)
|
||||
return createErrorResponse(error instanceof Error ? error.message : "Failed to get members")
|
||||
if (!org) {
|
||||
return createErrorResponse("Organization not found")
|
||||
}
|
||||
|
||||
// Check if user has access
|
||||
const hasAccess = org.organization_members.some(m => m.user_id === currentUserId)
|
||||
|
||||
if (!hasAccess) {
|
||||
return createErrorResponse("You don't have access to this organization")
|
||||
}
|
||||
|
||||
const members: Member[] = org.organization_members.map(member => ({
|
||||
id: member.id,
|
||||
user_id: member.user_id,
|
||||
username: member.user.github_username,
|
||||
name: member.user.name,
|
||||
email: member.user.email,
|
||||
role: member.role,
|
||||
added_at: member.added_at,
|
||||
avatarUrl: `https://github.com/${member.user.github_username}.png`,
|
||||
}))
|
||||
|
||||
return createSuccessResponse(members)
|
||||
} catch (error) {
|
||||
console.error("Failed to get organization members:", error)
|
||||
return createErrorResponse(error instanceof Error ? error.message : "Failed to get members")
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
|
|
@ -92,33 +75,20 @@ export async function addOrganizationMember(
|
|||
organizationId: string,
|
||||
): Promise<ActionResponse<Member>> {
|
||||
try {
|
||||
const org = await prisma.organizations.findFirst({
|
||||
where: { id: organizationId },
|
||||
include: {
|
||||
organization_members: true,
|
||||
},
|
||||
})
|
||||
const invitedUserId = `github|${invitedUser.githubUserId.toString()}`
|
||||
|
||||
// Verify org exists and check permissions + duplicate in parallel using indexed lookups
|
||||
// instead of loading the entire organization_members array
|
||||
const [orgExists, currentUserMember, existingMember] = await Promise.all([
|
||||
prisma.organizations.findUnique({
|
||||
where: { id: organizationId },
|
||||
select: { id: true },
|
||||
}),
|
||||
prisma.organization_members.findUnique({
|
||||
where: {
|
||||
organization_id_user_id: { organization_id: organizationId, user_id: currentUserId },
|
||||
},
|
||||
select: { role: true },
|
||||
}),
|
||||
prisma.organization_members.findUnique({
|
||||
where: {
|
||||
organization_id_user_id: { organization_id: organizationId, user_id: invitedUserId },
|
||||
},
|
||||
select: { id: true },
|
||||
}),
|
||||
])
|
||||
|
||||
if (!orgExists) {
|
||||
if (!org) {
|
||||
return createErrorResponse("Organization not found")
|
||||
}
|
||||
|
||||
const currentUserMember = org.organization_members.find(m => m.user_id === currentUserId)
|
||||
|
||||
// Check if user has permission to add members
|
||||
const isAdmin = currentUserMember?.role === "admin" || currentUserMember?.role === "owner"
|
||||
|
||||
|
|
@ -126,7 +96,9 @@ export async function addOrganizationMember(
|
|||
return createErrorResponse("You don't have permission to add members")
|
||||
}
|
||||
|
||||
// Check if member already exists
|
||||
// Check if member already exists by username
|
||||
const existingMember = org.organization_members.find(m => m.user_id === invitedUserId)
|
||||
|
||||
if (existingMember) {
|
||||
return createErrorResponse("User is already a member of this organization")
|
||||
}
|
||||
|
|
@ -134,16 +106,15 @@ export async function addOrganizationMember(
|
|||
// Check if user exists in our database
|
||||
let user = await getUserById(invitedUserId)
|
||||
|
||||
// If user doesn't exist, create them and re-fetch for consistent types
|
||||
// If user doesn't exist, create them
|
||||
if (!user) {
|
||||
await prisma.users.create({
|
||||
user = await prisma.users.create({
|
||||
data: {
|
||||
user_id: invitedUserId,
|
||||
github_username: invitedUser.username,
|
||||
onboarding_completed: false,
|
||||
},
|
||||
})
|
||||
user = await getUserById(invitedUserId)
|
||||
}
|
||||
|
||||
// Add user to organization members
|
||||
|
|
@ -164,16 +135,13 @@ export async function addOrganizationMember(
|
|||
})
|
||||
|
||||
return createSuccessResponse({
|
||||
id: String(newMember.id),
|
||||
user_id: String(newMember.user_id),
|
||||
id: newMember.id,
|
||||
user_id: newMember.user_id,
|
||||
username: invitedUser.username,
|
||||
name: user.name ?? null,
|
||||
email: user.email ?? null,
|
||||
role: String(newMember.role),
|
||||
added_at:
|
||||
newMember.added_at instanceof Date
|
||||
? newMember.added_at
|
||||
: new Date(String(newMember.added_at)),
|
||||
name: user.name,
|
||||
email: user.email,
|
||||
role: newMember.role,
|
||||
added_at: newMember.added_at,
|
||||
avatarUrl: invitedUser.avatarUrl,
|
||||
})
|
||||
} catch (error) {
|
||||
|
|
@ -192,34 +160,35 @@ export async function updateOrganizationMemberRole(
|
|||
newRole: "admin" | "member" | "owner",
|
||||
): Promise<ActionResponse<Boolean>> {
|
||||
try {
|
||||
// Fetch only the two specific members we need instead of loading ALL members
|
||||
const [currentUserMember, targetMember] = await Promise.all([
|
||||
prisma.organization_members.findUnique({
|
||||
where: {
|
||||
organization_id_user_id: { organization_id: organizationId, user_id: currentUserId },
|
||||
},
|
||||
select: { role: true },
|
||||
}),
|
||||
prisma.organization_members.findUnique({
|
||||
where: { id: memberId },
|
||||
select: { id: true, role: true, user_id: true },
|
||||
}),
|
||||
])
|
||||
const org = await prisma.organizations.findFirst({
|
||||
where: { id: organizationId },
|
||||
include: {
|
||||
organization_members: true,
|
||||
},
|
||||
})
|
||||
|
||||
if (!currentUserMember) {
|
||||
if (!org) {
|
||||
return createErrorResponse("Organization not found")
|
||||
}
|
||||
|
||||
const currentUserMember = org.organization_members.find(m => m.user_id === currentUserId)
|
||||
|
||||
// Only admins and owners can change roles
|
||||
if (currentUserMember.role !== "admin" && currentUserMember.role !== "owner") {
|
||||
if (currentUserMember?.role !== "admin" && currentUserMember?.role !== "owner") {
|
||||
return createErrorResponse("Only admins can change member roles")
|
||||
}
|
||||
|
||||
// Don't allow changing owner role
|
||||
const targetMember = org.organization_members.find(m => m.id === memberId)
|
||||
if (targetMember?.role === "owner") {
|
||||
return createErrorResponse("Cannot change owner role")
|
||||
}
|
||||
|
||||
// Don't allow changing own role if you're the only admin
|
||||
const adminCount = org.organization_members.filter(
|
||||
m => m.role === "admin" || m.role === "owner",
|
||||
).length
|
||||
|
||||
if (targetMember?.user_id === currentUserId) {
|
||||
return createErrorResponse("Cannot change your own role as the only admin")
|
||||
}
|
||||
|
|
@ -245,19 +214,19 @@ export async function removeOrganizationMember(
|
|||
memberId: string,
|
||||
): Promise<ActionResponse<Boolean>> {
|
||||
try {
|
||||
// Fetch only the two specific members we need instead of loading ALL members
|
||||
const [currentUserMember, targetMember] = await Promise.all([
|
||||
prisma.organization_members.findUnique({
|
||||
where: {
|
||||
organization_id_user_id: { organization_id: organizationId, user_id: currentUserId },
|
||||
},
|
||||
select: { role: true },
|
||||
}),
|
||||
prisma.organization_members.findUnique({
|
||||
where: { id: memberId },
|
||||
select: { id: true, role: true, user_id: true },
|
||||
}),
|
||||
])
|
||||
const org = await prisma.organizations.findFirst({
|
||||
where: { id: organizationId },
|
||||
include: {
|
||||
organization_members: true,
|
||||
},
|
||||
})
|
||||
|
||||
if (!org) {
|
||||
return createErrorResponse("Organization not found")
|
||||
}
|
||||
|
||||
const currentUserMember = org.organization_members.find(m => m.user_id === currentUserId)
|
||||
const targetMember = org.organization_members.find(m => m.id === memberId)
|
||||
|
||||
if (!targetMember) {
|
||||
return createErrorResponse("Member not found")
|
||||
|
|
@ -300,11 +269,11 @@ export async function getCurrentUserRole(
|
|||
organizationId: string,
|
||||
): Promise<ActionResponse<{ role: UserRole }>> {
|
||||
try {
|
||||
const member = await prisma.organization_members.findUnique({
|
||||
const member = await prisma.organization_members.findFirst({
|
||||
where: {
|
||||
organization_id_user_id: { organization_id: organizationId, user_id: userId },
|
||||
organization_id: organizationId,
|
||||
user_id: userId,
|
||||
},
|
||||
select: { role: true },
|
||||
})
|
||||
|
||||
if (!member) {
|
||||
|
|
|
|||
|
|
@ -1,80 +0,0 @@
|
|||
"use server"
|
||||
|
||||
import { auth0 } from "@/lib/auth0"
|
||||
import { cookies } from "next/headers"
|
||||
import { prisma } from "@/lib/prisma"
|
||||
import type { Member, UserRole } from "@/lib/types"
|
||||
|
||||
/**
|
||||
* Server-side function to fetch all data needed for the members page in parallel.
|
||||
* Uses @/lib/prisma directly to avoid pulling in @codeflash-ai/common at build time.
|
||||
*/
|
||||
export async function getMembersPageInitData() {
|
||||
const session = await auth0.getSession()
|
||||
if (!session?.user?.sub) {
|
||||
return null
|
||||
}
|
||||
|
||||
const userId = session.user.sub
|
||||
|
||||
const cookieStore = await cookies()
|
||||
const orgId = cookieStore.get("currentOrganizationId")?.value
|
||||
|
||||
if (!orgId) {
|
||||
return { userId, orgId: null, members: [] as Member[], currentUserRole: null }
|
||||
}
|
||||
|
||||
// Single query fetches org with all members (including current user's role)
|
||||
const org = await prisma.organizations.findUnique({
|
||||
where: { id: orgId },
|
||||
select: {
|
||||
id: true,
|
||||
organization_members: {
|
||||
include: {
|
||||
user: {
|
||||
select: { user_id: true, github_username: true, name: true, email: true },
|
||||
},
|
||||
},
|
||||
orderBy: { added_at: "asc" },
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
if (!org) {
|
||||
return { userId, orgId, members: [] as Member[], currentUserRole: null }
|
||||
}
|
||||
|
||||
// Check access and extract current user's role from the same query result
|
||||
const currentUserMember = org.organization_members.find(
|
||||
(m: { user_id: string }) => m.user_id === userId,
|
||||
)
|
||||
if (!currentUserMember) {
|
||||
return { userId, orgId, members: [] as Member[], currentUserRole: null }
|
||||
}
|
||||
|
||||
const members: Member[] = org.organization_members.map(
|
||||
(member: {
|
||||
id: string
|
||||
user_id: string
|
||||
role: string
|
||||
added_at: Date
|
||||
user: { user_id: string; github_username: string; name: string | null; email: string | null }
|
||||
}) => ({
|
||||
id: member.id,
|
||||
user_id: member.user_id,
|
||||
username: member.user.github_username,
|
||||
name: member.user.name,
|
||||
email: member.user.email,
|
||||
role: member.role,
|
||||
added_at: member.added_at,
|
||||
avatarUrl: `https://github.com/${member.user.github_username}.png`,
|
||||
}),
|
||||
)
|
||||
|
||||
return {
|
||||
userId,
|
||||
orgId,
|
||||
members,
|
||||
currentUserRole: (currentUserMember.role as UserRole) ?? null,
|
||||
}
|
||||
}
|
||||
|
|
@ -1,299 +0,0 @@
|
|||
"use client"
|
||||
import React, { useState, useEffect, useCallback, useRef } from "react"
|
||||
import { Users, UserPlus, RefreshCw, AlertCircle, Building2 } from "lucide-react"
|
||||
import { ConfirmDialog } from "@/components/confirm-dialog"
|
||||
import { MembersSkeleton } from "@/components/members/MembersSkeleton"
|
||||
import { GitHubUserSearchResult, Member } from "@/lib/types"
|
||||
import {
|
||||
addOrganizationMember,
|
||||
getCurrentUserRole,
|
||||
getOrganizationMembers,
|
||||
updateOrganizationMemberRole,
|
||||
removeOrganizationMember,
|
||||
} from "./action"
|
||||
import { useViewMode } from "@/app/app/ViewModeContext"
|
||||
import { MembersList } from "@/components/members/members-list"
|
||||
import { UserSearchModal } from "@/components/members/user-search-modal"
|
||||
|
||||
export interface MembersClientProps {
|
||||
initialUserId: string
|
||||
initialOrgId: string | null
|
||||
initialMembers: Member[]
|
||||
initialUserRole: string | null
|
||||
}
|
||||
|
||||
export function MembersClient({
|
||||
initialUserId,
|
||||
initialOrgId,
|
||||
initialMembers,
|
||||
initialUserRole,
|
||||
}: MembersClientProps) {
|
||||
const { currentOrg } = useViewMode()
|
||||
const initialOrgIdRef = useRef(initialOrgId)
|
||||
|
||||
const [members, setMembers] = useState<Member[]>(initialMembers)
|
||||
const [currentUserId] = useState<string>(initialUserId)
|
||||
const [currentUserRole, setCurrentUserRole] = useState<string | null>(initialUserRole)
|
||||
const [loading, setLoading] = useState(!initialOrgId)
|
||||
const [showAddModal, setShowAddModal] = useState(false)
|
||||
const [error, setError] = useState<string | null>(
|
||||
!initialOrgId ? "No organization selected" : null,
|
||||
)
|
||||
const [updatingMember, setUpdatingMember] = useState<string | null>(null)
|
||||
const [success, setSuccess] = useState<string | null>(null)
|
||||
const [searchQuery, setSearchQuery] = useState("")
|
||||
const [filterRole, setFilterRole] = useState<"all" | "owner" | "admin" | "member">("all")
|
||||
const [isRefreshing, setIsRefreshing] = useState(false)
|
||||
const [confirmDialog, setConfirmDialog] = useState<{
|
||||
open: boolean
|
||||
memberId: string
|
||||
memberUsername: string
|
||||
} | null>(null)
|
||||
|
||||
const isAdmin = currentUserRole === "admin" || currentUserRole === "owner"
|
||||
|
||||
const fetchMembers = useCallback(async () => {
|
||||
if (!currentOrg?.id) {
|
||||
setError("No organization selected")
|
||||
setLoading(false)
|
||||
return
|
||||
}
|
||||
|
||||
if (!isRefreshing) {
|
||||
setLoading(true)
|
||||
}
|
||||
setError(null)
|
||||
|
||||
try {
|
||||
const [roleResult, result] = await Promise.all([
|
||||
getCurrentUserRole(currentUserId, currentOrg.id),
|
||||
getOrganizationMembers(currentUserId, currentOrg.id),
|
||||
])
|
||||
if (roleResult.success && roleResult.data) {
|
||||
setCurrentUserRole(roleResult.data.role)
|
||||
}
|
||||
|
||||
if (result.success && result.data) {
|
||||
setMembers(result.data)
|
||||
} else {
|
||||
setError(result.error || "Failed to load members")
|
||||
}
|
||||
} catch (err) {
|
||||
console.error("Failed to fetch members:", err)
|
||||
setError("Failed to load members. Please try again.")
|
||||
} finally {
|
||||
setLoading(false)
|
||||
setIsRefreshing(false)
|
||||
}
|
||||
}, [currentOrg?.id, currentUserId, isRefreshing])
|
||||
|
||||
// Only refetch when org changes from what the server provided
|
||||
useEffect(() => {
|
||||
if (!currentOrg?.id) {
|
||||
setError("No organization selected")
|
||||
setLoading(false)
|
||||
return
|
||||
}
|
||||
|
||||
if (currentOrg.id === initialOrgIdRef.current) return
|
||||
initialOrgIdRef.current = currentOrg.id
|
||||
fetchMembers()
|
||||
}, [currentOrg?.id, fetchMembers])
|
||||
|
||||
useEffect(() => {
|
||||
if (success) {
|
||||
const timer = setTimeout(() => setSuccess(null), 5000)
|
||||
return () => clearTimeout(timer)
|
||||
}
|
||||
}, [success])
|
||||
|
||||
const handleMemberAdded = async () => {
|
||||
setIsRefreshing(true)
|
||||
await fetchMembers()
|
||||
setSuccess("Member added successfully!")
|
||||
}
|
||||
|
||||
const handleUserAdd = async (user: GitHubUserSearchResult, role: "admin" | "member") => {
|
||||
if (!currentOrg?.id) {
|
||||
return { success: false, error: "No organization selected" }
|
||||
}
|
||||
|
||||
const result = await addOrganizationMember(currentUserId, user, role, currentOrg.id)
|
||||
if (result.success) {
|
||||
handleMemberAdded()
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
const handleUpdateRole = async (memberId: string, newRole: "admin" | "member" | "owner") => {
|
||||
if (!currentOrg?.id) return
|
||||
|
||||
setUpdatingMember(memberId)
|
||||
setError(null)
|
||||
setSuccess(null)
|
||||
|
||||
const result = await updateOrganizationMemberRole(
|
||||
currentUserId,
|
||||
currentOrg.id,
|
||||
memberId,
|
||||
newRole,
|
||||
)
|
||||
|
||||
if (result.success) {
|
||||
setSuccess("Member role updated successfully")
|
||||
setIsRefreshing(true)
|
||||
await fetchMembers()
|
||||
} else {
|
||||
setError(result.error || "Failed to update role")
|
||||
}
|
||||
|
||||
setUpdatingMember(null)
|
||||
}
|
||||
|
||||
const handleRemoveMember = async (memberId: string, memberUsername: string) => {
|
||||
if (!currentOrg?.id) return
|
||||
setConfirmDialog({ open: true, memberId, memberUsername })
|
||||
}
|
||||
|
||||
const confirmRemoveMember = async () => {
|
||||
if (!confirmDialog || !currentOrg?.id) return
|
||||
|
||||
const { memberId, memberUsername } = confirmDialog
|
||||
|
||||
setUpdatingMember(memberId)
|
||||
setError(null)
|
||||
setSuccess(null)
|
||||
|
||||
const result = await removeOrganizationMember(currentUserId, currentOrg.id, memberId)
|
||||
|
||||
if (result.success) {
|
||||
setSuccess(`${memberUsername} has been removed successfully`)
|
||||
setIsRefreshing(true)
|
||||
await fetchMembers()
|
||||
} else {
|
||||
setError(result.error || "Failed to remove member")
|
||||
}
|
||||
|
||||
setUpdatingMember(null)
|
||||
setConfirmDialog(null)
|
||||
}
|
||||
|
||||
if (loading) {
|
||||
return <MembersSkeleton count={6} />
|
||||
}
|
||||
|
||||
if (!currentOrg?.id) {
|
||||
return (
|
||||
<div className="flex justify-center items-center min-h-[70vh] p-4">
|
||||
<div className="bg-destructive/10 border border-destructive/20 text-destructive p-6 sm:p-8 rounded-2xl w-full max-w-md shadow-lg">
|
||||
<div className="inline-flex items-center justify-center w-12 h-12 rounded-full bg-destructive/20 mb-4">
|
||||
<AlertCircle size={24} />
|
||||
</div>
|
||||
<h3 className="text-base sm:text-lg font-semibold mb-2">No Organization Selected</h3>
|
||||
<p className="mb-4 text-sm sm:text-base opacity-90">
|
||||
Please select an organization from the sidebar
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
if (error && members.length === 0) {
|
||||
return (
|
||||
<div className="flex justify-center items-center min-h-[70vh] p-4">
|
||||
<div className="bg-destructive/10 border border-destructive/20 text-destructive p-6 sm:p-8 rounded-2xl w-full max-w-md shadow-lg">
|
||||
<div className="inline-flex items-center justify-center w-12 h-12 rounded-full bg-destructive/20 mb-4">
|
||||
<AlertCircle size={24} />
|
||||
</div>
|
||||
<h3 className="text-base sm:text-lg font-semibold mb-2">Unable to Load Members</h3>
|
||||
<p className="mb-4 text-sm sm:text-base opacity-90">{error}</p>
|
||||
<button
|
||||
onClick={() => fetchMembers()}
|
||||
className="flex items-center gap-2 w-full justify-center px-4 py-2.5 bg-destructive hover:bg-destructive/90 text-destructive-foreground rounded-xl text-sm font-medium transition-all shadow-sm hover:shadow-md"
|
||||
>
|
||||
<RefreshCw size={16} /> Try Again
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
const adminCount = members.filter(m => m.role === "admin" || m.role === "owner").length
|
||||
const memberCount = members.filter(m => m.role === "member").length
|
||||
|
||||
return (
|
||||
<div className="flex-1 bg-background">
|
||||
<div className="h-screen py-6 sm:py-8 px-4 sm:px-6 max-w-[1400px] mx-auto">
|
||||
{/* Header */}
|
||||
<div className="mb-6 sm:mb-8">
|
||||
<div className="flex items-start justify-between gap-4 mb-2">
|
||||
<div>
|
||||
<h1 className="text-2xl sm:text-3xl font-bold text-foreground flex items-center gap-3">
|
||||
<div className="p-2.5 rounded-xl bg-primary/10">
|
||||
<Users size={28} className="text-primary" />
|
||||
</div>
|
||||
Organization Members
|
||||
</h1>
|
||||
<p className="text-muted-foreground mt-2">
|
||||
Manage members and their roles in your organization
|
||||
</p>
|
||||
</div>
|
||||
{isAdmin && (
|
||||
<button
|
||||
onClick={() => setShowAddModal(true)}
|
||||
disabled={showAddModal}
|
||||
className="flex items-center gap-2 px-4 py-2.5 bg-primary text-primary-foreground rounded-xl hover:bg-primary/90 disabled:opacity-50 disabled:cursor-not-allowed transition-all duration-200 text-sm font-medium whitespace-nowrap flex-shrink-0 shadow-sm hover:shadow-md"
|
||||
>
|
||||
<UserPlus size={16} />
|
||||
<span className="hidden sm:inline">Add Member</span>
|
||||
<span className="sm:hidden">Add</span>
|
||||
</button>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div className="bg-card rounded-2xl border border-border shadow-sm overflow-hidden">
|
||||
<MembersList
|
||||
members={members}
|
||||
currentUserId={currentUserId}
|
||||
isAdmin={isAdmin}
|
||||
updatingMember={updatingMember}
|
||||
error={error}
|
||||
success={success}
|
||||
searchQuery={searchQuery}
|
||||
filterRole={filterRole}
|
||||
onSearchChange={setSearchQuery}
|
||||
onFilterChange={setFilterRole}
|
||||
onUpdateRole={handleUpdateRole}
|
||||
onRemove={handleRemoveMember}
|
||||
onDismissError={() => setError(null)}
|
||||
onDismissSuccess={() => setSuccess(null)}
|
||||
headerIcon={<Building2 size={20} className="text-primary" />}
|
||||
headerTitle="Members"
|
||||
headerStats={`${members.length} ${members.length === 1 ? "member" : "members"} • ${adminCount} ${adminCount === 1 ? "admin" : "admins"} • ${memberCount} ${memberCount === 1 ? "member" : "members"}`}
|
||||
/>
|
||||
</div>
|
||||
|
||||
<UserSearchModal
|
||||
isOpen={showAddModal}
|
||||
onClose={() => setShowAddModal(false)}
|
||||
onUserAdd={handleUserAdd}
|
||||
title="Add Organization Member"
|
||||
description="Search for GitHub users and add them to this organization"
|
||||
showRoleSelector={true}
|
||||
/>
|
||||
|
||||
<ConfirmDialog
|
||||
open={confirmDialog?.open || false}
|
||||
onOpenChange={open => !open && setConfirmDialog(null)}
|
||||
onConfirm={confirmRemoveMember}
|
||||
title="Remove Member"
|
||||
description={`Are you sure you want to remove ${confirmDialog?.memberUsername} from this organization? This action cannot be undone.`}
|
||||
confirmText="Remove"
|
||||
cancelText="Cancel"
|
||||
variant="destructive"
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
|
@ -1,23 +1,298 @@
|
|||
"use client"
|
||||
import React, { useState, useEffect, useCallback } from "react"
|
||||
import { Users, UserPlus, RefreshCw, AlertCircle, Building2 } from "lucide-react"
|
||||
import { getUserIdAndUsername } from "@/app/utils/auth"
|
||||
import { DashboardErrorBoundary } from "@/components/dashboard/DashboardErrorBoundary"
|
||||
import { getMembersPageInitData } from "./data"
|
||||
import { MembersClient } from "./members-client"
|
||||
import { ConfirmDialog } from "@/components/confirm-dialog"
|
||||
import { MembersSkeleton } from "@/components/members/MembersSkeleton"
|
||||
import { GitHubUserSearchResult, Member } from "@/lib/types"
|
||||
import {
|
||||
addOrganizationMember,
|
||||
getCurrentUserRole,
|
||||
getOrganizationMembers,
|
||||
updateOrganizationMemberRole,
|
||||
removeOrganizationMember,
|
||||
} from "./action"
|
||||
import { useViewMode } from "@/app/app/ViewModeContext"
|
||||
import { MembersList } from "@/components/members/members-list"
|
||||
import { UserSearchModal } from "@/components/members/user-search-modal"
|
||||
|
||||
export default async function OrganizationMembersPage() {
|
||||
const initData = await getMembersPageInitData()
|
||||
function OrganizationMembers() {
|
||||
const { currentOrg } = useViewMode()
|
||||
|
||||
// No session — auth middleware will redirect
|
||||
if (!initData) {
|
||||
return null
|
||||
const [members, setMembers] = useState<Member[]>([])
|
||||
const [currentUserId, setCurrentUserId] = useState<string>("")
|
||||
const [currentUserRole, setCurrentUserRole] = useState<string | null>(null)
|
||||
const [loading, setLoading] = useState(true)
|
||||
const [showAddModal, setShowAddModal] = useState(false)
|
||||
const [error, setError] = useState<string | null>(null)
|
||||
const [updatingMember, setUpdatingMember] = useState<string | null>(null)
|
||||
const [success, setSuccess] = useState<string | null>(null)
|
||||
const [searchQuery, setSearchQuery] = useState("")
|
||||
const [filterRole, setFilterRole] = useState<"all" | "owner" | "admin" | "member">("all")
|
||||
const [isRefreshing, setIsRefreshing] = useState(false)
|
||||
const [confirmDialog, setConfirmDialog] = useState<{
|
||||
open: boolean
|
||||
memberId: string
|
||||
memberUsername: string
|
||||
} | null>(null)
|
||||
|
||||
const isAdmin = currentUserRole === "admin" || currentUserRole === "owner"
|
||||
|
||||
const fetchMembers = useCallback(async () => {
|
||||
if (!currentOrg?.id) {
|
||||
setError("No organization selected")
|
||||
setLoading(false)
|
||||
return
|
||||
}
|
||||
|
||||
if (!isRefreshing) {
|
||||
setLoading(true)
|
||||
}
|
||||
setError(null)
|
||||
|
||||
try {
|
||||
const data = await getUserIdAndUsername()
|
||||
if (!data || !data.userId) {
|
||||
throw new Error("User authentication failed")
|
||||
}
|
||||
|
||||
setCurrentUserId(data.userId)
|
||||
|
||||
const [roleResult, result] = await Promise.all([
|
||||
getCurrentUserRole(data.userId, currentOrg?.id),
|
||||
getOrganizationMembers(data.userId, currentOrg?.id),
|
||||
])
|
||||
if (roleResult.success && roleResult.data) {
|
||||
setCurrentUserRole(roleResult.data.role)
|
||||
}
|
||||
|
||||
if (result.success && result.data) {
|
||||
setMembers(result.data)
|
||||
} else {
|
||||
setError(result.error || "Failed to load members")
|
||||
}
|
||||
} catch (err) {
|
||||
console.error("Failed to fetch members:", err)
|
||||
setError("Failed to load members. Please try again.")
|
||||
} finally {
|
||||
setLoading(false)
|
||||
setIsRefreshing(false)
|
||||
}
|
||||
}, [currentOrg?.id, isRefreshing])
|
||||
|
||||
useEffect(() => {
|
||||
if (!currentOrg?.id) {
|
||||
setError("No organization selected")
|
||||
setLoading(false)
|
||||
return
|
||||
}
|
||||
|
||||
fetchMembers()
|
||||
}, [currentOrg?.id, fetchMembers])
|
||||
|
||||
useEffect(() => {
|
||||
if (success) {
|
||||
const timer = setTimeout(() => setSuccess(null), 5000)
|
||||
return () => clearTimeout(timer)
|
||||
}
|
||||
}, [success])
|
||||
|
||||
const handleMemberAdded = async () => {
|
||||
setIsRefreshing(true)
|
||||
await fetchMembers()
|
||||
setSuccess("Member added successfully!")
|
||||
}
|
||||
|
||||
const handleUserAdd = async (user: GitHubUserSearchResult, role: "admin" | "member") => {
|
||||
if (!currentOrg?.id) {
|
||||
return { success: false, error: "No organization selected" }
|
||||
}
|
||||
|
||||
const result = await addOrganizationMember(currentUserId, user, role, currentOrg?.id)
|
||||
if (result.success) {
|
||||
handleMemberAdded()
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
const handleUpdateRole = async (memberId: string, newRole: "admin" | "member" | "owner") => {
|
||||
if (!currentOrg?.id) return
|
||||
|
||||
setUpdatingMember(memberId)
|
||||
setError(null)
|
||||
setSuccess(null)
|
||||
|
||||
const result = await updateOrganizationMemberRole(
|
||||
currentUserId,
|
||||
currentOrg?.id,
|
||||
memberId,
|
||||
newRole,
|
||||
)
|
||||
|
||||
if (result.success) {
|
||||
setSuccess("Member role updated successfully")
|
||||
setIsRefreshing(true)
|
||||
await fetchMembers()
|
||||
} else {
|
||||
setError(result.error || "Failed to update role")
|
||||
}
|
||||
|
||||
setUpdatingMember(null)
|
||||
}
|
||||
|
||||
const handleRemoveMember = async (memberId: string, memberUsername: string) => {
|
||||
if (!currentOrg?.id) return
|
||||
setConfirmDialog({ open: true, memberId, memberUsername })
|
||||
}
|
||||
|
||||
const confirmRemoveMember = async () => {
|
||||
if (!confirmDialog || !currentOrg?.id) return
|
||||
|
||||
const { memberId, memberUsername } = confirmDialog
|
||||
|
||||
setUpdatingMember(memberId)
|
||||
setError(null)
|
||||
setSuccess(null)
|
||||
|
||||
const result = await removeOrganizationMember(currentUserId, currentOrg.id, memberId)
|
||||
|
||||
if (result.success) {
|
||||
setSuccess(`${memberUsername} has been removed successfully`)
|
||||
setIsRefreshing(true)
|
||||
await fetchMembers()
|
||||
} else {
|
||||
setError(result.error || "Failed to remove member")
|
||||
}
|
||||
|
||||
setUpdatingMember(null)
|
||||
setConfirmDialog(null)
|
||||
}
|
||||
|
||||
if (loading) {
|
||||
return <MembersSkeleton count={6} />
|
||||
}
|
||||
|
||||
if (!currentOrg?.id) {
|
||||
return (
|
||||
<div className="flex justify-center items-center min-h-[70vh] p-4">
|
||||
<div className="bg-destructive/10 border border-destructive/20 text-destructive p-6 sm:p-8 rounded-2xl w-full max-w-md shadow-lg">
|
||||
<div className="inline-flex items-center justify-center w-12 h-12 rounded-full bg-destructive/20 mb-4">
|
||||
<AlertCircle size={24} />
|
||||
</div>
|
||||
<h3 className="text-base sm:text-lg font-semibold mb-2">No Organization Selected</h3>
|
||||
<p className="mb-4 text-sm sm:text-base opacity-90">
|
||||
Please select an organization from the sidebar
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
if (error && members.length === 0) {
|
||||
return (
|
||||
<div className="flex justify-center items-center min-h-[70vh] p-4">
|
||||
<div className="bg-destructive/10 border border-destructive/20 text-destructive p-6 sm:p-8 rounded-2xl w-full max-w-md shadow-lg">
|
||||
<div className="inline-flex items-center justify-center w-12 h-12 rounded-full bg-destructive/20 mb-4">
|
||||
<AlertCircle size={24} />
|
||||
</div>
|
||||
<h3 className="text-base sm:text-lg font-semibold mb-2">Unable to Load Members</h3>
|
||||
<p className="mb-4 text-sm sm:text-base opacity-90">{error}</p>
|
||||
<button
|
||||
onClick={() => fetchMembers()}
|
||||
className="flex items-center gap-2 w-full justify-center px-4 py-2.5 bg-destructive hover:bg-destructive/90 text-destructive-foreground rounded-xl text-sm font-medium transition-all shadow-sm hover:shadow-md"
|
||||
>
|
||||
<RefreshCw size={16} /> Try Again
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
const adminCount = members.filter(m => m.role === "admin" || m.role === "owner").length
|
||||
const memberCount = members.filter(m => m.role === "member").length
|
||||
|
||||
return (
|
||||
<div className="flex-1 bg-background">
|
||||
<div className="h-screen py-6 sm:py-8 px-4 sm:px-6 max-w-[1400px] mx-auto">
|
||||
{/* Header */}
|
||||
<div className="mb-6 sm:mb-8">
|
||||
<div className="flex items-start justify-between gap-4 mb-2">
|
||||
<div>
|
||||
<h1 className="text-2xl sm:text-3xl font-bold text-foreground flex items-center gap-3">
|
||||
<div className="p-2.5 rounded-xl bg-primary/10">
|
||||
<Users size={28} className="text-primary" />
|
||||
</div>
|
||||
Organization Members
|
||||
</h1>
|
||||
<p className="text-muted-foreground mt-2">
|
||||
Manage members and their roles in your organization
|
||||
</p>
|
||||
</div>
|
||||
{isAdmin && (
|
||||
<button
|
||||
onClick={() => setShowAddModal(true)}
|
||||
disabled={showAddModal}
|
||||
className="flex items-center gap-2 px-4 py-2.5 bg-primary text-primary-foreground rounded-xl hover:bg-primary/90 disabled:opacity-50 disabled:cursor-not-allowed transition-all duration-200 text-sm font-medium whitespace-nowrap flex-shrink-0 shadow-sm hover:shadow-md"
|
||||
>
|
||||
<UserPlus size={16} />
|
||||
<span className="hidden sm:inline">Add Member</span>
|
||||
<span className="sm:hidden">Add</span>
|
||||
</button>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div className="bg-card rounded-2xl border border-border shadow-sm overflow-hidden">
|
||||
<MembersList
|
||||
members={members}
|
||||
currentUserId={currentUserId}
|
||||
isAdmin={isAdmin}
|
||||
updatingMember={updatingMember}
|
||||
error={error}
|
||||
success={success}
|
||||
searchQuery={searchQuery}
|
||||
filterRole={filterRole}
|
||||
onSearchChange={setSearchQuery}
|
||||
onFilterChange={setFilterRole}
|
||||
onUpdateRole={handleUpdateRole}
|
||||
onRemove={handleRemoveMember}
|
||||
onDismissError={() => setError(null)}
|
||||
onDismissSuccess={() => setSuccess(null)}
|
||||
headerIcon={<Building2 size={20} className="text-primary" />}
|
||||
headerTitle="Members"
|
||||
headerStats={`${members.length} ${members.length === 1 ? "member" : "members"} • ${adminCount} ${adminCount === 1 ? "admin" : "admins"} • ${memberCount} ${memberCount === 1 ? "member" : "members"}`}
|
||||
/>
|
||||
</div>
|
||||
|
||||
<UserSearchModal
|
||||
isOpen={showAddModal}
|
||||
onClose={() => setShowAddModal(false)}
|
||||
onUserAdd={handleUserAdd}
|
||||
title="Add Organization Member"
|
||||
description="Search for GitHub users and add them to this organization"
|
||||
showRoleSelector={true}
|
||||
/>
|
||||
|
||||
<ConfirmDialog
|
||||
open={confirmDialog?.open || false}
|
||||
onOpenChange={open => !open && setConfirmDialog(null)}
|
||||
onConfirm={confirmRemoveMember}
|
||||
title="Remove Member"
|
||||
description={`Are you sure you want to remove ${confirmDialog?.memberUsername} from this organization? This action cannot be undone.`}
|
||||
confirmText="Remove"
|
||||
cancelText="Cancel"
|
||||
variant="destructive"
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
export default function OrganizationMembersWrapper() {
|
||||
return (
|
||||
<DashboardErrorBoundary>
|
||||
<MembersClient
|
||||
initialUserId={initData.userId}
|
||||
initialOrgId={initData.orgId}
|
||||
initialMembers={initData.members as any}
|
||||
initialUserRole={initData.currentUserRole}
|
||||
/>
|
||||
<OrganizationMembers />
|
||||
</DashboardErrorBoundary>
|
||||
)
|
||||
}
|
||||
|
|
|
|||
|
|
@ -28,8 +28,6 @@ const mockRepo = {
|
|||
optimizations_limit: 100,
|
||||
optimizations_used: 50,
|
||||
repository_members: [{ id: "rm-1" }, { id: "rm-2" }],
|
||||
// Matches the Prisma return shape for include: { _count: { select: { repository_members: true } } }
|
||||
_count: { repository_members: 2 },
|
||||
}
|
||||
|
||||
const mockPayload = { userId: "user-1", username: "testuser" }
|
||||
|
|
@ -44,7 +42,7 @@ describe("getRepositoryById", () => {
|
|||
|
||||
describe("parallel fetch", () => {
|
||||
it("fetches repo and authorized repoIds concurrently", async () => {
|
||||
vi.mocked(prisma.repositories.findUnique).mockResolvedValue(mockRepo as any)
|
||||
vi.mocked(prisma.repositories.findFirst).mockResolvedValue(mockRepo as any)
|
||||
vi.mocked(getRepositoriesForAccountCached).mockResolvedValue({
|
||||
repoIds: ["repo-1"],
|
||||
repos: [],
|
||||
|
|
@ -53,12 +51,12 @@ describe("getRepositoryById", () => {
|
|||
|
||||
await getRepositoryById(mockPayload as any, "repo-1")
|
||||
|
||||
expect(prisma.repositories.findUnique).toHaveBeenCalledTimes(1)
|
||||
expect(prisma.repositories.findFirst).toHaveBeenCalledTimes(1)
|
||||
expect(getRepositoriesForAccountCached).toHaveBeenCalledWith(mockPayload)
|
||||
})
|
||||
|
||||
it("returns null when repo is not found", async () => {
|
||||
vi.mocked(prisma.repositories.findUnique).mockResolvedValue(null)
|
||||
vi.mocked(prisma.repositories.findFirst).mockResolvedValue(null)
|
||||
vi.mocked(getRepositoriesForAccountCached).mockResolvedValue({
|
||||
repoIds: ["repo-1"],
|
||||
repos: [],
|
||||
|
|
@ -69,7 +67,7 @@ describe("getRepositoryById", () => {
|
|||
})
|
||||
|
||||
it("returns null when repo is not in authorized list", async () => {
|
||||
vi.mocked(prisma.repositories.findUnique).mockResolvedValue(mockRepo as any)
|
||||
vi.mocked(prisma.repositories.findFirst).mockResolvedValue(mockRepo as any)
|
||||
vi.mocked(getRepositoriesForAccountCached).mockResolvedValue({
|
||||
repoIds: ["other-repo"],
|
||||
repos: [],
|
||||
|
|
@ -82,7 +80,7 @@ describe("getRepositoryById", () => {
|
|||
|
||||
describe("successful retrieval", () => {
|
||||
beforeEach(() => {
|
||||
vi.mocked(prisma.repositories.findUnique).mockResolvedValue(mockRepo as any)
|
||||
vi.mocked(prisma.repositories.findFirst).mockResolvedValue(mockRepo as any)
|
||||
vi.mocked(getRepositoriesForAccountCached).mockResolvedValue({
|
||||
repoIds: ["repo-1"],
|
||||
repos: [],
|
||||
|
|
@ -129,7 +127,7 @@ describe("getRepositoryById", () => {
|
|||
|
||||
describe("analytics tracking", () => {
|
||||
beforeEach(() => {
|
||||
vi.mocked(prisma.repositories.findUnique).mockResolvedValue(mockRepo as any)
|
||||
vi.mocked(prisma.repositories.findFirst).mockResolvedValue(mockRepo as any)
|
||||
vi.mocked(getRepositoriesForAccountCached).mockResolvedValue({
|
||||
repoIds: ["repo-1"],
|
||||
repos: [],
|
||||
|
|
@ -150,7 +148,9 @@ describe("getRepositoryById", () => {
|
|||
describe("error handling", () => {
|
||||
it("returns null and logs when Prisma throws", async () => {
|
||||
vi.spyOn(console, "error").mockImplementation(() => {})
|
||||
vi.mocked(prisma.repositories.findUnique).mockRejectedValue(new Error("timeout"))
|
||||
vi.mocked(prisma.repositories.findFirst).mockRejectedValue(
|
||||
new Error("timeout"),
|
||||
)
|
||||
vi.mocked(getRepositoriesForAccountCached).mockResolvedValue({
|
||||
repoIds: ["repo-1"],
|
||||
repos: [],
|
||||
|
|
|
|||
|
|
@ -2,7 +2,6 @@
|
|||
|
||||
import * as Sentry from "@sentry/nextjs"
|
||||
import { AccountPayload, createOrUpdateUser, getUserById, prisma } from "@codeflash-ai/common"
|
||||
import { Prisma } from "@prisma/client"
|
||||
import { eachDayOfInterval, startOfDay } from "date-fns"
|
||||
import { GitHubUserSearchResult, Member, UserRole } from "@/lib/types"
|
||||
import { ActionResponse, createErrorResponse, createSuccessResponse } from "@/lib/action-response"
|
||||
|
|
@ -13,33 +12,43 @@ import { trackMemberInvited, trackRepositoryConnected } from "@/lib/analytics/tr
|
|||
|
||||
export async function getOptimizationsTimeSeriesData(repoId: string, onlySuccessful?: boolean) {
|
||||
try {
|
||||
// Use SQL GROUP BY to aggregate on the database side instead of fetching every row
|
||||
const successFilter =
|
||||
onlySuccessful === true ? Prisma.sql`AND is_optimization_found = true` : Prisma.empty
|
||||
const dailyCounts = await prisma.$queryRaw<Array<{ day: string; cnt: bigint }>>`
|
||||
SELECT DATE(created_at) AS day, COUNT(*)::bigint AS cnt
|
||||
FROM optimization_events
|
||||
WHERE repository_id = ${repoId} ${successFilter}
|
||||
GROUP BY DATE(created_at)
|
||||
ORDER BY day`
|
||||
|
||||
if (dailyCounts.length === 0) return []
|
||||
const data = await prisma.optimization_events.findMany({
|
||||
where: {
|
||||
...(onlySuccessful === true ? { is_optimization_found: true } : {}),
|
||||
repository_id: repoId,
|
||||
},
|
||||
select: {
|
||||
created_at: true,
|
||||
},
|
||||
})
|
||||
|
||||
const groupedByDay: Record<string, number> = {}
|
||||
for (const row of dailyCounts) {
|
||||
// DATE columns come back as Date objects from Prisma; format to YYYY-MM-DD
|
||||
const dayStr =
|
||||
typeof row.day === "string"
|
||||
? row.day
|
||||
: (row.day as unknown as Date).toISOString().slice(0, 10)
|
||||
groupedByDay[dayStr] = Number(row.cnt)
|
||||
}
|
||||
|
||||
const sortedDays = Object.keys(groupedByDay).sort()
|
||||
data.forEach(item => {
|
||||
const day = item.created_at
|
||||
.toLocaleDateString(undefined, {
|
||||
timeZone: Intl.DateTimeFormat().resolvedOptions().timeZone,
|
||||
year: "numeric",
|
||||
month: "2-digit",
|
||||
day: "2-digit",
|
||||
})
|
||||
.replace(/(\d{2})\/(\d{2})\/(\d{4})/, "$3-$1-$2")
|
||||
groupedByDay[day] = (groupedByDay[day] || 0) + 1
|
||||
})
|
||||
|
||||
const allDates = eachDayOfInterval({
|
||||
start: new Date(sortedDays[0] + "T00:00:00"),
|
||||
start: new Date(Object.keys(groupedByDay).sort()[0]),
|
||||
end: startOfDay(new Date()),
|
||||
}).map(d => d.toISOString().slice(0, 10))
|
||||
}).map(d =>
|
||||
d
|
||||
.toLocaleDateString(undefined, {
|
||||
timeZone: Intl.DateTimeFormat().resolvedOptions().timeZone,
|
||||
year: "numeric",
|
||||
month: "2-digit",
|
||||
day: "2-digit",
|
||||
})
|
||||
.replace(/(\d{2})\/(\d{2})\/(\d{4})/, "$3-$1-$2"),
|
||||
)
|
||||
|
||||
let cumulativeCount = 0
|
||||
const completeData = allDates.map(date => {
|
||||
|
|
@ -56,43 +65,45 @@ export async function getOptimizationsTimeSeriesData(repoId: string, onlySuccess
|
|||
|
||||
export async function getPullRequestEventTimeSeriesData(year: number, repoId: string) {
|
||||
try {
|
||||
// Use SQL GROUP BY to aggregate on the database side instead of fetching every row
|
||||
const startDate = new Date(`${year}-01-01T00:00:00.000Z`)
|
||||
const endDate = new Date(`${year + 1}-01-01T00:00:00.000Z`)
|
||||
const monthlyStats = await prisma.$queryRaw<
|
||||
Array<{
|
||||
month: number
|
||||
pr_created: bigint
|
||||
pr_merged: bigint
|
||||
pr_closed: bigint
|
||||
}>
|
||||
>`SELECT
|
||||
EXTRACT(MONTH FROM created_at)::int AS month,
|
||||
SUM(CASE WHEN event_type = 'pr_created' THEN 1 ELSE 0 END)::bigint AS pr_created,
|
||||
SUM(CASE WHEN event_type = 'pr_merged' THEN 1 ELSE 0 END)::bigint AS pr_merged,
|
||||
SUM(CASE WHEN event_type = 'pr_closed' THEN 1 ELSE 0 END)::bigint AS pr_closed
|
||||
FROM optimization_events
|
||||
WHERE event_type IN ('pr_created', 'pr_merged', 'pr_closed')
|
||||
AND created_at >= ${startDate}
|
||||
AND created_at < ${endDate}
|
||||
AND repository_id = ${repoId}
|
||||
GROUP BY EXTRACT(MONTH FROM created_at)`
|
||||
const eventTypes = ["pr_created", "pr_merged", "pr_closed"]
|
||||
const data = await prisma.optimization_events.findMany({
|
||||
where: {
|
||||
event_type: { in: eventTypes },
|
||||
created_at: {
|
||||
gte: new Date(`${year}-01-01T00:00:00.000Z`),
|
||||
lt: new Date(`${year + 1}-01-01T00:00:00.000Z`),
|
||||
},
|
||||
repository_id: repoId,
|
||||
},
|
||||
select: {
|
||||
event_type: true,
|
||||
created_at: true,
|
||||
},
|
||||
})
|
||||
|
||||
type MonthStat = { month: number; pr_created: bigint; pr_merged: bigint; pr_closed: bigint }
|
||||
const statsMap = new Map<number, MonthStat>(
|
||||
(monthlyStats as MonthStat[]).map((r: MonthStat) => [r.month, r]),
|
||||
)
|
||||
const groupedByMonth: Record<string, Record<string, number>> = {}
|
||||
|
||||
return Array.from({ length: 12 }, (_, i) => {
|
||||
const month = i + 1
|
||||
const stats = statsMap.get(month)
|
||||
return {
|
||||
month: `${year}-${month.toString().padStart(2, "0")}`,
|
||||
pr_created: Number(stats?.pr_created ?? 0),
|
||||
pr_merged: Number(stats?.pr_merged ?? 0),
|
||||
pr_closed: Number(stats?.pr_closed ?? 0),
|
||||
for (let month = 1; month <= 12; month++) {
|
||||
const monthKey = `${year}-${month.toString().padStart(2, "0")}`
|
||||
groupedByMonth[monthKey] = { pr_created: 0, pr_merged: 0, pr_closed: 0 }
|
||||
}
|
||||
|
||||
data.forEach(item => {
|
||||
const month = item.created_at.getMonth() + 1
|
||||
const monthKey = `${year}-${month.toString().padStart(2, "0")}`
|
||||
if (groupedByMonth[monthKey]) {
|
||||
groupedByMonth[monthKey][item.event_type] += 1
|
||||
}
|
||||
})
|
||||
|
||||
const completeData = Object.keys(groupedByMonth).map(monthKey => ({
|
||||
month: monthKey,
|
||||
pr_created: groupedByMonth[monthKey].pr_created,
|
||||
pr_merged: groupedByMonth[monthKey].pr_merged,
|
||||
pr_closed: groupedByMonth[monthKey].pr_closed,
|
||||
}))
|
||||
|
||||
return completeData
|
||||
} catch (error) {
|
||||
console.error("Failed to fetch pull request event time series data:", error)
|
||||
return []
|
||||
|
|
@ -116,25 +127,6 @@ export async function getUserOptimizationSuccessfulCountByRepo(repoId: string) {
|
|||
})
|
||||
}
|
||||
|
||||
/**
|
||||
* Get both total and successful optimization counts in a single query.
|
||||
* Callers that need both counts should prefer this over two separate calls.
|
||||
*/
|
||||
export async function getOptimizationCountsByRepo(
|
||||
repoId: string,
|
||||
): Promise<{ total: number; successful: number }> {
|
||||
const result = await prisma.$queryRaw<[{ total: bigint; successful: bigint }]>`
|
||||
SELECT
|
||||
COUNT(*)::bigint AS total,
|
||||
SUM(CASE WHEN is_optimization_found THEN 1 ELSE 0 END)::bigint AS successful
|
||||
FROM optimization_events
|
||||
WHERE repository_id = ${repoId}`
|
||||
return {
|
||||
total: Number(result[0].total),
|
||||
successful: Number(result[0].successful),
|
||||
}
|
||||
}
|
||||
|
||||
export async function getActiveUserLeaderboardLast30DaysForRepo(
|
||||
repoId: string,
|
||||
): Promise<{ username: string; eventCount: number; avatarUrl: string }[]> {
|
||||
|
|
@ -161,38 +153,37 @@ export async function getActiveUserLeaderboardLast30DaysForRepo(
|
|||
},
|
||||
})
|
||||
|
||||
return groupedCounts.map(
|
||||
(entry: { current_username: string | null; _count: { id: number } }) => ({
|
||||
username: entry.current_username!,
|
||||
eventCount: entry._count.id,
|
||||
avatarUrl: `https://github.com/${entry.current_username}.png`,
|
||||
}),
|
||||
)
|
||||
return groupedCounts.map(entry => ({
|
||||
username: entry.current_username!,
|
||||
eventCount: entry._count.id,
|
||||
avatarUrl: `https://github.com/${entry.current_username}.png`,
|
||||
}))
|
||||
}
|
||||
|
||||
export const getRepositoryById = withTiming(
|
||||
"getRepositoryById",
|
||||
async (payload: AccountPayload, repoId: string): Promise<RepositoryWithUsage | null> => {
|
||||
try {
|
||||
// Fetch repo, authorized repoIds, and recent activity count in parallel
|
||||
const [repo, { repoIds }, recentEventCount] = await Promise.all([
|
||||
prisma.repositories.findUnique({
|
||||
// Fetch repo and authorized repoIds in parallel
|
||||
const [repo, { repoIds }] = await Promise.all([
|
||||
prisma.repositories.findFirst({
|
||||
where: { id: repoId },
|
||||
include: { _count: { select: { repository_members: true } } },
|
||||
include: { repository_members: true },
|
||||
}),
|
||||
getRepositoriesForAccountCached(payload),
|
||||
prisma.optimization_events.count({
|
||||
where: {
|
||||
repository_id: repoId,
|
||||
created_at: {
|
||||
gte: new Date(Date.now() - 30 * 24 * 60 * 60 * 1000),
|
||||
},
|
||||
},
|
||||
}),
|
||||
])
|
||||
|
||||
if (!repo || !repoIds.includes(repo.id)) return null
|
||||
|
||||
const recentEventCount = await prisma.optimization_events.count({
|
||||
where: {
|
||||
repository_id: repo.id,
|
||||
created_at: {
|
||||
gte: new Date(Date.now() - 30 * 24 * 60 * 60 * 1000),
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
// Track repository view as a connection/engagement signal
|
||||
const userId = "userId" in payload ? payload.userId : undefined
|
||||
if (userId) {
|
||||
|
|
@ -202,7 +193,6 @@ export const getRepositoryById = withTiming(
|
|||
})
|
||||
}
|
||||
|
||||
const organization = repo.full_name.split("/")[0]
|
||||
return {
|
||||
id: repo.id,
|
||||
github_repo_id: repo.github_repo_id,
|
||||
|
|
@ -215,9 +205,9 @@ export const getRepositoryById = withTiming(
|
|||
last_optimized: repo.last_optimized,
|
||||
optimizations_limit: repo.optimizations_limit,
|
||||
optimizations_used: repo.optimizations_used,
|
||||
organization,
|
||||
avatarUrl: `https://github.com/${organization}.png`,
|
||||
membersCount: repo._count.repository_members,
|
||||
organization: repo.full_name.split("/")[0],
|
||||
avatarUrl: `https://github.com/${repo.full_name.split("/")[0]}.png`,
|
||||
membersCount: repo.repository_members.length,
|
||||
}
|
||||
} catch (error) {
|
||||
console.error("Failed to fetch repository by ID:", error)
|
||||
|
|
@ -234,40 +224,35 @@ export async function addRepositoryMemberById(
|
|||
): Promise<ActionResponse> {
|
||||
try {
|
||||
const invitedUserId = `github|${invitedUser.githubUserId.toString()}`
|
||||
// Check if current user is admin or the only member
|
||||
const repo = await prisma.repositories.findFirst({
|
||||
where: { id: repoId },
|
||||
include: {
|
||||
repository_members: {
|
||||
include: {
|
||||
user: true,
|
||||
},
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
// Verify repo exists, check permissions, and check for duplicate in parallel
|
||||
// using indexed lookups instead of loading ALL members
|
||||
const [repoExists, currentUserMember, existingMember, memberCount] = await Promise.all([
|
||||
prisma.repositories.findUnique({
|
||||
where: { id: repoId },
|
||||
select: { id: true },
|
||||
}),
|
||||
prisma.repository_members.findUnique({
|
||||
where: { repository_id_user_id: { repository_id: repoId, user_id: currentUserId } },
|
||||
select: { role: true },
|
||||
}),
|
||||
prisma.repository_members.findUnique({
|
||||
where: { repository_id_user_id: { repository_id: repoId, user_id: invitedUserId } },
|
||||
select: { id: true },
|
||||
}),
|
||||
prisma.repository_members.count({
|
||||
where: { repository_id: repoId },
|
||||
}),
|
||||
])
|
||||
|
||||
if (!repoExists) {
|
||||
if (!repo) {
|
||||
return createErrorResponse("Repository not found")
|
||||
}
|
||||
|
||||
const currentUserMember = repo.repository_members.find(m => m.user_id === currentUserId)
|
||||
|
||||
// Check if user has permission to add members
|
||||
const isAdmin = currentUserMember?.role === "admin" || currentUserMember?.role === "owner"
|
||||
const isOnlyMember = memberCount === 1 && currentUserMember // if only member we need to let him add because we are was not manage well the member role
|
||||
const isOnlyMember = repo.repository_members.length === 1 && currentUserMember // if only member we need to let him add because we are was not manage well the member role
|
||||
|
||||
if (!isAdmin && !isOnlyMember) {
|
||||
return createErrorResponse("You don't have permission to add members")
|
||||
}
|
||||
|
||||
// Check if member already exists
|
||||
// Check if member already exists by username
|
||||
const existingMember = repo.repository_members.find(m => m.user.user_id === invitedUserId)
|
||||
|
||||
if (existingMember) {
|
||||
return createErrorResponse("User is already a member of this repository")
|
||||
}
|
||||
|
|
@ -321,44 +306,36 @@ export async function getRepositoryMembers(
|
|||
repoId: string,
|
||||
): Promise<ActionResponse<Member[]>> {
|
||||
try {
|
||||
// Check access with a single indexed lookup, then fetch members only if authorized
|
||||
const hasAccess = await prisma.repository_members.findUnique({
|
||||
where: { repository_id_user_id: { repository_id: repoId, user_id: currentUserId } },
|
||||
select: { id: true },
|
||||
const repo = await prisma.repositories.findFirst({
|
||||
where: { id: repoId },
|
||||
include: {
|
||||
repository_members: {
|
||||
include: {
|
||||
user: true,
|
||||
},
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
if (!repo) {
|
||||
return createErrorResponse("Repository not found")
|
||||
}
|
||||
|
||||
// Check if user has access
|
||||
const hasAccess = repo.repository_members.some(m => m.user_id === currentUserId)
|
||||
|
||||
if (!hasAccess) {
|
||||
return createErrorResponse("You don't have access to this repository")
|
||||
}
|
||||
|
||||
// Now fetch all members (only needed fields)
|
||||
const repoMembers = await prisma.repository_members.findMany({
|
||||
where: { repository_id: repoId },
|
||||
select: {
|
||||
id: true,
|
||||
user_id: true,
|
||||
role: true,
|
||||
added_at: true,
|
||||
user: { select: { github_username: true } },
|
||||
},
|
||||
})
|
||||
|
||||
const members: Member[] = repoMembers.map(
|
||||
(member: {
|
||||
id: string
|
||||
user_id: string
|
||||
role: string
|
||||
added_at: Date
|
||||
user: { github_username: string }
|
||||
}) => ({
|
||||
id: member.id,
|
||||
user_id: member.user_id,
|
||||
username: member.user.github_username,
|
||||
role: member.role,
|
||||
added_at: member.added_at,
|
||||
avatarUrl: `https://github.com/${member.user.github_username}.png`,
|
||||
}),
|
||||
)
|
||||
const members: Member[] = repo.repository_members.map(member => ({
|
||||
id: member.id,
|
||||
user_id: member.user_id,
|
||||
username: member.user.github_username,
|
||||
role: member.role,
|
||||
added_at: member.added_at,
|
||||
avatarUrl: `https://github.com/${member.user.github_username}.png`,
|
||||
}))
|
||||
|
||||
return createSuccessResponse(members)
|
||||
} catch (error) {
|
||||
|
|
@ -378,28 +355,26 @@ export async function updateRepositoryMemberRole(
|
|||
newRole: UserRole,
|
||||
): Promise<ActionResponse<Boolean>> {
|
||||
try {
|
||||
// Fetch only the two specific members we need instead of loading ALL repository members
|
||||
const [currentUserMember, targetMember] = await Promise.all([
|
||||
prisma.repository_members.findUnique({
|
||||
where: { repository_id_user_id: { repository_id: repoId, user_id: currentUserId } },
|
||||
select: { role: true },
|
||||
}),
|
||||
prisma.repository_members.findUnique({
|
||||
where: { id: memberId },
|
||||
select: { id: true, role: true, user_id: true },
|
||||
}),
|
||||
])
|
||||
const repo = await prisma.repositories.findFirst({
|
||||
where: { id: repoId },
|
||||
include: {
|
||||
repository_members: true,
|
||||
},
|
||||
})
|
||||
|
||||
if (!currentUserMember) {
|
||||
if (!repo) {
|
||||
return createErrorResponse("Repository not found")
|
||||
}
|
||||
|
||||
const currentUserMember = repo.repository_members.find(m => m.user_id === currentUserId)
|
||||
|
||||
// Only admins and owners can change roles
|
||||
if (currentUserMember.role !== "admin" && currentUserMember.role !== "owner") {
|
||||
if (currentUserMember?.role !== "admin" && currentUserMember?.role !== "owner") {
|
||||
return createErrorResponse("Only admins can change member roles")
|
||||
}
|
||||
|
||||
// Don't allow changing owner role
|
||||
const targetMember = repo.repository_members.find(m => m.id === memberId)
|
||||
if (targetMember?.role === "owner") {
|
||||
return createErrorResponse("Cannot change owner role")
|
||||
}
|
||||
|
|
@ -430,17 +405,19 @@ export async function removeRepositoryMember(
|
|||
memberId: string,
|
||||
): Promise<ActionResponse<Boolean>> {
|
||||
try {
|
||||
// Fetch only the two specific members we need instead of loading ALL repository members
|
||||
const [currentUserMember, targetMember] = await Promise.all([
|
||||
prisma.repository_members.findUnique({
|
||||
where: { repository_id_user_id: { repository_id: repoId, user_id: currentUserId } },
|
||||
select: { role: true },
|
||||
}),
|
||||
prisma.repository_members.findUnique({
|
||||
where: { id: memberId },
|
||||
select: { id: true, role: true, user_id: true },
|
||||
}),
|
||||
])
|
||||
const repo = await prisma.repositories.findFirst({
|
||||
where: { id: repoId },
|
||||
include: {
|
||||
repository_members: true,
|
||||
},
|
||||
})
|
||||
|
||||
if (!repo) {
|
||||
return createErrorResponse("Repository not found")
|
||||
}
|
||||
|
||||
const currentUserMember = repo.repository_members.find(m => m.user_id === currentUserId)
|
||||
const targetMember = repo.repository_members.find(m => m.id === memberId)
|
||||
|
||||
if (!targetMember) {
|
||||
return createErrorResponse("Member not found")
|
||||
|
|
|
|||
|
|
@ -1,89 +0,0 @@
|
|||
"use server"
|
||||
|
||||
import { auth0 } from "@/lib/auth0"
|
||||
import { cookies } from "next/headers"
|
||||
import type { AccountPayload } from "@codeflash-ai/common"
|
||||
import {
|
||||
getRepositoryById,
|
||||
getOptimizationCountsByRepo,
|
||||
getOptimizationsTimeSeriesData,
|
||||
getPullRequestEventTimeSeriesData,
|
||||
getActiveUserLeaderboardLast30DaysForRepo,
|
||||
} from "./action"
|
||||
|
||||
/**
|
||||
* Server-side function to fetch all data needed for the repository detail page
|
||||
* in parallel. Eliminates the client-side auth→repo→stats waterfall.
|
||||
*/
|
||||
export async function getRepoDetailInitData(repositoryId: string) {
|
||||
const session = await auth0.getSession()
|
||||
if (!session?.user?.sub || !session?.user?.nickname) {
|
||||
return null
|
||||
}
|
||||
|
||||
const userId = session.user.sub
|
||||
const username = session.user.nickname
|
||||
|
||||
const cookieStore = await cookies()
|
||||
const orgId = cookieStore.get("currentOrganizationId")?.value
|
||||
|
||||
const payload: AccountPayload = orgId ? { orgId } : { userId, username }
|
||||
|
||||
const repository = await getRepositoryById(payload, repositoryId)
|
||||
if (!repository) {
|
||||
return { userId, repository: null, stats: null }
|
||||
}
|
||||
|
||||
const currentYear = new Date().getFullYear()
|
||||
|
||||
// Fetch all statistics in parallel — these are all independent queries
|
||||
// Use the combined count query (single SQL) instead of two separate COUNT calls
|
||||
const [counts, optimizationsOverTime, successfulOptimizationsOverTime, prData, leaderboardData] =
|
||||
await Promise.all([
|
||||
getOptimizationCountsByRepo(repositoryId),
|
||||
getOptimizationsTimeSeriesData(repositoryId, false),
|
||||
getOptimizationsTimeSeriesData(repositoryId, true),
|
||||
getPullRequestEventTimeSeriesData(currentYear, repositoryId),
|
||||
getActiveUserLeaderboardLast30DaysForRepo(repositoryId),
|
||||
])
|
||||
|
||||
const totalAttempts = counts.total
|
||||
const successfulAttempts = counts.successful
|
||||
|
||||
// Process time series data
|
||||
let optimizationsTrend: number[] = []
|
||||
let optimizationsTrendDates: string[] = []
|
||||
if (Array.isArray(optimizationsOverTime) && optimizationsOverTime.length > 0) {
|
||||
optimizationsTrend = optimizationsOverTime.map(item => item?.count || 0)
|
||||
optimizationsTrendDates = optimizationsOverTime.map(item => item?.date || "")
|
||||
}
|
||||
|
||||
let successfulOptimizationsTrend: number[] = []
|
||||
let successfulOptimizationsTrendDates: string[] = []
|
||||
if (
|
||||
Array.isArray(successfulOptimizationsOverTime) &&
|
||||
successfulOptimizationsOverTime.length > 0
|
||||
) {
|
||||
successfulOptimizationsTrend = successfulOptimizationsOverTime.map(item => item?.count || 0)
|
||||
successfulOptimizationsTrendDates = successfulOptimizationsOverTime.map(
|
||||
item => item?.date || "",
|
||||
)
|
||||
}
|
||||
|
||||
return {
|
||||
userId,
|
||||
orgId: orgId ?? null,
|
||||
repository,
|
||||
stats: {
|
||||
totalAttempts: totalAttempts ?? 0,
|
||||
successfulAttempts: successfulAttempts ?? 0,
|
||||
optimizationsTrend,
|
||||
optimizationsTrendDates,
|
||||
successfulOptimizationsTrend,
|
||||
successfulOptimizationsTrendDates,
|
||||
prActivityData: Array.isArray(prData) ? prData : [],
|
||||
activeUsersData: Array.isArray(leaderboardData) ? leaderboardData : [],
|
||||
prYear: currentYear,
|
||||
},
|
||||
}
|
||||
}
|
||||
|
|
@ -1,5 +0,0 @@
|
|||
import { RepositoryDetailSkeleton } from "@/components/repositories/RepositoryDetailSkeleton"
|
||||
|
||||
export default function Loading() {
|
||||
return <RepositoryDetailSkeleton />
|
||||
}
|
||||
|
|
@ -1,23 +1,726 @@
|
|||
// app/(dashboard)/repositories/[repositoryId]/page.tsx
|
||||
"use client"
|
||||
import React, { useState, useMemo, useEffect, useCallback } from "react"
|
||||
import {
|
||||
Zap,
|
||||
Gauge,
|
||||
GitPullRequest,
|
||||
Clock,
|
||||
GitBranch,
|
||||
Users,
|
||||
RefreshCw,
|
||||
UserPlus,
|
||||
AlertCircle,
|
||||
BarChart3,
|
||||
} from "lucide-react"
|
||||
import { getUserIdAndUsername } from "@/app/utils/auth"
|
||||
import { format, subDays } from "date-fns"
|
||||
import { ActiveUsersLeaderboard } from "@/components/dashboard/ActiveUsersLeaderboard"
|
||||
import { CompactPullRequestActivityCard } from "@/components/dashboard/CompactPullRequestActivityCard"
|
||||
import { DashboardErrorBoundary } from "@/components/dashboard/DashboardErrorBoundary"
|
||||
import { getRepoDetailInitData } from "./data"
|
||||
import { RepoDetailClient } from "./repo-detail-client"
|
||||
import { GitPullRequest } from "lucide-react"
|
||||
import { MetricCard } from "@/components/dashboard/MetricCard"
|
||||
import { OptimizationPRsTable } from "@/components/dashboard/OptimizationPRsTable"
|
||||
import { RepositoryDetailSkeleton } from "@/components/repositories/RepositoryDetailSkeleton"
|
||||
import Image from "next/image"
|
||||
import { useParams, useRouter, useSearchParams } from "next/navigation"
|
||||
import {
|
||||
getActiveUserLeaderboardLast30DaysForRepo,
|
||||
getOptimizationsTimeSeriesData,
|
||||
getPullRequestEventTimeSeriesData,
|
||||
getRepositoryById,
|
||||
getUserOptimizationCountByRepo,
|
||||
getUserOptimizationSuccessfulCountByRepo,
|
||||
getRepositoryMembers,
|
||||
updateRepositoryMemberRole,
|
||||
removeRepositoryMember,
|
||||
addRepositoryMemberById,
|
||||
} from "./action"
|
||||
import { GitHubUserSearchResult, Member } from "@/lib/types"
|
||||
import { RepositoryWithUsage } from "@/app/(dashboard)/dashboard/action"
|
||||
import { useViewMode } from "@/app/app/ViewModeContext"
|
||||
import { MembersList } from "@/components/members/members-list"
|
||||
import { UserSearchModal } from "@/components/members/user-search-modal"
|
||||
import { RoleSelector } from "@/components/members/role-selector"
|
||||
import { ConfirmDialog } from "@/components/confirm-dialog"
|
||||
import { AccountPayload } from "@codeflash-ai/common"
|
||||
|
||||
export default async function RepositoryDetailPage({
|
||||
params,
|
||||
// Repository Header Component
|
||||
const RepositoryHeader = ({ repository }: { repository: RepositoryWithUsage }) => {
|
||||
return (
|
||||
<div className="mb-6 sm:mb-8">
|
||||
<div className="flex items-start">
|
||||
<div className="flex items-start gap-4 w-full">
|
||||
{/* Repository Avatar - Circular */}
|
||||
<div className="flex-shrink-0">
|
||||
{repository.avatarUrl ? (
|
||||
<div className="w-12 h-12 sm:w-16 sm:h-16 rounded-full overflow-hidden border-2 border-border/50 shadow-sm">
|
||||
<Image
|
||||
src={repository.avatarUrl}
|
||||
alt={`${repository.organization} avatar`}
|
||||
width={64}
|
||||
height={64}
|
||||
className="object-cover w-full h-full"
|
||||
/>
|
||||
</div>
|
||||
) : (
|
||||
<div className="w-12 h-12 sm:w-16 sm:h-16 rounded-full bg-gradient-to-br from-primary/10 to-primary/30 flex items-center justify-center border-2 border-border shadow-sm">
|
||||
<span className="text-primary font-semibold text-lg sm:text-xl">
|
||||
{repository.name?.substring(0, 1).toUpperCase() || "?"}
|
||||
</span>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Repository Info */}
|
||||
<div className="flex-1 min-w-0">
|
||||
<div className="flex items-center gap-2 flex-wrap">
|
||||
<h1 className="text-xl sm:text-2xl font-bold truncate text-foreground">
|
||||
{repository.name}
|
||||
</h1>
|
||||
<span
|
||||
className={`px-2.5 py-1 text-xs font-medium rounded-full whitespace-nowrap ${
|
||||
repository.is_private
|
||||
? "bg-amber-100 text-amber-700 dark:bg-amber-900/30 dark:text-amber-400"
|
||||
: "bg-emerald-100 text-emerald-700 dark:bg-emerald-900/30 dark:text-emerald-400"
|
||||
}`}
|
||||
>
|
||||
{repository.is_private ? "Private" : "Public"}
|
||||
</span>
|
||||
{repository.is_active && (
|
||||
<span className="inline-flex items-center px-2.5 py-1 rounded-full text-xs bg-green-100 text-green-700 dark:bg-green-900/30 dark:text-green-400 whitespace-nowrap">
|
||||
<span className="inline-block w-2 h-2 rounded-full bg-green-500 mr-1.5 animate-pulse"></span>
|
||||
Active
|
||||
</span>
|
||||
)}
|
||||
{repository.has_github_action && (
|
||||
<span className="inline-flex items-center px-2.5 py-1 rounded-full bg-blue-100 text-xs text-blue-700 dark:bg-blue-900/30 dark:text-blue-400 whitespace-nowrap">
|
||||
<GitBranch size={12} className="mr-1" />
|
||||
GitHub Action
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<a
|
||||
href={`https://github.com/${repository.full_name}`}
|
||||
target="_blank"
|
||||
rel="noopener noreferrer"
|
||||
className="text-sm text-muted-foreground hover:text-primary transition-colors mt-1 inline-block hover:underline"
|
||||
>
|
||||
{repository.full_name}
|
||||
</a>
|
||||
|
||||
<div className="flex items-center gap-4 mt-2 flex-wrap">
|
||||
{repository.last_optimized && (
|
||||
<div className="text-xs text-muted-foreground flex items-center whitespace-nowrap">
|
||||
<Clock size={12} className="mr-1" />
|
||||
Last optimized: {new Date(repository.last_optimized).toLocaleDateString()}
|
||||
</div>
|
||||
)}
|
||||
{repository.membersCount !== undefined && repository.membersCount > 0 && (
|
||||
<div className="text-xs text-muted-foreground flex items-center whitespace-nowrap">
|
||||
<Users size={12} className="mr-1" />
|
||||
{repository.membersCount} {repository.membersCount === 1 ? "member" : "members"}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
// Tab Navigation Component
|
||||
const TabNavigation = ({
|
||||
activeTab,
|
||||
onTabChange,
|
||||
}: {
|
||||
params: Promise<{ repositoryId: string }>
|
||||
}) {
|
||||
const { repositoryId } = await params
|
||||
const initData = await getRepoDetailInitData(repositoryId)
|
||||
activeTab: "statistics" | "members"
|
||||
onTabChange: (tab: "statistics" | "members") => void
|
||||
}) => {
|
||||
return (
|
||||
<div className="bg-card rounded-2xl border border-border shadow-sm p-2 mb-6">
|
||||
<div className="flex gap-2">
|
||||
<button
|
||||
onClick={() => onTabChange("statistics")}
|
||||
className={`flex-1 flex items-center justify-center gap-2 px-4 py-3 rounded-xl font-medium transition-all duration-200 ${
|
||||
activeTab === "statistics"
|
||||
? "bg-primary text-primary-foreground shadow-sm"
|
||||
: "text-muted-foreground hover:bg-accent hover:text-foreground"
|
||||
}`}
|
||||
>
|
||||
<BarChart3 size={18} />
|
||||
<span className="hidden sm:inline">Statistics</span>
|
||||
<span className="sm:hidden">Stats</span>
|
||||
</button>
|
||||
<button
|
||||
onClick={() => onTabChange("members")}
|
||||
className={`flex-1 flex items-center justify-center gap-2 px-4 py-3 rounded-xl font-medium transition-all duration-200 ${
|
||||
activeTab === "members"
|
||||
? "bg-primary text-primary-foreground shadow-sm"
|
||||
: "text-muted-foreground hover:bg-accent hover:text-foreground"
|
||||
}`}
|
||||
>
|
||||
<Users size={18} />
|
||||
<span>Members</span>
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
// No session — auth middleware will redirect
|
||||
if (!initData) {
|
||||
return null
|
||||
// Statistics Tab Component
|
||||
const StatisticsTab = ({
|
||||
optimizationStats,
|
||||
optimizationsTrend,
|
||||
optimizationsTrendDates,
|
||||
successfulOptimizationsTrend,
|
||||
successfulOptimizationsTrendDates,
|
||||
prActivityData,
|
||||
selectedPrYear,
|
||||
setSelectedPrYear,
|
||||
activeUsersData,
|
||||
dateRangeDisplay,
|
||||
isMobile,
|
||||
repositoryId,
|
||||
}: {
|
||||
optimizationStats: { totalAttempts: number; successfulAttempts: number }
|
||||
optimizationsTrend: number[]
|
||||
optimizationsTrendDates: string[]
|
||||
successfulOptimizationsTrend: number[]
|
||||
successfulOptimizationsTrendDates: string[]
|
||||
prActivityData: Array<{
|
||||
month: string
|
||||
pr_created: number
|
||||
pr_merged: number
|
||||
pr_closed: number
|
||||
}>
|
||||
selectedPrYear: number
|
||||
setSelectedPrYear: (year: number) => void
|
||||
activeUsersData: { username: string; eventCount: number; avatarUrl: string }[]
|
||||
dateRangeDisplay: string
|
||||
isMobile: boolean
|
||||
repositoryId: string
|
||||
}) => {
|
||||
return (
|
||||
<div className="space-y-6">
|
||||
{/* Repository Stats */}
|
||||
<div className="grid grid-cols-1 sm:grid-cols-2 gap-3 sm:gap-5">
|
||||
<MetricCard
|
||||
title="Optimization Attempts"
|
||||
value={optimizationStats.totalAttempts}
|
||||
icon={<Zap size={isMobile ? 16 : 20} />}
|
||||
gradientFrom="bg-gradient-to-br from-blue-500/20"
|
||||
gradientTo="to-blue-600/20"
|
||||
iconColor="text-blue-500"
|
||||
chartData={optimizationsTrend}
|
||||
chartDates={optimizationsTrendDates}
|
||||
chartColor="rgba(59, 130, 246, 1)"
|
||||
chartFillColor="rgba(59, 130, 246, 0.2)"
|
||||
timeText={dateRangeDisplay}
|
||||
emptyStateMessage="No optimization attempts"
|
||||
cumulativeChart={true}
|
||||
/>
|
||||
<MetricCard
|
||||
title="Optimizations Found"
|
||||
value={optimizationStats.successfulAttempts}
|
||||
icon={<Gauge size={isMobile ? 16 : 20} />}
|
||||
gradientFrom="bg-gradient-to-br from-emerald-500/20"
|
||||
gradientTo="to-emerald-600/20"
|
||||
iconColor="text-emerald-500"
|
||||
chartData={successfulOptimizationsTrend}
|
||||
chartDates={successfulOptimizationsTrendDates}
|
||||
chartColor="rgba(16, 185, 129, 1)"
|
||||
chartFillColor="rgba(16, 185, 129, 0.2)"
|
||||
emptyStateMessage="No optimizations found"
|
||||
timeText="All time"
|
||||
cumulativeChart={true}
|
||||
showChart={successfulOptimizationsTrend.length > 0}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* PR Activity and Active Users */}
|
||||
<div className="grid grid-cols-1 md:grid-cols-2 gap-3 sm:gap-5 h-96 md:h-[500px]">
|
||||
<CompactPullRequestActivityCard
|
||||
prData={prActivityData}
|
||||
selectedYear={selectedPrYear}
|
||||
onYearChange={setSelectedPrYear}
|
||||
className="h-full"
|
||||
/>
|
||||
|
||||
<div className="h-full">
|
||||
<ActiveUsersLeaderboard leaderboardData={activeUsersData} />
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Optimization PRs Table */}
|
||||
<div>
|
||||
<OptimizationPRsTable repositoryId={repositoryId} />
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
// Members Tab Component
|
||||
const MembersTab = ({ repoId, currentUserId }: { repoId: string; currentUserId: string }) => {
|
||||
const [members, setMembers] = useState<Member[]>([])
|
||||
const [loading, setLoading] = useState(true)
|
||||
const [showAddModal, setShowAddModal] = useState(false)
|
||||
const [error, setError] = useState<string | null>(null)
|
||||
const [updatingMember, setUpdatingMember] = useState<string | null>(null)
|
||||
const [success, setSuccess] = useState<string | null>(null)
|
||||
const [searchQuery, setSearchQuery] = useState("")
|
||||
const [filterRole, setFilterRole] = useState<"all" | "owner" | "admin" | "member">("all")
|
||||
const [isRefreshing, setIsRefreshing] = useState(false)
|
||||
const [selectedRole, setSelectedRole] = useState<"admin" | "member">("member")
|
||||
const [confirmDialog, setConfirmDialog] = useState<{
|
||||
open: boolean
|
||||
memberId: string
|
||||
memberUsername: string
|
||||
} | null>(null)
|
||||
|
||||
const currentUserMember = members.find(m => m.user_id === currentUserId)
|
||||
const isAdmin = currentUserMember?.role === "admin" || currentUserMember?.role === "owner"
|
||||
const isOnlyMember = members.length === 1
|
||||
|
||||
const fetchMembers = useCallback(async () => {
|
||||
if (!isRefreshing) {
|
||||
setLoading(true)
|
||||
}
|
||||
setError(null)
|
||||
|
||||
const result = await getRepositoryMembers(currentUserId, repoId)
|
||||
|
||||
if (result.success && result.data) {
|
||||
setMembers(result.data)
|
||||
} else {
|
||||
setError(result.error || "Failed to load members")
|
||||
}
|
||||
|
||||
setLoading(false)
|
||||
setIsRefreshing(false)
|
||||
}, [currentUserId, repoId, isRefreshing])
|
||||
|
||||
useEffect(() => {
|
||||
fetchMembers()
|
||||
}, [fetchMembers])
|
||||
|
||||
useEffect(() => {
|
||||
if (success) {
|
||||
const timer = setTimeout(() => setSuccess(null), 5000)
|
||||
return () => clearTimeout(timer)
|
||||
}
|
||||
}, [success])
|
||||
|
||||
const handleMemberAdded = async () => {
|
||||
setIsRefreshing(true)
|
||||
await fetchMembers()
|
||||
setSuccess("Member added successfully!")
|
||||
}
|
||||
|
||||
// Repository not found
|
||||
if (!initData.repository || !initData.stats) {
|
||||
const handleUserAdd = async (user: GitHubUserSearchResult) => {
|
||||
const result = await addRepositoryMemberById(currentUserId, repoId, user, selectedRole)
|
||||
if (result.success) {
|
||||
handleMemberAdded()
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
const handleUpdateRole = async (memberId: string, newRole: "admin" | "member" | "owner") => {
|
||||
setUpdatingMember(memberId)
|
||||
setError(null)
|
||||
setSuccess(null)
|
||||
|
||||
const result = await updateRepositoryMemberRole(currentUserId, repoId, memberId, newRole)
|
||||
|
||||
if (result.success) {
|
||||
setSuccess("Member role updated successfully")
|
||||
setIsRefreshing(true)
|
||||
await fetchMembers()
|
||||
} else {
|
||||
setError(result.error || "Failed to update role")
|
||||
}
|
||||
|
||||
setUpdatingMember(null)
|
||||
}
|
||||
|
||||
const handleRemoveMember = async (memberId: string, memberUsername: string) => {
|
||||
setConfirmDialog({ open: true, memberId, memberUsername })
|
||||
}
|
||||
|
||||
const confirmRemoveMember = async () => {
|
||||
if (!confirmDialog) return
|
||||
|
||||
const { memberId, memberUsername } = confirmDialog
|
||||
|
||||
setUpdatingMember(memberId)
|
||||
setError(null)
|
||||
setSuccess(null)
|
||||
|
||||
const result = await removeRepositoryMember(currentUserId, repoId, memberId)
|
||||
|
||||
if (result.success) {
|
||||
setSuccess(`${memberUsername} has been removed successfully`)
|
||||
setIsRefreshing(true)
|
||||
await fetchMembers()
|
||||
} else {
|
||||
setError(result.error || "Failed to remove member")
|
||||
}
|
||||
|
||||
setUpdatingMember(null)
|
||||
setConfirmDialog(null)
|
||||
}
|
||||
|
||||
if (loading) {
|
||||
return (
|
||||
<div className="bg-card rounded-2xl border border-border p-8 shadow-sm">
|
||||
<div className="flex flex-col items-center justify-center">
|
||||
<div className="relative">
|
||||
<div className="animate-spin rounded-full h-12 w-12 border-b-2 border-primary"></div>
|
||||
<div className="absolute inset-0 rounded-full border-2 border-primary/20"></div>
|
||||
</div>
|
||||
<p className="text-sm text-muted-foreground mt-4">Loading members...</p>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
const adminCount = members.filter(m => m.role === "admin" || m.role === "owner").length
|
||||
const memberCount = members.filter(m => m.role === "member").length
|
||||
|
||||
return (
|
||||
<>
|
||||
<div className="bg-card rounded-2xl border border-border shadow-sm overflow-hidden">
|
||||
<div className="p-6 border-b border-border bg-accent/20">
|
||||
<div className="flex items-center justify-between gap-4 mb-4">
|
||||
<div className="flex-1 min-w-0">
|
||||
<div className="flex items-center gap-3 mb-2">
|
||||
<div className="p-2 rounded-lg bg-primary/10">
|
||||
<Users size={20} className="text-primary" />
|
||||
</div>
|
||||
<h2 className="text-xl font-semibold text-foreground">Repository Members</h2>
|
||||
</div>
|
||||
<p className="text-sm text-muted-foreground">
|
||||
{members.length} {members.length === 1 ? "member" : "members"} • {adminCount}{" "}
|
||||
{adminCount === 1 ? "admin" : "admins"} • {memberCount}{" "}
|
||||
{memberCount === 1 ? "member" : "members"}
|
||||
</p>
|
||||
</div>
|
||||
{(isAdmin || isOnlyMember) && (
|
||||
<div className="flex items-center gap-3">
|
||||
<RoleSelector
|
||||
selectedRole={selectedRole}
|
||||
onChange={setSelectedRole}
|
||||
disabled={showAddModal}
|
||||
/>
|
||||
<button
|
||||
onClick={() => setShowAddModal(true)}
|
||||
disabled={showAddModal}
|
||||
className="flex items-center gap-2 px-4 py-2.5 bg-primary text-primary-foreground rounded-xl hover:bg-primary/90 disabled:opacity-50 disabled:cursor-not-allowed transition-all duration-200 text-sm font-medium whitespace-nowrap flex-shrink-0 shadow-sm hover:shadow-md"
|
||||
>
|
||||
<UserPlus size={16} />
|
||||
<span className="hidden sm:inline">Add Member</span>
|
||||
<span className="sm:hidden">Add</span>
|
||||
</button>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<MembersList
|
||||
members={members}
|
||||
currentUserId={currentUserId}
|
||||
isAdmin={isAdmin}
|
||||
updatingMember={updatingMember}
|
||||
error={error}
|
||||
success={success}
|
||||
searchQuery={searchQuery}
|
||||
filterRole={filterRole}
|
||||
onSearchChange={setSearchQuery}
|
||||
onFilterChange={setFilterRole}
|
||||
onUpdateRole={handleUpdateRole}
|
||||
onRemove={handleRemoveMember}
|
||||
onDismissError={() => setError(null)}
|
||||
onDismissSuccess={() => setSuccess(null)}
|
||||
/>
|
||||
</div>
|
||||
|
||||
<UserSearchModal
|
||||
isOpen={showAddModal}
|
||||
onClose={() => setShowAddModal(false)}
|
||||
onUserAdd={handleUserAdd}
|
||||
title={`Add Repository Member as ${selectedRole === "admin" ? "Admin" : "Member"}`}
|
||||
description="Search for GitHub users and add them to this repository"
|
||||
addButtonText={`Add as ${selectedRole === "admin" ? "Admin" : "Member"}`}
|
||||
/>
|
||||
|
||||
{/* Confirm Dialog */}
|
||||
<ConfirmDialog
|
||||
open={confirmDialog?.open || false}
|
||||
onOpenChange={open => !open && setConfirmDialog(null)}
|
||||
onConfirm={confirmRemoveMember}
|
||||
title="Remove Member"
|
||||
description={`Are you sure you want to remove ${confirmDialog?.memberUsername} from this repository? This action cannot be undone.`}
|
||||
confirmText="Remove"
|
||||
cancelText="Cancel"
|
||||
variant="destructive"
|
||||
/>
|
||||
</>
|
||||
)
|
||||
}
|
||||
// Main repository detail component
|
||||
function RepositoryDetail() {
|
||||
const params = useParams()
|
||||
const router = useRouter()
|
||||
const searchParams = useSearchParams()
|
||||
const repositoryId = params.repositoryId as string
|
||||
|
||||
const [repository, setRepository] = useState<RepositoryWithUsage | null>(null)
|
||||
const [currentUserId, setCurrentUserId] = useState<string>("")
|
||||
const [loading, setLoading] = useState(true)
|
||||
const [error, setError] = useState<string | null>(null)
|
||||
const [retryCount, setRetryCount] = useState(0)
|
||||
const maxRetries = 3
|
||||
const { currentOrg } = useViewMode()
|
||||
|
||||
const tabFromUrl = (searchParams.get("tab") as "statistics" | "members") || "statistics"
|
||||
const [activeTab, setActiveTab] = useState<"statistics" | "members">(
|
||||
currentOrg ? "statistics" : tabFromUrl,
|
||||
)
|
||||
|
||||
const [optimizationStats, setOptimizationStats] = useState({
|
||||
totalAttempts: 0,
|
||||
successfulAttempts: 0,
|
||||
})
|
||||
|
||||
const [prActivityData, setPrActivityData] = useState<
|
||||
Array<{
|
||||
month: string
|
||||
pr_created: number
|
||||
pr_merged: number
|
||||
pr_closed: number
|
||||
}>
|
||||
>([])
|
||||
const [selectedPrYear, setSelectedPrYear] = useState<number>(new Date().getFullYear())
|
||||
const [activeUsersData, setActiveUsersData] = useState<
|
||||
{ username: string; eventCount: number; avatarUrl: string }[]
|
||||
>([])
|
||||
|
||||
const [optimizationsTrend, setOptimizationsTrend] = useState<number[]>([])
|
||||
const [optimizationsTrendDates, setOptimizationsTrendDates] = useState<string[]>([])
|
||||
const [successfulOptimizationsTrend, setSuccessfulOptimizationsTrend] = useState<number[]>([])
|
||||
const [successfulOptimizationsTrendDates, setSuccessfulOptimizationsTrendDates] = useState<
|
||||
string[]
|
||||
>([])
|
||||
const [isMobile, setIsMobile] = useState<boolean>(false)
|
||||
|
||||
useEffect(() => {
|
||||
const handleResize = () => {
|
||||
setIsMobile(window.innerWidth < 640)
|
||||
}
|
||||
|
||||
if (typeof window !== "undefined") {
|
||||
handleResize()
|
||||
window.addEventListener("resize", handleResize)
|
||||
return () => window.removeEventListener("resize", handleResize)
|
||||
}
|
||||
}, [])
|
||||
|
||||
useEffect(() => {
|
||||
if (currentOrg) {
|
||||
setActiveTab("statistics")
|
||||
}
|
||||
}, [currentOrg])
|
||||
|
||||
const handleTabChange = (tab: "statistics" | "members") => {
|
||||
if (currentOrg) return
|
||||
|
||||
setActiveTab(tab)
|
||||
const url = new URL(window.location.href)
|
||||
url.searchParams.set("tab", tab)
|
||||
router.push(url.pathname + url.search, { scroll: false })
|
||||
}
|
||||
|
||||
const fetchRepositoryData = useCallback(
|
||||
async (attempt = 0) => {
|
||||
try {
|
||||
setLoading(attempt === 0)
|
||||
setError(null)
|
||||
|
||||
if (attempt > 0) {
|
||||
await new Promise(resolve => setTimeout(resolve, Math.pow(2, attempt) * 1000))
|
||||
}
|
||||
|
||||
const data = await getUserIdAndUsername()
|
||||
|
||||
setCurrentUserId(data.userId)
|
||||
|
||||
const payload: AccountPayload = currentOrg
|
||||
? { orgId: currentOrg.id }
|
||||
: { userId: data.userId, username: data.username }
|
||||
|
||||
const currentRepo = await getRepositoryById(payload, repositoryId)
|
||||
|
||||
if (!currentRepo) {
|
||||
throw new Error("Repository not found")
|
||||
}
|
||||
|
||||
setRepository(currentRepo)
|
||||
|
||||
// Fetch all statistics in parallel - these are all independent queries
|
||||
const [
|
||||
totalAttempts,
|
||||
successfulAttempts,
|
||||
optimizationsOverTime,
|
||||
successfulOptimizationsOverTime,
|
||||
prData,
|
||||
leaderboardData,
|
||||
] = await Promise.all([
|
||||
getUserOptimizationCountByRepo(repositoryId),
|
||||
getUserOptimizationSuccessfulCountByRepo(repositoryId),
|
||||
getOptimizationsTimeSeriesData(repositoryId, false),
|
||||
getOptimizationsTimeSeriesData(repositoryId, true),
|
||||
getPullRequestEventTimeSeriesData(selectedPrYear, repositoryId),
|
||||
getActiveUserLeaderboardLast30DaysForRepo(repositoryId),
|
||||
])
|
||||
|
||||
if (Array.isArray(optimizationsOverTime) && optimizationsOverTime.length > 0) {
|
||||
const optimizationValues = optimizationsOverTime.map(item => item?.count || 0)
|
||||
const optimizationDates = optimizationsOverTime.map(item => item?.date || "")
|
||||
setOptimizationsTrend(optimizationValues)
|
||||
setOptimizationsTrendDates(optimizationDates)
|
||||
} else {
|
||||
setOptimizationsTrend([])
|
||||
setOptimizationsTrendDates([])
|
||||
}
|
||||
|
||||
if (
|
||||
Array.isArray(successfulOptimizationsOverTime) &&
|
||||
successfulOptimizationsOverTime.length > 0
|
||||
) {
|
||||
const successfulValues = successfulOptimizationsOverTime.map(item => item?.count || 0)
|
||||
const successfulDates = successfulOptimizationsOverTime.map(item => item?.date || "")
|
||||
setSuccessfulOptimizationsTrend(successfulValues)
|
||||
setSuccessfulOptimizationsTrendDates(successfulDates)
|
||||
} else {
|
||||
setSuccessfulOptimizationsTrend([])
|
||||
setSuccessfulOptimizationsTrendDates([])
|
||||
}
|
||||
|
||||
if (Array.isArray(prData)) {
|
||||
setPrActivityData(prData)
|
||||
} else {
|
||||
setPrActivityData([])
|
||||
}
|
||||
|
||||
if (Array.isArray(leaderboardData)) {
|
||||
setActiveUsersData(leaderboardData)
|
||||
} else {
|
||||
setActiveUsersData([])
|
||||
}
|
||||
|
||||
setOptimizationStats({
|
||||
totalAttempts,
|
||||
successfulAttempts,
|
||||
})
|
||||
|
||||
setRetryCount(0)
|
||||
} catch (err) {
|
||||
console.error(`Failed to fetch repository data (attempt ${attempt + 1}):`, err)
|
||||
|
||||
if (
|
||||
attempt < maxRetries &&
|
||||
err instanceof Error &&
|
||||
(err.message.includes("authentication") ||
|
||||
err.message.includes("User authentication data not found") ||
|
||||
err.message.includes("Unauthorized") ||
|
||||
err.message.includes("No valid session found"))
|
||||
) {
|
||||
setRetryCount(attempt + 1)
|
||||
return fetchRepositoryData(attempt + 1)
|
||||
}
|
||||
|
||||
setError(
|
||||
err instanceof Error && err.message === "Repository not found"
|
||||
? "Repository not found"
|
||||
: "Failed to load repository data. Please try again later.",
|
||||
)
|
||||
} finally {
|
||||
setLoading(false)
|
||||
}
|
||||
},
|
||||
[maxRetries, selectedPrYear, repositoryId, currentOrg],
|
||||
)
|
||||
|
||||
useEffect(() => {
|
||||
const lastAuthCheck = localStorage.getItem("lastAuthCheck")
|
||||
const now = Date.now()
|
||||
|
||||
if (lastAuthCheck && now - parseInt(lastAuthCheck) < 2000) {
|
||||
const delay = 2000 - (now - parseInt(lastAuthCheck))
|
||||
setTimeout(() => {
|
||||
fetchRepositoryData()
|
||||
}, delay)
|
||||
} else {
|
||||
const timeoutId = setTimeout(() => {
|
||||
fetchRepositoryData()
|
||||
}, 100)
|
||||
|
||||
const cleanup = () => clearTimeout(timeoutId)
|
||||
return cleanup
|
||||
}
|
||||
|
||||
localStorage.setItem("lastAuthCheck", now.toString())
|
||||
}, [fetchRepositoryData])
|
||||
|
||||
const now = useMemo(() => new Date(), [])
|
||||
const last30DaysStart = subDays(now, 30)
|
||||
|
||||
const dateRangeDisplay = useMemo(() => {
|
||||
const startMonth = format(last30DaysStart, "MMMM")
|
||||
const endMonth = format(now, "MMMM")
|
||||
const startYear = format(last30DaysStart, "yyyy")
|
||||
const endYear = format(now, "yyyy")
|
||||
|
||||
if (startMonth === endMonth && startYear === endYear) {
|
||||
return `${startMonth} ${format(last30DaysStart, "d")}-${format(now, "d")}, ${startYear}`
|
||||
} else if (startYear === endYear) {
|
||||
return `${format(last30DaysStart, "MMMM d")} - ${format(now, "MMMM d")}, ${startYear}`
|
||||
} else {
|
||||
return `${format(last30DaysStart, "MMMM d, yyyy")} - ${format(now, "MMMM d, yyyy")}`
|
||||
}
|
||||
}, [last30DaysStart, now])
|
||||
|
||||
if (loading) {
|
||||
return <RepositoryDetailSkeleton showTabNavigation={!currentOrg} />
|
||||
}
|
||||
|
||||
if (error) {
|
||||
return (
|
||||
<div className="flex justify-center items-center min-h-[70vh] p-4">
|
||||
<div className="bg-destructive/10 border border-destructive/20 text-destructive p-6 sm:p-8 rounded-2xl w-full max-w-md shadow-lg">
|
||||
<div className="inline-flex items-center justify-center w-12 h-12 rounded-full bg-destructive/20 mb-4">
|
||||
<AlertCircle size={24} />
|
||||
</div>
|
||||
<h3 className="text-base sm:text-lg font-semibold mb-2">Unable to Load Repository</h3>
|
||||
<p className="mb-4 text-sm sm:text-base opacity-90">{error}</p>
|
||||
{retryCount > 0 && (
|
||||
<p className="mb-4 text-xs opacity-75">
|
||||
Retry attempt: {retryCount}/{maxRetries}
|
||||
</p>
|
||||
)}
|
||||
<button
|
||||
onClick={() => fetchRepositoryData()}
|
||||
className="flex items-center gap-2 w-full justify-center px-4 py-2.5 bg-destructive hover:bg-destructive/90 text-destructive-foreground rounded-xl text-sm font-medium transition-all shadow-sm hover:shadow-md"
|
||||
>
|
||||
<RefreshCw size={16} /> Try Again
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
if (!repository) {
|
||||
return (
|
||||
<div className="flex justify-center items-center min-h-[70vh] p-4">
|
||||
<div className="text-center">
|
||||
|
|
@ -34,15 +737,40 @@ export default async function RepositoryDetailPage({
|
|||
)
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="flex-1 bg-background">
|
||||
<div className="h-screen py-6 sm:py-8 px-4 sm:px-6 max-w-[1400px] mx-auto">
|
||||
<RepositoryHeader repository={repository} />
|
||||
|
||||
{!currentOrg && <TabNavigation activeTab={activeTab} onTabChange={handleTabChange} />}
|
||||
|
||||
{currentOrg || activeTab === "statistics" ? (
|
||||
<StatisticsTab
|
||||
optimizationStats={optimizationStats}
|
||||
optimizationsTrend={optimizationsTrend}
|
||||
optimizationsTrendDates={optimizationsTrendDates}
|
||||
successfulOptimizationsTrend={successfulOptimizationsTrend}
|
||||
successfulOptimizationsTrendDates={successfulOptimizationsTrendDates}
|
||||
prActivityData={prActivityData}
|
||||
selectedPrYear={selectedPrYear}
|
||||
setSelectedPrYear={setSelectedPrYear}
|
||||
activeUsersData={activeUsersData}
|
||||
dateRangeDisplay={dateRangeDisplay}
|
||||
isMobile={isMobile}
|
||||
repositoryId={repositoryId}
|
||||
/>
|
||||
) : (
|
||||
<MembersTab repoId={repositoryId} currentUserId={currentUserId} />
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
export default function RepositoryDetailWrapper() {
|
||||
return (
|
||||
<DashboardErrorBoundary>
|
||||
<RepoDetailClient
|
||||
repositoryId={repositoryId}
|
||||
initialUserId={initData.userId}
|
||||
initialOrgId={initData.orgId ?? null}
|
||||
initialRepository={initData.repository as any}
|
||||
initialStats={initData.stats}
|
||||
/>
|
||||
<RepositoryDetail />
|
||||
</DashboardErrorBoundary>
|
||||
)
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,770 +0,0 @@
|
|||
"use client"
|
||||
import React, { useState, useMemo, useEffect, useCallback, useRef } from "react"
|
||||
import {
|
||||
Zap,
|
||||
Gauge,
|
||||
GitPullRequest,
|
||||
Clock,
|
||||
GitBranch,
|
||||
Users,
|
||||
RefreshCw,
|
||||
UserPlus,
|
||||
AlertCircle,
|
||||
BarChart3,
|
||||
} from "lucide-react"
|
||||
import { format, subDays } from "date-fns"
|
||||
import { ActiveUsersLeaderboard } from "@/components/dashboard/ActiveUsersLeaderboard"
|
||||
import { CompactPullRequestActivityCard } from "@/components/dashboard/CompactPullRequestActivityCard"
|
||||
import { MetricCard } from "@/components/dashboard/MetricCard"
|
||||
import { OptimizationPRsTable } from "@/components/dashboard/OptimizationPRsTable"
|
||||
import { RepositoryDetailSkeleton } from "@/components/repositories/RepositoryDetailSkeleton"
|
||||
import Image from "next/image"
|
||||
import { useRouter, useSearchParams } from "next/navigation"
|
||||
import {
|
||||
getActiveUserLeaderboardLast30DaysForRepo,
|
||||
getOptimizationsTimeSeriesData,
|
||||
getPullRequestEventTimeSeriesData,
|
||||
getRepositoryById,
|
||||
getOptimizationCountsByRepo,
|
||||
getRepositoryMembers,
|
||||
updateRepositoryMemberRole,
|
||||
removeRepositoryMember,
|
||||
addRepositoryMemberById,
|
||||
} from "./action"
|
||||
import { GitHubUserSearchResult, Member } from "@/lib/types"
|
||||
import { RepositoryWithUsage } from "@/app/(dashboard)/dashboard/action"
|
||||
import { useViewMode } from "@/app/app/ViewModeContext"
|
||||
import { MembersList } from "@/components/members/members-list"
|
||||
import { UserSearchModal } from "@/components/members/user-search-modal"
|
||||
import { RoleSelector } from "@/components/members/role-selector"
|
||||
import { ConfirmDialog } from "@/components/confirm-dialog"
|
||||
import type { AccountPayload } from "@codeflash-ai/common"
|
||||
|
||||
// Repository Header Component
|
||||
const RepositoryHeader = ({ repository }: { repository: RepositoryWithUsage }) => {
|
||||
return (
|
||||
<div className="mb-6 sm:mb-8">
|
||||
<div className="flex items-start">
|
||||
<div className="flex items-start gap-4 w-full">
|
||||
{/* Repository Avatar - Circular */}
|
||||
<div className="flex-shrink-0">
|
||||
{repository.avatarUrl ? (
|
||||
<div className="w-12 h-12 sm:w-16 sm:h-16 rounded-full overflow-hidden border-2 border-border/50 shadow-sm">
|
||||
<Image
|
||||
src={repository.avatarUrl}
|
||||
alt={`${repository.organization} avatar`}
|
||||
width={64}
|
||||
height={64}
|
||||
className="object-cover w-full h-full"
|
||||
/>
|
||||
</div>
|
||||
) : (
|
||||
<div className="w-12 h-12 sm:w-16 sm:h-16 rounded-full bg-gradient-to-br from-primary/10 to-primary/30 flex items-center justify-center border-2 border-border shadow-sm">
|
||||
<span className="text-primary font-semibold text-lg sm:text-xl">
|
||||
{repository.name?.substring(0, 1).toUpperCase() || "?"}
|
||||
</span>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Repository Info */}
|
||||
<div className="flex-1 min-w-0">
|
||||
<div className="flex items-center gap-2 flex-wrap">
|
||||
<h1 className="text-xl sm:text-2xl font-bold truncate text-foreground">
|
||||
{repository.name}
|
||||
</h1>
|
||||
<span
|
||||
className={`px-2.5 py-1 text-xs font-medium rounded-full whitespace-nowrap ${
|
||||
repository.is_private
|
||||
? "bg-amber-100 text-amber-700 dark:bg-amber-900/30 dark:text-amber-400"
|
||||
: "bg-emerald-100 text-emerald-700 dark:bg-emerald-900/30 dark:text-emerald-400"
|
||||
}`}
|
||||
>
|
||||
{repository.is_private ? "Private" : "Public"}
|
||||
</span>
|
||||
{repository.is_active && (
|
||||
<span className="inline-flex items-center px-2.5 py-1 rounded-full text-xs bg-green-100 text-green-700 dark:bg-green-900/30 dark:text-green-400 whitespace-nowrap">
|
||||
<span className="inline-block w-2 h-2 rounded-full bg-green-500 mr-1.5 animate-pulse"></span>
|
||||
Active
|
||||
</span>
|
||||
)}
|
||||
{repository.has_github_action && (
|
||||
<span className="inline-flex items-center px-2.5 py-1 rounded-full bg-blue-100 text-xs text-blue-700 dark:bg-blue-900/30 dark:text-blue-400 whitespace-nowrap">
|
||||
<GitBranch size={12} className="mr-1" />
|
||||
GitHub Action
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<a
|
||||
href={`https://github.com/${repository.full_name}`}
|
||||
target="_blank"
|
||||
rel="noopener noreferrer"
|
||||
className="text-sm text-muted-foreground hover:text-primary transition-colors mt-1 inline-block hover:underline"
|
||||
>
|
||||
{repository.full_name}
|
||||
</a>
|
||||
|
||||
<div className="flex items-center gap-4 mt-2 flex-wrap">
|
||||
{repository.last_optimized && (
|
||||
<div className="text-xs text-muted-foreground flex items-center whitespace-nowrap">
|
||||
<Clock size={12} className="mr-1" />
|
||||
Last optimized: {new Date(repository.last_optimized).toLocaleDateString()}
|
||||
</div>
|
||||
)}
|
||||
{repository.membersCount !== undefined && repository.membersCount > 0 && (
|
||||
<div className="text-xs text-muted-foreground flex items-center whitespace-nowrap">
|
||||
<Users size={12} className="mr-1" />
|
||||
{repository.membersCount} {repository.membersCount === 1 ? "member" : "members"}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
// Tab Navigation Component
|
||||
const TabNavigation = ({
|
||||
activeTab,
|
||||
onTabChange,
|
||||
}: {
|
||||
activeTab: "statistics" | "members"
|
||||
onTabChange: (tab: "statistics" | "members") => void
|
||||
}) => {
|
||||
return (
|
||||
<div className="bg-card rounded-2xl border border-border shadow-sm p-2 mb-6">
|
||||
<div className="flex gap-2">
|
||||
<button
|
||||
onClick={() => onTabChange("statistics")}
|
||||
className={`flex-1 flex items-center justify-center gap-2 px-4 py-3 rounded-xl font-medium transition-all duration-200 ${
|
||||
activeTab === "statistics"
|
||||
? "bg-primary text-primary-foreground shadow-sm"
|
||||
: "text-muted-foreground hover:bg-accent hover:text-foreground"
|
||||
}`}
|
||||
>
|
||||
<BarChart3 size={18} />
|
||||
<span className="hidden sm:inline">Statistics</span>
|
||||
<span className="sm:hidden">Stats</span>
|
||||
</button>
|
||||
<button
|
||||
onClick={() => onTabChange("members")}
|
||||
className={`flex-1 flex items-center justify-center gap-2 px-4 py-3 rounded-xl font-medium transition-all duration-200 ${
|
||||
activeTab === "members"
|
||||
? "bg-primary text-primary-foreground shadow-sm"
|
||||
: "text-muted-foreground hover:bg-accent hover:text-foreground"
|
||||
}`}
|
||||
>
|
||||
<Users size={18} />
|
||||
<span>Members</span>
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
// Statistics Tab Component
|
||||
const StatisticsTab = ({
|
||||
optimizationStats,
|
||||
optimizationsTrend,
|
||||
optimizationsTrendDates,
|
||||
successfulOptimizationsTrend,
|
||||
successfulOptimizationsTrendDates,
|
||||
prActivityData,
|
||||
selectedPrYear,
|
||||
setSelectedPrYear,
|
||||
activeUsersData,
|
||||
dateRangeDisplay,
|
||||
isMobile,
|
||||
repositoryId,
|
||||
}: {
|
||||
optimizationStats: { totalAttempts: number; successfulAttempts: number }
|
||||
optimizationsTrend: number[]
|
||||
optimizationsTrendDates: string[]
|
||||
successfulOptimizationsTrend: number[]
|
||||
successfulOptimizationsTrendDates: string[]
|
||||
prActivityData: Array<{
|
||||
month: string
|
||||
pr_created: number
|
||||
pr_merged: number
|
||||
pr_closed: number
|
||||
}>
|
||||
selectedPrYear: number
|
||||
setSelectedPrYear: (year: number) => void
|
||||
activeUsersData: { username: string; eventCount: number; avatarUrl: string }[]
|
||||
dateRangeDisplay: string
|
||||
isMobile: boolean
|
||||
repositoryId: string
|
||||
}) => {
|
||||
return (
|
||||
<div className="space-y-6">
|
||||
{/* Repository Stats */}
|
||||
<div className="grid grid-cols-1 sm:grid-cols-2 gap-3 sm:gap-5">
|
||||
<MetricCard
|
||||
title="Optimization Attempts"
|
||||
value={optimizationStats.totalAttempts}
|
||||
icon={<Zap size={isMobile ? 16 : 20} />}
|
||||
gradientFrom="bg-gradient-to-br from-blue-500/20"
|
||||
gradientTo="to-blue-600/20"
|
||||
iconColor="text-blue-500"
|
||||
chartData={optimizationsTrend}
|
||||
chartDates={optimizationsTrendDates}
|
||||
chartColor="rgba(59, 130, 246, 1)"
|
||||
chartFillColor="rgba(59, 130, 246, 0.2)"
|
||||
timeText={dateRangeDisplay}
|
||||
emptyStateMessage="No optimization attempts"
|
||||
cumulativeChart={true}
|
||||
/>
|
||||
<MetricCard
|
||||
title="Optimizations Found"
|
||||
value={optimizationStats.successfulAttempts}
|
||||
icon={<Gauge size={isMobile ? 16 : 20} />}
|
||||
gradientFrom="bg-gradient-to-br from-emerald-500/20"
|
||||
gradientTo="to-emerald-600/20"
|
||||
iconColor="text-emerald-500"
|
||||
chartData={successfulOptimizationsTrend}
|
||||
chartDates={successfulOptimizationsTrendDates}
|
||||
chartColor="rgba(16, 185, 129, 1)"
|
||||
chartFillColor="rgba(16, 185, 129, 0.2)"
|
||||
emptyStateMessage="No optimizations found"
|
||||
timeText="All time"
|
||||
cumulativeChart={true}
|
||||
showChart={successfulOptimizationsTrend.length > 0}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* PR Activity and Active Users */}
|
||||
<div className="grid grid-cols-1 md:grid-cols-2 gap-3 sm:gap-5 h-96 md:h-[500px]">
|
||||
<CompactPullRequestActivityCard
|
||||
prData={prActivityData}
|
||||
selectedYear={selectedPrYear}
|
||||
onYearChange={setSelectedPrYear}
|
||||
className="h-full"
|
||||
/>
|
||||
|
||||
<div className="h-full">
|
||||
<ActiveUsersLeaderboard leaderboardData={activeUsersData} />
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Optimization PRs Table */}
|
||||
<div>
|
||||
<OptimizationPRsTable repositoryId={repositoryId} />
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
// Members Tab Component
|
||||
const MembersTab = ({ repoId, currentUserId }: { repoId: string; currentUserId: string }) => {
|
||||
const [members, setMembers] = useState<Member[]>([])
|
||||
const [loading, setLoading] = useState(true)
|
||||
const [showAddModal, setShowAddModal] = useState(false)
|
||||
const [error, setError] = useState<string | null>(null)
|
||||
const [updatingMember, setUpdatingMember] = useState<string | null>(null)
|
||||
const [success, setSuccess] = useState<string | null>(null)
|
||||
const [searchQuery, setSearchQuery] = useState("")
|
||||
const [filterRole, setFilterRole] = useState<"all" | "owner" | "admin" | "member">("all")
|
||||
const [isRefreshing, setIsRefreshing] = useState(false)
|
||||
const [selectedRole, setSelectedRole] = useState<"admin" | "member">("member")
|
||||
const [confirmDialog, setConfirmDialog] = useState<{
|
||||
open: boolean
|
||||
memberId: string
|
||||
memberUsername: string
|
||||
} | null>(null)
|
||||
|
||||
const currentUserMember = members.find(m => m.user_id === currentUserId)
|
||||
const isAdmin = currentUserMember?.role === "admin" || currentUserMember?.role === "owner"
|
||||
const isOnlyMember = members.length === 1
|
||||
|
||||
const fetchMembers = useCallback(async () => {
|
||||
if (!isRefreshing) {
|
||||
setLoading(true)
|
||||
}
|
||||
setError(null)
|
||||
|
||||
const result = await getRepositoryMembers(currentUserId, repoId)
|
||||
|
||||
if (result.success && result.data) {
|
||||
setMembers(result.data)
|
||||
} else {
|
||||
setError(result.error || "Failed to load members")
|
||||
}
|
||||
|
||||
setLoading(false)
|
||||
setIsRefreshing(false)
|
||||
}, [currentUserId, repoId, isRefreshing])
|
||||
|
||||
useEffect(() => {
|
||||
fetchMembers()
|
||||
}, [fetchMembers])
|
||||
|
||||
useEffect(() => {
|
||||
if (success) {
|
||||
const timer = setTimeout(() => setSuccess(null), 5000)
|
||||
return () => clearTimeout(timer)
|
||||
}
|
||||
}, [success])
|
||||
|
||||
const handleMemberAdded = async () => {
|
||||
setIsRefreshing(true)
|
||||
await fetchMembers()
|
||||
setSuccess("Member added successfully!")
|
||||
}
|
||||
|
||||
const handleUserAdd = async (user: GitHubUserSearchResult) => {
|
||||
const result = await addRepositoryMemberById(currentUserId, repoId, user, selectedRole)
|
||||
if (result.success) {
|
||||
handleMemberAdded()
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
const handleUpdateRole = async (memberId: string, newRole: "admin" | "member" | "owner") => {
|
||||
setUpdatingMember(memberId)
|
||||
setError(null)
|
||||
setSuccess(null)
|
||||
|
||||
const result = await updateRepositoryMemberRole(currentUserId, repoId, memberId, newRole)
|
||||
|
||||
if (result.success) {
|
||||
setSuccess("Member role updated successfully")
|
||||
setIsRefreshing(true)
|
||||
await fetchMembers()
|
||||
} else {
|
||||
setError(result.error || "Failed to update role")
|
||||
}
|
||||
|
||||
setUpdatingMember(null)
|
||||
}
|
||||
|
||||
const handleRemoveMember = async (memberId: string, memberUsername: string) => {
|
||||
setConfirmDialog({ open: true, memberId, memberUsername })
|
||||
}
|
||||
|
||||
const confirmRemoveMember = async () => {
|
||||
if (!confirmDialog) return
|
||||
|
||||
const { memberId, memberUsername } = confirmDialog
|
||||
|
||||
setUpdatingMember(memberId)
|
||||
setError(null)
|
||||
setSuccess(null)
|
||||
|
||||
const result = await removeRepositoryMember(currentUserId, repoId, memberId)
|
||||
|
||||
if (result.success) {
|
||||
setSuccess(`${memberUsername} has been removed successfully`)
|
||||
setIsRefreshing(true)
|
||||
await fetchMembers()
|
||||
} else {
|
||||
setError(result.error || "Failed to remove member")
|
||||
}
|
||||
|
||||
setUpdatingMember(null)
|
||||
setConfirmDialog(null)
|
||||
}
|
||||
|
||||
if (loading) {
|
||||
return (
|
||||
<div className="bg-card rounded-2xl border border-border p-8 shadow-sm">
|
||||
<div className="flex flex-col items-center justify-center">
|
||||
<div className="relative">
|
||||
<div className="animate-spin rounded-full h-12 w-12 border-b-2 border-primary"></div>
|
||||
<div className="absolute inset-0 rounded-full border-2 border-primary/20"></div>
|
||||
</div>
|
||||
<p className="text-sm text-muted-foreground mt-4">Loading members...</p>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
const adminCount = members.filter(m => m.role === "admin" || m.role === "owner").length
|
||||
const memberCount = members.filter(m => m.role === "member").length
|
||||
|
||||
return (
|
||||
<>
|
||||
<div className="bg-card rounded-2xl border border-border shadow-sm overflow-hidden">
|
||||
<div className="p-6 border-b border-border bg-accent/20">
|
||||
<div className="flex items-center justify-between gap-4 mb-4">
|
||||
<div className="flex-1 min-w-0">
|
||||
<div className="flex items-center gap-3 mb-2">
|
||||
<div className="p-2 rounded-lg bg-primary/10">
|
||||
<Users size={20} className="text-primary" />
|
||||
</div>
|
||||
<h2 className="text-xl font-semibold text-foreground">Repository Members</h2>
|
||||
</div>
|
||||
<p className="text-sm text-muted-foreground">
|
||||
{members.length} {members.length === 1 ? "member" : "members"} • {adminCount}{" "}
|
||||
{adminCount === 1 ? "admin" : "admins"} • {memberCount}{" "}
|
||||
{memberCount === 1 ? "member" : "members"}
|
||||
</p>
|
||||
</div>
|
||||
{(isAdmin || isOnlyMember) && (
|
||||
<div className="flex items-center gap-3">
|
||||
<RoleSelector
|
||||
selectedRole={selectedRole}
|
||||
onChange={setSelectedRole}
|
||||
disabled={showAddModal}
|
||||
/>
|
||||
<button
|
||||
onClick={() => setShowAddModal(true)}
|
||||
disabled={showAddModal}
|
||||
className="flex items-center gap-2 px-4 py-2.5 bg-primary text-primary-foreground rounded-xl hover:bg-primary/90 disabled:opacity-50 disabled:cursor-not-allowed transition-all duration-200 text-sm font-medium whitespace-nowrap flex-shrink-0 shadow-sm hover:shadow-md"
|
||||
>
|
||||
<UserPlus size={16} />
|
||||
<span className="hidden sm:inline">Add Member</span>
|
||||
<span className="sm:hidden">Add</span>
|
||||
</button>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<MembersList
|
||||
members={members}
|
||||
currentUserId={currentUserId}
|
||||
isAdmin={isAdmin}
|
||||
updatingMember={updatingMember}
|
||||
error={error}
|
||||
success={success}
|
||||
searchQuery={searchQuery}
|
||||
filterRole={filterRole}
|
||||
onSearchChange={setSearchQuery}
|
||||
onFilterChange={setFilterRole}
|
||||
onUpdateRole={handleUpdateRole}
|
||||
onRemove={handleRemoveMember}
|
||||
onDismissError={() => setError(null)}
|
||||
onDismissSuccess={() => setSuccess(null)}
|
||||
/>
|
||||
</div>
|
||||
|
||||
<UserSearchModal
|
||||
isOpen={showAddModal}
|
||||
onClose={() => setShowAddModal(false)}
|
||||
onUserAdd={handleUserAdd}
|
||||
title={`Add Repository Member as ${selectedRole === "admin" ? "Admin" : "Member"}`}
|
||||
description="Search for GitHub users and add them to this repository"
|
||||
addButtonText={`Add as ${selectedRole === "admin" ? "Admin" : "Member"}`}
|
||||
/>
|
||||
|
||||
{/* Confirm Dialog */}
|
||||
<ConfirmDialog
|
||||
open={confirmDialog?.open || false}
|
||||
onOpenChange={open => !open && setConfirmDialog(null)}
|
||||
onConfirm={confirmRemoveMember}
|
||||
title="Remove Member"
|
||||
description={`Are you sure you want to remove ${confirmDialog?.memberUsername} from this repository? This action cannot be undone.`}
|
||||
confirmText="Remove"
|
||||
cancelText="Cancel"
|
||||
variant="destructive"
|
||||
/>
|
||||
</>
|
||||
)
|
||||
}
|
||||
|
||||
export interface RepoDetailStats {
|
||||
totalAttempts: number
|
||||
successfulAttempts: number
|
||||
optimizationsTrend: number[]
|
||||
optimizationsTrendDates: string[]
|
||||
successfulOptimizationsTrend: number[]
|
||||
successfulOptimizationsTrendDates: string[]
|
||||
prActivityData: Array<{
|
||||
month: string
|
||||
pr_created: number
|
||||
pr_merged: number
|
||||
pr_closed: number
|
||||
}>
|
||||
activeUsersData: { username: string; eventCount: number; avatarUrl: string }[]
|
||||
prYear: number
|
||||
}
|
||||
|
||||
export interface RepoDetailClientProps {
|
||||
repositoryId: string
|
||||
initialUserId: string
|
||||
initialOrgId: string | null
|
||||
initialRepository: RepositoryWithUsage
|
||||
initialStats: RepoDetailStats
|
||||
}
|
||||
|
||||
export function RepoDetailClient({
|
||||
repositoryId,
|
||||
initialUserId,
|
||||
initialOrgId,
|
||||
initialRepository,
|
||||
initialStats,
|
||||
}: RepoDetailClientProps) {
|
||||
const router = useRouter()
|
||||
const searchParams = useSearchParams()
|
||||
const { currentOrg } = useViewMode()
|
||||
const initialOrgIdRef = useRef(initialOrgId)
|
||||
|
||||
const [repository, setRepository] = useState<RepositoryWithUsage | null>(initialRepository)
|
||||
const [currentUserId] = useState<string>(initialUserId)
|
||||
const [loading, setLoading] = useState(false)
|
||||
const [error, setError] = useState<string | null>(null)
|
||||
const [retryCount, setRetryCount] = useState(0)
|
||||
const maxRetries = 3
|
||||
|
||||
const tabFromUrl = (searchParams.get("tab") as "statistics" | "members") || "statistics"
|
||||
const [activeTab, setActiveTab] = useState<"statistics" | "members">(
|
||||
currentOrg ? "statistics" : tabFromUrl,
|
||||
)
|
||||
|
||||
const [optimizationStats, setOptimizationStats] = useState({
|
||||
totalAttempts: initialStats.totalAttempts,
|
||||
successfulAttempts: initialStats.successfulAttempts,
|
||||
})
|
||||
|
||||
const [prActivityData, setPrActivityData] = useState(initialStats.prActivityData)
|
||||
const [selectedPrYear, setSelectedPrYear] = useState<number>(initialStats.prYear)
|
||||
const [activeUsersData, setActiveUsersData] = useState(initialStats.activeUsersData)
|
||||
|
||||
const [optimizationsTrend, setOptimizationsTrend] = useState(initialStats.optimizationsTrend)
|
||||
const [optimizationsTrendDates, setOptimizationsTrendDates] = useState(
|
||||
initialStats.optimizationsTrendDates,
|
||||
)
|
||||
const [successfulOptimizationsTrend, setSuccessfulOptimizationsTrend] = useState(
|
||||
initialStats.successfulOptimizationsTrend,
|
||||
)
|
||||
const [successfulOptimizationsTrendDates, setSuccessfulOptimizationsTrendDates] = useState(
|
||||
initialStats.successfulOptimizationsTrendDates,
|
||||
)
|
||||
const [isMobile, setIsMobile] = useState<boolean>(false)
|
||||
|
||||
useEffect(() => {
|
||||
const handleResize = () => {
|
||||
setIsMobile(window.innerWidth < 640)
|
||||
}
|
||||
|
||||
if (typeof window !== "undefined") {
|
||||
handleResize()
|
||||
window.addEventListener("resize", handleResize)
|
||||
return () => window.removeEventListener("resize", handleResize)
|
||||
}
|
||||
}, [])
|
||||
|
||||
useEffect(() => {
|
||||
if (currentOrg) {
|
||||
setActiveTab("statistics")
|
||||
}
|
||||
}, [currentOrg])
|
||||
|
||||
const handleTabChange = (tab: "statistics" | "members") => {
|
||||
if (currentOrg) return
|
||||
|
||||
setActiveTab(tab)
|
||||
const url = new URL(window.location.href)
|
||||
url.searchParams.set("tab", tab)
|
||||
router.push(url.pathname + url.search, { scroll: false })
|
||||
}
|
||||
|
||||
const fetchRepositoryData = useCallback(
|
||||
async (attempt = 0) => {
|
||||
try {
|
||||
setLoading(attempt === 0)
|
||||
setError(null)
|
||||
|
||||
if (attempt > 0) {
|
||||
await new Promise(resolve => setTimeout(resolve, Math.pow(2, attempt) * 1000))
|
||||
}
|
||||
|
||||
const payload: AccountPayload = currentOrg
|
||||
? { orgId: currentOrg.id }
|
||||
: { userId: currentUserId, username: "" }
|
||||
|
||||
const currentRepo = await getRepositoryById(payload, repositoryId)
|
||||
|
||||
if (!currentRepo) {
|
||||
throw new Error("Repository not found")
|
||||
}
|
||||
|
||||
setRepository(currentRepo)
|
||||
|
||||
// Fetch all statistics in parallel - these are all independent queries
|
||||
// Use the combined count query (single SQL) instead of two separate COUNT calls
|
||||
const [
|
||||
counts,
|
||||
optimizationsOverTime,
|
||||
successfulOptimizationsOverTime,
|
||||
prData,
|
||||
leaderboardData,
|
||||
] = await Promise.all([
|
||||
getOptimizationCountsByRepo(repositoryId),
|
||||
getOptimizationsTimeSeriesData(repositoryId, false),
|
||||
getOptimizationsTimeSeriesData(repositoryId, true),
|
||||
getPullRequestEventTimeSeriesData(selectedPrYear, repositoryId),
|
||||
getActiveUserLeaderboardLast30DaysForRepo(repositoryId),
|
||||
])
|
||||
|
||||
const totalAttempts = counts.total
|
||||
const successfulAttempts = counts.successful
|
||||
|
||||
if (Array.isArray(optimizationsOverTime) && optimizationsOverTime.length > 0) {
|
||||
setOptimizationsTrend(optimizationsOverTime.map(item => item?.count || 0))
|
||||
setOptimizationsTrendDates(optimizationsOverTime.map(item => item?.date || ""))
|
||||
} else {
|
||||
setOptimizationsTrend([])
|
||||
setOptimizationsTrendDates([])
|
||||
}
|
||||
|
||||
if (
|
||||
Array.isArray(successfulOptimizationsOverTime) &&
|
||||
successfulOptimizationsOverTime.length > 0
|
||||
) {
|
||||
setSuccessfulOptimizationsTrend(
|
||||
successfulOptimizationsOverTime.map(item => item?.count || 0),
|
||||
)
|
||||
setSuccessfulOptimizationsTrendDates(
|
||||
successfulOptimizationsOverTime.map(item => item?.date || ""),
|
||||
)
|
||||
} else {
|
||||
setSuccessfulOptimizationsTrend([])
|
||||
setSuccessfulOptimizationsTrendDates([])
|
||||
}
|
||||
|
||||
setPrActivityData(Array.isArray(prData) ? prData : [])
|
||||
setActiveUsersData(Array.isArray(leaderboardData) ? leaderboardData : [])
|
||||
setOptimizationStats({ totalAttempts, successfulAttempts })
|
||||
setRetryCount(0)
|
||||
} catch (err) {
|
||||
console.error(`Failed to fetch repository data (attempt ${attempt + 1}):`, err)
|
||||
|
||||
if (
|
||||
attempt < maxRetries &&
|
||||
err instanceof Error &&
|
||||
(err.message.includes("authentication") ||
|
||||
err.message.includes("User authentication data not found") ||
|
||||
err.message.includes("Unauthorized") ||
|
||||
err.message.includes("No valid session found"))
|
||||
) {
|
||||
setRetryCount(attempt + 1)
|
||||
return fetchRepositoryData(attempt + 1)
|
||||
}
|
||||
|
||||
setError(
|
||||
err instanceof Error && err.message === "Repository not found"
|
||||
? "Repository not found"
|
||||
: "Failed to load repository data. Please try again later.",
|
||||
)
|
||||
} finally {
|
||||
setLoading(false)
|
||||
}
|
||||
},
|
||||
[maxRetries, selectedPrYear, repositoryId, currentOrg, currentUserId],
|
||||
)
|
||||
|
||||
// Only refetch when org changes from what the server provided, or when prYear changes
|
||||
useEffect(() => {
|
||||
const currentOrgId = currentOrg?.id ?? null
|
||||
if (currentOrgId === initialOrgIdRef.current) return
|
||||
initialOrgIdRef.current = currentOrgId
|
||||
fetchRepositoryData()
|
||||
}, [currentOrg?.id, fetchRepositoryData])
|
||||
|
||||
// Refetch PR data when year changes
|
||||
const initialPrYearRef = useRef(initialStats.prYear)
|
||||
useEffect(() => {
|
||||
if (selectedPrYear === initialPrYearRef.current) return
|
||||
initialPrYearRef.current = selectedPrYear
|
||||
getPullRequestEventTimeSeriesData(selectedPrYear, repositoryId).then(prData => {
|
||||
setPrActivityData(Array.isArray(prData) ? prData : [])
|
||||
})
|
||||
}, [selectedPrYear, repositoryId])
|
||||
|
||||
const now = useMemo(() => new Date(), [])
|
||||
const last30DaysStart = subDays(now, 30)
|
||||
|
||||
const dateRangeDisplay = useMemo(() => {
|
||||
const startMonth = format(last30DaysStart, "MMMM")
|
||||
const endMonth = format(now, "MMMM")
|
||||
const startYear = format(last30DaysStart, "yyyy")
|
||||
const endYear = format(now, "yyyy")
|
||||
|
||||
if (startMonth === endMonth && startYear === endYear) {
|
||||
return `${startMonth} ${format(last30DaysStart, "d")}-${format(now, "d")}, ${startYear}`
|
||||
} else if (startYear === endYear) {
|
||||
return `${format(last30DaysStart, "MMMM d")} - ${format(now, "MMMM d")}, ${startYear}`
|
||||
} else {
|
||||
return `${format(last30DaysStart, "MMMM d, yyyy")} - ${format(now, "MMMM d, yyyy")}`
|
||||
}
|
||||
}, [last30DaysStart, now])
|
||||
|
||||
if (loading) {
|
||||
return <RepositoryDetailSkeleton showTabNavigation={!currentOrg} />
|
||||
}
|
||||
|
||||
if (error) {
|
||||
return (
|
||||
<div className="flex justify-center items-center min-h-[70vh] p-4">
|
||||
<div className="bg-destructive/10 border border-destructive/20 text-destructive p-6 sm:p-8 rounded-2xl w-full max-w-md shadow-lg">
|
||||
<div className="inline-flex items-center justify-center w-12 h-12 rounded-full bg-destructive/20 mb-4">
|
||||
<AlertCircle size={24} />
|
||||
</div>
|
||||
<h3 className="text-base sm:text-lg font-semibold mb-2">Unable to Load Repository</h3>
|
||||
<p className="mb-4 text-sm sm:text-base opacity-90">{error}</p>
|
||||
{retryCount > 0 && (
|
||||
<p className="mb-4 text-xs opacity-75">
|
||||
Retry attempt: {retryCount}/{maxRetries}
|
||||
</p>
|
||||
)}
|
||||
<button
|
||||
onClick={() => fetchRepositoryData()}
|
||||
className="flex items-center gap-2 w-full justify-center px-4 py-2.5 bg-destructive hover:bg-destructive/90 text-destructive-foreground rounded-xl text-sm font-medium transition-all shadow-sm hover:shadow-md"
|
||||
>
|
||||
<RefreshCw size={16} /> Try Again
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
if (!repository) {
|
||||
return (
|
||||
<div className="flex justify-center items-center min-h-[70vh] p-4">
|
||||
<div className="text-center">
|
||||
<div className="inline-flex items-center justify-center w-16 h-16 rounded-full bg-accent mb-4">
|
||||
<GitPullRequest size={32} className="text-muted-foreground" />
|
||||
</div>
|
||||
<h3 className="text-lg font-semibold mb-2 text-foreground">Repository not found</h3>
|
||||
<p className="text-sm text-muted-foreground">
|
||||
The repository you're looking for doesn't exist or you don't have access
|
||||
to it.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="flex-1 bg-background">
|
||||
<div className="h-screen py-6 sm:py-8 px-4 sm:px-6 max-w-[1400px] mx-auto">
|
||||
<RepositoryHeader repository={repository} />
|
||||
|
||||
{!currentOrg && <TabNavigation activeTab={activeTab} onTabChange={handleTabChange} />}
|
||||
|
||||
{currentOrg || activeTab === "statistics" ? (
|
||||
<StatisticsTab
|
||||
optimizationStats={optimizationStats}
|
||||
optimizationsTrend={optimizationsTrend}
|
||||
optimizationsTrendDates={optimizationsTrendDates}
|
||||
successfulOptimizationsTrend={successfulOptimizationsTrend}
|
||||
successfulOptimizationsTrendDates={successfulOptimizationsTrendDates}
|
||||
prActivityData={prActivityData}
|
||||
selectedPrYear={selectedPrYear}
|
||||
setSelectedPrYear={setSelectedPrYear}
|
||||
activeUsersData={activeUsersData}
|
||||
dateRangeDisplay={dateRangeDisplay}
|
||||
isMobile={isMobile}
|
||||
repositoryId={repositoryId}
|
||||
/>
|
||||
) : (
|
||||
<MembersTab repoId={repositoryId} currentUserId={currentUserId} />
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
|
@ -7,7 +7,6 @@ import { auth0 } from "@/lib/auth0"
|
|||
import { AccountPayload, buildOptimizationOrCondition, prisma } from "@codeflash-ai/common"
|
||||
import * as Sentry from "@sentry/nextjs"
|
||||
import { trackOptimizationReviewed } from "@/lib/analytics/tracking"
|
||||
import { cookies } from "next/headers"
|
||||
|
||||
export interface DiffContent {
|
||||
oldContent: string
|
||||
|
|
@ -32,9 +31,7 @@ export interface GetStagingCodeParams {
|
|||
filePath?: string
|
||||
}
|
||||
|
||||
export async function getStagingCodeFromApi(
|
||||
params: GetStagingCodeParams,
|
||||
): Promise<ActionResponse<StagingCodeResponse>> {
|
||||
export async function getStagingCodeFromApi(params: GetStagingCodeParams): Promise<ActionResponse<StagingCodeResponse>> {
|
||||
const cfapiUrl = process.env.CODEFLASH_CFAPI_URL
|
||||
const session = await auth0.getAccessToken()
|
||||
|
||||
|
|
@ -166,9 +163,7 @@ export async function getOptimizationEventById({
|
|||
prisma.optimization_events.findFirst({
|
||||
where,
|
||||
include: {
|
||||
repository: {
|
||||
select: { id: true, full_name: true, name: true, installation_id: true },
|
||||
},
|
||||
repository: true,
|
||||
},
|
||||
}),
|
||||
prisma.optimization_features.findUnique({
|
||||
|
|
@ -213,8 +208,9 @@ export async function saveOptimizationChanges({
|
|||
}) {
|
||||
try {
|
||||
const currentEvent = await prisma.optimization_events.findUnique({
|
||||
where: { id: eventId },
|
||||
select: { metadata: true },
|
||||
where: {
|
||||
id: eventId,
|
||||
},
|
||||
})
|
||||
|
||||
if (!currentEvent) {
|
||||
|
|
@ -468,69 +464,3 @@ export async function getCommentsByEvent(eventId: string) {
|
|||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Server-side function to fetch all data needed for the review page in parallel.
|
||||
* Called from the server component to eliminate the client-side data-fetching waterfall.
|
||||
*/
|
||||
export async function getReviewPageInitData(traceId: string) {
|
||||
const session = await auth0.getSession()
|
||||
if (!session?.user?.sub || !session?.user?.nickname) {
|
||||
return null
|
||||
}
|
||||
|
||||
const userId = session.user.sub
|
||||
const username = session.user.nickname
|
||||
|
||||
// Read org cookie to determine payload
|
||||
const cookieStore = await cookies()
|
||||
const orgId = cookieStore.get("currentOrganizationId")?.value
|
||||
|
||||
const payload: AccountPayload = orgId ? { orgId } : { userId, username }
|
||||
|
||||
// Fetch the optimization event
|
||||
const event = await getOptimizationEventById({ payload, trace_id: traceId })
|
||||
if (!event) {
|
||||
return { userId, username, event: null, comments: [], stagingCode: null }
|
||||
}
|
||||
|
||||
// If git_branch storage, fetch staging code + comments in parallel
|
||||
const metadata = (event.metadata as any) || {}
|
||||
if (event.staging_storage_type === "git_branch") {
|
||||
const stagingBranchName = metadata.staging_branch_name
|
||||
const repository = event.repository
|
||||
|
||||
if (stagingBranchName && repository?.full_name && repository?.installation_id) {
|
||||
const [stagingCodeResult, commentsResult] = await Promise.all([
|
||||
getStagingCodeFromApi({
|
||||
stagingBranchName,
|
||||
baseBranch: event.baseBranch || "main",
|
||||
fullRepoName: repository.full_name,
|
||||
installationId: repository.installation_id,
|
||||
functionName: event.function_name || undefined,
|
||||
filePath: event.file_path || undefined,
|
||||
}),
|
||||
getCommentsByEvent(event.id),
|
||||
])
|
||||
|
||||
return {
|
||||
userId,
|
||||
username,
|
||||
event,
|
||||
comments: commentsResult.success ? (commentsResult.comments ?? []) : [],
|
||||
stagingCode: stagingCodeResult.success ? (stagingCodeResult.data ?? null) : null,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// For plain_text storage, just fetch comments
|
||||
const commentsResult = await getCommentsByEvent(event.id)
|
||||
|
||||
return {
|
||||
userId,
|
||||
username,
|
||||
event,
|
||||
comments: commentsResult.success ? (commentsResult.comments ?? []) : [],
|
||||
stagingCode: null,
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,39 +0,0 @@
|
|||
import { Skeleton } from "@/components/ui/skeleton"
|
||||
|
||||
export default function OptimizationReviewLoading() {
|
||||
return (
|
||||
<div className="py-6 sm:py-8 px-4 sm:px-6 max-w-[1400px] mx-auto">
|
||||
{/* Header */}
|
||||
<div className="mb-6">
|
||||
<div className="flex items-center gap-3 mb-4">
|
||||
<Skeleton className="h-8 w-8 rounded-md" />
|
||||
<Skeleton className="h-7 w-64" />
|
||||
</div>
|
||||
<div className="flex items-center gap-4">
|
||||
<Skeleton className="h-5 w-40" />
|
||||
<Skeleton className="h-6 w-20 rounded-full" />
|
||||
<Skeleton className="h-6 w-24 rounded-full" />
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Stats cards */}
|
||||
<div className="grid grid-cols-2 md:grid-cols-4 gap-4 mb-6">
|
||||
{Array.from({ length: 4 }).map((_, i) => (
|
||||
<div key={i} className="bg-card rounded-xl border border-border p-4">
|
||||
<Skeleton className="h-4 w-24 mb-2" />
|
||||
<Skeleton className="h-8 w-16" />
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
|
||||
{/* Code diff area */}
|
||||
<div className="bg-card rounded-xl border border-border p-6">
|
||||
<div className="flex items-center gap-3 mb-4">
|
||||
<Skeleton className="h-5 w-32" />
|
||||
<Skeleton className="h-8 w-24 rounded-md" />
|
||||
</div>
|
||||
<Skeleton className="h-[400px] w-full rounded-md" />
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
|
@ -1,34 +1,974 @@
|
|||
import { notFound } from "next/navigation"
|
||||
import { getReviewPageInitData } from "./action"
|
||||
import { OptimizationReviewClient } from "./review-client"
|
||||
"use client"
|
||||
|
||||
interface ReviewPageProps {
|
||||
params: Promise<{ traceId: string }>
|
||||
import { useEffect, useState, useCallback, useRef } from "react"
|
||||
import { useParams, useRouter } from "next/navigation"
|
||||
import Image from "next/image"
|
||||
import {
|
||||
Zap,
|
||||
CheckCircle,
|
||||
XCircle,
|
||||
MessageSquare,
|
||||
Loader2,
|
||||
GitCommit,
|
||||
BarChart3,
|
||||
} from "lucide-react"
|
||||
import {
|
||||
createPullRequest,
|
||||
getOptimizationEventById,
|
||||
saveOptimizationChanges,
|
||||
setApprovalStatus,
|
||||
addComment,
|
||||
getCommentsByEvent,
|
||||
getStagingCodeFromApi,
|
||||
commitStagingCode,
|
||||
} from "./action"
|
||||
import { getUserIdAndUsername } from "@/app/utils/auth"
|
||||
import dynamic from "next/dynamic"
|
||||
|
||||
const MonacoDiffEditorGithub = dynamic(
|
||||
() => import("@/components/Editor/monaco-diff-editor-github"),
|
||||
{ ssr: false },
|
||||
)
|
||||
import { toast } from "sonner"
|
||||
import { MarkdownEditor } from "@/components/markdwon/markdown-editor"
|
||||
import { MarkdownViewer } from "@/components/markdwon/markdown-viewer"
|
||||
import { BaseBranchDialog } from "@/components/ui/base-branch-dialog"
|
||||
import { useViewMode } from "@/app/app/ViewModeContext"
|
||||
|
||||
// Interfaces
|
||||
interface Comment {
|
||||
id: string
|
||||
optimization_event_id: string
|
||||
author_user_id: string
|
||||
content: string
|
||||
created_at: Date
|
||||
author?: {
|
||||
user_id: string
|
||||
email: string
|
||||
name?: string
|
||||
github_username?: string
|
||||
}
|
||||
}
|
||||
|
||||
export default async function OptimizationReviewPage({ params }: ReviewPageProps) {
|
||||
const { traceId } = await params
|
||||
interface TestResults {
|
||||
passed: number
|
||||
failed: number
|
||||
}
|
||||
|
||||
const initData = await getReviewPageInitData(traceId)
|
||||
interface ReportTable {
|
||||
[key: string]: TestResults
|
||||
}
|
||||
|
||||
// No session — auth middleware will redirect
|
||||
if (!initData) {
|
||||
return null
|
||||
interface PRCommentFields {
|
||||
original_runtime?: string
|
||||
best_runtime?: string
|
||||
loop_count?: number
|
||||
optimization_explanation?: string
|
||||
report_table?: ReportTable
|
||||
}
|
||||
|
||||
interface EventMetadata {
|
||||
diffContents?: Record<string, DiffContent>
|
||||
prCommentFields?: PRCommentFields
|
||||
generatedTests?: string
|
||||
existingTests?: string
|
||||
lastModified?: string
|
||||
coverage_message?: string
|
||||
staging_storage_type?: "plain_text" | "git_branch"
|
||||
staging_branch_name?: string
|
||||
originalLineProfiler?: string
|
||||
optimizedLineProfiler?: string
|
||||
}
|
||||
|
||||
interface Repository {
|
||||
id: string
|
||||
name: string
|
||||
owner: string
|
||||
full_name: string
|
||||
installation_id?: number
|
||||
}
|
||||
|
||||
interface OptimizationEvent {
|
||||
id: string
|
||||
event_type: string
|
||||
user_id: string | null
|
||||
repository_id: string | null
|
||||
trace_id: string
|
||||
pr_id: string | null
|
||||
pr_url: string | null
|
||||
api_key_id: number | null
|
||||
metadata: EventMetadata
|
||||
is_optimization_found: boolean | null
|
||||
current_username: string | null
|
||||
function_name?: string | null
|
||||
file_path?: string | null
|
||||
speedup_x?: number | null
|
||||
speedup_pct?: number | null
|
||||
created_at: Date
|
||||
baseBranch?: string | null
|
||||
repository?: Repository | null
|
||||
status?: "approved" | "rejected" | null
|
||||
review_quality?: string | null
|
||||
review_explanation?: string | null
|
||||
staging_storage_type?: "plain_text" | "git_branch" | null
|
||||
}
|
||||
|
||||
interface RawOptimizationEvent {
|
||||
id: string
|
||||
event_type: string
|
||||
user_id: string | null
|
||||
repository_id: string | null
|
||||
trace_id: string
|
||||
pr_id: string | null
|
||||
pr_url: string | null
|
||||
api_key_id: number | null
|
||||
metadata: unknown
|
||||
is_optimization_found: boolean | null
|
||||
current_username: string | null
|
||||
function_name?: string | null
|
||||
file_path?: string | null
|
||||
speedup_x?: number | null
|
||||
speedup_pct?: number | null
|
||||
created_at: Date
|
||||
baseBranch?: string | null
|
||||
repository?: Repository | null
|
||||
status?: "approved" | "rejected" | null
|
||||
review_quality?: string | null
|
||||
review_explanation?: string | null
|
||||
staging_storage_type?: "plain_text" | "git_branch" | null
|
||||
}
|
||||
|
||||
interface DiffContent {
|
||||
oldContent: string
|
||||
newContent: string
|
||||
}
|
||||
|
||||
interface SaveOptimizationResult {
|
||||
success: boolean
|
||||
error?: string
|
||||
event?: RawOptimizationEvent
|
||||
}
|
||||
|
||||
export default function OptimizationReviewPage() {
|
||||
const params = useParams()
|
||||
const router = useRouter()
|
||||
const [event, setEvent] = useState<OptimizationEvent | null>(null)
|
||||
const [loading, setLoading] = useState(true)
|
||||
const [creatingPR, setCreatingPR] = useState(false)
|
||||
const [userId, setUserId] = useState<string>("")
|
||||
const [isUpdatingStatus, setIsUpdatingStatus] = useState(false)
|
||||
const [isCommitting, setIsCommitting] = useState(false)
|
||||
const [hasUnsavedChanges, setHasUnsavedChanges] = useState(false)
|
||||
const saveQueueRef = useRef<Map<string, NodeJS.Timeout>>(new Map())
|
||||
const isLoadingRef = useRef(false)
|
||||
const pendingChangesRef = useRef<Record<string, string>>({})
|
||||
|
||||
// State for comments
|
||||
const [comments, setComments] = useState<Comment[]>([])
|
||||
const [newComment, setNewComment] = useState("")
|
||||
const [isSubmittingComment, setIsSubmittingComment] = useState(false)
|
||||
const [loadingComments, setLoadingComments] = useState(false)
|
||||
const [showCommentsSection, setShowCommentsSection] = useState(false)
|
||||
const { currentOrg } = useViewMode()
|
||||
|
||||
// State for base branch dialog
|
||||
const [showBaseBranchDialog, setShowBaseBranchDialog] = useState(false)
|
||||
|
||||
const currentOrgId = currentOrg?.id
|
||||
|
||||
useEffect(() => {
|
||||
// Prevent concurrent calls
|
||||
if (isLoadingRef.current) {
|
||||
return
|
||||
}
|
||||
|
||||
async function loadEvent() {
|
||||
isLoadingRef.current = true
|
||||
try {
|
||||
const userSession = (await getUserIdAndUsername()) ?? ""
|
||||
setUserId(userSession.userId)
|
||||
|
||||
const data = await getOptimizationEventById({
|
||||
payload: currentOrgId
|
||||
? { orgId: currentOrgId }
|
||||
: { userId: userSession.userId, username: userSession.username },
|
||||
trace_id: params.traceId as string,
|
||||
})
|
||||
if (data) {
|
||||
const rawData = data as unknown as RawOptimizationEvent
|
||||
let metadata = rawData.metadata as EventMetadata
|
||||
|
||||
// If staging_storage_type is git_branch, fetch code from cf-api IN PARALLEL with comments
|
||||
if (rawData.staging_storage_type === "git_branch") {
|
||||
// Extract staging info from metadata and repository
|
||||
const eventMetadata = rawData.metadata as EventMetadata
|
||||
const stagingBranchName = eventMetadata?.staging_branch_name
|
||||
const repository = rawData.repository
|
||||
|
||||
if (!stagingBranchName || !repository?.full_name || !repository?.installation_id) {
|
||||
console.error("Missing staging info:", { stagingBranchName, repository })
|
||||
toast.error("Missing staging branch information")
|
||||
setEvent(null)
|
||||
return
|
||||
}
|
||||
|
||||
// Start both requests in parallel for better performance
|
||||
const [stagingCodeResult] = await Promise.all([
|
||||
getStagingCodeFromApi({
|
||||
stagingBranchName,
|
||||
baseBranch: rawData.baseBranch || "main",
|
||||
fullRepoName: repository.full_name,
|
||||
installationId: repository.installation_id,
|
||||
functionName: rawData.function_name || undefined,
|
||||
filePath: rawData.file_path || undefined,
|
||||
}),
|
||||
loadComments(data.id),
|
||||
])
|
||||
|
||||
if (stagingCodeResult.success && stagingCodeResult.data) {
|
||||
const diffContentsResult = stagingCodeResult.data.diffContents
|
||||
const isDiffEmpty =
|
||||
!diffContentsResult || Object.keys(diffContentsResult).length === 0
|
||||
|
||||
if (!isDiffEmpty) {
|
||||
metadata = {
|
||||
...metadata,
|
||||
diffContents: diffContentsResult,
|
||||
staging_storage_type: "git_branch",
|
||||
staging_branch_name: stagingCodeResult.data.stagingBranchName,
|
||||
}
|
||||
}
|
||||
// If diff is empty, we just proceed without setting diffContents
|
||||
// The editor will handle showing "no changes" state
|
||||
} else {
|
||||
console.error("Failed to fetch staging code:", stagingCodeResult.error)
|
||||
toast.error(stagingCodeResult.error || "Failed to fetch staging code from repository")
|
||||
}
|
||||
|
||||
const transformedData: OptimizationEvent = {
|
||||
...rawData,
|
||||
metadata,
|
||||
function_name: rawData.function_name || null,
|
||||
file_path: rawData.file_path || null,
|
||||
speedup_x: rawData.speedup_x || null,
|
||||
speedup_pct: rawData.speedup_pct || null,
|
||||
baseBranch: rawData.baseBranch || undefined,
|
||||
repository: rawData.repository || null,
|
||||
status: rawData.status || null,
|
||||
review_quality: rawData.review_quality || null,
|
||||
review_explanation: rawData.review_explanation || null,
|
||||
staging_storage_type: rawData.staging_storage_type || null,
|
||||
}
|
||||
setEvent(transformedData)
|
||||
} else {
|
||||
// For plain_text storage, load comments after setting event
|
||||
const transformedData: OptimizationEvent = {
|
||||
...rawData,
|
||||
metadata,
|
||||
function_name: rawData.function_name || null,
|
||||
file_path: rawData.file_path || null,
|
||||
speedup_x: rawData.speedup_x || null,
|
||||
speedup_pct: rawData.speedup_pct || null,
|
||||
baseBranch: rawData.baseBranch || undefined,
|
||||
repository: rawData.repository || null,
|
||||
status: rawData.status || null,
|
||||
review_quality: rawData.review_quality || null,
|
||||
review_explanation: rawData.review_explanation || null,
|
||||
staging_storage_type: rawData.staging_storage_type || null,
|
||||
}
|
||||
setEvent(transformedData)
|
||||
|
||||
// Load comments
|
||||
await loadComments(data.id)
|
||||
}
|
||||
} else {
|
||||
setEvent(null)
|
||||
}
|
||||
} catch (error) {
|
||||
console.error("Failed to load optimization event:", error)
|
||||
toast.error("Failed to load optimization event")
|
||||
} finally {
|
||||
setLoading(false)
|
||||
isLoadingRef.current = false
|
||||
}
|
||||
}
|
||||
loadEvent()
|
||||
}, [params.traceId, currentOrgId])
|
||||
|
||||
const loadComments = async (eventId: string) => {
|
||||
setLoadingComments(true)
|
||||
try {
|
||||
const result = await getCommentsByEvent(eventId)
|
||||
if (result.success && result.comments) {
|
||||
setComments(result.comments as Comment[])
|
||||
}
|
||||
} catch (error) {
|
||||
console.error("Failed to load comments:", error)
|
||||
} finally {
|
||||
setLoadingComments(false)
|
||||
}
|
||||
}
|
||||
|
||||
// Event not found
|
||||
if (!initData.event) {
|
||||
notFound()
|
||||
// Cleanup save queue on unmount
|
||||
useEffect(() => {
|
||||
const saveQueue = saveQueueRef.current
|
||||
return () => {
|
||||
saveQueue.forEach(timeout => clearTimeout(timeout))
|
||||
saveQueue.clear()
|
||||
}
|
||||
}, [])
|
||||
|
||||
const handleContentChange = (filePath: string, newContent: string) => {
|
||||
if (event && event.metadata.diffContents) {
|
||||
const updatedEvent = {
|
||||
...event,
|
||||
metadata: {
|
||||
...event.metadata,
|
||||
diffContents: {
|
||||
...event.metadata.diffContents,
|
||||
[filePath]: {
|
||||
...event.metadata.diffContents[filePath],
|
||||
newContent: newContent,
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
setEvent(updatedEvent)
|
||||
|
||||
// For git_branch storage, track pending changes for manual commit
|
||||
if (event.staging_storage_type === "git_branch") {
|
||||
pendingChangesRef.current[filePath] = newContent
|
||||
setHasUnsavedChanges(true)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Handle committing changes to git branch
|
||||
const handleCommitChanges = async () => {
|
||||
if (!event || !hasUnsavedChanges || Object.keys(pendingChangesRef.current).length === 0) {
|
||||
return
|
||||
}
|
||||
|
||||
setIsCommitting(true)
|
||||
try {
|
||||
const result = await commitStagingCode(
|
||||
event.trace_id,
|
||||
pendingChangesRef.current,
|
||||
`Update optimized code for ${event.function_name || "function"}`,
|
||||
)
|
||||
|
||||
if (result.success) {
|
||||
toast.success("Changes committed successfully!", {
|
||||
description: `Commit SHA: ${result.data?.commitSha?.substring(0, 7)}`,
|
||||
})
|
||||
pendingChangesRef.current = {}
|
||||
setHasUnsavedChanges(false)
|
||||
} else {
|
||||
toast.error(result.error || "Failed to commit changes")
|
||||
}
|
||||
} catch (error) {
|
||||
console.error("Error committing changes:", error)
|
||||
toast.error("Failed to commit changes")
|
||||
} finally {
|
||||
setIsCommitting(false)
|
||||
}
|
||||
}
|
||||
|
||||
// Handle autosave edits with database persistence (only for plain_text storage)
|
||||
const handleEdit = useCallback(
|
||||
async (filePath: string, newContent: string) => {
|
||||
if (!event || !userId) return
|
||||
|
||||
// Skip autosave for git_branch storage - use manual commit instead
|
||||
if (event.staging_storage_type === "git_branch") {
|
||||
return
|
||||
}
|
||||
|
||||
try {
|
||||
const existingTimeout = saveQueueRef.current.get(filePath)
|
||||
if (existingTimeout) {
|
||||
clearTimeout(existingTimeout)
|
||||
}
|
||||
|
||||
const timeoutId = setTimeout(async () => {
|
||||
try {
|
||||
const result = (await saveOptimizationChanges({
|
||||
userId,
|
||||
eventId: event.id,
|
||||
filePath,
|
||||
newContent,
|
||||
})) as SaveOptimizationResult
|
||||
|
||||
if (result.success) {
|
||||
console.log(`Successfully saved ${filePath} to database`)
|
||||
if (result.event) {
|
||||
const transformedData: OptimizationEvent = {
|
||||
...result.event,
|
||||
metadata: result.event.metadata as EventMetadata,
|
||||
function_name: result.event.function_name || null,
|
||||
file_path: result.event.file_path || null,
|
||||
speedup_x: result.event.speedup_x || null,
|
||||
speedup_pct: result.event.speedup_pct || null,
|
||||
created_at: result.event.created_at,
|
||||
status: result.event.status || null,
|
||||
repository: result.event.repository || event.repository || null,
|
||||
}
|
||||
setEvent(transformedData)
|
||||
}
|
||||
} else {
|
||||
console.error(`Failed to save ${filePath}:`, result.error)
|
||||
toast.error(`Failed to save changes: ${result.error}`)
|
||||
}
|
||||
} catch (error) {
|
||||
console.error(`Error saving ${filePath}:`, error)
|
||||
toast.error("Failed to save changes")
|
||||
} finally {
|
||||
saveQueueRef.current.delete(filePath)
|
||||
}
|
||||
}, 100)
|
||||
|
||||
saveQueueRef.current.set(filePath, timeoutId)
|
||||
} catch (error) {
|
||||
console.error("Error in handleEdit:", error)
|
||||
}
|
||||
},
|
||||
[event, userId],
|
||||
)
|
||||
|
||||
const handleSubmitReview = async (status: "approved" | "rejected") => {
|
||||
if (!event || !userId) return
|
||||
|
||||
setIsUpdatingStatus(true)
|
||||
|
||||
try {
|
||||
const result = await setApprovalStatus(event.id, status)
|
||||
|
||||
if (result.success) {
|
||||
setEvent(prev => (prev ? { ...prev, status } : null))
|
||||
} else {
|
||||
throw new Error(result.error || `Failed to ${status} optimization`)
|
||||
}
|
||||
} catch {
|
||||
toast.error("Failed to submit review")
|
||||
} finally {
|
||||
setIsUpdatingStatus(false)
|
||||
}
|
||||
}
|
||||
|
||||
const handleAddComment = async () => {
|
||||
if (!event || !userId || !newComment.trim()) return
|
||||
|
||||
setIsSubmittingComment(true)
|
||||
try {
|
||||
const commentResult = await addComment({
|
||||
eventId: event.id,
|
||||
userId,
|
||||
content: newComment.trim(),
|
||||
})
|
||||
|
||||
if (!commentResult.success) {
|
||||
throw new Error(commentResult.error || "Failed to add comment")
|
||||
}
|
||||
|
||||
// Reload comments and clear input
|
||||
await loadComments(event.id)
|
||||
setNewComment("")
|
||||
} catch {
|
||||
toast.error("Failed to add comment")
|
||||
} finally {
|
||||
setIsSubmittingComment(false)
|
||||
}
|
||||
}
|
||||
|
||||
const handleOpenBaseBranchDialog = () => {
|
||||
setShowBaseBranchDialog(true)
|
||||
}
|
||||
|
||||
const handleBaseBranchConfirm = async (branchName: string) => {
|
||||
setShowBaseBranchDialog(false)
|
||||
|
||||
// Update the event with the new base branch
|
||||
if (event) {
|
||||
setEvent(prev => (prev ? { ...prev, baseBranch: branchName } : null))
|
||||
}
|
||||
|
||||
// Small delay to ensure state is updated
|
||||
setTimeout(() => {
|
||||
handleCreatePR(branchName)
|
||||
}, 100)
|
||||
}
|
||||
|
||||
const handleCreatePR = async (customBaseBranch?: string) => {
|
||||
if (!event || !event.trace_id || !event.metadata.diffContents) {
|
||||
toast.error("Missing required data to create PR")
|
||||
return
|
||||
}
|
||||
|
||||
setCreatingPR(true)
|
||||
try {
|
||||
const speedupX = event.speedup_x ? `${event.speedup_x.toFixed(2)}x` : "N/A"
|
||||
const speedupPct = event.speedup_pct ? `${event.speedup_pct.toLocaleString()}%` : "N/A"
|
||||
|
||||
const result = await createPullRequest({
|
||||
traceId: event.trace_id,
|
||||
diffContents: event.metadata.diffContents,
|
||||
prCommentFields: event.metadata.prCommentFields,
|
||||
generatedTests: event.metadata.generatedTests,
|
||||
existingTests: event.metadata.existingTests,
|
||||
functionName: event.function_name || undefined,
|
||||
filePath: event.file_path || undefined,
|
||||
speedupX: speedupX,
|
||||
speedupPct: speedupPct,
|
||||
baseBranch: customBaseBranch || event.baseBranch || undefined,
|
||||
full_repo_name: event.repository?.full_name,
|
||||
coverage_message: event.metadata.coverage_message,
|
||||
originalLineProfiler: event.metadata.originalLineProfiler,
|
||||
optimizedLineProfiler: event.metadata.optimizedLineProfiler,
|
||||
})
|
||||
|
||||
console.log("[handleCreatePR] Result from createPullRequest:", {
|
||||
success: result.success,
|
||||
data: result.data,
|
||||
dataType: typeof result.data,
|
||||
error: result.error,
|
||||
})
|
||||
|
||||
if (!result.success) {
|
||||
console.error("[handleCreatePR] Failed to create PR:", result.error)
|
||||
toast.error(result.error || "Failed to create pull request", {
|
||||
duration: 5000,
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
// Handle pending approval response (status 202)
|
||||
if (typeof result.data === "object" && result.data !== null) {
|
||||
const dataObj = result.data as { status?: string; message?: string }
|
||||
if (dataObj.status === "pending_approval") {
|
||||
console.log("[handleCreatePR] Pending approval response:", dataObj)
|
||||
toast.info(dataObj.message || "This optimization requires approval", {
|
||||
duration: 5000,
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
// If it's an object but not pending approval, something is wrong
|
||||
console.error("[handleCreatePR] Unexpected object response:", dataObj)
|
||||
toast.error("Failed to create pull request: Server returned unexpected response", {
|
||||
duration: 5000,
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
// Extract PR number - should be a number or string
|
||||
let prNumber: string | null = null
|
||||
if (typeof result.data === "number") {
|
||||
prNumber = String(result.data)
|
||||
} else if (typeof result.data === "string") {
|
||||
prNumber = result.data
|
||||
} else {
|
||||
console.error(
|
||||
"[handleCreatePR] Invalid data type. Expected number or string, got:",
|
||||
typeof result.data,
|
||||
result.data,
|
||||
)
|
||||
toast.error("Failed to create pull request: Invalid response from server", {
|
||||
duration: 5000,
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
console.log("[handleCreatePR] Successfully extracted PR number:", prNumber)
|
||||
|
||||
let constructedUrl = ""
|
||||
if (prNumber && event.repository?.full_name)
|
||||
constructedUrl = `https://github.com/${event.repository.full_name}/pull/${prNumber}`
|
||||
|
||||
// Update the event state with the new PR number
|
||||
if (prNumber) {
|
||||
setEvent(prev => (prev ? { ...prev, pr_url: constructedUrl } : null))
|
||||
}
|
||||
|
||||
// Show success toast with custom duration and description
|
||||
toast.success("Pull request created successfully!", {
|
||||
description: `PR #${prNumber || "new"} has been created. Opening GitHub...`,
|
||||
duration: 5000,
|
||||
action: {
|
||||
label: "Open PR",
|
||||
onClick: () => {
|
||||
if (constructedUrl) {
|
||||
window.open(constructedUrl, "_blank")
|
||||
}
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
// Delay opening the window to ensure toast is visible
|
||||
setTimeout(() => {
|
||||
if (constructedUrl) {
|
||||
window.open(constructedUrl, "_blank")
|
||||
}
|
||||
}, 1000)
|
||||
} catch (error: unknown) {
|
||||
console.error("[handleCreatePR] Exception:", error)
|
||||
const errorMessage = error instanceof Error ? error.message : "Failed to create pull request"
|
||||
toast.error(errorMessage, {
|
||||
duration: 5000,
|
||||
})
|
||||
} finally {
|
||||
setCreatingPR(false)
|
||||
}
|
||||
}
|
||||
|
||||
const handleViewPR = () => {
|
||||
if (!event?.pr_url) return
|
||||
window.open(event.pr_url, "_blank")
|
||||
}
|
||||
|
||||
const handleViewProfiler = () => {
|
||||
router.push(`/review-optimizations/${params.traceId}/profiler`)
|
||||
}
|
||||
|
||||
const formatTimeAgo = (date: Date) => {
|
||||
const seconds = Math.floor((new Date().getTime() - new Date(date).getTime()) / 1000)
|
||||
|
||||
if (seconds < 60) return "just now"
|
||||
if (seconds < 3600) return `${Math.floor(seconds / 60)}m ago`
|
||||
if (seconds < 86400) return `${Math.floor(seconds / 3600)}h ago`
|
||||
if (seconds < 2592000) return `${Math.floor(seconds / 86400)}d ago`
|
||||
return `${Math.floor(seconds / 2592000)}mo ago`
|
||||
}
|
||||
|
||||
if (loading) {
|
||||
return (
|
||||
<div className="min-h-screen flex items-center justify-center">
|
||||
<div className="animate-spin rounded-full h-12 w-12 border-b-2 border-primary"></div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
if (!event) {
|
||||
return (
|
||||
<div className="min-h-screen flex items-center justify-center p-4">
|
||||
<div className="text-center">
|
||||
<h2 className="text-2xl font-semibold mb-2">Event not found</h2>
|
||||
<p className="text-muted-foreground">
|
||||
The optimization event you're looking for doesn't exist.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
const metadata = event.metadata || {}
|
||||
const diffContents = metadata.diffContents || {}
|
||||
const prCommentFields = metadata.prCommentFields || {}
|
||||
|
||||
// Check if we have empty diffContents for git_branch storage type (merged PR in privacy mode)
|
||||
const isPrivacyModeWithNoDiff =
|
||||
event.staging_storage_type === "git_branch" && Object.keys(diffContents).length === 0
|
||||
|
||||
return (
|
||||
<OptimizationReviewClient
|
||||
traceId={traceId}
|
||||
initialUserId={initData.userId}
|
||||
initialUsername={initData.username}
|
||||
initialEvent={initData.event as any}
|
||||
initialComments={initData.comments as any}
|
||||
initialStagingCode={initData.stagingCode as any}
|
||||
/>
|
||||
<div className="min-h-screen bg-background">
|
||||
<div className="mx-auto">
|
||||
{/* Header */}
|
||||
<div className="px-4 py-2 bg-muted/30 border-b border-border">
|
||||
<div className="flex items-center justify-between">
|
||||
<div className="flex items-center gap-3">
|
||||
<Zap className="w-6 h-6 text-primary" />
|
||||
<h1 className="text-xl font-semibold">
|
||||
{event.function_name ? (
|
||||
<>
|
||||
Code Optimization -{" "}
|
||||
<code className="font-mono text-primary">{event.function_name}()</code>
|
||||
</>
|
||||
) : (
|
||||
"Code Optimization"
|
||||
)}
|
||||
</h1>
|
||||
{event.speedup_x && (
|
||||
<span className="flex items-center gap-2 rounded-md bg-gradient-to-r from-primary to-yellow-500 px-3 py-1 text-xs font-bold text-gray-900">
|
||||
<svg className="h-3.5 w-3.5" fill="currentColor" viewBox="0 0 24 24">
|
||||
<path d="M13 10V3L4 14h7v7l9-11h-7z" />
|
||||
</svg>
|
||||
{event.speedup_x.toFixed(2)}x faster
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<div className="flex items-center gap-2">
|
||||
{/* Performance Profile Button - Only show if profiler data exists */}
|
||||
{(metadata.originalLineProfiler || metadata.optimizedLineProfiler) && (
|
||||
<button
|
||||
onClick={handleViewProfiler}
|
||||
className="flex items-center gap-2 px-3 py-1.5 text-sm font-medium rounded-md
|
||||
bg-purple-100 text-purple-700 hover:bg-purple-200
|
||||
dark:bg-purple-900/30 dark:text-purple-300 dark:hover:bg-purple-900/50
|
||||
transition-all duration-200"
|
||||
title="View line-by-line performance profile"
|
||||
>
|
||||
<BarChart3 className="w-4 h-4" />
|
||||
<span>Performance Profile</span>
|
||||
</button>
|
||||
)}
|
||||
|
||||
{/* Comments Toggle Button with Count */}
|
||||
<button
|
||||
onClick={() => setShowCommentsSection(!showCommentsSection)}
|
||||
className={`
|
||||
relative p-1.5 rounded-md transition-all duration-200 flex items-center gap-1
|
||||
${showCommentsSection ? "bg-primary/10 text-foreground" : "hover:bg-muted text-foreground"}
|
||||
`}
|
||||
title={showCommentsSection ? "Hide comments panel" : "Show comments panel"}
|
||||
>
|
||||
<MessageSquare
|
||||
className={`
|
||||
w-4 h-4 transition-colors
|
||||
${showCommentsSection ? "text-primary" : "text-muted-foreground"}
|
||||
`}
|
||||
/>
|
||||
{comments.length > 0 && (
|
||||
<span
|
||||
className={`
|
||||
absolute -top-1 -right-1 min-w-[16px] h-4 flex items-center justify-center
|
||||
px-1 text-[10px] font-bold rounded-full transition-colors
|
||||
${
|
||||
showCommentsSection
|
||||
? "bg-primary text-primary-foreground"
|
||||
: "bg-muted-foreground text-background"
|
||||
}
|
||||
`}
|
||||
>
|
||||
{comments.length}
|
||||
</span>
|
||||
)}
|
||||
</button>
|
||||
|
||||
{/* Commit Button - Only for git_branch storage */}
|
||||
{event.staging_storage_type === "git_branch" && (
|
||||
<button
|
||||
onClick={handleCommitChanges}
|
||||
disabled={isCommitting || !hasUnsavedChanges}
|
||||
className={`
|
||||
flex items-center gap-2 px-3 py-1.5 text-sm font-medium rounded-md
|
||||
transition-all duration-200
|
||||
${
|
||||
isCommitting
|
||||
? "bg-muted text-muted-foreground cursor-not-allowed opacity-50"
|
||||
: hasUnsavedChanges
|
||||
? "bg-blue-600 text-white hover:bg-blue-700"
|
||||
: "bg-muted text-muted-foreground cursor-not-allowed"
|
||||
}
|
||||
`}
|
||||
title={
|
||||
hasUnsavedChanges ? "Commit changes to staging branch" : "No changes to commit"
|
||||
}
|
||||
>
|
||||
{isCommitting ? (
|
||||
<Loader2 className="w-4 h-4 animate-spin" />
|
||||
) : (
|
||||
<GitCommit className="w-4 h-4" />
|
||||
)}
|
||||
<span>{hasUnsavedChanges ? "Commit" : "Committed"}</span>
|
||||
</button>
|
||||
)}
|
||||
|
||||
{/* Approve Button */}
|
||||
<button
|
||||
onClick={() => handleSubmitReview("approved")}
|
||||
disabled={isUpdatingStatus || event.status === "approved"}
|
||||
className={`
|
||||
flex items-center gap-2 px-3 py-1.5 text-sm font-medium rounded-md
|
||||
transition-all duration-200
|
||||
${
|
||||
event.status === "approved"
|
||||
? "bg-green-600 text-white cursor-default"
|
||||
: isUpdatingStatus
|
||||
? "bg-muted text-muted-foreground cursor-not-allowed"
|
||||
: "bg-muted hover:bg-green-600 hover:text-white text-foreground"
|
||||
}
|
||||
${isUpdatingStatus ? "opacity-50" : ""}
|
||||
`}
|
||||
>
|
||||
{isUpdatingStatus ? (
|
||||
<Loader2 className="w-4 h-4 animate-spin" />
|
||||
) : (
|
||||
<CheckCircle className="w-4 h-4" />
|
||||
)}
|
||||
<span>Approve</span>
|
||||
</button>
|
||||
|
||||
{/* Reject Button */}
|
||||
<button
|
||||
onClick={() => handleSubmitReview("rejected")}
|
||||
disabled={isUpdatingStatus || event.status === "rejected"}
|
||||
className={`
|
||||
flex items-center gap-2 px-3 py-1.5 text-sm font-medium rounded-md
|
||||
transition-all duration-200
|
||||
${
|
||||
event.status === "rejected"
|
||||
? "bg-red-600 text-white cursor-default"
|
||||
: isUpdatingStatus
|
||||
? "bg-muted text-muted-foreground cursor-not-allowed"
|
||||
: "bg-muted hover:bg-red-600 hover:text-white text-foreground"
|
||||
}
|
||||
${isUpdatingStatus ? "opacity-50" : ""}
|
||||
`}
|
||||
>
|
||||
{isUpdatingStatus ? (
|
||||
<Loader2 className="w-4 h-4 animate-spin" />
|
||||
) : (
|
||||
<XCircle className="w-4 h-4" />
|
||||
)}
|
||||
<span>Reject</span>
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Main Content */}
|
||||
<div className="flex h-[calc(100vh-60px)] w-full overflow-hidden">
|
||||
{/* Editor Section */}
|
||||
<div className="flex-1">
|
||||
<MonacoDiffEditorGithub
|
||||
diffContents={diffContents}
|
||||
onContentChange={handleContentChange}
|
||||
onEdit={handleEdit}
|
||||
optimizationInfo={{
|
||||
speedup_x: event.speedup_x || undefined,
|
||||
speedup_pct: event.speedup_pct || undefined,
|
||||
prCommentFields: prCommentFields,
|
||||
generatedTests: metadata.generatedTests,
|
||||
coverage_message: metadata.coverage_message,
|
||||
review_explanation: event.review_explanation,
|
||||
review_quality: event.review_quality,
|
||||
}}
|
||||
functionName={event.function_name || undefined}
|
||||
filePath={event.file_path || undefined}
|
||||
onCreatePR={
|
||||
event.repository_id && !event.pr_url ? handleOpenBaseBranchDialog : undefined
|
||||
}
|
||||
onViewPR={event.pr_url ? handleViewPR : undefined}
|
||||
prNumber={event.pr_url ? event.pr_url.split("/").pop() : undefined}
|
||||
repositoryFullName={event.repository?.full_name || undefined}
|
||||
isCreatingPR={creatingPR}
|
||||
showGitDiffDownload={!isPrivacyModeWithNoDiff}
|
||||
disableAutoSave={event.staging_storage_type === "git_branch"}
|
||||
isPrivacyModeNoDiff={isPrivacyModeWithNoDiff}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Comments Sidebar */}
|
||||
<div
|
||||
className={`bg-muted/30 border-l border-border flex flex-col transition-all duration-300 ${
|
||||
showCommentsSection ? "w-96" : "w-0"
|
||||
} overflow-hidden`}
|
||||
>
|
||||
<div
|
||||
className={`h-full flex flex-col transition-opacity duration-300 ${showCommentsSection ? "opacity-100" : "opacity-0"}`}
|
||||
>
|
||||
{/* Comments Header */}
|
||||
<div className="p-3 border-b border-border">
|
||||
<h3 className="text-sm font-medium text-foreground flex items-center gap-2">
|
||||
<MessageSquare className="w-4 h-4 text-primary" />
|
||||
Comments
|
||||
{comments.length > 0 && (
|
||||
<span className="ml-auto px-1.5 py-0.5 text-xs bg-primary/20 rounded-full text-foreground">
|
||||
{comments.length}
|
||||
</span>
|
||||
)}
|
||||
</h3>
|
||||
</div>
|
||||
|
||||
{/* Comments List */}
|
||||
<div className="flex-1 overflow-y-auto">
|
||||
{loadingComments ? (
|
||||
<div className="flex items-center justify-center py-8">
|
||||
<div className="animate-spin rounded-full h-8 w-8 border-b-2 border-primary"></div>
|
||||
</div>
|
||||
) : comments.length === 0 ? (
|
||||
<div className="text-center py-8 px-4">
|
||||
<MessageSquare className="w-12 h-12 mx-auto text-muted-foreground/50 mb-3" />
|
||||
<p className="text-muted-foreground text-sm">No comments yet</p>
|
||||
</div>
|
||||
) : (
|
||||
<div className="divide-y divide-border">
|
||||
{comments.map(comment => (
|
||||
<div key={comment.id} className="p-4 hover:bg-accent/50 transition-colors">
|
||||
<div className="flex items-start gap-3">
|
||||
<Image
|
||||
src={
|
||||
comment.author?.github_username
|
||||
? `https://github.com/${comment.author.github_username}.png`
|
||||
: `https://ui-avatars.com/api/?name=${encodeURIComponent(
|
||||
comment.author?.name || comment.author?.email || "U",
|
||||
)}&background=d08e0d&color=fff`
|
||||
}
|
||||
alt={comment.author?.name || "User"}
|
||||
width={32}
|
||||
height={32}
|
||||
className="w-8 h-8 rounded-full"
|
||||
/>
|
||||
<div className="flex-1 min-w-0">
|
||||
<div className="flex items-center gap-2 mb-1">
|
||||
<span className="font-medium text-sm text-foreground">
|
||||
{comment.author?.name ||
|
||||
comment.author?.email?.split("@")[0] ||
|
||||
"Unknown"}
|
||||
</span>
|
||||
<span className="text-xs text-muted-foreground">
|
||||
{formatTimeAgo(comment.created_at)}
|
||||
</span>
|
||||
</div>
|
||||
<MarkdownViewer content={comment.content} />
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Comment Input with Custom Markdown Editor */}
|
||||
<div className="border-t border-border p-4 bg-background">
|
||||
<div className="mb-3">
|
||||
<MarkdownEditor
|
||||
value={newComment}
|
||||
onChange={setNewComment}
|
||||
placeholder="Add a comment... (supports Markdown)"
|
||||
disabled={isSubmittingComment}
|
||||
height={150}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Submit Button */}
|
||||
<button
|
||||
onClick={handleAddComment}
|
||||
disabled={!newComment.trim() || isSubmittingComment}
|
||||
className="w-full px-4 py-2 text-sm font-medium text-primary-foreground bg-primary hover:bg-primary/90 rounded-md transition-colors disabled:opacity-50 disabled:cursor-not-allowed"
|
||||
>
|
||||
{isSubmittingComment ? (
|
||||
<>
|
||||
<Loader2 className="w-4 h-4 animate-spin inline mr-2" />
|
||||
Commenting...
|
||||
</>
|
||||
) : (
|
||||
"Comment"
|
||||
)}
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<BaseBranchDialog
|
||||
isOpen={showBaseBranchDialog}
|
||||
onClose={() => setShowBaseBranchDialog(false)}
|
||||
onConfirm={handleBaseBranchConfirm}
|
||||
initialBranch={event.baseBranch || "main"}
|
||||
isCreatingPR={creatingPR}
|
||||
/>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,21 +0,0 @@
|
|||
import { Skeleton } from "@/components/ui/skeleton"
|
||||
|
||||
export default function ProfilerLoading() {
|
||||
return (
|
||||
<div className="py-6 sm:py-8 px-4 sm:px-6 max-w-[1400px] mx-auto">
|
||||
{/* Header */}
|
||||
<div className="mb-6">
|
||||
<div className="flex items-center gap-3 mb-4">
|
||||
<Skeleton className="h-8 w-8 rounded-md" />
|
||||
<Skeleton className="h-7 w-48" />
|
||||
</div>
|
||||
<Skeleton className="h-5 w-64" />
|
||||
</div>
|
||||
|
||||
{/* Profiler content */}
|
||||
<div className="bg-card rounded-xl border border-border p-6">
|
||||
<Skeleton className="h-[500px] w-full rounded-md" />
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
|
@ -1,32 +1,249 @@
|
|||
import { notFound } from "next/navigation"
|
||||
import { getReviewPageInitData } from "../action"
|
||||
import { ProfilerClient } from "./profiler-client"
|
||||
"use client"
|
||||
|
||||
interface ProfilerPageProps {
|
||||
params: Promise<{ traceId: string }>
|
||||
import React, { useEffect, useState } from "react"
|
||||
import { useParams, useRouter } from "next/navigation"
|
||||
import { ArrowLeft, Zap, Loader2, AlertTriangle } from "lucide-react"
|
||||
import { getOptimizationEventById } from "../action"
|
||||
import { getUserIdAndUsername } from "@/app/utils/auth"
|
||||
import dynamic from "next/dynamic"
|
||||
import { Skeleton } from "@/components/ui/skeleton"
|
||||
|
||||
const LineProfilerView = dynamic(
|
||||
() => import("@/components/LineProfiler").then(mod => mod.LineProfilerView),
|
||||
{
|
||||
ssr: false,
|
||||
loading: () => <Skeleton className="h-full w-full" />,
|
||||
},
|
||||
)
|
||||
import { useViewMode } from "@/app/app/ViewModeContext"
|
||||
import { toast } from "sonner"
|
||||
|
||||
// Error boundary to gracefully handle parsing failures or rendering issues
|
||||
interface ErrorBoundaryState {
|
||||
hasError: boolean
|
||||
error?: Error
|
||||
}
|
||||
|
||||
export default async function LineProfilerPage({ params }: ProfilerPageProps) {
|
||||
const { traceId } = await params
|
||||
|
||||
const initData = await getReviewPageInitData(traceId)
|
||||
|
||||
if (!initData || !initData.event) {
|
||||
notFound()
|
||||
class ProfilerErrorBoundary extends React.Component<
|
||||
{ children: React.ReactNode; onRetry?: () => void },
|
||||
ErrorBoundaryState
|
||||
> {
|
||||
constructor(props: { children: React.ReactNode; onRetry?: () => void }) {
|
||||
super(props)
|
||||
this.state = { hasError: false }
|
||||
}
|
||||
|
||||
const metadata = (initData.event.metadata as any) || {}
|
||||
static getDerivedStateFromError(error: Error): ErrorBoundaryState {
|
||||
return { hasError: true, error }
|
||||
}
|
||||
|
||||
componentDidCatch(error: Error, errorInfo: React.ErrorInfo) {
|
||||
console.error("ProfilerErrorBoundary caught an error:", error, errorInfo)
|
||||
}
|
||||
|
||||
render() {
|
||||
if (this.state.hasError) {
|
||||
return (
|
||||
<div className="flex-1 flex items-center justify-center p-8">
|
||||
<div className="text-center">
|
||||
<AlertTriangle className="w-12 h-12 mx-auto mb-4 text-destructive" />
|
||||
<h2 className="text-xl font-semibold mb-2">Failed to load profiler data</h2>
|
||||
<p className="text-muted-foreground mb-4">
|
||||
There was an error parsing or rendering the profiler data.
|
||||
</p>
|
||||
{this.props.onRetry && (
|
||||
<button
|
||||
onClick={() => {
|
||||
this.setState({ hasError: false, error: undefined })
|
||||
this.props.onRetry?.()
|
||||
}}
|
||||
className="px-4 py-2 bg-primary text-primary-foreground rounded-md hover:bg-primary/90"
|
||||
>
|
||||
Try Again
|
||||
</button>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
return this.props.children
|
||||
}
|
||||
}
|
||||
|
||||
interface EventMetadata {
|
||||
originalLineProfiler?: string
|
||||
optimizedLineProfiler?: string
|
||||
prCommentFields?: {
|
||||
original_runtime?: string
|
||||
best_runtime?: string
|
||||
}
|
||||
}
|
||||
|
||||
interface OptimizationEvent {
|
||||
id: string
|
||||
trace_id: string
|
||||
function_name?: string | null
|
||||
file_path?: string | null
|
||||
speedup_x?: number | null
|
||||
speedup_pct?: number | null
|
||||
metadata: EventMetadata
|
||||
}
|
||||
|
||||
export default function LineProfilerPage() {
|
||||
const params = useParams()
|
||||
const router = useRouter()
|
||||
const [event, setEvent] = useState<OptimizationEvent | null>(null)
|
||||
const [loading, setLoading] = useState(true)
|
||||
const { currentOrg } = useViewMode()
|
||||
const currentOrgId = currentOrg?.id
|
||||
|
||||
useEffect(() => {
|
||||
async function loadEvent() {
|
||||
try {
|
||||
const userSession = (await getUserIdAndUsername()) ?? { userId: "", username: "" }
|
||||
|
||||
const data = await getOptimizationEventById({
|
||||
payload: currentOrgId
|
||||
? { orgId: currentOrgId }
|
||||
: { userId: userSession.userId, username: userSession.username },
|
||||
trace_id: params.traceId as string,
|
||||
})
|
||||
|
||||
if (data) {
|
||||
const metadata = data.metadata as EventMetadata
|
||||
setEvent({
|
||||
id: data.id,
|
||||
trace_id: data.trace_id,
|
||||
function_name: data.function_name,
|
||||
file_path: data.file_path,
|
||||
speedup_x: data.speedup_x,
|
||||
speedup_pct: data.speedup_pct,
|
||||
metadata,
|
||||
})
|
||||
} else {
|
||||
setEvent(null)
|
||||
}
|
||||
} catch (error) {
|
||||
console.error("Failed to load optimization event:", error)
|
||||
toast.error("Failed to load profiler data")
|
||||
} finally {
|
||||
setLoading(false)
|
||||
}
|
||||
}
|
||||
|
||||
loadEvent()
|
||||
}, [params.traceId, currentOrgId])
|
||||
|
||||
const handleBack = () => {
|
||||
router.push(`/review-optimizations/${params.traceId}`)
|
||||
}
|
||||
|
||||
if (loading) {
|
||||
return (
|
||||
<div className="min-h-screen flex items-center justify-center">
|
||||
<Loader2 className="h-12 w-12 animate-spin text-primary" />
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
if (!event) {
|
||||
return (
|
||||
<div className="min-h-screen flex items-center justify-center p-4">
|
||||
<div className="text-center">
|
||||
<h2 className="text-2xl font-semibold mb-2">Event not found</h2>
|
||||
<p className="text-muted-foreground mb-4">
|
||||
The optimization event you're looking for doesn't exist.
|
||||
</p>
|
||||
<button
|
||||
onClick={handleBack}
|
||||
className="inline-flex items-center gap-2 px-4 py-2 bg-primary text-primary-foreground rounded-md hover:bg-primary/90"
|
||||
>
|
||||
<ArrowLeft className="h-4 w-4" />
|
||||
Go Back
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
const metadata = event.metadata || {}
|
||||
const hasProfilerData = metadata.originalLineProfiler || metadata.optimizedLineProfiler
|
||||
|
||||
if (!hasProfilerData) {
|
||||
return (
|
||||
<div className="min-h-screen flex items-center justify-center p-4">
|
||||
<div className="text-center">
|
||||
<h2 className="text-2xl font-semibold mb-2">No Profiler Data</h2>
|
||||
<p className="text-muted-foreground mb-4">
|
||||
This optimization doesn't have line profiler data available.
|
||||
</p>
|
||||
<button
|
||||
onClick={handleBack}
|
||||
className="inline-flex items-center gap-2 px-4 py-2 bg-primary text-primary-foreground rounded-md hover:bg-primary/90"
|
||||
>
|
||||
<ArrowLeft className="h-4 w-4" />
|
||||
Go Back
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
return (
|
||||
<ProfilerClient
|
||||
traceId={traceId}
|
||||
functionName={initData.event.function_name ?? null}
|
||||
filePath={initData.event.file_path ?? null}
|
||||
speedupX={initData.event.speedup_x ?? null}
|
||||
originalLineProfiler={metadata.originalLineProfiler}
|
||||
optimizedLineProfiler={metadata.optimizedLineProfiler}
|
||||
originalRuntime={metadata.prCommentFields?.original_runtime}
|
||||
bestRuntime={metadata.prCommentFields?.best_runtime}
|
||||
/>
|
||||
<div className="min-h-screen bg-background flex flex-col">
|
||||
{/* Header */}
|
||||
<div className="px-4 py-3 bg-muted/30 border-b border-border">
|
||||
<div className="flex items-center justify-between">
|
||||
<div className="flex items-center gap-4">
|
||||
<button
|
||||
onClick={handleBack}
|
||||
className="p-2 hover:bg-muted rounded-md transition-colors"
|
||||
title="Back to optimization details"
|
||||
>
|
||||
<ArrowLeft className="h-5 w-5" />
|
||||
</button>
|
||||
<div className="flex items-center gap-3">
|
||||
<Zap className="w-6 h-6 text-primary" />
|
||||
<h1 className="text-xl font-semibold">
|
||||
Line Profiler Report
|
||||
{event.function_name && (
|
||||
<>
|
||||
{" - "}
|
||||
<code className="font-mono text-primary">{event.function_name}()</code>
|
||||
</>
|
||||
)}
|
||||
</h1>
|
||||
{event.speedup_x && (
|
||||
<span className="flex items-center gap-2 rounded-md bg-gradient-to-r from-primary to-yellow-500 px-3 py-1 text-xs font-bold text-gray-900">
|
||||
<svg className="h-3.5 w-3.5" fill="currentColor" viewBox="0 0 24 24">
|
||||
<path d="M13 10V3L4 14h7v7l9-11h-7z" />
|
||||
</svg>
|
||||
{event.speedup_x.toFixed(2)}x faster
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{event.file_path && (
|
||||
<span className="text-sm text-muted-foreground font-mono">
|
||||
{event.file_path}
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Line Profiler View */}
|
||||
<div className="flex-1 overflow-hidden">
|
||||
<ProfilerErrorBoundary>
|
||||
<LineProfilerView
|
||||
originalProfiler={metadata.originalLineProfiler}
|
||||
optimizedProfiler={metadata.optimizedLineProfiler}
|
||||
functionName={event.function_name || undefined}
|
||||
originalRuntime={metadata.prCommentFields?.original_runtime}
|
||||
optimizedRuntime={metadata.prCommentFields?.best_runtime}
|
||||
/>
|
||||
</ProfilerErrorBoundary>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,171 +0,0 @@
|
|||
"use client"
|
||||
|
||||
import React from "react"
|
||||
import { useRouter } from "next/navigation"
|
||||
import { ArrowLeft, Zap, AlertTriangle } from "lucide-react"
|
||||
import dynamic from "next/dynamic"
|
||||
import { Skeleton } from "@/components/ui/skeleton"
|
||||
|
||||
const LineProfilerView = dynamic(
|
||||
() => import("@/components/LineProfiler").then(mod => mod.LineProfilerView),
|
||||
{
|
||||
ssr: false,
|
||||
loading: () => <Skeleton className="h-full w-full" />,
|
||||
},
|
||||
)
|
||||
|
||||
// Error boundary to gracefully handle parsing failures or rendering issues
|
||||
interface ErrorBoundaryState {
|
||||
hasError: boolean
|
||||
error?: Error
|
||||
}
|
||||
|
||||
class ProfilerErrorBoundary extends React.Component<
|
||||
{ children: React.ReactNode; onRetry?: () => void },
|
||||
ErrorBoundaryState
|
||||
> {
|
||||
constructor(props: { children: React.ReactNode; onRetry?: () => void }) {
|
||||
super(props)
|
||||
this.state = { hasError: false }
|
||||
}
|
||||
|
||||
static getDerivedStateFromError(error: Error): ErrorBoundaryState {
|
||||
return { hasError: true, error }
|
||||
}
|
||||
|
||||
componentDidCatch(error: Error, errorInfo: React.ErrorInfo) {
|
||||
console.error("ProfilerErrorBoundary caught an error:", error, errorInfo)
|
||||
}
|
||||
|
||||
render() {
|
||||
if (this.state.hasError) {
|
||||
return (
|
||||
<div className="flex-1 flex items-center justify-center p-8">
|
||||
<div className="text-center">
|
||||
<AlertTriangle className="w-12 h-12 mx-auto mb-4 text-destructive" />
|
||||
<h2 className="text-xl font-semibold mb-2">Failed to load profiler data</h2>
|
||||
<p className="text-muted-foreground mb-4">
|
||||
There was an error parsing or rendering the profiler data.
|
||||
</p>
|
||||
{this.props.onRetry && (
|
||||
<button
|
||||
onClick={() => {
|
||||
this.setState({ hasError: false, error: undefined })
|
||||
this.props.onRetry?.()
|
||||
}}
|
||||
className="px-4 py-2 bg-primary text-primary-foreground rounded-md hover:bg-primary/90"
|
||||
>
|
||||
Try Again
|
||||
</button>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
return this.props.children
|
||||
}
|
||||
}
|
||||
|
||||
export interface ProfilerClientProps {
|
||||
traceId: string
|
||||
functionName: string | null
|
||||
filePath: string | null
|
||||
speedupX: number | null
|
||||
originalLineProfiler?: string
|
||||
optimizedLineProfiler?: string
|
||||
originalRuntime?: string
|
||||
bestRuntime?: string
|
||||
}
|
||||
|
||||
export function ProfilerClient({
|
||||
traceId,
|
||||
functionName,
|
||||
filePath,
|
||||
speedupX,
|
||||
originalLineProfiler,
|
||||
optimizedLineProfiler,
|
||||
originalRuntime,
|
||||
bestRuntime,
|
||||
}: ProfilerClientProps) {
|
||||
const router = useRouter()
|
||||
|
||||
const handleBack = () => {
|
||||
router.push(`/review-optimizations/${traceId}`)
|
||||
}
|
||||
|
||||
const hasProfilerData = originalLineProfiler || optimizedLineProfiler
|
||||
|
||||
if (!hasProfilerData) {
|
||||
return (
|
||||
<div className="min-h-screen flex items-center justify-center p-4">
|
||||
<div className="text-center">
|
||||
<h2 className="text-2xl font-semibold mb-2">No Profiler Data</h2>
|
||||
<p className="text-muted-foreground mb-4">
|
||||
This optimization doesn't have line profiler data available.
|
||||
</p>
|
||||
<button
|
||||
onClick={handleBack}
|
||||
className="inline-flex items-center gap-2 px-4 py-2 bg-primary text-primary-foreground rounded-md hover:bg-primary/90"
|
||||
>
|
||||
<ArrowLeft className="h-4 w-4" />
|
||||
Go Back
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="min-h-screen bg-background flex flex-col">
|
||||
{/* Header */}
|
||||
<div className="px-4 py-3 bg-muted/30 border-b border-border">
|
||||
<div className="flex items-center justify-between">
|
||||
<div className="flex items-center gap-4">
|
||||
<button
|
||||
onClick={handleBack}
|
||||
className="p-2 hover:bg-muted rounded-md transition-colors"
|
||||
title="Back to optimization details"
|
||||
>
|
||||
<ArrowLeft className="h-5 w-5" />
|
||||
</button>
|
||||
<div className="flex items-center gap-3">
|
||||
<Zap className="w-6 h-6 text-primary" />
|
||||
<h1 className="text-xl font-semibold">
|
||||
Line Profiler Report
|
||||
{functionName && (
|
||||
<>
|
||||
{" - "}
|
||||
<code className="font-mono text-primary">{functionName}()</code>
|
||||
</>
|
||||
)}
|
||||
</h1>
|
||||
{speedupX && (
|
||||
<span className="flex items-center gap-2 rounded-md bg-gradient-to-r from-primary to-yellow-500 px-3 py-1 text-xs font-bold text-gray-900">
|
||||
<svg className="h-3.5 w-3.5" fill="currentColor" viewBox="0 0 24 24">
|
||||
<path d="M13 10V3L4 14h7v7l9-11h-7z" />
|
||||
</svg>
|
||||
{speedupX.toFixed(2)}x faster
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{filePath && <span className="text-sm text-muted-foreground font-mono">{filePath}</span>}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Line Profiler View */}
|
||||
<div className="flex-1 overflow-hidden">
|
||||
<ProfilerErrorBoundary>
|
||||
<LineProfilerView
|
||||
originalProfiler={originalLineProfiler}
|
||||
optimizedProfiler={optimizedLineProfiler}
|
||||
functionName={functionName || undefined}
|
||||
originalRuntime={originalRuntime}
|
||||
optimizedRuntime={bestRuntime}
|
||||
/>
|
||||
</ProfilerErrorBoundary>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
|
@ -1,993 +0,0 @@
|
|||
"use client"
|
||||
|
||||
import { useEffect, useState, useCallback, useRef } from "react"
|
||||
import { useRouter } from "next/navigation"
|
||||
import Image from "next/image"
|
||||
import {
|
||||
Zap,
|
||||
CheckCircle,
|
||||
XCircle,
|
||||
MessageSquare,
|
||||
Loader2,
|
||||
GitCommit,
|
||||
BarChart3,
|
||||
} from "lucide-react"
|
||||
import {
|
||||
createPullRequest,
|
||||
getOptimizationEventById,
|
||||
saveOptimizationChanges,
|
||||
setApprovalStatus,
|
||||
addComment,
|
||||
getCommentsByEvent,
|
||||
getStagingCodeFromApi,
|
||||
commitStagingCode,
|
||||
type StagingCodeResponse,
|
||||
} from "./action"
|
||||
import dynamic from "next/dynamic"
|
||||
|
||||
const MonacoDiffEditorGithub = dynamic(
|
||||
() => import("@/components/Editor/monaco-diff-editor-github"),
|
||||
{ ssr: false },
|
||||
)
|
||||
import { toast } from "sonner"
|
||||
import { MarkdownEditor } from "@/components/markdwon/markdown-editor"
|
||||
import { MarkdownViewer } from "@/components/markdwon/markdown-viewer"
|
||||
import { BaseBranchDialog } from "@/components/ui/base-branch-dialog"
|
||||
import { useViewMode } from "@/app/app/ViewModeContext"
|
||||
|
||||
// Interfaces
|
||||
interface Comment {
|
||||
id: string
|
||||
optimization_event_id: string
|
||||
author_user_id: string
|
||||
content: string
|
||||
created_at: Date
|
||||
author?: {
|
||||
user_id: string
|
||||
email: string
|
||||
name?: string
|
||||
github_username?: string
|
||||
}
|
||||
}
|
||||
|
||||
interface TestResults {
|
||||
passed: number
|
||||
failed: number
|
||||
}
|
||||
|
||||
interface ReportTable {
|
||||
[key: string]: TestResults
|
||||
}
|
||||
|
||||
interface PRCommentFields {
|
||||
original_runtime?: string
|
||||
best_runtime?: string
|
||||
loop_count?: number
|
||||
optimization_explanation?: string
|
||||
report_table?: ReportTable
|
||||
}
|
||||
|
||||
interface EventMetadata {
|
||||
diffContents?: Record<string, DiffContent>
|
||||
prCommentFields?: PRCommentFields
|
||||
generatedTests?: string
|
||||
existingTests?: string
|
||||
lastModified?: string
|
||||
coverage_message?: string
|
||||
staging_storage_type?: "plain_text" | "git_branch"
|
||||
staging_branch_name?: string
|
||||
originalLineProfiler?: string
|
||||
optimizedLineProfiler?: string
|
||||
}
|
||||
|
||||
interface Repository {
|
||||
id: string
|
||||
name: string
|
||||
owner: string
|
||||
full_name: string
|
||||
installation_id?: number
|
||||
}
|
||||
|
||||
interface OptimizationEvent {
|
||||
id: string
|
||||
event_type: string
|
||||
user_id: string | null
|
||||
repository_id: string | null
|
||||
trace_id: string
|
||||
pr_id: string | null
|
||||
pr_url: string | null
|
||||
api_key_id: number | null
|
||||
metadata: EventMetadata
|
||||
is_optimization_found: boolean | null
|
||||
current_username: string | null
|
||||
function_name?: string | null
|
||||
file_path?: string | null
|
||||
speedup_x?: number | null
|
||||
speedup_pct?: number | null
|
||||
created_at: Date
|
||||
baseBranch?: string | null
|
||||
repository?: Repository | null
|
||||
status?: "approved" | "rejected" | null
|
||||
review_quality?: string | null
|
||||
review_explanation?: string | null
|
||||
staging_storage_type?: "plain_text" | "git_branch" | null
|
||||
}
|
||||
|
||||
interface RawOptimizationEvent {
|
||||
id: string
|
||||
event_type: string
|
||||
user_id: string | null
|
||||
repository_id: string | null
|
||||
trace_id: string
|
||||
pr_id: string | null
|
||||
pr_url: string | null
|
||||
api_key_id: number | null
|
||||
metadata: unknown
|
||||
is_optimization_found: boolean | null
|
||||
current_username: string | null
|
||||
function_name?: string | null
|
||||
file_path?: string | null
|
||||
speedup_x?: number | null
|
||||
speedup_pct?: number | null
|
||||
created_at: Date
|
||||
baseBranch?: string | null
|
||||
repository?: Repository | null
|
||||
status?: "approved" | "rejected" | null
|
||||
review_quality?: string | null
|
||||
review_explanation?: string | null
|
||||
staging_storage_type?: "plain_text" | "git_branch" | null
|
||||
}
|
||||
|
||||
interface DiffContent {
|
||||
oldContent: string
|
||||
newContent: string
|
||||
}
|
||||
|
||||
interface SaveOptimizationResult {
|
||||
success: boolean
|
||||
error?: string
|
||||
event?: RawOptimizationEvent
|
||||
}
|
||||
|
||||
export interface ReviewClientProps {
|
||||
traceId: string
|
||||
initialUserId: string
|
||||
initialUsername: string
|
||||
initialEvent: RawOptimizationEvent | null
|
||||
initialComments: Comment[]
|
||||
initialStagingCode: StagingCodeResponse | null
|
||||
}
|
||||
|
||||
function transformEvent(
|
||||
rawData: RawOptimizationEvent,
|
||||
stagingCode: StagingCodeResponse | null,
|
||||
): OptimizationEvent {
|
||||
let metadata = rawData.metadata as EventMetadata
|
||||
|
||||
// Merge staging code into metadata if available
|
||||
if (stagingCode && rawData.staging_storage_type === "git_branch") {
|
||||
const isDiffEmpty =
|
||||
!stagingCode.diffContents || Object.keys(stagingCode.diffContents).length === 0
|
||||
if (!isDiffEmpty) {
|
||||
metadata = {
|
||||
...metadata,
|
||||
diffContents: stagingCode.diffContents,
|
||||
staging_storage_type: "git_branch",
|
||||
staging_branch_name: stagingCode.stagingBranchName,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
...rawData,
|
||||
metadata,
|
||||
function_name: rawData.function_name || null,
|
||||
file_path: rawData.file_path || null,
|
||||
speedup_x: rawData.speedup_x || null,
|
||||
speedup_pct: rawData.speedup_pct || null,
|
||||
baseBranch: rawData.baseBranch || undefined,
|
||||
repository: rawData.repository || null,
|
||||
status: rawData.status || null,
|
||||
review_quality: rawData.review_quality || null,
|
||||
review_explanation: rawData.review_explanation || null,
|
||||
staging_storage_type: rawData.staging_storage_type || null,
|
||||
}
|
||||
}
|
||||
|
||||
export function OptimizationReviewClient({
|
||||
traceId,
|
||||
initialUserId,
|
||||
initialUsername,
|
||||
initialEvent,
|
||||
initialComments,
|
||||
initialStagingCode,
|
||||
}: ReviewClientProps) {
|
||||
const router = useRouter()
|
||||
const [event, setEvent] = useState<OptimizationEvent | null>(
|
||||
initialEvent ? transformEvent(initialEvent, initialStagingCode) : null,
|
||||
)
|
||||
const [loading, setLoading] = useState(!initialEvent)
|
||||
const [creatingPR, setCreatingPR] = useState(false)
|
||||
const [userId] = useState<string>(initialUserId)
|
||||
const [isUpdatingStatus, setIsUpdatingStatus] = useState(false)
|
||||
const [isCommitting, setIsCommitting] = useState(false)
|
||||
const [hasUnsavedChanges, setHasUnsavedChanges] = useState(false)
|
||||
const saveQueueRef = useRef<Map<string, NodeJS.Timeout>>(new Map())
|
||||
const isLoadingRef = useRef(false)
|
||||
const pendingChangesRef = useRef<Record<string, string>>({})
|
||||
|
||||
// State for comments
|
||||
const [comments, setComments] = useState<Comment[]>(initialComments)
|
||||
const [newComment, setNewComment] = useState("")
|
||||
const [isSubmittingComment, setIsSubmittingComment] = useState(false)
|
||||
const [loadingComments, setLoadingComments] = useState(false)
|
||||
const [showCommentsSection, setShowCommentsSection] = useState(false)
|
||||
const { currentOrg } = useViewMode()
|
||||
|
||||
// State for base branch dialog
|
||||
const [showBaseBranchDialog, setShowBaseBranchDialog] = useState(false)
|
||||
|
||||
const currentOrgId = currentOrg?.id
|
||||
// Track the org ID used for the server-prefetched data
|
||||
const initialOrgIdRef = useRef(currentOrgId)
|
||||
|
||||
// Only refetch when the org changes from the initial server-fetched state
|
||||
useEffect(() => {
|
||||
// Skip if this is the initial render with server data
|
||||
if (currentOrgId === initialOrgIdRef.current) {
|
||||
return
|
||||
}
|
||||
// Prevent concurrent calls
|
||||
if (isLoadingRef.current) {
|
||||
return
|
||||
}
|
||||
|
||||
async function refetchEvent() {
|
||||
isLoadingRef.current = true
|
||||
setLoading(true)
|
||||
try {
|
||||
const data = await getOptimizationEventById({
|
||||
payload: currentOrgId
|
||||
? { orgId: currentOrgId }
|
||||
: { userId: initialUserId, username: initialUsername },
|
||||
trace_id: traceId,
|
||||
})
|
||||
if (data) {
|
||||
const rawData = data as unknown as RawOptimizationEvent
|
||||
let metadata = rawData.metadata as EventMetadata
|
||||
|
||||
if (rawData.staging_storage_type === "git_branch") {
|
||||
const eventMetadata = rawData.metadata as EventMetadata
|
||||
const stagingBranchName = eventMetadata?.staging_branch_name
|
||||
const repository = rawData.repository
|
||||
|
||||
if (stagingBranchName && repository?.full_name && repository?.installation_id) {
|
||||
const [stagingCodeResult] = await Promise.all([
|
||||
getStagingCodeFromApi({
|
||||
stagingBranchName,
|
||||
baseBranch: rawData.baseBranch || "main",
|
||||
fullRepoName: repository.full_name,
|
||||
installationId: repository.installation_id,
|
||||
functionName: rawData.function_name || undefined,
|
||||
filePath: rawData.file_path || undefined,
|
||||
}),
|
||||
loadComments(data.id),
|
||||
])
|
||||
|
||||
if (stagingCodeResult.success && stagingCodeResult.data) {
|
||||
const diffContentsResult = stagingCodeResult.data.diffContents
|
||||
const isDiffEmpty =
|
||||
!diffContentsResult || Object.keys(diffContentsResult).length === 0
|
||||
if (!isDiffEmpty) {
|
||||
metadata = {
|
||||
...metadata,
|
||||
diffContents: diffContentsResult,
|
||||
staging_storage_type: "git_branch",
|
||||
staging_branch_name: stagingCodeResult.data.stagingBranchName,
|
||||
}
|
||||
}
|
||||
} else {
|
||||
toast.error(
|
||||
stagingCodeResult.error || "Failed to fetch staging code from repository",
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
setEvent(
|
||||
transformEvent({ ...rawData, metadata } as unknown as RawOptimizationEvent, null),
|
||||
)
|
||||
} else {
|
||||
setEvent(transformEvent(rawData, null))
|
||||
await loadComments(data.id)
|
||||
}
|
||||
} else {
|
||||
setEvent(null)
|
||||
}
|
||||
} catch (error) {
|
||||
console.error("Failed to load optimization event:", error)
|
||||
toast.error("Failed to load optimization event")
|
||||
} finally {
|
||||
setLoading(false)
|
||||
isLoadingRef.current = false
|
||||
}
|
||||
}
|
||||
refetchEvent()
|
||||
}, [currentOrgId, traceId, initialUserId, initialUsername])
|
||||
|
||||
const loadComments = async (eventId: string) => {
|
||||
setLoadingComments(true)
|
||||
try {
|
||||
const result = await getCommentsByEvent(eventId)
|
||||
if (result.success && result.comments) {
|
||||
setComments(result.comments as Comment[])
|
||||
}
|
||||
} catch (error) {
|
||||
console.error("Failed to load comments:", error)
|
||||
} finally {
|
||||
setLoadingComments(false)
|
||||
}
|
||||
}
|
||||
|
||||
// Cleanup save queue on unmount
|
||||
useEffect(() => {
|
||||
const saveQueue = saveQueueRef.current
|
||||
return () => {
|
||||
saveQueue.forEach(timeout => clearTimeout(timeout))
|
||||
saveQueue.clear()
|
||||
}
|
||||
}, [])
|
||||
|
||||
const handleContentChange = (filePath: string, newContent: string) => {
|
||||
if (event && event.metadata.diffContents) {
|
||||
const updatedEvent = {
|
||||
...event,
|
||||
metadata: {
|
||||
...event.metadata,
|
||||
diffContents: {
|
||||
...event.metadata.diffContents,
|
||||
[filePath]: {
|
||||
...event.metadata.diffContents[filePath],
|
||||
newContent: newContent,
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
setEvent(updatedEvent)
|
||||
|
||||
// For git_branch storage, track pending changes for manual commit
|
||||
if (event.staging_storage_type === "git_branch") {
|
||||
pendingChangesRef.current[filePath] = newContent
|
||||
setHasUnsavedChanges(true)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Handle committing changes to git branch
|
||||
const handleCommitChanges = async () => {
|
||||
if (!event || !hasUnsavedChanges || Object.keys(pendingChangesRef.current).length === 0) {
|
||||
return
|
||||
}
|
||||
|
||||
setIsCommitting(true)
|
||||
try {
|
||||
const result = await commitStagingCode(
|
||||
event.trace_id,
|
||||
pendingChangesRef.current,
|
||||
`Update optimized code for ${event.function_name || "function"}`,
|
||||
)
|
||||
|
||||
if (result.success) {
|
||||
toast.success("Changes committed successfully!", {
|
||||
description: `Commit SHA: ${result.data?.commitSha?.substring(0, 7)}`,
|
||||
})
|
||||
pendingChangesRef.current = {}
|
||||
setHasUnsavedChanges(false)
|
||||
} else {
|
||||
toast.error(result.error || "Failed to commit changes")
|
||||
}
|
||||
} catch (error) {
|
||||
console.error("Error committing changes:", error)
|
||||
toast.error("Failed to commit changes")
|
||||
} finally {
|
||||
setIsCommitting(false)
|
||||
}
|
||||
}
|
||||
|
||||
// Handle autosave edits with database persistence (only for plain_text storage)
|
||||
const handleEdit = useCallback(
|
||||
async (filePath: string, newContent: string) => {
|
||||
if (!event || !userId) return
|
||||
|
||||
// Skip autosave for git_branch storage - use manual commit instead
|
||||
if (event.staging_storage_type === "git_branch") {
|
||||
return
|
||||
}
|
||||
|
||||
try {
|
||||
const existingTimeout = saveQueueRef.current.get(filePath)
|
||||
if (existingTimeout) {
|
||||
clearTimeout(existingTimeout)
|
||||
}
|
||||
|
||||
const timeoutId = setTimeout(async () => {
|
||||
try {
|
||||
const result = (await saveOptimizationChanges({
|
||||
userId,
|
||||
eventId: event.id,
|
||||
filePath,
|
||||
newContent,
|
||||
})) as SaveOptimizationResult
|
||||
|
||||
if (result.success) {
|
||||
console.log(`Successfully saved ${filePath} to database`)
|
||||
if (result.event) {
|
||||
const transformedData: OptimizationEvent = {
|
||||
...result.event,
|
||||
metadata: result.event.metadata as EventMetadata,
|
||||
function_name: result.event.function_name || null,
|
||||
file_path: result.event.file_path || null,
|
||||
speedup_x: result.event.speedup_x || null,
|
||||
speedup_pct: result.event.speedup_pct || null,
|
||||
created_at: result.event.created_at,
|
||||
status: result.event.status || null,
|
||||
repository: result.event.repository || event.repository || null,
|
||||
}
|
||||
setEvent(transformedData)
|
||||
}
|
||||
} else {
|
||||
console.error(`Failed to save ${filePath}:`, result.error)
|
||||
toast.error(`Failed to save changes: ${result.error}`)
|
||||
}
|
||||
} catch (error) {
|
||||
console.error(`Error saving ${filePath}:`, error)
|
||||
toast.error("Failed to save changes")
|
||||
} finally {
|
||||
saveQueueRef.current.delete(filePath)
|
||||
}
|
||||
}, 100)
|
||||
|
||||
saveQueueRef.current.set(filePath, timeoutId)
|
||||
} catch (error) {
|
||||
console.error("Error in handleEdit:", error)
|
||||
}
|
||||
},
|
||||
[event, userId],
|
||||
)
|
||||
|
||||
const handleSubmitReview = async (status: "approved" | "rejected") => {
|
||||
if (!event || !userId) return
|
||||
|
||||
setIsUpdatingStatus(true)
|
||||
|
||||
try {
|
||||
const result = await setApprovalStatus(event.id, status)
|
||||
|
||||
if (result.success) {
|
||||
setEvent(prev => (prev ? { ...prev, status } : null))
|
||||
} else {
|
||||
throw new Error(result.error || `Failed to ${status} optimization`)
|
||||
}
|
||||
} catch {
|
||||
toast.error("Failed to submit review")
|
||||
} finally {
|
||||
setIsUpdatingStatus(false)
|
||||
}
|
||||
}
|
||||
|
||||
const handleAddComment = async () => {
|
||||
if (!event || !userId || !newComment.trim()) return
|
||||
|
||||
setIsSubmittingComment(true)
|
||||
try {
|
||||
const commentResult = await addComment({
|
||||
eventId: event.id,
|
||||
userId,
|
||||
content: newComment.trim(),
|
||||
})
|
||||
|
||||
if (!commentResult.success) {
|
||||
throw new Error(commentResult.error || "Failed to add comment")
|
||||
}
|
||||
|
||||
// Reload comments and clear input
|
||||
await loadComments(event.id)
|
||||
setNewComment("")
|
||||
} catch {
|
||||
toast.error("Failed to add comment")
|
||||
} finally {
|
||||
setIsSubmittingComment(false)
|
||||
}
|
||||
}
|
||||
|
||||
const handleOpenBaseBranchDialog = () => {
|
||||
setShowBaseBranchDialog(true)
|
||||
}
|
||||
|
||||
const handleBaseBranchConfirm = async (branchName: string) => {
|
||||
setShowBaseBranchDialog(false)
|
||||
|
||||
// Update the event with the new base branch
|
||||
if (event) {
|
||||
setEvent(prev => (prev ? { ...prev, baseBranch: branchName } : null))
|
||||
}
|
||||
|
||||
// Small delay to ensure state is updated
|
||||
setTimeout(() => {
|
||||
handleCreatePR(branchName)
|
||||
}, 100)
|
||||
}
|
||||
|
||||
const handleCreatePR = async (customBaseBranch?: string) => {
|
||||
if (!event || !event.trace_id || !event.metadata.diffContents) {
|
||||
toast.error("Missing required data to create PR")
|
||||
return
|
||||
}
|
||||
|
||||
setCreatingPR(true)
|
||||
try {
|
||||
const speedupX = event.speedup_x ? `${event.speedup_x.toFixed(2)}x` : "N/A"
|
||||
const speedupPct = event.speedup_pct ? `${event.speedup_pct.toLocaleString()}%` : "N/A"
|
||||
|
||||
const result = await createPullRequest({
|
||||
traceId: event.trace_id,
|
||||
diffContents: event.metadata.diffContents,
|
||||
prCommentFields: event.metadata.prCommentFields,
|
||||
generatedTests: event.metadata.generatedTests,
|
||||
existingTests: event.metadata.existingTests,
|
||||
functionName: event.function_name || undefined,
|
||||
filePath: event.file_path || undefined,
|
||||
speedupX: speedupX,
|
||||
speedupPct: speedupPct,
|
||||
baseBranch: customBaseBranch || event.baseBranch || undefined,
|
||||
full_repo_name: event.repository?.full_name,
|
||||
coverage_message: event.metadata.coverage_message,
|
||||
originalLineProfiler: event.metadata.originalLineProfiler,
|
||||
optimizedLineProfiler: event.metadata.optimizedLineProfiler,
|
||||
})
|
||||
|
||||
console.log("[handleCreatePR] Result from createPullRequest:", {
|
||||
success: result.success,
|
||||
data: result.data,
|
||||
dataType: typeof result.data,
|
||||
error: result.error,
|
||||
})
|
||||
|
||||
if (!result.success) {
|
||||
console.error("[handleCreatePR] Failed to create PR:", result.error)
|
||||
toast.error(result.error || "Failed to create pull request", {
|
||||
duration: 5000,
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
// Handle pending approval response (status 202)
|
||||
if (typeof result.data === "object" && result.data !== null) {
|
||||
const dataObj = result.data as { status?: string; message?: string }
|
||||
if (dataObj.status === "pending_approval") {
|
||||
console.log("[handleCreatePR] Pending approval response:", dataObj)
|
||||
toast.info(dataObj.message || "This optimization requires approval", {
|
||||
duration: 5000,
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
// If it's an object but not pending approval, something is wrong
|
||||
console.error("[handleCreatePR] Unexpected object response:", dataObj)
|
||||
toast.error("Failed to create pull request: Server returned unexpected response", {
|
||||
duration: 5000,
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
// Extract PR number - should be a number or string
|
||||
let prNumber: string | null = null
|
||||
if (typeof result.data === "number") {
|
||||
prNumber = String(result.data)
|
||||
} else if (typeof result.data === "string") {
|
||||
prNumber = result.data
|
||||
} else {
|
||||
console.error(
|
||||
"[handleCreatePR] Invalid data type. Expected number or string, got:",
|
||||
typeof result.data,
|
||||
result.data,
|
||||
)
|
||||
toast.error("Failed to create pull request: Invalid response from server", {
|
||||
duration: 5000,
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
console.log("[handleCreatePR] Successfully extracted PR number:", prNumber)
|
||||
|
||||
let constructedUrl = ""
|
||||
if (prNumber && event.repository?.full_name)
|
||||
constructedUrl = `https://github.com/${event.repository.full_name}/pull/${prNumber}`
|
||||
|
||||
// Update the event state with the new PR number
|
||||
if (prNumber) {
|
||||
setEvent(prev => (prev ? { ...prev, pr_url: constructedUrl } : null))
|
||||
}
|
||||
|
||||
// Show success toast with custom duration and description
|
||||
toast.success("Pull request created successfully!", {
|
||||
description: `PR #${prNumber || "new"} has been created. Opening GitHub...`,
|
||||
duration: 5000,
|
||||
action: {
|
||||
label: "Open PR",
|
||||
onClick: () => {
|
||||
if (constructedUrl) {
|
||||
window.open(constructedUrl, "_blank")
|
||||
}
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
// Delay opening the window to ensure toast is visible
|
||||
setTimeout(() => {
|
||||
if (constructedUrl) {
|
||||
window.open(constructedUrl, "_blank")
|
||||
}
|
||||
}, 1000)
|
||||
} catch (error: unknown) {
|
||||
console.error("[handleCreatePR] Exception:", error)
|
||||
const errorMessage = error instanceof Error ? error.message : "Failed to create pull request"
|
||||
toast.error(errorMessage, {
|
||||
duration: 5000,
|
||||
})
|
||||
} finally {
|
||||
setCreatingPR(false)
|
||||
}
|
||||
}
|
||||
|
||||
const handleViewPR = () => {
|
||||
if (!event?.pr_url) return
|
||||
window.open(event.pr_url, "_blank")
|
||||
}
|
||||
|
||||
const handleViewProfiler = () => {
|
||||
router.push(`/review-optimizations/${traceId}/profiler`)
|
||||
}
|
||||
|
||||
const formatTimeAgo = (date: Date) => {
|
||||
const seconds = Math.floor((new Date().getTime() - new Date(date).getTime()) / 1000)
|
||||
|
||||
if (seconds < 60) return "just now"
|
||||
if (seconds < 3600) return `${Math.floor(seconds / 60)}m ago`
|
||||
if (seconds < 86400) return `${Math.floor(seconds / 3600)}h ago`
|
||||
if (seconds < 2592000) return `${Math.floor(seconds / 86400)}d ago`
|
||||
return `${Math.floor(seconds / 2592000)}mo ago`
|
||||
}
|
||||
|
||||
if (loading) {
|
||||
return (
|
||||
<div className="min-h-screen flex items-center justify-center">
|
||||
<div className="animate-spin rounded-full h-12 w-12 border-b-2 border-primary"></div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
if (!event) {
|
||||
return (
|
||||
<div className="min-h-screen flex items-center justify-center p-4">
|
||||
<div className="text-center">
|
||||
<h2 className="text-2xl font-semibold mb-2">Event not found</h2>
|
||||
<p className="text-muted-foreground">
|
||||
The optimization event you're looking for doesn't exist.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
const metadata = event.metadata || {}
|
||||
const diffContents = metadata.diffContents || {}
|
||||
const prCommentFields = metadata.prCommentFields || {}
|
||||
|
||||
// Check if we have empty diffContents for git_branch storage type (merged PR in privacy mode)
|
||||
const isPrivacyModeWithNoDiff =
|
||||
event.staging_storage_type === "git_branch" && Object.keys(diffContents).length === 0
|
||||
|
||||
return (
|
||||
<div className="min-h-screen bg-background">
|
||||
<div className="mx-auto">
|
||||
{/* Header */}
|
||||
<div className="px-4 py-2 bg-muted/30 border-b border-border">
|
||||
<div className="flex items-center justify-between">
|
||||
<div className="flex items-center gap-3">
|
||||
<Zap className="w-6 h-6 text-primary" />
|
||||
<h1 className="text-xl font-semibold">
|
||||
{event.function_name ? (
|
||||
<>
|
||||
Code Optimization -{" "}
|
||||
<code className="font-mono text-primary">{event.function_name}()</code>
|
||||
</>
|
||||
) : (
|
||||
"Code Optimization"
|
||||
)}
|
||||
</h1>
|
||||
{event.speedup_x && (
|
||||
<span className="flex items-center gap-2 rounded-md bg-gradient-to-r from-primary to-yellow-500 px-3 py-1 text-xs font-bold text-gray-900">
|
||||
<svg className="h-3.5 w-3.5" fill="currentColor" viewBox="0 0 24 24">
|
||||
<path d="M13 10V3L4 14h7v7l9-11h-7z" />
|
||||
</svg>
|
||||
{event.speedup_x.toFixed(2)}x faster
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<div className="flex items-center gap-2">
|
||||
{/* Performance Profile Button - Only show if profiler data exists */}
|
||||
{(metadata.originalLineProfiler || metadata.optimizedLineProfiler) && (
|
||||
<button
|
||||
onClick={handleViewProfiler}
|
||||
className="flex items-center gap-2 px-3 py-1.5 text-sm font-medium rounded-md
|
||||
bg-purple-100 text-purple-700 hover:bg-purple-200
|
||||
dark:bg-purple-900/30 dark:text-purple-300 dark:hover:bg-purple-900/50
|
||||
transition-all duration-200"
|
||||
title="View line-by-line performance profile"
|
||||
>
|
||||
<BarChart3 className="w-4 h-4" />
|
||||
<span>Performance Profile</span>
|
||||
</button>
|
||||
)}
|
||||
|
||||
{/* Comments Toggle Button with Count */}
|
||||
<button
|
||||
onClick={() => setShowCommentsSection(!showCommentsSection)}
|
||||
className={`
|
||||
relative p-1.5 rounded-md transition-all duration-200 flex items-center gap-1
|
||||
${showCommentsSection ? "bg-primary/10 text-foreground" : "hover:bg-muted text-foreground"}
|
||||
`}
|
||||
title={showCommentsSection ? "Hide comments panel" : "Show comments panel"}
|
||||
>
|
||||
<MessageSquare
|
||||
className={`
|
||||
w-4 h-4 transition-colors
|
||||
${showCommentsSection ? "text-primary" : "text-muted-foreground"}
|
||||
`}
|
||||
/>
|
||||
{comments.length > 0 && (
|
||||
<span
|
||||
className={`
|
||||
absolute -top-1 -right-1 min-w-[16px] h-4 flex items-center justify-center
|
||||
px-1 text-[10px] font-bold rounded-full transition-colors
|
||||
${
|
||||
showCommentsSection
|
||||
? "bg-primary text-primary-foreground"
|
||||
: "bg-muted-foreground text-background"
|
||||
}
|
||||
`}
|
||||
>
|
||||
{comments.length}
|
||||
</span>
|
||||
)}
|
||||
</button>
|
||||
|
||||
{/* Commit Button - Only for git_branch storage */}
|
||||
{event.staging_storage_type === "git_branch" && (
|
||||
<button
|
||||
onClick={handleCommitChanges}
|
||||
disabled={isCommitting || !hasUnsavedChanges}
|
||||
className={`
|
||||
flex items-center gap-2 px-3 py-1.5 text-sm font-medium rounded-md
|
||||
transition-all duration-200
|
||||
${
|
||||
isCommitting
|
||||
? "bg-muted text-muted-foreground cursor-not-allowed opacity-50"
|
||||
: hasUnsavedChanges
|
||||
? "bg-blue-600 text-white hover:bg-blue-700"
|
||||
: "bg-muted text-muted-foreground cursor-not-allowed"
|
||||
}
|
||||
`}
|
||||
title={
|
||||
hasUnsavedChanges ? "Commit changes to staging branch" : "No changes to commit"
|
||||
}
|
||||
>
|
||||
{isCommitting ? (
|
||||
<Loader2 className="w-4 h-4 animate-spin" />
|
||||
) : (
|
||||
<GitCommit className="w-4 h-4" />
|
||||
)}
|
||||
<span>{hasUnsavedChanges ? "Commit" : "Committed"}</span>
|
||||
</button>
|
||||
)}
|
||||
|
||||
{/* Approve Button */}
|
||||
<button
|
||||
onClick={() => handleSubmitReview("approved")}
|
||||
disabled={isUpdatingStatus || event.status === "approved"}
|
||||
className={`
|
||||
flex items-center gap-2 px-3 py-1.5 text-sm font-medium rounded-md
|
||||
transition-all duration-200
|
||||
${
|
||||
event.status === "approved"
|
||||
? "bg-green-600 text-white cursor-default"
|
||||
: isUpdatingStatus
|
||||
? "bg-muted text-muted-foreground cursor-not-allowed"
|
||||
: "bg-muted hover:bg-green-600 hover:text-white text-foreground"
|
||||
}
|
||||
${isUpdatingStatus ? "opacity-50" : ""}
|
||||
`}
|
||||
>
|
||||
{isUpdatingStatus ? (
|
||||
<Loader2 className="w-4 h-4 animate-spin" />
|
||||
) : (
|
||||
<CheckCircle className="w-4 h-4" />
|
||||
)}
|
||||
<span>Approve</span>
|
||||
</button>
|
||||
|
||||
{/* Reject Button */}
|
||||
<button
|
||||
onClick={() => handleSubmitReview("rejected")}
|
||||
disabled={isUpdatingStatus || event.status === "rejected"}
|
||||
className={`
|
||||
flex items-center gap-2 px-3 py-1.5 text-sm font-medium rounded-md
|
||||
transition-all duration-200
|
||||
${
|
||||
event.status === "rejected"
|
||||
? "bg-red-600 text-white cursor-default"
|
||||
: isUpdatingStatus
|
||||
? "bg-muted text-muted-foreground cursor-not-allowed"
|
||||
: "bg-muted hover:bg-red-600 hover:text-white text-foreground"
|
||||
}
|
||||
${isUpdatingStatus ? "opacity-50" : ""}
|
||||
`}
|
||||
>
|
||||
{isUpdatingStatus ? (
|
||||
<Loader2 className="w-4 h-4 animate-spin" />
|
||||
) : (
|
||||
<XCircle className="w-4 h-4" />
|
||||
)}
|
||||
<span>Reject</span>
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Main Content */}
|
||||
<div className="flex h-[calc(100vh-60px)] w-full overflow-hidden">
|
||||
{/* Editor Section */}
|
||||
<div className="flex-1">
|
||||
<MonacoDiffEditorGithub
|
||||
diffContents={diffContents}
|
||||
onContentChange={handleContentChange}
|
||||
onEdit={handleEdit}
|
||||
optimizationInfo={{
|
||||
speedup_x: event.speedup_x || undefined,
|
||||
speedup_pct: event.speedup_pct || undefined,
|
||||
prCommentFields: prCommentFields,
|
||||
generatedTests: metadata.generatedTests,
|
||||
coverage_message: metadata.coverage_message,
|
||||
review_explanation: event.review_explanation,
|
||||
review_quality: event.review_quality,
|
||||
}}
|
||||
functionName={event.function_name || undefined}
|
||||
filePath={event.file_path || undefined}
|
||||
onCreatePR={
|
||||
event.repository_id && !event.pr_url ? handleOpenBaseBranchDialog : undefined
|
||||
}
|
||||
onViewPR={event.pr_url ? handleViewPR : undefined}
|
||||
prNumber={event.pr_url ? event.pr_url.split("/").pop() : undefined}
|
||||
repositoryFullName={event.repository?.full_name || undefined}
|
||||
isCreatingPR={creatingPR}
|
||||
showGitDiffDownload={!isPrivacyModeWithNoDiff}
|
||||
disableAutoSave={event.staging_storage_type === "git_branch"}
|
||||
isPrivacyModeNoDiff={isPrivacyModeWithNoDiff}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Comments Sidebar */}
|
||||
<div
|
||||
className={`bg-muted/30 border-l border-border flex flex-col transition-all duration-300 ${
|
||||
showCommentsSection ? "w-96" : "w-0"
|
||||
} overflow-hidden`}
|
||||
>
|
||||
<div
|
||||
className={`h-full flex flex-col transition-opacity duration-300 ${showCommentsSection ? "opacity-100" : "opacity-0"}`}
|
||||
>
|
||||
{/* Comments Header */}
|
||||
<div className="p-3 border-b border-border">
|
||||
<h3 className="text-sm font-medium text-foreground flex items-center gap-2">
|
||||
<MessageSquare className="w-4 h-4 text-primary" />
|
||||
Comments
|
||||
{comments.length > 0 && (
|
||||
<span className="ml-auto px-1.5 py-0.5 text-xs bg-primary/20 rounded-full text-foreground">
|
||||
{comments.length}
|
||||
</span>
|
||||
)}
|
||||
</h3>
|
||||
</div>
|
||||
|
||||
{/* Comments List */}
|
||||
<div className="flex-1 overflow-y-auto">
|
||||
{loadingComments ? (
|
||||
<div className="flex items-center justify-center py-8">
|
||||
<div className="animate-spin rounded-full h-8 w-8 border-b-2 border-primary"></div>
|
||||
</div>
|
||||
) : comments.length === 0 ? (
|
||||
<div className="text-center py-8 px-4">
|
||||
<MessageSquare className="w-12 h-12 mx-auto text-muted-foreground/50 mb-3" />
|
||||
<p className="text-muted-foreground text-sm">No comments yet</p>
|
||||
</div>
|
||||
) : (
|
||||
<div className="divide-y divide-border">
|
||||
{comments.map(comment => (
|
||||
<div key={comment.id} className="p-4 hover:bg-accent/50 transition-colors">
|
||||
<div className="flex items-start gap-3">
|
||||
<Image
|
||||
src={
|
||||
comment.author?.github_username
|
||||
? `https://github.com/${comment.author.github_username}.png`
|
||||
: `https://ui-avatars.com/api/?name=${encodeURIComponent(
|
||||
comment.author?.name || comment.author?.email || "U",
|
||||
)}&background=d08e0d&color=fff`
|
||||
}
|
||||
alt={comment.author?.name || "User"}
|
||||
width={32}
|
||||
height={32}
|
||||
className="w-8 h-8 rounded-full"
|
||||
/>
|
||||
<div className="flex-1 min-w-0">
|
||||
<div className="flex items-center gap-2 mb-1">
|
||||
<span className="font-medium text-sm text-foreground">
|
||||
{comment.author?.name ||
|
||||
comment.author?.email?.split("@")[0] ||
|
||||
"Unknown"}
|
||||
</span>
|
||||
<span className="text-xs text-muted-foreground">
|
||||
{formatTimeAgo(comment.created_at)}
|
||||
</span>
|
||||
</div>
|
||||
<MarkdownViewer content={comment.content} />
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Comment Input with Custom Markdown Editor */}
|
||||
<div className="border-t border-border p-4 bg-background">
|
||||
<div className="mb-3">
|
||||
<MarkdownEditor
|
||||
value={newComment}
|
||||
onChange={setNewComment}
|
||||
placeholder="Add a comment... (supports Markdown)"
|
||||
disabled={isSubmittingComment}
|
||||
height={150}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Submit Button */}
|
||||
<button
|
||||
onClick={handleAddComment}
|
||||
disabled={!newComment.trim() || isSubmittingComment}
|
||||
className="w-full px-4 py-2 text-sm font-medium text-primary-foreground bg-primary hover:bg-primary/90 rounded-md transition-colors disabled:opacity-50 disabled:cursor-not-allowed"
|
||||
>
|
||||
{isSubmittingComment ? (
|
||||
<>
|
||||
<Loader2 className="w-4 h-4 animate-spin inline mr-2" />
|
||||
Commenting...
|
||||
</>
|
||||
) : (
|
||||
"Comment"
|
||||
)}
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<BaseBranchDialog
|
||||
isOpen={showBaseBranchDialog}
|
||||
onClose={() => setShowBaseBranchDialog(false)}
|
||||
onConfirm={handleBaseBranchConfirm}
|
||||
initialBranch={event.baseBranch || "main"}
|
||||
isCreatingPR={creatingPR}
|
||||
/>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
|
@ -1,5 +1,5 @@
|
|||
import { describe, it, expect, vi, beforeEach } from "vitest"
|
||||
import { prisma } from "@codeflash-ai/common"
|
||||
import { prisma, buildOptimizationOrCondition } from "@codeflash-ai/common"
|
||||
import { getRepositoriesForAccountCached } from "@/lib/services/repository-utils"
|
||||
|
||||
vi.mock("@/lib/server-action-timing", () => ({
|
||||
|
|
@ -10,32 +10,31 @@ vi.mock("@/lib/services/repository-utils", () => ({
|
|||
getRepositoriesForAccountCached: vi.fn(),
|
||||
}))
|
||||
|
||||
// Use realistic test fixtures: valid UUIDs and Auth0-style user IDs
|
||||
const mockPayload = { userId: "github|12345", username: "testuser" }
|
||||
const mockRepoIds = ["a1b2c3d4-e5f6-7890-abcd-ef1234567890", "b2c3d4e5-f678-9012-bcde-f12345678901"]
|
||||
const mockPayload = { userId: "user-1", username: "testuser" }
|
||||
const mockRepoIds = ["repo-1", "repo-2"]
|
||||
|
||||
const mockEvents = [
|
||||
{
|
||||
id: "e1f2g3h4-i5j6-7890-cdef-123456789012",
|
||||
id: "evt-1",
|
||||
trace_id: "trace-1",
|
||||
function_name: "calculate",
|
||||
file_path: "src/utils.py",
|
||||
repository_id: mockRepoIds[0],
|
||||
repository_id: "repo-1",
|
||||
status: "approved",
|
||||
is_staging: true,
|
||||
created_at: new Date("2024-06-01"),
|
||||
repository: { id: mockRepoIds[0], full_name: "org/repo", name: "repo" },
|
||||
repository: { id: "repo-1", full_name: "org/repo", name: "repo" },
|
||||
},
|
||||
{
|
||||
id: "f2g3h4i5-j678-9012-defg-234567890123",
|
||||
id: "evt-2",
|
||||
trace_id: "trace-2",
|
||||
function_name: "process",
|
||||
file_path: "src/main.py",
|
||||
repository_id: mockRepoIds[1],
|
||||
repository_id: "repo-2",
|
||||
status: "pending",
|
||||
is_staging: true,
|
||||
created_at: new Date("2024-06-02"),
|
||||
repository: { id: mockRepoIds[1], full_name: "org/repo2", name: "repo2" },
|
||||
repository: { id: "repo-2", full_name: "org/repo2", name: "repo2" },
|
||||
},
|
||||
]
|
||||
|
||||
|
|
@ -55,6 +54,7 @@ describe("getAllOptimizationEvents", () => {
|
|||
repoIds: mockRepoIds,
|
||||
repos: [],
|
||||
} as any)
|
||||
vi.mocked(buildOptimizationOrCondition).mockReturnValue({})
|
||||
|
||||
const mod = await import("../action")
|
||||
getAllOptimizationEvents = mod.getAllOptimizationEvents
|
||||
|
|
@ -62,20 +62,19 @@ describe("getAllOptimizationEvents", () => {
|
|||
|
||||
describe("Path B: standard Prisma query", () => {
|
||||
it("calls findMany and count in parallel", async () => {
|
||||
vi.mocked(prisma.$queryRawUnsafe)
|
||||
.mockResolvedValueOnce(mockEvents)
|
||||
.mockResolvedValueOnce([{ count: BigInt(2) }])
|
||||
vi.mocked(prisma.optimization_events.findMany).mockResolvedValue(mockEvents as any)
|
||||
vi.mocked(prisma.optimization_events.count).mockResolvedValue(2)
|
||||
vi.mocked(prisma.optimization_features.findMany).mockResolvedValue([])
|
||||
|
||||
await getAllOptimizationEvents({ payload: mockPayload as any })
|
||||
|
||||
expect(prisma.$queryRawUnsafe).toHaveBeenCalledTimes(2)
|
||||
expect(prisma.optimization_events.findMany).toHaveBeenCalledTimes(1)
|
||||
expect(prisma.optimization_events.count).toHaveBeenCalledTimes(1)
|
||||
})
|
||||
|
||||
it("batch-fetches optimization_features by trace_id array (not N+1)", async () => {
|
||||
vi.mocked(prisma.$queryRawUnsafe)
|
||||
.mockResolvedValueOnce(mockEvents)
|
||||
.mockResolvedValueOnce([{ count: BigInt(2) }])
|
||||
vi.mocked(prisma.optimization_events.findMany).mockResolvedValue(mockEvents as any)
|
||||
vi.mocked(prisma.optimization_events.count).mockResolvedValue(2)
|
||||
vi.mocked(prisma.optimization_features.findMany).mockResolvedValue(mockFeatures as any)
|
||||
|
||||
await getAllOptimizationEvents({ payload: mockPayload as any })
|
||||
|
|
@ -93,22 +92,20 @@ describe("getAllOptimizationEvents", () => {
|
|||
})
|
||||
|
||||
it("merges review_quality into events", async () => {
|
||||
vi.mocked(prisma.$queryRawUnsafe)
|
||||
.mockResolvedValueOnce(mockEvents)
|
||||
.mockResolvedValueOnce([{ count: BigInt(2) }])
|
||||
vi.mocked(prisma.optimization_events.findMany).mockResolvedValue(mockEvents as any)
|
||||
vi.mocked(prisma.optimization_events.count).mockResolvedValue(2)
|
||||
vi.mocked(prisma.optimization_features.findMany).mockResolvedValue(mockFeatures as any)
|
||||
|
||||
const result = await getAllOptimizationEvents({ payload: mockPayload as any })
|
||||
|
||||
expect((result.events[0] as any).review_quality).toBe("high")
|
||||
expect((result.events[0] as any).review_explanation).toBe("Great optimization")
|
||||
expect((result.events[1] as any).review_quality).toBeNull()
|
||||
expect(result.events[0].review_quality).toBe("high")
|
||||
expect(result.events[0].review_explanation).toBe("Great optimization")
|
||||
expect(result.events[1].review_quality).toBeNull()
|
||||
})
|
||||
|
||||
it("returns totalCount from count query", async () => {
|
||||
vi.mocked(prisma.$queryRawUnsafe)
|
||||
.mockResolvedValueOnce([])
|
||||
.mockResolvedValueOnce([{ count: BigInt(42) }])
|
||||
vi.mocked(prisma.optimization_events.findMany).mockResolvedValue([])
|
||||
vi.mocked(prisma.optimization_events.count).mockResolvedValue(42)
|
||||
vi.mocked(prisma.optimization_features.findMany).mockResolvedValue([])
|
||||
|
||||
const result = await getAllOptimizationEvents({ payload: mockPayload as any })
|
||||
|
|
@ -116,9 +113,8 @@ describe("getAllOptimizationEvents", () => {
|
|||
})
|
||||
|
||||
it("applies pagination with skip and take", async () => {
|
||||
vi.mocked(prisma.$queryRawUnsafe)
|
||||
.mockResolvedValueOnce([])
|
||||
.mockResolvedValueOnce([{ count: BigInt(0) }])
|
||||
vi.mocked(prisma.optimization_events.findMany).mockResolvedValue([])
|
||||
vi.mocked(prisma.optimization_events.count).mockResolvedValue(0)
|
||||
vi.mocked(prisma.optimization_features.findMany).mockResolvedValue([])
|
||||
|
||||
await getAllOptimizationEvents({
|
||||
|
|
@ -127,28 +123,31 @@ describe("getAllOptimizationEvents", () => {
|
|||
pageSize: 25,
|
||||
})
|
||||
|
||||
// Check that OFFSET is calculated correctly in the SQL
|
||||
const sql = vi.mocked(prisma.$queryRawUnsafe).mock.calls[0][0] as string
|
||||
expect(sql).toContain("OFFSET 50") // (3 - 1) * 25
|
||||
expect(sql).toContain("LIMIT 25")
|
||||
expect(prisma.optimization_events.findMany).toHaveBeenCalledWith(
|
||||
expect.objectContaining({
|
||||
skip: 50, // (3 - 1) * 25
|
||||
take: 25,
|
||||
}),
|
||||
)
|
||||
})
|
||||
|
||||
it("uses default sort (created_at desc) when no sort provided", async () => {
|
||||
vi.mocked(prisma.$queryRawUnsafe)
|
||||
.mockResolvedValueOnce([])
|
||||
.mockResolvedValueOnce([{ count: BigInt(0) }])
|
||||
vi.mocked(prisma.optimization_events.findMany).mockResolvedValue([])
|
||||
vi.mocked(prisma.optimization_events.count).mockResolvedValue(0)
|
||||
vi.mocked(prisma.optimization_features.findMany).mockResolvedValue([])
|
||||
|
||||
await getAllOptimizationEvents({ payload: mockPayload as any })
|
||||
|
||||
const sql = vi.mocked(prisma.$queryRawUnsafe).mock.calls[0][0] as string
|
||||
expect(sql).toContain("ORDER BY oe.created_at DESC")
|
||||
expect(prisma.optimization_events.findMany).toHaveBeenCalledWith(
|
||||
expect.objectContaining({
|
||||
orderBy: { created_at: "desc" },
|
||||
}),
|
||||
)
|
||||
})
|
||||
|
||||
it("applies search filter", async () => {
|
||||
vi.mocked(prisma.$queryRawUnsafe)
|
||||
.mockResolvedValueOnce([])
|
||||
.mockResolvedValueOnce([{ count: BigInt(0) }])
|
||||
vi.mocked(prisma.optimization_events.findMany).mockResolvedValue([])
|
||||
vi.mocked(prisma.optimization_events.count).mockResolvedValue(0)
|
||||
vi.mocked(prisma.optimization_features.findMany).mockResolvedValue([])
|
||||
|
||||
await getAllOptimizationEvents({
|
||||
|
|
@ -156,31 +155,33 @@ describe("getAllOptimizationEvents", () => {
|
|||
search: "calc",
|
||||
})
|
||||
|
||||
// Check that search is included in the SQL
|
||||
const sql = vi.mocked(prisma.$queryRawUnsafe).mock.calls[0][0] as string
|
||||
expect(sql).toContain("oe.function_name ILIKE $1")
|
||||
expect(sql).toContain("oe.file_path ILIKE $1")
|
||||
expect(sql).toContain("r.full_name ILIKE $1")
|
||||
// Check params include the search term
|
||||
const params = vi.mocked(prisma.$queryRawUnsafe).mock.calls[0].slice(1)
|
||||
expect(params[0]).toBe("%calc%")
|
||||
const callArgs = vi.mocked(prisma.optimization_events.findMany).mock.calls[0][0] as any
|
||||
const andClause = callArgs.where.AND
|
||||
expect(andClause).toBeDefined()
|
||||
expect(andClause.length).toBeGreaterThan(0)
|
||||
|
||||
// Search should include OR across function_name, file_path, repository.full_name
|
||||
const orClause = andClause.find((c: any) => c.OR)?.OR
|
||||
expect(orClause).toHaveLength(3)
|
||||
expect(orClause[0]).toEqual({
|
||||
function_name: { contains: "calc", mode: "insensitive" },
|
||||
})
|
||||
})
|
||||
|
||||
it("applies repository_id filter", async () => {
|
||||
vi.mocked(prisma.$queryRawUnsafe)
|
||||
.mockResolvedValueOnce([])
|
||||
.mockResolvedValueOnce([{ count: BigInt(0) }])
|
||||
vi.mocked(prisma.optimization_events.findMany).mockResolvedValue([])
|
||||
vi.mocked(prisma.optimization_events.count).mockResolvedValue(0)
|
||||
vi.mocked(prisma.optimization_features.findMany).mockResolvedValue([])
|
||||
|
||||
await getAllOptimizationEvents({
|
||||
payload: mockPayload as any,
|
||||
filter: { repository_id: mockRepoIds[0] },
|
||||
filter: { repository_id: "repo-1" },
|
||||
})
|
||||
|
||||
// In the new UNION-based implementation, additional filters are NOT supported
|
||||
// because they would require complex WHERE clause merging across UNION branches.
|
||||
// This test now verifies the query runs without errors (which is a valid regression test).
|
||||
expect(prisma.$queryRawUnsafe).toHaveBeenCalledTimes(2)
|
||||
const callArgs = vi.mocked(prisma.optimization_events.findMany).mock.calls[0][0] as any
|
||||
const andClause = callArgs.where.AND
|
||||
expect(andClause).toBeDefined()
|
||||
expect(andClause).toContainEqual({ repository_id: "repo-1" })
|
||||
})
|
||||
})
|
||||
|
||||
|
|
@ -235,7 +236,7 @@ describe("getAllOptimizationEvents", () => {
|
|||
review_explanation: "Good",
|
||||
repo_full_name: "org/repo",
|
||||
repo_name: "repo",
|
||||
repo_id: mockRepoIds[0],
|
||||
repo_id: "repo-1",
|
||||
},
|
||||
]
|
||||
vi.mocked(prisma.$queryRawUnsafe)
|
||||
|
|
@ -247,8 +248,8 @@ describe("getAllOptimizationEvents", () => {
|
|||
sort: { review_quality: "desc" },
|
||||
})
|
||||
|
||||
expect((result.events[0] as any).repository).toEqual({
|
||||
id: mockRepoIds[0],
|
||||
expect(result.events[0].repository).toEqual({
|
||||
id: "repo-1",
|
||||
full_name: "org/repo",
|
||||
name: "repo",
|
||||
})
|
||||
|
|
@ -275,7 +276,7 @@ describe("getAllOptimizationEvents", () => {
|
|||
sort: { review_quality: "desc" },
|
||||
})
|
||||
|
||||
expect((result.events[0] as any).repository).toBeNull()
|
||||
expect(result.events[0].repository).toBeNull()
|
||||
})
|
||||
|
||||
it("includes LEFT JOIN in raw SQL queries", async () => {
|
||||
|
|
@ -300,10 +301,12 @@ describe("getAllOptimizationEvents", () => {
|
|||
repoIds: [],
|
||||
repos: [],
|
||||
} as any)
|
||||
vi.mocked(prisma.optimization_events.findMany).mockResolvedValue([])
|
||||
vi.mocked(prisma.optimization_events.count).mockResolvedValue(0)
|
||||
vi.mocked(prisma.optimization_features.findMany).mockResolvedValue([])
|
||||
|
||||
const result = await getAllOptimizationEvents({ payload: mockPayload as any })
|
||||
expect(result.events).toEqual([])
|
||||
expect(result.totalCount).toBe(0)
|
||||
})
|
||||
})
|
||||
})
|
||||
|
|
|
|||
|
|
@ -228,18 +228,9 @@ export function OptimizationsTable({
|
|||
const pageSize = 10
|
||||
const isInitialMount = useRef(true)
|
||||
const debounceTimer = useRef<NodeJS.Timeout>(undefined)
|
||||
const [retryKey, setRetryKey] = useState(0)
|
||||
|
||||
// Load events when filters change (skip initial mount — server provided that data)
|
||||
useEffect(() => {
|
||||
if (isInitialMount.current) {
|
||||
isInitialMount.current = false
|
||||
return
|
||||
}
|
||||
|
||||
const controller = new AbortController()
|
||||
|
||||
const doFetch = () => {
|
||||
const loadEvents = useCallback(
|
||||
(signal?: AbortSignal) => {
|
||||
setIsLoading(true)
|
||||
setError(null)
|
||||
|
||||
|
|
@ -254,7 +245,7 @@ export function OptimizationsTable({
|
|||
})
|
||||
if (filters.repositoryId) params.set("repositoryId", filters.repositoryId)
|
||||
|
||||
fetch(`/api/optimization-events?${params}`, { signal: controller.signal })
|
||||
fetch(`/api/optimization-events?${params}`, { signal })
|
||||
.then(res => {
|
||||
if (!res.ok) throw new Error(`HTTP ${res.status}`)
|
||||
return res.json()
|
||||
|
|
@ -284,18 +275,32 @@ export function OptimizationsTable({
|
|||
setError(err instanceof Error ? err.message : "Failed to load events")
|
||||
})
|
||||
.finally(() => {
|
||||
if (!controller.signal.aborted) setIsLoading(false)
|
||||
if (!signal?.aborted) setIsLoading(false)
|
||||
})
|
||||
},
|
||||
[filters, pageSize],
|
||||
)
|
||||
|
||||
// Load events when filters change (skip initial mount — server provided that data)
|
||||
useEffect(() => {
|
||||
if (isInitialMount.current) {
|
||||
isInitialMount.current = false
|
||||
return
|
||||
}
|
||||
|
||||
const controller = new AbortController()
|
||||
|
||||
if (debounceTimer.current) {
|
||||
clearTimeout(debounceTimer.current)
|
||||
}
|
||||
|
||||
if (filters.search !== "") {
|
||||
debounceTimer.current = setTimeout(doFetch, 300)
|
||||
const hasSearchChanged = filters.search !== ""
|
||||
if (hasSearchChanged) {
|
||||
debounceTimer.current = setTimeout(() => {
|
||||
loadEvents(controller.signal)
|
||||
}, 300)
|
||||
} else {
|
||||
doFetch()
|
||||
loadEvents(controller.signal)
|
||||
}
|
||||
|
||||
return () => {
|
||||
|
|
@ -304,19 +309,7 @@ export function OptimizationsTable({
|
|||
clearTimeout(debounceTimer.current)
|
||||
}
|
||||
}
|
||||
// Flatten filter properties as deps to avoid object-reference churn
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, [
|
||||
filters.page,
|
||||
filters.search,
|
||||
filters.status,
|
||||
filters.eventType,
|
||||
filters.reviewQuality,
|
||||
filters.sortBy,
|
||||
filters.repositoryId,
|
||||
pageSize,
|
||||
retryKey,
|
||||
])
|
||||
}, [filters, loadEvents])
|
||||
|
||||
const handleRowClick = useCallback(
|
||||
(traceId: string) => {
|
||||
|
|
@ -640,7 +633,7 @@ export function OptimizationsTable({
|
|||
<Button
|
||||
variant="outline"
|
||||
size="sm"
|
||||
onClick={() => setRetryKey(k => k + 1)}
|
||||
onClick={() => loadEvents()}
|
||||
className="mt-2"
|
||||
disabled={isLoading}
|
||||
>
|
||||
|
|
|
|||
|
|
@ -1,87 +1,40 @@
|
|||
"use server"
|
||||
import { cache } from "react"
|
||||
import { getRepositoriesForAccountCached } from "@/lib/services/repository-utils"
|
||||
import { withTiming } from "@/lib/server-action-timing"
|
||||
import { AccountPayload, prisma } from "@codeflash-ai/common"
|
||||
import { Prisma } from "@prisma/client"
|
||||
|
||||
// Cached implementation for getRepositoriesWithStagingEvents
|
||||
// React cache() ensures this is only executed once per unique payload within a single request
|
||||
const getRepositoriesWithStagingEventsImpl = cache(
|
||||
async (payload: AccountPayload): Promise<Array<{ id: string; full_name: string }>> => {
|
||||
const { repoIds, repos: allRepos } = await getRepositoriesForAccountCached(payload)
|
||||
|
||||
if (repoIds.length === 0) {
|
||||
return []
|
||||
}
|
||||
|
||||
// For org accounts, use simple IN clause. For personal accounts, use UNION
|
||||
// to avoid bitmap OR merge (each branch uses its own composite index independently).
|
||||
let repoIdsWithStagingEvents: Array<{ repository_id: string | null }>
|
||||
|
||||
if ("orgId" in payload) {
|
||||
// Organization account: simple IN clause
|
||||
const groupByResult = await prisma.optimization_events.groupBy({
|
||||
by: ["repository_id"],
|
||||
where: {
|
||||
is_staging: true,
|
||||
repository_id: { in: repoIds, not: null },
|
||||
},
|
||||
})
|
||||
repoIdsWithStagingEvents = groupByResult.map((g: { repository_id: string | null }) => ({
|
||||
repository_id: g.repository_id,
|
||||
}))
|
||||
} else {
|
||||
// Personal account: UNION query for efficient index usage
|
||||
// Each branch can use its own composite index independently
|
||||
const result = await prisma.$queryRaw<Array<{ repository_id: string | null }>>`
|
||||
SELECT DISTINCT repository_id
|
||||
FROM (
|
||||
SELECT repository_id
|
||||
FROM optimization_events
|
||||
WHERE is_staging = true
|
||||
AND repository_id IN (${Prisma.join(repoIds)})
|
||||
AND repository_id IS NOT NULL
|
||||
UNION
|
||||
SELECT repository_id
|
||||
FROM optimization_events
|
||||
WHERE is_staging = true
|
||||
AND user_id = ${payload.userId}
|
||||
AND repository_id IS NOT NULL
|
||||
UNION
|
||||
SELECT repository_id
|
||||
FROM optimization_events
|
||||
WHERE is_staging = true
|
||||
AND current_username = ${payload.username}
|
||||
AND repository_id IS NOT NULL
|
||||
) AS combined_events
|
||||
`
|
||||
|
||||
repoIdsWithStagingEvents = (result as Array<{ repository_id: string | null }>).map(row => ({
|
||||
repository_id: row.repository_id,
|
||||
}))
|
||||
}
|
||||
|
||||
// Filter and map repos that have staging events (O(1) Set lookup instead of O(n) .some)
|
||||
const stagingRepoSet = new Set(repoIdsWithStagingEvents.map(g => g.repository_id))
|
||||
return allRepos
|
||||
.filter(repo => stagingRepoSet.has(repo.id))
|
||||
.map(repo => ({
|
||||
id: repo.id,
|
||||
full_name: repo.full_name,
|
||||
}))
|
||||
.sort((a, b) => a.full_name.localeCompare(b.full_name))
|
||||
},
|
||||
)
|
||||
import { AccountPayload, buildOptimizationOrCondition, prisma } from "@codeflash-ai/common"
|
||||
|
||||
export const getRepositoriesWithStagingEvents = withTiming(
|
||||
"getRepositoriesWithStagingEvents",
|
||||
getRepositoriesWithStagingEventsImpl,
|
||||
async (payload: AccountPayload): Promise<Array<{ id: string; full_name: string }>> => {
|
||||
const { repoIds, repos: allRepos } = await getRepositoriesForAccountCached(payload)
|
||||
|
||||
if (repoIds.length === 0) {
|
||||
return []
|
||||
}
|
||||
|
||||
// Get distinct repository IDs that have staging events using groupBy (more efficient than findMany with distinct)
|
||||
const repoIdsWithStagingEvents = await prisma.optimization_events.groupBy({
|
||||
by: ["repository_id"],
|
||||
where: {
|
||||
is_staging: true,
|
||||
...buildOptimizationOrCondition(payload, repoIds),
|
||||
repository_id: { not: null },
|
||||
},
|
||||
})
|
||||
|
||||
// Filter and map repos that have staging events
|
||||
return allRepos
|
||||
.filter(repo => repoIdsWithStagingEvents.some(group => group.repository_id === repo.id))
|
||||
.map(repo => ({
|
||||
id: repo.id,
|
||||
full_name: repo.full_name,
|
||||
}))
|
||||
.sort((a, b) => a.full_name.localeCompare(b.full_name))
|
||||
},
|
||||
)
|
||||
|
||||
// Cached implementation for getAllOptimizationEvents
|
||||
// React cache() deduplicates calls with identical arguments within a single request
|
||||
const getAllOptimizationEventsImpl = cache(
|
||||
export const getAllOptimizationEvents = withTiming(
|
||||
"getAllOptimizationEvents",
|
||||
async ({
|
||||
payload,
|
||||
search,
|
||||
|
|
@ -97,68 +50,112 @@ const getAllOptimizationEventsImpl = cache(
|
|||
page?: number
|
||||
pageSize?: number
|
||||
}) => {
|
||||
const repoIds = (await getRepositoriesForAccountCached(payload)).repoIds
|
||||
const repoIds = (await getRepositoriesForAccountCached(payload)).repoIds
|
||||
|
||||
if (repoIds.length === 0) {
|
||||
return { events: [], totalCount: 0 }
|
||||
const where: any = {
|
||||
is_staging: true,
|
||||
...buildOptimizationOrCondition(payload, repoIds),
|
||||
}
|
||||
|
||||
if (search) {
|
||||
where.AND = where.AND || []
|
||||
where.AND.push({
|
||||
OR: [
|
||||
{
|
||||
function_name: {
|
||||
contains: search,
|
||||
mode: "insensitive",
|
||||
},
|
||||
},
|
||||
{
|
||||
file_path: {
|
||||
contains: search,
|
||||
mode: "insensitive",
|
||||
},
|
||||
},
|
||||
{
|
||||
repository: {
|
||||
full_name: {
|
||||
contains: search,
|
||||
mode: "insensitive",
|
||||
},
|
||||
},
|
||||
},
|
||||
],
|
||||
})
|
||||
}
|
||||
|
||||
if (filter) {
|
||||
Object.keys(filter).forEach(key => {
|
||||
if (key === "repository_id") {
|
||||
where.AND = where.AND || []
|
||||
where.AND.push({ [key]: filter[key] })
|
||||
} else if (key !== "review_quality") {
|
||||
where[key] = filter[key]
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
const needsOptimizationFeaturesJoin =
|
||||
(sort && Object.keys(sort).some(k => k.toLowerCase() === "review_quality")) ||
|
||||
(filter && Object.keys(filter).some(k => k.toLowerCase() === "review_quality"))
|
||||
|
||||
if (needsOptimizationFeaturesJoin) {
|
||||
const whereConditions = []
|
||||
const params: any[] = []
|
||||
let paramIndex = 1
|
||||
whereConditions.push(`oe.is_staging = true`)
|
||||
if ("orgId" in payload) {
|
||||
whereConditions.push(`oe.repository_id IN (${repoIds.map(id => `'${id}'`).join(",")})`)
|
||||
} else {
|
||||
whereConditions.push(
|
||||
`(
|
||||
oe.repository_id IN (${repoIds.map(id => `'${id}'`).join(",")})
|
||||
OR oe.user_id = '${payload.userId}'
|
||||
OR oe.current_username = '${payload.username}'
|
||||
)`,
|
||||
)
|
||||
}
|
||||
|
||||
const needsOptimizationFeaturesJoin =
|
||||
(sort && Object.keys(sort).some(k => k.toLowerCase() === "review_quality")) ||
|
||||
(filter && Object.keys(filter).some(k => k.toLowerCase() === "review_quality"))
|
||||
|
||||
if (needsOptimizationFeaturesJoin) {
|
||||
// Raw SQL path for review_quality sorting/filtering
|
||||
const whereFragments: Prisma.Sql[] = [Prisma.sql`oe.is_staging = true`]
|
||||
|
||||
if ("orgId" in payload) {
|
||||
whereFragments.push(Prisma.sql`oe.repository_id IN (${Prisma.join(repoIds)})`)
|
||||
} else {
|
||||
// For personal accounts, use OR pattern in WHERE (raw SQL already, so bitmap merge is acceptable here
|
||||
// since it's joined with optimization_features anyway). The primary bottleneck was the groupBy,
|
||||
// which is now fixed above. This path is rarely hit (only when sorting by review_quality).
|
||||
whereFragments.push(
|
||||
Prisma.sql`(
|
||||
oe.repository_id IN (${Prisma.join(repoIds)})
|
||||
OR oe.user_id = ${payload.userId}
|
||||
OR oe.current_username = ${payload.username}
|
||||
)`,
|
||||
)
|
||||
// Add search conditions
|
||||
if (search) {
|
||||
whereConditions.push(
|
||||
`(oe.function_name ILIKE $${paramIndex} OR oe.file_path ILIKE $${paramIndex} OR r.full_name ILIKE $${paramIndex})`,
|
||||
)
|
||||
params.push(`%${search}%`)
|
||||
paramIndex += 1
|
||||
}
|
||||
// Add filter conditions
|
||||
if (filter) {
|
||||
if (filter.status) {
|
||||
whereConditions.push(`oe.status = $${paramIndex}`)
|
||||
params.push(filter.status)
|
||||
paramIndex += 1
|
||||
}
|
||||
|
||||
// Add search conditions
|
||||
if (search) {
|
||||
const searchPattern = `%${search}%`
|
||||
whereFragments.push(
|
||||
Prisma.sql`(oe.function_name ILIKE ${searchPattern} OR oe.file_path ILIKE ${searchPattern} OR r.full_name ILIKE ${searchPattern})`,
|
||||
)
|
||||
if (filter.event_type) {
|
||||
whereConditions.push(`oe.event_type = $${paramIndex}`)
|
||||
params.push(filter.event_type)
|
||||
paramIndex += 1
|
||||
}
|
||||
// Add filter conditions
|
||||
if (filter) {
|
||||
if (filter.status) {
|
||||
whereFragments.push(Prisma.sql`oe.status = ${filter.status}`)
|
||||
}
|
||||
if (filter.event_type) {
|
||||
whereFragments.push(Prisma.sql`oe.event_type = ${filter.event_type}`)
|
||||
}
|
||||
if (filter.review_quality) {
|
||||
whereFragments.push(Prisma.sql`of.review_quality = ${filter.review_quality}`)
|
||||
}
|
||||
if (filter.repository_id !== undefined) {
|
||||
if (filter.repository_id === null) {
|
||||
whereFragments.push(Prisma.sql`oe.repository_id IS NULL`)
|
||||
} else if (filter.repository_id.not !== undefined && filter.repository_id.not === null) {
|
||||
whereFragments.push(Prisma.sql`oe.repository_id IS NOT NULL`)
|
||||
}
|
||||
if (filter.review_quality) {
|
||||
whereConditions.push(`of.review_quality = $${paramIndex}`)
|
||||
params.push(filter.review_quality)
|
||||
paramIndex += 1
|
||||
}
|
||||
if (filter.repository_id !== undefined) {
|
||||
if (filter.repository_id === null) {
|
||||
whereConditions.push(`oe.repository_id IS NULL`)
|
||||
} else if (filter.repository_id.not !== undefined && filter.repository_id.not === null) {
|
||||
whereConditions.push(`oe.repository_id IS NOT NULL`)
|
||||
}
|
||||
}
|
||||
const whereClause = Prisma.join(whereFragments, " AND ")
|
||||
const orderByClauses: Prisma.Sql[] = []
|
||||
if (sort && Object.keys(sort).length > 0) {
|
||||
Object.entries(sort).forEach(([key, direction]) => {
|
||||
const dir = direction.toUpperCase() === "ASC" ? Prisma.sql`ASC` : Prisma.sql`DESC`
|
||||
if (key.toLowerCase() === "review_quality") {
|
||||
orderByClauses.push(Prisma.sql`
|
||||
}
|
||||
const whereClause = whereConditions.join(" AND ")
|
||||
const orderByClauses: string[] = []
|
||||
if (sort && Object.keys(sort).length > 0) {
|
||||
Object.entries(sort).forEach(([key, direction]) => {
|
||||
const dir = direction.toUpperCase()
|
||||
if (key.toLowerCase() === "review_quality") {
|
||||
orderByClauses.push(`
|
||||
CASE
|
||||
WHEN LOWER(of.review_quality) = 'high' THEN 3
|
||||
WHEN LOWER(of.review_quality) = 'medium' THEN 2
|
||||
|
|
@ -166,20 +163,18 @@ const getAllOptimizationEventsImpl = cache(
|
|||
ELSE 0
|
||||
END ${dir}
|
||||
`)
|
||||
} else {
|
||||
const col = key === "created_at" ? Prisma.sql`oe.created_at` : Prisma.raw(`oe.${key}`)
|
||||
orderByClauses.push(Prisma.sql`${col} ${dir}`)
|
||||
}
|
||||
})
|
||||
}
|
||||
if (!sort) {
|
||||
orderByClauses.push(Prisma.sql`oe.created_at DESC`)
|
||||
}
|
||||
const orderByClause = Prisma.join(orderByClauses, ", ")
|
||||
const paginationLimit = pageSize
|
||||
const paginationOffset = (page - 1) * pageSize
|
||||
const [events, countResult] = await Promise.all([
|
||||
prisma.$queryRaw<any[]>`
|
||||
} else {
|
||||
orderByClauses.push(`oe.${key} ${dir}`)
|
||||
}
|
||||
})
|
||||
}
|
||||
if (!sort) {
|
||||
orderByClauses.push("oe.created_at DESC")
|
||||
}
|
||||
const orderByClause = orderByClauses.join(", ")
|
||||
const [events, countResult] = await Promise.all([
|
||||
prisma.$queryRawUnsafe<any[]>(
|
||||
`
|
||||
SELECT
|
||||
oe.*,
|
||||
of.review_quality,
|
||||
|
|
@ -192,231 +187,69 @@ const getAllOptimizationEventsImpl = cache(
|
|||
LEFT JOIN repositories r ON oe.repository_id = r.id
|
||||
WHERE ${whereClause}
|
||||
ORDER BY ${orderByClause}
|
||||
LIMIT ${paginationLimit} OFFSET ${paginationOffset}
|
||||
LIMIT $${paramIndex} OFFSET $${paramIndex + 1}
|
||||
`,
|
||||
prisma.$queryRaw<[{ count: bigint }]>`
|
||||
...params,
|
||||
pageSize,
|
||||
(page - 1) * pageSize,
|
||||
),
|
||||
prisma.$queryRawUnsafe<[{ count: bigint }]>(
|
||||
`
|
||||
SELECT COUNT(*) as count
|
||||
FROM optimization_events oe
|
||||
LEFT JOIN optimization_features of ON oe.trace_id = of.trace_id
|
||||
LEFT JOIN repositories r ON oe.repository_id = r.id
|
||||
WHERE ${whereClause}
|
||||
`,
|
||||
])
|
||||
const totalCount = Number(countResult[0].count)
|
||||
// Repository data is already included from the JOIN
|
||||
const eventsWithRepo = events.map(
|
||||
(
|
||||
event: Record<string, unknown> & {
|
||||
repo_id?: string
|
||||
repo_full_name?: string
|
||||
repo_name?: string
|
||||
},
|
||||
) => ({
|
||||
...event,
|
||||
repository: event.repo_id
|
||||
? { id: event.repo_id, full_name: event.repo_full_name, name: event.repo_name }
|
||||
: null,
|
||||
}),
|
||||
)
|
||||
return { events: eventsWithRepo, totalCount }
|
||||
} else {
|
||||
// Standard Prisma query with native orderBy (optimized with UNION for personal accounts)
|
||||
const orderBy = sort || { created_at: "desc" as const }
|
||||
...params,
|
||||
),
|
||||
])
|
||||
const totalCount = Number(countResult[0].count)
|
||||
// Repository data is already included from the JOIN
|
||||
const eventsWithRepo = events.map(event => ({
|
||||
...event,
|
||||
repository: event.repo_id ? { id: event.repo_id, full_name: event.repo_full_name, name: event.repo_name } : null,
|
||||
}))
|
||||
return { events: eventsWithRepo, totalCount }
|
||||
} else {
|
||||
// Standard Prisma query with native orderBy
|
||||
const orderBy = sort || { created_at: "desc" }
|
||||
|
||||
let events
|
||||
let totalCount
|
||||
|
||||
if ("orgId" in payload) {
|
||||
// Organization account: simple IN clause
|
||||
const where = {
|
||||
is_staging: true,
|
||||
repository_id: { in: repoIds },
|
||||
} as any
|
||||
|
||||
if (search) {
|
||||
where.AND = where.AND || []
|
||||
where.AND.push({
|
||||
OR: [
|
||||
{
|
||||
function_name: {
|
||||
contains: search,
|
||||
mode: "insensitive" as const,
|
||||
},
|
||||
},
|
||||
{
|
||||
file_path: {
|
||||
contains: search,
|
||||
mode: "insensitive" as const,
|
||||
},
|
||||
},
|
||||
{
|
||||
repository: {
|
||||
full_name: {
|
||||
contains: search,
|
||||
mode: "insensitive" as const,
|
||||
},
|
||||
},
|
||||
},
|
||||
],
|
||||
})
|
||||
}
|
||||
|
||||
if (filter) {
|
||||
Object.keys(filter).forEach(key => {
|
||||
if (key === "repository_id") {
|
||||
where.AND = where.AND || []
|
||||
where.AND.push({ [key]: filter[key] })
|
||||
} else if (key !== "review_quality") {
|
||||
where[key] = filter[key]
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
;[events, totalCount] = await Promise.all([
|
||||
prisma.optimization_events.findMany({
|
||||
where,
|
||||
orderBy,
|
||||
skip: (page - 1) * pageSize,
|
||||
take: pageSize,
|
||||
include: {
|
||||
repository: {
|
||||
select: { id: true, full_name: true, name: true },
|
||||
},
|
||||
},
|
||||
}),
|
||||
prisma.optimization_events.count({ where }),
|
||||
])
|
||||
} else {
|
||||
// Personal account: Use raw SQL with UNION for efficient index seeks
|
||||
let searchCondition = Prisma.empty
|
||||
if (search) {
|
||||
const searchPattern = `%${search}%`
|
||||
searchCondition = Prisma.sql`AND (oe.function_name ILIKE ${searchPattern} OR oe.file_path ILIKE ${searchPattern} OR r.full_name ILIKE ${searchPattern})`
|
||||
}
|
||||
|
||||
const filterFragments: Prisma.Sql[] = []
|
||||
if (filter) {
|
||||
Object.entries(filter).forEach(([key, value]) => {
|
||||
if (key === "status") {
|
||||
filterFragments.push(Prisma.sql`AND oe.status = ${value}`)
|
||||
} else if (key === "event_type") {
|
||||
filterFragments.push(Prisma.sql`AND oe.event_type = ${value}`)
|
||||
} else if (key === "repository_id") {
|
||||
if (value === null) {
|
||||
filterFragments.push(Prisma.sql`AND oe.repository_id IS NULL`)
|
||||
} else if (value?.not === null) {
|
||||
filterFragments.push(Prisma.sql`AND oe.repository_id IS NOT NULL`)
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
const filterConditions =
|
||||
filterFragments.length > 0 ? Prisma.join(filterFragments, " ") : Prisma.empty
|
||||
|
||||
const orderByDir =
|
||||
typeof orderBy === "object" && orderBy.created_at === "asc"
|
||||
? Prisma.sql`ASC`
|
||||
: Prisma.sql`DESC`
|
||||
|
||||
const paginationLimit = pageSize
|
||||
const paginationOffset = (page - 1) * pageSize
|
||||
|
||||
const unionSubquery = Prisma.sql`
|
||||
SELECT id FROM (
|
||||
SELECT id FROM optimization_events
|
||||
WHERE is_staging = true
|
||||
AND repository_id IN (${Prisma.join(repoIds)})
|
||||
UNION
|
||||
SELECT id FROM optimization_events
|
||||
WHERE is_staging = true AND user_id = ${payload.userId}
|
||||
UNION
|
||||
SELECT id FROM optimization_events
|
||||
WHERE is_staging = true AND current_username = ${payload.username}
|
||||
) AS combined_ids
|
||||
`
|
||||
|
||||
const [eventsResult, countResult] = await Promise.all([
|
||||
prisma.$queryRaw<any[]>`
|
||||
WITH base_events AS (
|
||||
SELECT oe.*, r.id as repo_id, r.full_name as repo_full_name, r.name as repo_name
|
||||
FROM optimization_events oe
|
||||
LEFT JOIN repositories r ON oe.repository_id = r.id
|
||||
WHERE oe.id IN (${unionSubquery})
|
||||
${searchCondition}
|
||||
${filterConditions}
|
||||
ORDER BY oe.created_at ${orderByDir}
|
||||
LIMIT ${paginationLimit} OFFSET ${paginationOffset}
|
||||
)
|
||||
SELECT * FROM base_events
|
||||
`,
|
||||
prisma.$queryRaw<[{ count: bigint }]>`
|
||||
SELECT COUNT(*) as count
|
||||
FROM optimization_events oe
|
||||
LEFT JOIN repositories r ON oe.repository_id = r.id
|
||||
WHERE oe.id IN (${unionSubquery})
|
||||
${searchCondition}
|
||||
${filterConditions}
|
||||
`,
|
||||
])
|
||||
|
||||
totalCount = Number(countResult[0].count)
|
||||
events = eventsResult.map(
|
||||
(
|
||||
event: Record<string, unknown> & {
|
||||
repo_id?: string
|
||||
repo_full_name?: string
|
||||
repo_name?: string
|
||||
},
|
||||
) => ({
|
||||
...event,
|
||||
repository: event.repo_id
|
||||
? { id: event.repo_id, full_name: event.repo_full_name, name: event.repo_name }
|
||||
: null,
|
||||
}),
|
||||
)
|
||||
}
|
||||
|
||||
// Batch-fetch review data for all events in a single query
|
||||
const traceIds = (events as Array<Record<string, unknown>>).map(
|
||||
(e: Record<string, unknown>) => e.trace_id as string,
|
||||
)
|
||||
const features =
|
||||
traceIds.length > 0
|
||||
? await prisma.optimization_features.findMany({
|
||||
where: { trace_id: { in: traceIds } },
|
||||
select: {
|
||||
trace_id: true,
|
||||
review_quality: true,
|
||||
review_explanation: true,
|
||||
},
|
||||
})
|
||||
: []
|
||||
type ReviewFeature = {
|
||||
trace_id: string
|
||||
review_quality: string | null
|
||||
review_explanation: string | null
|
||||
}
|
||||
const featuresMap = new Map<string, ReviewFeature>(
|
||||
(features as ReviewFeature[]).map((f: ReviewFeature) => [f.trace_id, f]),
|
||||
)
|
||||
|
||||
const eventsWithReviewData = (events as Array<Record<string, unknown>>).map(
|
||||
(event: Record<string, unknown>) => {
|
||||
const f = featuresMap.get(event.trace_id as string)
|
||||
return {
|
||||
...event,
|
||||
review_quality: f?.review_quality || null,
|
||||
review_explanation: f?.review_explanation || null,
|
||||
}
|
||||
const [events, totalCount] = await Promise.all([
|
||||
prisma.optimization_events.findMany({
|
||||
where,
|
||||
orderBy,
|
||||
skip: (page - 1) * pageSize,
|
||||
take: pageSize,
|
||||
include: {
|
||||
repository: true,
|
||||
},
|
||||
)
|
||||
}),
|
||||
prisma.optimization_events.count({ where }),
|
||||
])
|
||||
|
||||
return { events: eventsWithReviewData, totalCount }
|
||||
}
|
||||
// Batch-fetch review data for all events in a single query
|
||||
const traceIds = events.map(e => e.trace_id)
|
||||
const features = await prisma.optimization_features.findMany({
|
||||
where: { trace_id: { in: traceIds } },
|
||||
select: {
|
||||
trace_id: true,
|
||||
review_quality: true,
|
||||
review_explanation: true,
|
||||
},
|
||||
})
|
||||
const featuresMap = new Map(features.map(f => [f.trace_id, f]))
|
||||
|
||||
const eventsWithReviewData = events.map(event => {
|
||||
const f = featuresMap.get(event.trace_id)
|
||||
return {
|
||||
...event,
|
||||
review_quality: f?.review_quality || null,
|
||||
review_explanation: f?.review_explanation || null,
|
||||
}
|
||||
})
|
||||
|
||||
return { events: eventsWithReviewData, totalCount }
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
export const getAllOptimizationEvents = withTiming(
|
||||
"getAllOptimizationEvents",
|
||||
getAllOptimizationEventsImpl,
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,19 +0,0 @@
|
|||
import { cacheLife, cacheTag } from "next/cache"
|
||||
import type { AccountPayload } from "@codeflash-ai/common"
|
||||
import { getAllOptimizationEvents, getRepositoriesWithStagingEvents } from "./action"
|
||||
|
||||
export async function getCachedInitialEvents(accountKey: string, payload: AccountPayload) {
|
||||
"use cache"
|
||||
cacheLife("frequent")
|
||||
cacheTag(`review-events:${accountKey}`)
|
||||
|
||||
return getAllOptimizationEvents({ payload, page: 1, pageSize: 10 })
|
||||
}
|
||||
|
||||
export async function getCachedRepositories(accountKey: string, payload: AccountPayload) {
|
||||
"use cache"
|
||||
cacheLife("frequent")
|
||||
cacheTag(`review-repos:${accountKey}`)
|
||||
|
||||
return getRepositoriesWithStagingEvents(payload)
|
||||
}
|
||||
|
|
@ -1,5 +1,5 @@
|
|||
import { getAccountContext } from "@/lib/server/get-account-context"
|
||||
import { getCachedInitialEvents, getCachedRepositories } from "./cached-data"
|
||||
import { getAllOptimizationEvents, getRepositoriesWithStagingEvents } from "./action"
|
||||
import { OptimizationsTable } from "./_components/OptimizationsTable"
|
||||
|
||||
export default async function ReviewOptimizationsPage() {
|
||||
|
|
@ -7,8 +7,12 @@ export default async function ReviewOptimizationsPage() {
|
|||
const accountKey = "orgId" in accountPayload ? accountPayload.orgId : accountPayload.userId
|
||||
|
||||
const [initialData, availableRepositories] = await Promise.all([
|
||||
getCachedInitialEvents(accountKey, accountPayload),
|
||||
getCachedRepositories(accountKey, accountPayload),
|
||||
getAllOptimizationEvents({
|
||||
payload: accountPayload,
|
||||
page: 1,
|
||||
pageSize: 10,
|
||||
}),
|
||||
getRepositoriesWithStagingEvents(accountPayload),
|
||||
])
|
||||
|
||||
const initialEvents = (initialData?.events || []).map((event: any) => ({
|
||||
|
|
|
|||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Reference in a new issue