Increase test data size for TS findDuplicates benchmark

The js-ts-class E2E test was flaky because n=100 is too small for
the O(n²)→O(n) optimization to overcome Map/Set per-operation overhead.
At n=100, the LLM correctly generates a Map-based O(n) solution but it
benchmarks as slower (-10.6%) due to constant factor dominance.

Bump to n=10,000 so the algorithmic improvement produces measurable
speedup, making the 30% E2E threshold reliably achievable.
This commit is contained in:
Kevin Turcios 2026-04-09 19:18:50 -05:00
parent 477dfa246e
commit a73ccca426

View file

@ -24,7 +24,7 @@ describe('DataProcessor', () => {
test('handles larger arrays with duplicates', () => {
const data: number[] = [];
for (let i = 0; i < 100; i++) {
for (let i = 0; i < 10000; i++) {
data.push(i % 20);
}
const processor = new DataProcessor(data);