23 examples

Stack overflow error

Call stack exceeds limit, causing crashes or halts.

[ FAQ1 ]

What is a stack overflow error?

A stack overflow error occurs when a program exhausts the available space in the call stack, usually resulting from excessively deep or infinite recursion, or occasionally from improperly managed function calls. Each function call adds a new stack frame to the call stack, which stores local variables and execution context. If calls become too deeply nested without returning, the stack limit is breached, causing the program to crash or terminate abruptly. Common symptoms include sudden crashes, segmentation faults, or explicit stack overflow errors reported by the runtime environment.
[ FAQ2 ]

How to fix a stack overflow error

To fix a stack overflow error, carefully analyze and manage recursive function calls, ensuring that they have proper base cases and termination conditions. Limit recursion depth explicitly, or refactor recursive logic into iterative loops, which manage memory more predictably. Use debugging and profiling tools to identify problematic recursion or call-stack-intensive functions, and consider increasing the stack size only if absolutely necessary. Regular code reviews and testing can help detect infinite loops or excessively deep call stacks, preventing stack overflow errors proactively.
diff block
+import {
+ BusterChatMessageReasoning,
+ BusterChatMessageReasoning_thought
+} from '@/api/asset_interfaces';
+import { useMemoizedFn } from 'ahooks';
+import sample from 'lodash/sample';
+import last from 'lodash/last';
+import { useBusterChatContextSelector } from '../ChatProvider';
+import { timeout } from '@/utils';
+import random from 'lodash/random';
+
+export const useAutoAppendThought = () => {
+ const onUpdateChatMessage = useBusterChatContextSelector((x) => x.onUpdateChatMessage);
+ const getChatMessagesMemoized = useBusterChatContextSelector((x) => x.getChatMessagesMemoized);
+
+ const removeAutoThoughts = useMemoizedFn(
+ (reasoningMessages: BusterChatMessageReasoning[]): BusterChatMessageReasoning[] => {
+ return reasoningMessages.filter((rm) => rm.id !== AUTO_THOUGHT_ID);
+ }
+ );
+
+ const autoAppendThought = useMemoizedFn(
+ (
+ reasoningMessages: BusterChatMessageReasoning[],
+ chatId: string
+ ): BusterChatMessageReasoning[] => {
+ const lastReasoningMessage = reasoningMessages[reasoningMessages.length - 1];
+ const lastMessageIsCompleted =
+ !lastReasoningMessage || lastReasoningMessage?.status === 'completed';
+
+ if (lastMessageIsCompleted) {
+ _loopAutoThought(chatId);
+
+ return [...reasoningMessages, createAutoThought()];
+ }
+
+ return removeAutoThoughts(reasoningMessages);
+ }
+ );
+
+ const _loopAutoThought = useMemoizedFn(async (chatId: string) => {
+ const randomDelay = random(3000, 5000);
+ await timeout(randomDelay);
+ const chatMessages = getChatMessagesMemoized(chatId);
+ const lastMessage = last(chatMessages);
+ const isCompletedStream = !!lastMessage?.isCompletedStream;
+ const lastReasoningMessage = last(lastMessage?.reasoning);
+ const lastReasoningMessageIsAutoAppended =
+ !lastReasoningMessage || lastReasoningMessage?.id === AUTO_THOUGHT_ID;
+
+ if (!isCompletedStream && lastReasoningMessageIsAutoAppended && lastMessage) {
+ const lastMessageId = lastMessage?.id!;
+ const lastReasoningMessageIndex = lastMessage?.reasoning.length - 1;
+ const updatedReasoning = lastMessage?.reasoning.slice(0, lastReasoningMessageIndex);
+ const newReasoningMessages = [...updatedReasoning, createAutoThought()];
+
+ onUpdateChatMessage({
+ id: lastMessageId,
+ reasoning: newReasoningMessages,
+ isCompletedStream: false
+ });
+
+ _loopAutoThought(chatId);
+ }
Greptile
greptile
recursive call to _loopAutoThought could potentially cause memory issues if stream never completes
suggested fix
if (!isCompletedStream && lastReasoningMessageIsAutoAppended && lastMessage) {
const lastMessageId = lastMessage?.id!;
const lastReasoningMessageIndex = lastMessage?.reasoning.length - 1;
const updatedReasoning = lastMessage?.reasoning.slice(0, lastReasoningMessageIndex);
const newReasoningMessages = [...updatedReasoning, createAutoThought()];
onUpdateChatMessage({
id: lastMessageId,
reasoning: newReasoningMessages,
isCompletedStream: false
});
+ // Use setTimeout instead of recursive call to prevent stack overflow
+ setTimeout(() => _loopAutoThought(chatId), 0);
}
diff block
+import { useSettingsPermissionMap } from '@/settings/roles/hooks/useSettingsPermissionMap';
+import { NavigationDrawerSection } from '@/ui/navigation/navigation-drawer/components/NavigationDrawerSection';
+import { NavigationDrawerSectionTitle } from '@/ui/navigation/navigation-drawer/components/NavigationDrawerSectionTitle';
+import { useFeatureFlagsMap } from '@/workspace/hooks/useFeatureFlagsMap';
+import { Children, ReactNode, isValidElement } from 'react';
+import { isDefined } from 'twenty-shared';
+import { FeatureFlagKey } from '~/generated/graphql';
+import {
+ SettingsNavigationItemWrapper,
+ SettingsNavigationItemWrapperProps,
+} from './SettingsNavigationItemWrapper';
+
+type SettingsNavigationSectionWrapperProps = {
+ title: string;
+ children: ReactNode;
+};
+
+export const SettingsNavigationSectionWrapper = ({
+ title,
+ children,
+}: SettingsNavigationSectionWrapperProps) => {
+ const settingsPermissionMap = useSettingsPermissionMap();
+ const featureFlagsMap = useFeatureFlagsMap();
+
+ const hasVisibleChildren = (children: ReactNode): boolean => {
+ return Children.toArray(children).some((child) => {
+ if (!isValidElement(child)) {
+ return false;
+ }
+
+ if (child.type === SettingsNavigationItemWrapper) {
+ const { requiredFeatureFlag, feature } =
+ child.props as SettingsNavigationItemWrapperProps;
+
+ const hasPermissionEnabled =
+ featureFlagsMap[FeatureFlagKey.IsPermissionsEnabled];
+ const requiredFeatureFlagEnabled =
+ requiredFeatureFlag && featureFlagsMap[requiredFeatureFlag];
+
+ if (!hasPermissionEnabled) {
+ return true;
+ }
+
+ if (!requiredFeatureFlagEnabled) {
+ return false;
+ }
+
+ if (isDefined(feature)) {
+ return settingsPermissionMap[feature];
+ }
+
+ return true;
+ }
+
+ if (isDefined(child.props?.children)) {
+ return hasVisibleChildren(child.props.children);
+ }
Greptile
greptile
style: Recursive call could cause stack overflow with deeply nested children. Consider adding depth limit
diff block
import { global } from '@storybook/global';
+import type { AxeResults } from 'axe-core';
+
import { EVENTS } from './constants';
import type { A11yParameters } from './params';
const { document } = global;
const channel = addons.getChannel();
-// Holds axe core running state
-let active = false;
-// Holds latest story we requested a run
-let activeStoryId: string | undefined;
const defaultParameters = { config: {}, options: {} };
-/** Handle A11yContext events. Because the event are sent without manual check, we split calls */
-const handleRequest = async (storyId: string, input: A11yParameters | null) => {
- if (!input?.manual) {
- await run(storyId, input ?? defaultParameters);
+const disabledRules = [
+ // In component testing, landmarks are not always present
+ // and the rule check can cause false positives
+ 'region',
+];
+
+// A simple queue to run axe-core in sequence
+// This is necessary because axe-core is not designed to run in parallel
+const queue: (() => Promise<void>)[] = [];
+let isRunning = false;
+
+const runNext = async () => {
+ if (queue.length === 0) {
+ isRunning = false;
+ return;
+ }
+
+ isRunning = true;
+ const next = queue.shift();
+ if (next) {
+ await next();
}
+ runNext();
Greptile
greptile
logic: runNext is called recursively without any error handling, which could cause a stack overflow if there are many queued tasks
diff block
+import { ChartData } from "./useChart";
+import { LineChart, Line, XAxis, YAxis } from "recharts";
+import ReactDOMServer from "react-dom/server";
+
+const getChartDataUrl = async (chart: ChartData) => {
+ const isUp = chart[0].price < chart[chart.length - 1].price;
+ const lineColor = isUp ? "#63fe7d" : "#FE6364";
+
+ const formattedChart = chart.map((c) => ({
+ time: new Date(c.timestamp).toLocaleTimeString(),
+ price: c.price,
+ }));
+
+ const minPrice = Math.min(...chart.map((c) => c.price));
+ const maxPrice = Math.max(...chart.map((c) => c.price));
Greptile
greptile
style: Math.min/max with spread operator on large arrays could cause stack overflow. Consider using reduce() instead
diff block
public async storeSessionConsoleLogs(logs: ConsoleLogEntry[]): Promise<void> {
if (logs.length === 0 || !this.topic) {
- return
+ return Promise.resolve()
}
- await this.producer.queueMessages({
- topic: this.topic,
- messages: logs.map((log) => ({
- value: JSON.stringify(log),
- key: log.log_source_id, // Using session_id as the key for partitioning
- })),
- })
+ if (this.pendingPromises.length >= this.promiseLimit) {
+ this.syncPromise = this.sync()
+ }
+
+ if (this.syncPromise) {
+ await this.syncPromise
+ return this.storeSessionConsoleLogs(logs)
+ }
+
+ this.pendingPromises.push(
+ this.producer.queueMessages({
+ topic: this.topic,
+ messages: logs.map((log) => ({
+ value: JSON.stringify(log),
+ key: log.log_source_id,
+ })),
+ })
+ )
this.consoleLogsCount += logs.length
logger.debug(`stored ${logs.length} console logs for session ${logs[0].log_source_id}`)
SessionBatchMetrics.incrementConsoleLogsStored(logs.length)
+ return Promise.resolve()
}
public async flush(): Promise<void> {
+ if (this.syncPromise) {
+ await this.syncPromise
+ return this.flush()
+ } else {
Greptile
greptile
style: Recursive flush call could lead to stack overflow with many pending syncs. Consider using iteration instead.
suggested fix
+ while (this.syncPromise) {
await this.syncPromise
}
+ await this.sync()
diff block
export const GlobalErrorContext = React.createContext<{
isModalOpen: boolean;
setModalOpen: (isOpen: boolean) => void;
- error?: string;
}>({
isModalOpen: false,
setModalOpen: () => {},
- error: undefined,
});
interface GlobalErrorModalProps {
onRerun: () => void;
+ storeState: StoreState;
}
-export function GlobalErrorModal({ onRerun }: GlobalErrorModalProps) {
+function ErrorCause({ error }: { error: ErrorLike }) {
+ if (!error) {
+ return null;
+ }
+
+ return (
+ <div>
+ <h4>
+ Caused by: {error.name || 'Error'}: {error.message}
+ </h4>
+ {error.stack && <pre>{error.stack}</pre>}
+ {error.cause && <ErrorCause error={error.cause} />}
+ </div>
+ );
+}
Greptile
greptile
style: Recursive error cause handling could potentially cause stack overflow with deeply nested error causes
diff block
return channel;
}
- const execute =
- (eventName: string) =>
- (...args: any[]) => {
- if (args[0]?.providerId === TEST_PROVIDER_ID) {
- runTestRunner(channel, eventName, args);
+ const fsCache = createFileSystemCache({
+ basePath: resolvePathInStorybookCache(ADDON_ID.replace('/', '-')),
+ ns: 'storybook',
+ ttl: 14 * 24 * 60 * 60 * 1000, // 14 days
+ });
+ const cachedState: CachedState = await fsCache.get<CachedState>('state', {
+ config: storeOptions.initialState.config,
+ watching: storeOptions.initialState.watching,
+ });
+
+ const store = experimental_UniversalStore.create<StoreState, StoreEvent>({
+ ...storeOptions,
+ initialState: {
+ ...storeOptions.initialState,
+ ...cachedState,
+ },
+ leader: true,
+ });
+ store.onStateChange((state, previousState) => {
+ const selectCachedState = (s: StoreState): CachedState => ({
+ config: s.config,
+ watching: s.watching,
+ });
+ if (!isEqual(selectCachedState(state), selectCachedState(previousState))) {
+ fsCache.set('state', selectCachedState(state));
+ }
+ });
+ if (cachedState.watching) {
+ runTestRunner(channel, store);
+ }
+ const testProviderStore = experimental_getTestProviderStore(ADDON_ID);
+
+ store.subscribe('TRIGGER_RUN', (event, eventInfo) => {
+ testProviderStore.setState('test-provider-state:running');
+ store.setState((s) => ({
+ ...s,
+ fatalError: undefined,
+ }));
+ runTestRunner(channel, store, STORE_CHANNEL_EVENT_NAME, [{ event, eventInfo }]);
+ });
+ store.subscribe('TOGGLE_WATCHING', (event, eventInfo) => {
+ store.setState((s) => ({
+ ...s,
+ watching: event.payload.to,
+ currentRun: {
+ ...s.currentRun,
+ // when enabling watch mode, clear the coverage summary too
+ ...(event.payload.to && {
+ coverageSummary: undefined,
+ }),
+ },
+ }));
+ if (event.payload.to) {
+ runTestRunner(channel, store, STORE_CHANNEL_EVENT_NAME, [{ event, eventInfo }]);
+ }
+ });
+ store.subscribe('FATAL_ERROR', (event) => {
+ const { message, error } = event.payload;
+ const name = error.name || 'Error';
+ log(`${name}: ${message}`);
+ if (error.stack) {
+ log(error.stack);
+ }
+
+ function logErrorWithCauses(err: ErrorLike) {
+ if (!err) {
+ return;
}
- };
- channel.on(TESTING_MODULE_RUN_REQUEST, execute(TESTING_MODULE_RUN_REQUEST));
+ log(`Caused by: ${err.name ?? 'Error'}: ${err.message}`);
- store.onStateChange((state) => {
- if (state.watching) {
- runTestRunner(channel);
+ if (err.stack) {
+ log(err.stack);
+ }
+
+ if (err.cause) {
+ logErrorWithCauses(err.cause);
+ }
+ }
Greptile
greptile
logic: Recursive error cause logging could potentially cause stack overflow with deeply nested error causes ```suggestion + function logErrorWithCauses(err: ErrorLike, depth = 0) { + if (!err || depth > 10) { // Limit recursion depth return; } log(`Caused by: ${err.name ?? 'Error'}: ${err.message}`); if (err.stack) { log(err.stack); } if (err.cause) { + logErrorWithCauses(err.cause, depth + 1); } } ```
diff block
return false;
}
+ // Use the Web Animations API to wait for any animations and transitions to finish
+ private async waitForAnimations(signal: AbortSignal) {
+ let timedOut = false;
+
+ await Promise.race([
+ // After 50ms, retrieve any running animations and wait for them to finish
+ // If new animations are created while waiting, we'll wait for them too
+ new Promise((resolve) => {
+ setTimeout(() => {
+ const animationRoots = [global.document, ...getShadowRoots()];
+ const checkAnimationsFinished = async () => {
+ if (this.checkIfAborted(signal) || timedOut) {
+ return;
+ }
+ const runningAnimations = animationRoots
+ .flatMap((el) => el?.getAnimations() || [])
+ .filter((a) => a.playState === 'running');
+ if (runningAnimations.length > 0) {
+ await Promise.all(runningAnimations.map((a) => a.finished));
+ await checkAnimationsFinished();
+ }
Greptile
greptile
style: Recursive call to checkAnimationsFinished could cause stack overflow if animations keep getting added. Consider using a loop instead.
diff block
run_migrations(host, port, retries, user, password)
Greptile
greptile
logic: Recursive call to run_migrations() inside a loop could cause stack overflow with many failed migrations
suggested fix
+ continue # Skip this migration and try the next one
diff block
+import {
+ DeleteMessageBatchCommand,
+ ReceiveMessageCommand,
+ SQSClient,
+ Message as SQSMessage,
+} from "@aws-sdk/client-sqs";
+import { LogManager } from "../../../managers/LogManager";
+import { ScoreManager } from "../../../managers/score/ScoreManager";
+import { SettingsManager } from "../../../utils/settings";
+import { mapDlqKafkaMessageToMessage } from "../../consumer/helpers/mapDlqKafkaMessageToMessage";
+import { mapKafkaMessageToMessage } from "../../consumer/helpers/mapKafkaMessageToMessage";
+import { mapKafkaMessageToScoresMessage } from "../../consumer/helpers/mapKafkaMessageToScoresMessage";
+
+// do not go above 10, this is the max sqs can handle
+const MAX_NUMBER_OF_MESSAGES = 10;
+const QUEUE_NAMES = {
+ requestResponseLogs: "request-response-logs-queue",
+ heliconeScores: "helicone-scores-queue",
+ requestResponseLogsDlq: "request-response-logs-dlq",
+ heliconeScoresDlq: "helicone-scores-dlq",
+} as const;
+
+const QUEUE_URLS = {
+ requestResponseLogs: `https://sqs.${process.env.AWS_REGION}.amazonaws.com/${process.env.AWS_ACCOUNT_ID}/${QUEUE_NAMES.requestResponseLogs}`,
+ heliconeScores: `https://sqs.${process.env.AWS_REGION}.amazonaws.com/${process.env.AWS_ACCOUNT_ID}/${QUEUE_NAMES.heliconeScores}`,
+ requestResponseLogsDlq: `https://sqs.${process.env.AWS_REGION}.amazonaws.com/${process.env.AWS_ACCOUNT_ID}/${QUEUE_NAMES.requestResponseLogsDlq}`,
+ heliconeScoresDlq: `https://sqs.${process.env.AWS_REGION}.amazonaws.com/${process.env.AWS_ACCOUNT_ID}/${QUEUE_NAMES.heliconeScoresDlq}`,
+} as const;
+const SQS_CLIENT = new SQSClient({
+ region: process.env.AWS_REGION,
+});
+
+const settingsManager = new SettingsManager();
+
+const pullMessages = async ({
+ sqs,
+ queueUrl,
+ count,
+ accumulatedMessages = [],
+}: {
+ sqs: SQSClient;
+ queueUrl: string;
+ count: number;
+ accumulatedMessages: SQSMessage[];
+}) => {
+ const command = new ReceiveMessageCommand({
+ QueueUrl: queueUrl,
+ MaxNumberOfMessages: count,
+ });
+ const result = await sqs.send(command);
+ if (result.Messages === undefined || result.Messages.length === 0) {
+ return accumulatedMessages;
+ }
+ return pullMessages({
+ sqs,
+ queueUrl,
+ count: count - result.Messages.length,
+ accumulatedMessages: [...accumulatedMessages, ...result.Messages],
+ });
+};
Greptile
greptile
style: Recursive message pulling without limits could cause stack overflow with large message counts. Consider using iteration instead.
suggested fix
const pullMessages = async ({
sqs,
queueUrl,
count,
accumulatedMessages = [],
}: {
sqs: SQSClient;
queueUrl: string;
count: number;
accumulatedMessages: SQSMessage[];
}) => {
+ let remaining = count;
+ let messages = [...accumulatedMessages];
+ while (remaining > 0) {
const command = new ReceiveMessageCommand({
QueueUrl: queueUrl,
+ MaxNumberOfMessages: remaining,
});
const result = await sqs.send(command);
if (result.Messages === undefined || result.Messages.length === 0) {
+ break;
}
+ messages = [...messages, ...result.Messages];
+ remaining -= result.Messages.length;
}
+ return messages;
};
diff block
+//
+// AccessibilityParserConfig.swift
+// Onit
+//
+// Created by Kévin Naudin on 28/04/2025.
+//
+
+struct AccessibilityParserConfig {
+
+ static let recursiveDepthMax = 1000
Greptile
greptile
style: 1000 levels of recursion could lead to stack overflow in deeply nested UIs. Consider a lower default like 100-200
diff block
import type { InValue } from '@libsql/client';
+
import type {
- VectorFilter,
BasicOperator,
NumericOperator,
ArrayOperator,
ElementOperator,
LogicalOperator,
- RegexOperator,
+ VectorFilter,
} from '@mastra/core/vector/filter';
-export type OperatorType =
+type OperatorType =
| BasicOperator
| NumericOperator
| ArrayOperator
| ElementOperator
| LogicalOperator
| '$contains'
- | Exclude<RegexOperator, '$options'>;
+ | '$size';
type FilterOperator = {
sql: string;
needsValue: boolean;
- transformValue?: (value: any) => any;
+ transformValue?: () => any;
};
type OperatorFn = (key: string, value?: any) => FilterOperator;
+export function validateIdentifier(name: string, kind = 'identifier') {
+ if (!/^[a-zA-Z_][a-zA-Z0-9_]*$/.test(name) || name.length > 63) {
+ throw new Error(
+ `Invalid ${kind}: ${name}.
+ Must start with a letter or underscore,
+ contain only letters, numbers, or underscores,
+ and be at most 63 characters long.`,
+ );
+ }
+}
+
+function validateFieldKey(key: string) {
+ if (!key) return;
+ const segments = key.split('.');
+ for (const segment of segments) {
+ if (!/^[a-zA-Z_][a-zA-Z0-9_]*$/.test(segment) || segment.length > 63) {
+ throw new Error(`Invalid field key segment: ${segment} in ${key}`);
+ }
+ }
+}
// Helper functions to create operators
const createBasicOperator = (symbol: string) => {
- return (key: string): FilterOperator => ({
- sql: `CASE
- WHEN ? IS NULL THEN json_extract(metadata, '$."${handleKey(key)}"') IS ${symbol === '=' ? '' : 'NOT'} NULL
- ELSE json_extract(metadata, '$."${handleKey(key)}"') ${symbol} ?
- END`,
- needsValue: true,
- transformValue: (value: any) => {
- // Return the values directly, not in an object
- return [value, value];
- },
- });
+ return (key: string, value: any): FilterOperator => {
+ validateFieldKey(key);
+ const jsonPathKey = toJsonPathKey(key);
+ return {
+ sql: `CASE
+ WHEN ? IS NULL THEN json_extract(metadata, '$."${jsonPathKey}"') IS ${symbol === '=' ? '' : 'NOT'} NULL
+ ELSE json_extract(metadata, '$."${jsonPathKey}"') ${symbol} ?
+ END`,
+ needsValue: true,
+ transformValue: () => {
+ // Return the values directly, not in an object
+ return [value, value];
+ },
+ };
+ };
};
const createNumericOperator = (symbol: string) => {
- return (key: string): FilterOperator => ({
- sql: `CAST(json_extract(metadata, '$."${handleKey(key)}"') AS NUMERIC) ${symbol} ?`,
- needsValue: true,
- });
+ return (key: string): FilterOperator => {
+ validateFieldKey(key);
+ const jsonPathKey = toJsonPathKey(key);
+ return {
+ sql: `CAST(json_extract(metadata, '$."${jsonPathKey}"') AS NUMERIC) ${symbol} ?`,
+ needsValue: true,
+ };
+ };
};
const validateJsonArray = (key: string) =>
- `json_valid(json_extract(metadata, '$."${handleKey(key)}"'))
- AND json_type(json_extract(metadata, '$."${handleKey(key)}"')) = 'array'`;
+ `json_valid(json_extract(metadata, '$."${key}"'))
+ AND json_type(json_extract(metadata, '$."${key}"')) = 'array'`;
+
+const pattern = /json_extract\(metadata, '\$\."[^"]*"(\."[^"]*")*'\)/g;
+
+function buildElemMatchConditions(value: any) {
+ const conditions = Object.entries(value).map(([field, fieldValue]) => {
+ if (field.startsWith('$')) {
+ // Direct operators on array elements ($in, $gt, etc)
+ const { sql, values } = buildCondition('elem.value', { [field]: fieldValue }, '');
+ // Replace the metadata path with elem.value
Greptile
greptile
logic: buildElemMatchConditions recursively calls buildCondition without depth limit - could lead to stack overflow with deeply nested objects
suggested fix
+function buildElemMatchConditions(value: any, depth = 0) {
+ if (depth > 100) { // Reasonable max depth to prevent stack overflow
+ throw new Error('Maximum nesting depth exceeded in $elemMatch condition');
}
const conditions = Object.entries(value).map(([field, fieldValue]) => {
if (field.startsWith('$')) {
// Direct operators on array elements ($in, $gt, etc)
+ const { sql, values } = buildCondition('elem.value', { [field]: fieldValue }, '', depth + 1);
// Replace the metadata path with elem.value
diff block
return options.maxZoom ? options.maxZoom / 100 : DEFAULT_MAX_ZOOM;
};
+
+export const setFontNameRecursively = (obj: Record<string, unknown>, fontName: string): void => {
+ if (!obj || typeof obj !== 'object') return;
+
+ for (const key in obj) {
+ if (key === 'fontName' && obj[key] === undefined) {
+ obj[key] = fontName;
+ } else if (typeof obj[key] === 'object' && obj[key] !== null) {
+ setFontNameRecursively(obj[key] as Record<string, unknown>, fontName);
+ }
+ }
+};
Greptile
greptile
logic: No check for circular references could cause stack overflow with cyclical objects
suggested fix
+export const setFontNameRecursively = (obj: Record<string, unknown>, fontName: string, seen = new WeakSet()): void => {
+ if (!obj || typeof obj !== 'object' || seen.has(obj)) return;
+ seen.add(obj);
for (const key in obj) {
if (key === 'fontName' && obj[key] === undefined) {
obj[key] = fontName;
} else if (typeof obj[key] === 'object' && obj[key] !== null) {
+ setFontNameRecursively(obj[key] as Record<string, unknown>, fontName, seen);
}
}
};
diff block
+import Parser from 'tree-sitter';
+import { TypescriptLanguage } from 'tree-sitter-typescript';
+
+type ErrorWithMessage = {
+ message: string;
+};
+
+export interface TreeSitterOptions {
+ includeComments?: boolean;
+ parseServerComponents?: boolean;
+}
+
+export class TreeSitterProcessor {
+ private parser: Parser;
+
+ constructor() {
+ this.parser = new Parser();
+ this.parser.setLanguage(TypescriptLanguage.tsx);
+ }
+
+ async parseNextCode(code: string, options: TreeSitterOptions = {}): Promise<Parser.Tree> {
+ try {
+ return this.parser.parse(code);
+ } catch (error) {
+ const err = error as ErrorWithMessage;
+ throw new Error(`Failed to parse Next.js code: ${err.message}`);
+ }
+ }
+
+ async getASTForLLM(code: string, options: TreeSitterOptions = {}): Promise<object> {
+ const tree = await this.parseNextCode(code, options);
+ return this.transformTreeForLLM(tree.rootNode);
+ }
+
+ private transformTreeForLLM(node: Parser.SyntaxNode): object {
+ return {
+ type: node.type,
+ text: node.text,
+ startPosition: node.startPosition,
+ endPosition: node.endPosition,
+ children: node.children.map((child) => this.transformTreeForLLM(child)),
+ };
Greptile
greptile
style: transformTreeForLLM recursively processes all children without a depth limit, which could cause stack overflow for deeply nested ASTs
diff block
+import os
+from datetime import datetime
+from datetime import timezone
+from typing import Any, Optional
+
+import msal # type: ignore
+import requests
+from requests.exceptions import RequestException
+
+from onyx.configs.app_configs import INDEX_BATCH_SIZE
+from onyx.configs.constants import DocumentSource
+from onyx.connectors.cross_connector_utils.miscellaneous_utils import time_str_to_utc
+from onyx.connectors.exceptions import ConnectorValidationError
+from onyx.connectors.exceptions import CredentialExpiredError
+from onyx.connectors.exceptions import InsufficientPermissionsError
+from onyx.connectors.exceptions import UnexpectedValidationError
+from onyx.connectors.interfaces import GenerateDocumentsOutput
+from onyx.connectors.interfaces import LoadConnector
+from onyx.connectors.interfaces import PollConnector
+from onyx.connectors.interfaces import SecondsSinceUnixEpoch
+from onyx.connectors.models import BasicExpertInfo
+from onyx.connectors.models import ConnectorMissingCredentialError
+from onyx.connectors.models import Document
+from onyx.connectors.models import TextSection
+from onyx.utils.logger import setup_logger
+
+logger = setup_logger()
+
+
+class OutlookConnector(LoadConnector, PollConnector):
+ def __init__(
+ self,
+ batch_size: int = INDEX_BATCH_SIZE,
+ indexing_scope: str = "everything",
+ folders: list[str] = [],
+ email_addresses: list[str] = [],
+ include_attachments: bool = True,
+ start_date: str | None = None,
+ include_metadata: bool = False,
+ max_emails: int | None = None,
+ ) -> None:
+ self.batch_size = batch_size
+ self.access_token: Optional[str] = None
+ self.indexing_scope = indexing_scope
+ self.folders = folders
+ self.email_addresses = email_addresses
+ self.include_attachments = include_attachments
+ self.start_date = start_date
+ self.include_metadata = include_metadata
+ self.max_emails = max_emails
+ self.msal_app: Optional[msal.ConfidentialClientApplication] = None
+ self.base_url = "https://graph.microsoft.com/v1.0"
+
+ def load_credentials(self, credentials: dict[str, Any]) -> dict[str, Any] | None:
+ outlook_client_id = credentials["outlook_client_id"]
+ outlook_client_secret = credentials["outlook_client_secret"]
+ outlook_directory_id = credentials["outlook_directory_id"]
+
+ authority_url = f"https://login.microsoftonline.com/{outlook_directory_id}"
+ self.msal_app = msal.ConfidentialClientApplication(
+ authority=authority_url,
+ client_id=outlook_client_id,
+ client_credential=outlook_client_secret,
+ )
+ return None
+
+ def _get_access_token(self) -> str:
+ if self.msal_app is None:
+ raise ConnectorMissingCredentialError("Outlook credentials not loaded.")
+
+ token = self.msal_app.acquire_token_for_client(
+ scopes=["https://graph.microsoft.com/.default"]
+ )
+ if "access_token" not in token:
+ raise CredentialExpiredError("Failed to acquire access token")
+ return token["access_token"]
+
+ def _make_request(self, endpoint: str, params: Optional[dict] = None) -> dict:
+ if not self.access_token:
+ self.access_token = self._get_access_token()
+
+ headers = {
+ "Authorization": f"Bearer {self.access_token}",
+ "Content-Type": "application/json",
+ }
+
+ try:
+ response = requests.get(
+ f"{self.base_url}/{endpoint}",
+ headers=headers,
+ params=params,
+ timeout=30
+ )
+ response.raise_for_status()
+ return response.json()
+ except RequestException as e:
+ if e.response is not None:
+ status_code = e.response.status_code
+ if status_code == 401:
+ # Token might be expired, try to get a new one
+ self.access_token = None
+ return self._make_request(endpoint, params)
Greptile
greptile
style: Recursive call to _make_request could potentially cause a stack overflow if authentication repeatedly fails.
suggested fix
+ # Token might be expired, try to get a new one once
+ if not getattr(self, '_retried', False):
+ self._retried = True
self.access_token = None
+ result = self._make_request(endpoint, params)
+ self._retried = False
+ return result
+ raise CredentialExpiredError("Failed to refresh access token")
diff block
+import {
+ BusterChatMessageReasoning,
+ BusterChatMessageReasoning_thought
+} from '@/api/asset_interfaces';
+import { useMemoizedFn } from 'ahooks';
+import sample from 'lodash/sample';
+import last from 'lodash/last';
+import { useBusterChatContextSelector } from '../ChatProvider';
+import { timeout } from '@/utils';
+import random from 'lodash/random';
+
+export const useAutoAppendThought = () => {
+ const onUpdateChatMessage = useBusterChatContextSelector((x) => x.onUpdateChatMessage);
+ const getChatMessagesMemoized = useBusterChatContextSelector((x) => x.getChatMessagesMemoized);
+
+ const removeAutoThoughts = useMemoizedFn(
+ (reasoningMessages: BusterChatMessageReasoning[]): BusterChatMessageReasoning[] => {
+ return reasoningMessages.filter((rm) => rm.id !== AUTO_THOUGHT_ID);
+ }
+ );
+
+ const autoAppendThought = useMemoizedFn(
+ (
+ reasoningMessages: BusterChatMessageReasoning[],
+ chatId: string
+ ): BusterChatMessageReasoning[] => {
+ const lastReasoningMessage = reasoningMessages[reasoningMessages.length - 1];
+ const lastMessageIsCompleted =
+ !lastReasoningMessage || lastReasoningMessage?.status === 'completed';
+
+ if (lastMessageIsCompleted) {
+ _loopAutoThought(chatId);
+
+ return [...reasoningMessages, createAutoThought()];
+ }
+
+ return removeAutoThoughts(reasoningMessages);
+ }
+ );
+
+ const _loopAutoThought = useMemoizedFn(async (chatId: string) => {
+ const randomDelay = random(3000, 5000);
+ await timeout(randomDelay);
+ const chatMessages = getChatMessagesMemoized(chatId);
+ const lastMessage = last(chatMessages);
+ const isCompletedStream = !!lastMessage?.isCompletedStream;
+ const lastReasoningMessage = last(lastMessage?.reasoning);
+ const lastReasoningMessageIsAutoAppended =
+ !lastReasoningMessage || lastReasoningMessage?.id === AUTO_THOUGHT_ID;
+
+ if (!isCompletedStream && lastReasoningMessageIsAutoAppended && lastMessage) {
+ const lastMessageId = lastMessage?.id!;
+ const lastReasoningMessageIndex = lastMessage?.reasoning.length - 1;
+ const updatedReasoning = lastMessage?.reasoning.slice(0, lastReasoningMessageIndex);
+ const newReasoningMessages = [...updatedReasoning, createAutoThought()];
+
+ onUpdateChatMessage({
+ id: lastMessageId,
+ reasoning: newReasoningMessages,
+ isCompletedStream: false
+ });
+
+ _loopAutoThought(chatId);
+ }
Greptile
greptile
logic: recursive call to _loopAutoThought could potentially cause a stack overflow if the stream never completes
suggested fix
if (!isCompletedStream && lastReasoningMessageIsAutoAppended && lastMessage) {
const lastMessageId = lastMessage?.id!;
const lastReasoningMessageIndex = lastMessage?.reasoning.length - 1;
const updatedReasoning = lastMessage?.reasoning.slice(0, lastReasoningMessageIndex);
const newReasoningMessages = [...updatedReasoning, createAutoThought()];
onUpdateChatMessage({
id: lastMessageId,
reasoning: newReasoningMessages,
isCompletedStream: false
});
+ // Use setTimeout to prevent stack overflow
+ setTimeout(() => _loopAutoThought(chatId), 0);
}
diff block
};
export const getDirSize = async (dir: string): Promise<number> => {
- const getItemSize = async (filePath: string): Promise<number> => {
- const stat = await fs.promises.stat(filePath);
+ try {
+ const stat = await fs.promises.stat(dir);
- if (stat.isDirectory()) {
- return getDirSize(filePath);
+ // If it's a file, return its size directly
+ if (!stat.isDirectory()) {
+ return stat.size;
}
- return stat.size;
- };
+ const getItemSize = async (filePath: string): Promise<number> => {
+ const stat = await fs.promises.stat(filePath);
+
+ if (stat.isDirectory()) {
+ return getDirSize(filePath);
+ }
+
+ return stat.size;
+ };
Greptile
greptile
style: Recursive stat calls inside getItemSize can cause stack overflow for deeply nested directories. Consider implementing an iterative approach.
diff block
throw error;
}
}
+
+export async function deleteImageFromProject(
+ projectRoot: string,
+ imageName: string,
+): Promise<string> {
+ try {
+ const imageFolder = path.join(projectRoot, DefaultSettings.IMAGE_FOLDER);
+ const imagePath = path.join(imageFolder, imageName);
+ await fs.unlink(imagePath);
+ return imagePath;
+ } catch (error) {
+ console.error('Error deleting image:', error);
+ throw error;
+ }
+}
+
+export async function renameImageInProject(
+ projectRoot: string,
+ imageName: string,
+ newName: string,
+): Promise<string> {
+ if (!imageName || !newName) {
+ throw new Error('Image name and new name are required');
+ }
+
+ const imageFolder = path.join(projectRoot, DefaultSettings.IMAGE_FOLDER);
+ const oldImagePath = path.join(imageFolder, imageName);
+ const newImagePath = path.join(imageFolder, newName);
+
+ try {
+ await validateRename(oldImagePath, newImagePath);
+ await fs.rename(oldImagePath, newImagePath);
+
+ await updateImageReferences(projectRoot, imageName, newName);
+ return newImagePath;
+ } catch (error) {
+ console.error('Error renaming image:', error);
+ throw error;
+ }
+}
+
+const MAX_FILENAME_LENGTH = 255;
+const VALID_FILENAME_REGEX = /^[a-zA-Z0-9-_. ]+$/;
+
+async function validateRename(oldImagePath: string, newImagePath: string): Promise<void> {
+ try {
+ await fs.access(oldImagePath);
+ } catch (err) {
+ throw new Error(`Source image does not exist`);
+ }
+
+ const newFileName = path.basename(newImagePath);
+
+ if (newFileName.length > MAX_FILENAME_LENGTH) {
+ throw new Error(`File name is too long (max ${MAX_FILENAME_LENGTH} characters)`);
+ }
+
+ if (!VALID_FILENAME_REGEX.test(newFileName)) {
+ throw new Error(
+ 'File name can only contain letters, numbers, spaces, hyphens, underscores, and periods',
+ );
+ }
+
+ try {
+ await fs.access(newImagePath);
+ throw new Error(`A file with this name already exists`);
+ } catch (err: any) {
+ if (err.code !== 'ENOENT') {
+ throw err;
+ }
+ }
+}
+
+async function updateImageReferences(
+ projectRoot: string,
+ oldName: string,
+ newName: string,
+): Promise<void> {
+ const prefix = DefaultSettings.IMAGE_FOLDER.replace(/^public\//, '');
+ const oldImageUrl = `/${prefix}/${oldName}`;
+ const newImageUrl = `/${prefix}/${newName}`;
+ const pattern = new RegExp(oldImageUrl.replace(/[.*+?^${}()|[\]\\]/g, '\\$&'), 'g');
+
+ const sourceFiles = await findSourceFiles(projectRoot);
+ await Promise.all(
+ sourceFiles.map(async (file) => {
+ const content = await fs.readFile(file, 'utf8');
+ if (!content.includes(oldImageUrl)) {
+ return;
+ }
+
+ const updatedContent = content.replace(pattern, newImageUrl);
+ await fs.writeFile(file, updatedContent, 'utf8');
+ }),
+ );
+}
+
+async function findSourceFiles(dir: string): Promise<string[]> {
+ const files: string[] = [];
+ const entries = await fs.readdir(dir, { withFileTypes: true });
+
+ for (const entry of entries) {
+ const fullPath = path.join(dir, entry.name);
+ if (entry.isDirectory() && !entry.name.startsWith('.') && entry.name !== 'node_modules') {
+ files.push(...(await findSourceFiles(fullPath)));
Greptile
greptile
style: Recursive directory traversal without depth limit could cause stack overflow with deeply nested directories. Add a max depth parameter.
diff block
+use crate::error::BusterError;
+use dirs;
+use std::fs;
+use std::io::{self, Write};
+use std::path::{Path, PathBuf};
+
+// Moved from run.rs
+pub fn prompt_for_input(prompt_message: &str, default_value: Option<&str>, is_sensitive: bool) -> Result<String, BusterError> {
+ if let Some(def_val) = default_value {
+ print!("{} (default: {}): ", prompt_message, def_val);
+ } else {
+ print!("{}: ", prompt_message);
+ }
+ io::stdout().flush().map_err(|e| BusterError::CommandError(format!("Failed to flush stdout: {}", e)))?;
+
+ let mut input = String::new();
+ // Simple masking for sensitive input is complex in raw terminal io without extra crates.
+ // For a real CLI, rpassword or similar would be used.
+ // Here, we just read the line.
+ io::stdin().read_line(&mut input).map_err(|e| BusterError::CommandError(format!("Failed to read line: {}", e)))?;
+ let trimmed_input = input.trim().to_string();
+
+ if trimmed_input.is_empty() {
+ if let Some(def_val) = default_value {
+ Ok(def_val.to_string())
+ } else {
+ println!("Input cannot be empty. Please try again.");
+ prompt_for_input(prompt_message, default_value, is_sensitive) // Recurse
+ }
Greptile
greptile
logic: Recursive prompt without stack limit could cause stack overflow with repeated empty inputs
diff block
+use crate::error::BusterError;
+use dirs;
+use std::fs;
+use std::io::{self, Write};
+use std::path::{Path, PathBuf};
+
+// Moved from run.rs
+pub fn prompt_for_input(prompt_message: &str, default_value: Option<&str>, is_sensitive: bool) -> Result<String, BusterError> {
+ if let Some(def_val) = default_value {
+ print!("{} (default: {}): ", prompt_message, def_val);
+ } else {
+ print!("{}: ", prompt_message);
+ }
+ io::stdout().flush().map_err(|e| BusterError::CommandError(format!("Failed to flush stdout: {}", e)))?;
+
+ let mut input = String::new();
+ // Simple masking for sensitive input is complex in raw terminal io without extra crates.
+ // For a real CLI, rpassword or similar would be used.
+ // Here, we just read the line.
+ io::stdin().read_line(&mut input).map_err(|e| BusterError::CommandError(format!("Failed to read line: {}", e)))?;
+ let trimmed_input = input.trim().to_string();
+
+ if trimmed_input.is_empty() {
+ if let Some(def_val) = default_value {
+ Ok(def_val.to_string())
+ } else {
+ println!("Input cannot be empty. Please try again.");
+ prompt_for_input(prompt_message, default_value, is_sensitive) // Recurse
+ }
Greptile
greptile
logic: Recursive call without a maximum depth limit could cause stack overflow with repeated empty inputs. Consider using a loop instead.
diff block
+use crate::error::BusterError;
+use dirs;
+use std::fs;
+use std::io::{self, Write};
+use std::path::{Path, PathBuf};
+
+// Moved from run.rs
+pub fn prompt_for_input(prompt_message: &str, default_value: Option<&str>, is_sensitive: bool) -> Result<String, BusterError> {
+ if let Some(def_val) = default_value {
+ print!("{} (default: {}): ", prompt_message, def_val);
+ } else {
+ print!("{}: ", prompt_message);
+ }
+ io::stdout().flush().map_err(|e| BusterError::CommandError(format!("Failed to flush stdout: {}", e)))?;
+
+ let mut input = String::new();
+ // Simple masking for sensitive input is complex in raw terminal io without extra crates.
+ // For a real CLI, rpassword or similar would be used.
+ // Here, we just read the line.
+ io::stdin().read_line(&mut input).map_err(|e| BusterError::CommandError(format!("Failed to read line: {}", e)))?;
+ let trimmed_input = input.trim().to_string();
+
+ if trimmed_input.is_empty() {
+ if let Some(def_val) = default_value {
+ Ok(def_val.to_string())
+ } else {
+ println!("Input cannot be empty. Please try again.");
+ prompt_for_input(prompt_message, default_value, is_sensitive) // Recurse
+ }
Greptile
greptile
logic: Recursive prompt without depth limit could cause stack overflow with repeated empty inputs
diff block
+import { NextRequest, NextResponse } from 'next/server';
+import path from 'path';
+import fs from 'fs';
+
+export async function GET(request: NextRequest) {
+ try {
+ // In a real implementation, this would query a database of mobile app events
+ // For now, we'll try to generate realistic activity based on any available data
+
+ const syncDir = process.env.SYNC_DIR || path.join(process.cwd(), 'sync_data');
+ let mobileActivity = [];
+
+ try {
+ if (fs.existsSync(syncDir)) {
+ // Get the most recent files to create activity items
+ const allFiles = [];
+
+ const getAllFiles = (dir: string) => {
+ const items = fs.readdirSync(dir, { withFileTypes: true });
+
+ for (const item of items) {
+ const fullPath = path.join(dir, item.name);
+
+ if (item.isDirectory()) {
+ getAllFiles(fullPath);
+ } else if (item.isFile()) {
+ const stats = fs.statSync(fullPath);
+ allFiles.push({
+ path: fullPath,
+ name: item.name,
+ mtime: stats.mtime,
+ ext: path.extname(item.name).toLowerCase()
+ });
+ }
+ }
+ };
Greptile
greptile
logic: Recursive file traversal without depth limit could cause stack overflow for deeply nested directories
suggested fix
+ const getAllFiles = (dir: string, depth = 0, maxDepth = 10) => {
+ if (depth >= maxDepth) return;
const items = fs.readdirSync(dir, { withFileTypes: true });
for (const item of items) {
const fullPath = path.join(dir, item.name);
if (item.isDirectory()) {
+ getAllFiles(fullPath, depth + 1, maxDepth);
} else if (item.isFile()) {
const stats = fs.statSync(fullPath);
allFiles.push({
path: fullPath,
name: item.name,
mtime: stats.mtime,
ext: path.extname(item.name).toLowerCase()
});
}
}
};
diff block
+import fetch from "node-fetch";
+
+import { togglApiToken } from "../helpers/preferences";
+
+const base64encode = (str: string) => {
+ return Buffer.from(str).toString("base64");
+};
+
+const baseUrl = "https://api.track.toggl.com/api/v9";
+const authHeader = { Authorization: `Basic ${base64encode(`${togglApiToken}:api_token`)}` };
+
+export const get = <T>(endpoint: string) => togglFetch<T>("GET", endpoint);
+export const post = <T = void>(endpoint: string, body?: unknown) => togglFetch<T>("POST", endpoint, body);
+export const patch = <T = void>(endpoint: string, body?: unknown) => togglFetch<T>("PATCH", endpoint, body);
+export const put = <T = void>(endpoint: string, body?: unknown) => togglFetch<T>("PUT", endpoint, body);
+export const remove = (endpoint: string) => togglFetch("DELETE", endpoint);
+
+async function togglFetch<T>(method: string, endpoint: string, body?: unknown): Promise<T>;
+async function togglFetch(method: "DELETE", endpoint: string): Promise<void>;
+async function togglFetch<T>(method: string, endpoint: string, body?: unknown): Promise<T | void> {
+ const headers: Record<string, string> = authHeader;
+ if (body !== undefined) headers["Content-Type"] = "application/json";
+ const res = await fetch(baseUrl + endpoint, {
+ method: method,
+ headers,
+ body: body ? JSON.stringify(body) : undefined,
+ });
+ if (!res.ok) {
+ if (res.status == 429) {
+ await delay(1000);
+ return await togglFetch(method, endpoint, body);
+ }
Greptile
greptile
logic: Recursive retry on rate limit could cause stack overflow with no max retries limit