OpenGIN-bot project#114
Conversation
… entity attributes and relationships
…tool and updating `get_entity_attributes` parameters and prompt instructions.
…rate chat functionality to use its new API.
…node, and migrate backend to port 9000.
…imits, refine prompt instructions for cleaner output
…ced agent memory and context management.
… messages when a new topic is detected.
…GIN-bot into backend-migration
Backend migration
There was a problem hiding this comment.
Code Review
This pull request introduces the OpenGIN Bot, a full-stack application featuring a Next.js frontend and a FastAPI backend powered by LangGraph for querying temporal graph databases. The implementation includes sophisticated context management strategies like topic shift detection and fact distillation. Feedback highlights several critical issues: a session leakage risk caused by global variables in the Next.js route handler, potential runtime errors in the graph API client when handling 500 responses, and security concerns regarding insecure CORS policies and sensitive information leakage in error responses. Additionally, corrections are needed for hardcoded environment URLs, invalid dependency versions in package.json, and incorrect LLM model identifiers.
| }, | ||
| "dependencies": { | ||
| "groq-sdk": "^0.37.0", | ||
| "next": "16.0.10", |
| from langchain_core.messages import HumanMessage | ||
| from app.graph.router import app_graph | ||
|
|
||
| app = FastAPI(title="OpenGIN Bot Backend") |
There was a problem hiding this comment.
The GraphAPIClient should be properly closed to prevent connection leaks. While a FastAPI lifespan handler is one way to manage this, in low-throughput contexts it is also acceptable to instantiate the client within a context manager for each request. This ensures safe, automatic cleanup with minimal overhead.
References
- In low-throughput contexts where HTTP requests are made sequentially at a human pace, it is acceptable to instantiate a new httpx.Client for each request using a context manager. The negligible performance overhead is outweighed by the benefit of safe, automatic cleanup.
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
This pr introduces the OpenGIN AI agent chat bot