Graph Your Way to Inspiration: Integrating Co-Author Graphs with Retrieval-Augmented Generation for Large Language Model Based Scientific Idea Generation

2026-02-28T11:53:56.000Z·★ 76·1 min read
GYWI combines author knowledge graphs with RAG + GraphRAG to generate more novel and feasible scientific ideas, evaluated across 5 dimensions on arXiv data.

GYWI combines author knowledge graphs with retrieval-augmented generation to help LLMs generate more novel, feasible, and relevant scientific research ideas.

The Problem

LLMs can generate scientific ideas, but the results often lack controllable academic context and traceable inspiration pathways. Generated ideas may be generic or disconnected from real research trajectories.

The GYWI System

Author Knowledge Graphs: Build an author-centered knowledge graph construction method with inspiration source sampling algorithms to create an external knowledge base with academic context.

Hybrid Retrieval: Combine standard RAG with GraphRAG to retrieve both broad and deep knowledge, forming a hybrid context for the LLM.

Prompt Optimization: Use reinforcement learning principles to automatically guide LLMs in optimizing generated ideas based on the hybrid context.

Evaluation

Tested on arXiv papers (2018-2023) using GPT-4o, DeepSeek-V3, Qwen3-8B, and Gemini 2.5. Ideas evaluated across five dimensions:

DimensionDescription
NoveltyHow original the idea is
FeasibilityCan it actually be done
ClarityHow well-articulated
RelevanceFit to the research area
SignificancePotential impact

GYWI significantly outperforms mainstream LLMs across multiple metrics.


Source: arXiv:2602.22215

↗ Original source
← Previous: OpenAI – How to delete your accountNext: Anthropic vs. The Pentagon: what enterprises should do →
Comments0