Metadata-Version: 2.4 Name: llama-index-core Version: 0.12.51 Summary: Interface between LLMs and your data Project-URL: Homepage, https://llamaindex.ai Project-URL: Repository, https://github.com/run-llama/llama_index Project-URL: Documentation, https://docs.llamaindex.ai/en/stable/ Author-email: Jerry Liu Maintainer-email: Andrei Fajardo , Haotian Zhang , Jerry Liu , Logan Markewich , Simon Suo , Sourabh Desai License-Expression: MIT License-File: LICENSE Keywords: LLM,NLP,RAG,data,devtools,index,retrieval Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence Classifier: Topic :: Software Development :: Libraries :: Application Frameworks Classifier: Topic :: Software Development :: Libraries :: Python Modules Requires-Python: <4.0,>=3.9 Requires-Dist: aiohttp<4,>=3.8.6 Requires-Dist: aiosqlite Requires-Dist: banks<3,>=2.0.0 Requires-Dist: dataclasses-json Requires-Dist: deprecated>=1.2.9.3 Requires-Dist: dirtyjson<2,>=1.0.8 Requires-Dist: eval-type-backport<0.3,>=0.2.0; python_version < '3.10' Requires-Dist: filetype<2,>=1.2.0 Requires-Dist: fsspec>=2023.5.0 Requires-Dist: httpx Requires-Dist: llama-index-workflows<2,>=1.0.1 Requires-Dist: nest-asyncio<2,>=1.5.8 Requires-Dist: networkx>=3.0 Requires-Dist: nltk>3.8.1 Requires-Dist: numpy Requires-Dist: pillow>=9.0.0 Requires-Dist: platformdirs Requires-Dist: pydantic>=2.8.0 Requires-Dist: pyyaml>=6.0.1 Requires-Dist: requests>=2.31.0 Requires-Dist: setuptools>=80.9.0 Requires-Dist: sqlalchemy[asyncio]>=1.4.49 Requires-Dist: tenacity!=8.4.0,<10.0.0,>=8.2.0 Requires-Dist: tiktoken>=0.7.0 Requires-Dist: tqdm<5,>=4.66.1 Requires-Dist: typing-extensions>=4.5.0 Requires-Dist: typing-inspect>=0.8.0 Requires-Dist: wrapt Description-Content-Type: text/markdown # LlamaIndex Core The core python package to the LlamaIndex library. Core classes and abstractions represent the foundational building blocks for LLM applications, most notably, RAG. Such building blocks include abstractions for LLMs, Vector Stores, Embeddings, Storage, Callables and several others. We've designed the core library so that it can be easily extended through subclasses. Building LLM applications with LlamaIndex thus involves building with LlamaIndex core as well as with the LlamaIndex [integrations](https://github.com/run-llama/llama_index/tree/main/llama-index-integrations) needed for your application.