Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ ZENODOarrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Other literature type . 2026
License: CC BY
Data sources: ZENODO
ZENODO
Other literature type . 2026
License: CC BY
Data sources: Datacite
ZENODO
Other literature type . 2026
License: CC BY
Data sources: Datacite
versions View all 2 versions
addClaim

Information Retrieval Systems for Efficient Multimedia Information Access

Authors: Dr. Suneel Pappala;

Information Retrieval Systems for Efficient Multimedia Information Access

Abstract

An Information Retrieval System (IRS) is designed to store, organize, retrieve, and maintain information in response to user queries. Unlike traditional database systems that rely on structured data and exact matching, an IRS focuses on retrieving relevant information from large collections of unstructured or semi-structured data such as text, images, audio, video, and other multimedia content. With the rapid growth of the Internet and advances in low-cost computing and storage technologies, information retrieval systems have become essential tools for managing vast digital repositories and enabling efficient access to knowledge. The primary objective of an IRS is to reduce the user’s effort in locating needed information. This effort, known as information retrieval overhead, includes query formulation, execution, examination of retrieved results, and reading non-relevant items. To evaluate system effectiveness, two key performance measures are used: precision, which reflects the accuracy of retrieved results, and recall, which measures the completeness of retrieval. A balance between these measures is crucial for effective information access. Modern information retrieval systems support natural language queries, allowing users to express their information needs in everyday language. Internally, an IRS operates through several functional processes, including item normalization, selective dissemination of information, document database search, and index database search. Item normalization converts diverse data formats into standardized, searchable representations through processes such as zoning, token identification, and stop-word removal. Indexing and automatic file-building techniques further enhance retrieval efficiency.

Keywords

Information Retrieval System, Precision, Recall, Relevance, Item Normalization, Indexing

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    0
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
Powered by OpenAIRE graph
Found an issue? Give us feedback
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
0
Average
Average
Average