Skip to content

Added journal publication to citation section #498

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Oct 14, 2024
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
26 changes: 26 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,32 @@ For development, besides the JDK, addtitionally [NPM](https://www.npmjs.com/) an

We kindly ask you to refer to the following paper in publications mentioning or employing DRES:

> Loris Sauter, Ralph Gasser, Heiko Schuldt, Abraham Bernstein, and Luca Rossetto. 2024. Performance Evaluation in Multimedia Retrieval. ACM Trans. Multimedia Comput. Commun. Appl. Just Accepted (October 2024). https://doi.org/10.1145/3678881

**Link:** https://dl.acm.org/doi/10.1145/3678881

**Bibtex:**

```
@article{10.1145/3678881,
author = {Sauter, Loris and Gasser, Ralph and Schuldt, Heiko and Bernstein, Abraham and Rossetto, Luca},
title = {Performance Evaluation in Multimedia Retrieval},
year = {2024},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
issn = {1551-6857},
url = {https://doi.org/10.1145/3678881},
doi = {10.1145/3678881},
abstract = {Performance evaluation in multimedia retrieval, as in the information retrieval domain at large, relies heavily on retrieval experiments, employing a broad range of techniques and metrics. These can involve human-in-the-loop and machine-only settings for the retrieval process itself and the subsequent verification of results. Such experiments can be elaborate and use-case-specific, which can make them difficult to compare or replicate. In this paper, we present a formal model to express all relevant aspects of such retrieval experiments, as well as a flexible open-source evaluation infrastructure that implements the model. These contributions intend to make a step towards lowering the hurdles for conducting retrieval experiments and improving their reproducibility.},
note = {Just Accepted},
journal = {ACM Trans. Multimedia Comput. Commun. Appl.},
month = oct,
keywords = {Interactive Multimedia Retrieval, Retrieval Evaluation, Interactive Evaluation, Evaluation System}
}
```

We also have a demo publication:

> Rossetto L., Gasser R., Sauter L., Bernstein A., Schuldt H. (2021) A System for Interactive Multimedia Retrieval Evaluations. In: Lokoč J. et al. (eds) MultiMedia Modeling. MMM 2021. Lecture Notes in Computer Science, vol 12573. Springer, Cham.

**Link:** https://doi.org/10.1007/978-3-030-67835-7_33
Expand Down
Loading